Is the Metaverse Dead? XR Founders Break Down the Hype, Headlines & Reality
Every few months, headlines declare the "death of the metaverse." But is the dream really over or just evolving? In this episode of the Dauntless Podcast, co-founders Lori-Lee and Sofia get real about what’s actually happening in spatial computing, why VR/AR adoption is still niche, and what it’ll take for the metaverse to become mainstream.
Whether you’re a tech founder, product leader, or just XR-curious, this deep dive covers the the real data, and practical advice for making your business “metaverse ready” (even if you’re skeptical about the term).
Episode Highlights
“Ready Player One” vs. 2025 Reality
The sci-fi vision of a fully immersive, persistent digital world is still far off. Despite a $70B+ annual market, adoption of true VR/AR-based metaverse platforms remains niche. So, what’s holding us back?
The Great Metaverse Rebrand
"Metaverse" has baggage, so watch for new terms like "spatial computing." Tech is moving forward, even if the buzzwords change.
Defining the Metaverse (for Realists)
For this conversation, the metaverse means immersive, real-time digital spaces where people interact, create, and collaborate—using VR, AR, AI, and blockchain, not just 2D screens.
Adoption: Where Are We Now?
600M+ users in 2D worlds (Roblox, Minecraft, Fortnite)
141M+ people using VR worldwide (not mainstream yet)
Most VR/AR adoption is still in gaming, with enterprise use (training, aerospace, defense) quietly growing behind the scenes
Why Hasn't the Metaverse Happened Yet?
Three big roadblocks:
Interoperability: Apps and content don’t work seamlessly across devices. Developers can’t “build once, run everywhere.”
Content Creation: There aren’t enough XR-native apps, and user-generated content tools are still early. (Psst—Katana XR is tackling this!)
Hardware: Devices are either too heavy, too expensive, or not powerful enough. Form factor must match the use case, and we’re not there yet.
Hardware Is More Like Cameras Than Laptops
You’ll likely own different headsets for different jobs, just like you use different cameras for different situations.
Where Adoption Is Actually Happening
Gaming leads the way for consumers.
Training and simulation are driving enterprise adoption, especially in aerospace and defense.
Businesses are quietly improving onboarding, training, and process quality with XR even if it’s not headline news.
The Frequently Asked Questions we Get on the metaverse
Is the metaverse really dead?
Nope. The hype cycle is noisy, but real progress is happening—just slower and more quietly than headlines suggest. The tech is evolving, and so is the language (think “spatial computing”).
Why hasn’t the metaverse gone mainstream?
Three main reasons: lack of interoperability, limited content creation tools, and hardware that’s either too bulky or not powerful enough for what people want.
What’s the difference between 2D digital worlds and the “real” metaverse?
2D platforms like Roblox and Minecraft have massive adoption, but the immersive, persistent, spatial computing metaverse (VR/AR) is a different beast—one that’s still finding its footing.
How can businesses prepare for the metaverse (or spatial computing)?
Look for real problems you can solve with XR now (virtual try-ons, 3D product demos, immersive training).
Digitize your business knowledge and assets in common formats (FBX, OBJ, open USD).
Experiment with AI-powered 3D asset generation and low/no-code authoring tools (like Katana XR).
Don’t wait for a “perfect” headset to be released—start experimenting and learning what works for your business.
What’s the single biggest thing holding back adoption?
Hardware is the biggest barrier to widespread adoption. Form factor, compute power, and cost all have to converge to make the value a worthy investment. But interoperability and content creation are close behind.
Our Top Tips for Getting “Metaverse Ready” in 2025
Don’t wait for the hype to return. Start experimenting now, even if it’s just with free tools or pilot projects.
Digitize everything. The more of your business knowledge and assets you have in accessible digital formats, the easier it’ll be to leverage XR and AI later on.
Think use-case first. Not every XR project needs a headset; sometimes AR on a phone is enough.
Prepare for a multi-device world. Like cameras, you’ll likely use different hardware for different jobs.
Adopt a learning mindset. The industry is evolving fast—stay curious, experiment, and share what you learn.
Podcast Transcript
Lori-Lee: Ready Player one was set in 2045. Is that realistic? In 2025 adoption is still pretty niche. You know what gives, is the metaverse dead? Is this a $70 billion a year market just going to fizzle out to nothing? And will we ever get the oasis?
Sofia: It feels like every, what, every like four to six months, there's another round of headlines that are like, VR is dead, XR is dead. It's been declared dead a lot for something that's still going, just maybe a bit slowly.
Lori-Lee: And I do think that the Metaverse, as it maybe was traditionally, is undergoing a bit of a rebrand. I'm seeing spatial computing a lot more, and that might be because Metaverse has just accrued too much baggage that people don't want, or too many associations that people don't necessarily want anymore.
Sofia: I don't know that it represents what people really want, but I think like for the purposes of saying, is the metaverse dead? When is it coming? Why hasn't it happened yet? I do think we have to define what we're talking about when we say the Metaverse, because I do think there's a lot of valid definitions out there.
But I think for the purposes of this, we'll anchor it in a definition that we talk about. Go from there. So for this conversation, the Metaverse would be a world where digital. physical and virtual realities are merged where users or people interacting in this world can. Interact. They can create, they can collaborate in these immersive environments that are real time, just like they would in the real world. The technologies that we typically talk about when we're talking about the metaverse would be vr, ar, but AI and blockchain also play a role. And it's really envisioned as this persistent space, not something that you come in and out of, where people can work. in commerce, socialize all the things. for our conversation today, when we talk about why hasn't it happened yet, how are we gonna get there? Are we gonna get there? should focus on spatial 3D experiences rather than a digital world that's accessed through a 2D screen. I would not include Second Life, as an example of the type of Metaverse experience that we're defining today.
Lori-Lee: Yeah. Also digital worlds that you access through a 2D screen. So like Roblox, Minecraft, Second Life, Fortnite, those. Platforms have 600 million monthly active users. There's no real question that people have adopted those particular products, like they have product market fit, they're good.
When people talk about metaverse adoption, they're talking about the things that you said, you know, augmented mixed in virtual reality or extended reality, spatial computing, whatever label you want to put on it. And those things are typically accessed through a headset or through a phone with pass through turned on.
So you can see your real environment with a digital overlay.
Sofia: That makes sense. Let's level set where adoption is today. So you said we have 600 million users in virtual games like Roblox, Minecraft, Fortnite, and I'm gonna refer to my notes for some of these stats here. We do have 141 million people worldwide who are using VR and accessing the metaverse. 49% or so of those people are using a ME Quest to interact with the Metaverse.
31% are using a PlayStation vr, 12%, an HTC five and 6% with the valve index. if you don't own a headset of any kind, but you're still interacting with the metaverse, there's an 80% utilization rate. There is evidence that there is product market fit, compared to how many people there are in 2D equivalent or other types of apps, that adoption is not mainstream.
Those are not numbers that we would call mainstream adoption. What I think has people worried that we talk to, or that we see in this space is that the amount of dollars being poured in to make the metaverse happen is not resulting quickly in a proportionate adoption. So Zuckerberg is spending, what, 12 billion a year?
So 1 billion a month. I haven't been able to visit the Oasis yet, even on that 1 billion a month burn. Why is that? And I think we should talk a little bit about what are the challenges that we see or that we think there are in between the current adoption rate and ready player one where everyone's interacting with the oasis every day.
Lori-Lee: Yeah, no, it's a good question and I think. There's obviously like a very long list of things that need to happen before we get to that kind of adoption and that kind of state. But I think the top three, the really big ones that we haven't been able to, you know, fully crack yet, is interoperability, which we can talk about what that means, content creation and then the hardware itself.
So do you wanna talk about interoperability and what that means?
Sofia: Sure. At its most basic level, the interoperability challenge is that there is no one who creates a business today and then makes a website for their business and doesn't expect. That anyone can access their website regardless of if they're using a desktop, a laptop, a phone, If they're publishing it on the internet, they expect it to be accessible to everyone. That is not true today. Even if you use the best of the best in terms of industry standards. So like you're building an open xr, if you're interacting with digital twins, you're using open USD. Even if you're using what's considered the industry standard, you still cannot today as a developer who wants to contribute to the metaverse. Build an app that doesn't require adaptation to go from one platform to another. If you own an HTC Vive and I own a meta quest, and we both want to play an app together we have done custom development to make it work. So you and I cannot join a common game, right? And this is true across all of them. Now you could say that interoperability is just the nature of. The technological development and where we are. The problem is if you don't solve the interoperability problem, you really limit your ability to access users and then you limit the ability for users to adopt it. So if you're a developer today building something, and you look at the stats we listed earlier about what hardware are people using to access these devices, the first thing you're gonna do is go to phones. Because everyone has those, but not all phones are equipped, although most of the modern day ones are. And then two, you're gonna go to Quest. Because that's where the majority of the users are. Right? But again, even with Quest, there are limitations like on the pass through API and things that you can and can't do on that headset that you can do on other headsets. So yeah, we could go on and on about interoperability for a very long time, but at its core that technological underpinning is just not there to make it true. So that as a developer or a publisher of. Metaverse content that you can guarantee that it's accessible to anyone who has a computing device.
Lori-Lee: Yeah, I think we can maybe ground this in, in an example, and you'll. Let me, let me throw one out there and you can tell me if it's correct, so say for example, I have a Quest and I am gonna go put on my Quest and play Beat Saber. Awesome. Right? If you're right next to me and you have an HTC vibe and you also want to play Beat Saber with me, there's no way to do that today.
I don't even know if Beat Saber is available on [HTC] Vive. Maybe it is.
Sofia: It might be, I don't have a Vive,
Lori-Lee: Yeah.
Sofia: one lately, so I don't know. But yeah, that's exactly it.
Lori-Lee: It's like today, if you couldn't access like Instagram from. An Android device, like you could only if it was only on iOS and you can only access it on one of those devices. That's sort of the equivalent of where we're at with xr and that was true for mobile. In the early days, things were locked into one platform because the developers didn't have the bandwidth to put an application out everywhere, and the user bases were.
Concentrated in different ways. So it's not that it's an abnormal stage or phase to go through, it's just where we're at right now. And it is, you know, probably hindering adoption because, you know, I don't wanna go pay for beat saber on four different devices if that's what I have, which I probably do have like five devices in this house.
But that's a hazard of being in the industry.
Sofia: Yeah. Another challenge that you think stands in the way of adoption rates today and adoption rates that you see in these sci-fi things like Ready Player one.
Lori-Lee: Yeah. Ready Player one is a tough example to draw from because that entire world was built by one company, right.
I don't know that it is realistic to expect that, you know, is meta or Facebook or ideal, right? Yeah. Like do you really want your entire digital world dictated by one corporation or any corporation? Who knows? But. Yeah, so this kind of brings us to like, once you have the hardware and we have hit a bit of a wall on hardware development at the moment.
I think AI and quantum computing may swoop in and kind of accelerate things a little bit. But, you know, at this particular time of recording things have stagnated a bit. So what happens, kind of going through the natural development cycle. Companies spend a lot of time and money developing the hardware, and then they go to launch this fantastic device that they have built and realize that people won't use it unless there's apps that run on those devices.
We have experienced this with hardware companies coming to us and being like, Hey, please build apps that work here. So we have a bit of an app deficit. There's lots of companies that have ported over mobile apps to make them run in xr, but they're not really XR native, they're not optimized for spatial computing.
But that's okay. Like, it's not an uncommon occurrence, especially, you know, for. A technology that's new, but we arrive at this point because it is uncommon for a company to be good at developing hardware, developing software, and developing content like Apple. Apple does it, Microsoft does it.
Meta is trying to do it. And for all of those companies, it took decades to get to that point. Where they can do all three. And it took a lot of acquisitions. They didn't home grow everything. They purchased the capabilities that they then bought in-house.
So, like I said, meta is trying to do the same thing. They have the Quest, they have all of their apps that run on that device, and then they're trying to do content creation as well, I believe. I think Meta purchased Beat Saber, that was their content creation. They bought it. And that's great.
But the assets that Meta has under that umbrella, they do all those things independently. And then they've had to knit them together. And that's a challenge. And I don't know how the other companies, like Apple's gonna solve that and how Microsoft will solve it other than just buying more companies that specialize in those things.
Where we get to at this point is we're going down to a level where we need to have user generated content, if there's gonna be enough content to go around. And the whole point of the metaverse is to create digital worlds that don't exist in real life. We would like users to.
Be able to do that on their own. Not all of the content is gonna come from an app or headset provider. So one of the roadblocks is we need a way for users to make their own stuff without being a developer or a 3D modeler or, you know, a metaverse developer. We're tackling that a little bit with Katana.
Well, actually we're tackling that a lot with Katana, where we offer no code Metaverse content creation, that you do on a desktop and then you push it to the device. And I think AI, that's kind of, this is a separate topic, but I think AI is going to help out a lot with asset generation because people who have not worked with 3D models in the past can just type out what they need and one will be created.
I could go on a lot about the content side of things because it's one of the topics that's near and dear to my heart, but. That's all I'll say for now. Did you wanna talk about hardware? The elephant in the room?
Sofia: I think the other thing that I would think about, content. Not to tie it all back to interoperability, but with Katana, you can publish multiple headsets because of how we built it.
Lori-Lee: Mm-hmm.
Sofia: tools out there, but then they're often exclusive to the headset. So
Lori-Lee: Yeah.
Sofia: low code things where you can build a game, but then it will only run on something specific. And this does also feed into hardware because. Part of the reason this is hard is because the form factor here has to follow the use case
Lori-Lee: Mm-hmm.
Sofia: than the other way around. And because we're still figuring out product market fit and all these areas of xr, figuring out what the right form factor is for the use case is arguably very hard. That, like the example I'll give is, as a millennial I will. Stop on my phone and I'll shop for little things and big things. I'll look for houses, right? I'm not gonna file my taxes on my phone. As a business owner, I will leave myself voice notes and notes of ideas I have of maybe a government proposal we're working on, but what it actually comes to put pen to paper and to build that narrative, I'm gonna go to a desktop to do it. There's this push and pull of wanting the convenience in the smallest possible form factor, which I think is the narrative that's predominant within XR, is just make it smaller, make it lighter, make it smaller, make it lighter, depends on what it is you're trying to do. If you're just trying to quickly check what the weather is or upload your direction so you can watch where you're going while enjoying the scenery. Yeah, that should be a really light, really small headset if you're trying to. Visualize dense 3D data. Say a flight you just flew. The mission debriefing stuff we worked on, weather, you need more compute power to do that. So you either have to offload that compute, you have to have a tethered headset, or you have to have a headset that has a pact. So I do think in many ways head hardware might be the biggest barrier to adoption. Because the processing power for a highly graphical, intensive experience, whether you're talking about augmented or virtual reality, the device has to be big. and that drives down the amount of time someone wants to spend on a device, right?
Like two to three minutes, maybe max 10. if you could get the Apple Vision Pro experience on something that looked like a RayBan, we'd be having a different conversation about adoption right now. You mentioned earlier, but Microsoft dropped their quantum chip. So who knows, a cheaper sleeker headset might be closer than we thought.
Because of that, that might solve some of this compute problem. But also, if I could use chatGPT through a light pair of glasses, I think I would use it all the time. I. I ask a million questions to my phone all day and being able to, to a pair of glasses would be great. But I do think there's still going to be a place in the market for these heavier duty headsets for certain use cases. So I think trying to evaluate a headset that's on the market is very hard unless you're evaluating it against a very specific use case. Like you can't evaluate a Varrio XR four. The way that you would evaluate the meta ray bands, you just can't, they're trying to do two different things.
Lori-Lee: We talked about this a bit before. I think you came up with this. Analogy that headsets are gonna be closer to cameras than they are to laptops. Like I have a different camera depending on what I want to go photograph or film I might just use my phone. I might use that film camera over there.
I might use my DSLR or I have a retro camp snap camera around here somewhere that's like an early 2000 style digital camera. I'm gonna use a different camera depending on what. I want to shoot. I think you are right in that XR headsets. You're going to pick up a different one potentially, depending on what you use it for the most.
Not everyone is going to have an XR four rig in their house. But a lot of people might have something that is RayBan's meta. Esque, right? Like at that level, and maybe depending on what you're into, how much gaming you do, what your job demands are, you might have something that is a little more intrusive and with a bit more computing power.
It's not gonna be one headset to rule them all.
Sofia: No, it'll not. Okay. The three top challenges were hardware, content creation, and interoperability. Out of everyone that is using these headsets today, this technology to say, despite those challenges, what do we think they're doing? And. What does that tell us about where the adoption might continue to grow, where it might come first, where it might come last? Curious what you think about this.
Lori-Lee: Yeah, so a majority of the hardware users, so of that 171 million, give or take, a majority of those are. Using the metaverse for gaming. Not a huge surprise. Games lend themselves really well to the use case. So outside of gaming, we have corporations that are using VR for training, and we also see traction in aerospace and defense, which a lot of people are surprised to hear.
But when you drill down, pilots are used to training in simulators, which is really just a full VR experience. So doing similar. Kind of activities, but in a smaller form factor headset actually isn't that big of a behavioral change. If anything, it's easier to pull out a headset and run through training than it is to go to the simulator and power it up and sit in it.
So that's where a majority of the traction is today. And we tend to see the gaming side of things reported on the most because that's the most accessible. But these more less sexy, enterprise use cases. They're happening, people are spending the money, but it's not that interesting to write a headline about how this company is using VR to do onboarding like that's, no one cares, but
Sofia: It's difficult to quantify,
Lori-Lee: Yeah.
Sofia: You and I both know from having worked corporate jobs, like there is a bunch of process improvement that you do that is difficult to distill down. To like here was the concrete value to the business besides just, it was better. There was less friction and people hated it less while going through it.
It doesn't mean that it didn't add value, but it does mean that sometimes distilling that value down into something financial can be tricky.
Lori-Lee: Yeah. Yeah, and I mean, there's lots of science, like we have lots of research, which is nice, that shows what the improvements are with using augmented or virtual reality for learning and training activities versus, you know. PowerPoints or computer-based training, there's like a pretty decent set of data that shows you have better retention if you learn certain things in a spatial environment versus trying to read it off of a screen.
So you know, there's, there's some. Quantifiable pieces there. But yeah, like you said, a lot of it is just the qualitative feedback of, well, onboarding sucked less this time, or the training was less brutal, maybe because I could do it in, you know, 15 minutes instead of four hours. But that's also harder to justify as a business decision.
Like, oh, people really enjoyed it. Can we spend a bunch of money on it? Like, sometimes that works, but, but not always.
What else have we got here? Do you wanna go into how to get metaverse ready?
Sofia: Yeah, let's talk about it.
Lori-Lee: Yeah.
Sofia: we get asked a lot, which is, if you're. Business who's not yet using this type of metaverse product in your run of business, but you do believe like we do, that it will be a technology that does change the way that we work and live. How do you make your business metaverse ready so that you can capitalize on the benefits of the technology?
When the challenges start to get resolved and the adoption starts to catch up?
Lori-Lee: Yeah. No. This is a question we are asked quite a bit, and first I'm gonna reiterate what I said at the beginning. I think the term metaverse is going through a rebrand. So if you're looking ahead for the next, you know, kind of technology iteration, I'd look for spatial computing.
It might yield better results. As for what to look for, I would look for opportunities where you can use the technology as it stands today to solve a problem. So if you sell a physical product, do you offer virtual try-ons?
Do you offer, see it in your space? I don't know what that product is called, but you can place your IKEA furniture in your living room and see how it looks. That's augmented reality. Do you have that available? Are you using it? It's been shown to, you know, increase conversions and sales and things like that.
If your business has a heavy training or onboarding component to it can you, can you streamline that experience with VR and, you know. Potentially have to train people less often because their retention is better. If your business already uses 3D assets, like if you're an engineering firm and you have a bunch of 3D models, can you be.
Getting a better ROI out of those assets by using them in XR somehow, even if it's for marketing. The other thing I would say is go and experiment with free, 3D asset generation with ai. There's a couple of them out there now. And if that sounds a little too intimidating, you could start with trying to author a mixed reality workflow, for free. You can do that with our app Katana, my biggest caution is don't wait around and let someone else from, a consulting firm or wherever, tell you the best way to implement the technology.
You need to go find that out for yourself, especially if you're a business owner. For the love of God. Do not wait for the industry or for someone else to tell you the best way to use it, it is going to be such a case by case basis in my opinion.
Sofia: I think the other one that always comes to mind for me when we're getting asked this question of like, how do I be metaverse ready? And maybe this is the nature of the markets that we've operated in, this is something that you articulate to people all the time.
You can't jump to industry 5.0 if you're on Industry 2.0. So if all of the knowledge on how your business is run and how people build, maintain, sell your physical product as an example, or digital product, if that is all on paper or if that is in someone's brain. Get it out, and get it into a digitally native format. If you do sell physical products, do you have 3D models of them? And are those 3D models in commonly held formats? We don't want custom proprietary formats. We want FBX, we want OBJ, we want open USD, because that will enable it to be easier to bring your content into the metaverse without having to learn how to code. If you have the content in images, in PDFs, in. Word documents, like anything that could then be extracted down to A-J-S-O-N structure, then it's much easier to then take that and put it into the metaverse if it's in a binder, in a filing cabinet somewhere much harder. So for. Businesses that think that their line of business isn't digitally native, I would argue even if it's not metaverse, you're gonna take advantage of, maybe it's AI beginning to digitize your content and your knowledge base will only make it easier as these technologies increase in adoption and increase in usefulness for you to then take advantage of them in a way that's meaningful.
Lori-Lee: Yeah. And that has the added bonus of also sets you up really well to take advantage of ai. The omnipresent AI will eventually need access to. Your data, whether that's personal data, business data, whatever. If that data is sitting in a binder in a box under someone's desks, until the robot apocalypse, we won't be able to access that.
So taking the time to make sure all of that is digitized and in like Sophia set a common format is a really good use of time.
Sofia: It is. I think we
Lori-Lee: Yeah.
Sofia: for granted because we are a remote company, so everything we do is digitally native. But I think about like when we decided to onboard, business AI agents. I promise this is a short tangent and we should do a whole other episode about how we use AI to help us run our business. It was minutes of us training the model of being like, here's our brand kit, here's our past proposals, here's our case study. Here's all of these things you need to know, because all of that content was already digitally native. So if our onboarding process with those small business AI agents had been higher, I feel like it would've been harder for us to extract value out of it quickly.
But because it was so easy for us to very quickly take all of the relevant digital content
Lori-Lee: Mm-hmm.
Sofia: feed it. And then be able to start using it and see that the AI was referencing the content that we'd given it. It was like, oh, this is useful and this is useful fast. So I think we take it for granted again because we're a remote company who has always had to operate in a digitally native context. But if you haven't, that's definitely something that can fall by the wayside.
Lori-Lee: Oh, for sure. And I was actually just thinking like. You know, a lot of bigger companies, like we came from big corporations where at the C-suite level or at the leadership level, like they have all. Everyone's like a hundred percent digitally native, like, yep. And for those people, that's true, but then it doesn't trickle down, you know, you get down to the individual developer level or individual engineer level, and you find out, oh, this person's actually been keeping all of their spec sheets printed out in a drawer and.
Sofia: on a Word file on their local drive.
Lori-Lee: Yeah, Somewhere.
Sofia: not very accessible to the business.
Lori-Lee: Right? You have to make sure that that trickles down. And for us, like our org's pretty flat. Like I could, you know, we can, we can do that pretty quickly. But yeah, I agree. The AI agent thing would be a fascinating episode because, you know, I say all the time like, we'll, we'll ask it a question or have it do something for us and we get this great answer and I'm like, I love AI.
And then I see other people that are just. Like bashing their heads against the wall. 'cause the answers that they're getting are just, like, so mediocre. And it's like such, it's such a disconnect, right? And I'm like, yeah, but when you've, when you have the data to do it right, it works really well. But you have to have the data in, have it there and have it accessible to enjoy those benefits.
And yeah. No, that was a, that was a great call out. Digitize your knowledge into a common format.
Sofia: As quickly as you can.
Lori-Lee: Yeah.
Sofia: Okay. Before we have to jump off and we start getting slacked, what have you been listening to? What is on your recently played list?
Lori-Lee: Ooh. So this week I actually just finished, or no, I'm like halfway through the episode. So hopefully the second half of it is as good as the first half. But I've been listening to the Financial Diet episode called You're Being Gaslit by Generational Wealth, which is a great title. It's hosted by Chelsea Fagan, and she's just had some singers lately.
The episode I listened before this one was called It's Low Key Giving Recession, and I completely understand why that one got a lot of listens,
Sofia: These are amazing podcast titles
Lori-Lee: Yeah. Right?
Sofia: Top tier. Love it.
Lori-Lee: But yeah, and it's been fascinating. I was, you know, today years old when I found out that grown adults, people my age, have parents helping them. Put down payments on houses even though they have full-time jobs. So who knew? How about you?
Sofia: Mine's a repeat and. You're already sick of me saying this or you're about to be, but I started listening to, we re-listening to We Are Legion, we are Bob, because it's so good. And it's not a podcast, it's an audio book. But here's, you've heard this, but it's so enjoyable to re-listen to it 'cause there's so much more to it. It is space sci-fi, set in the future where we can create artificial intelligence. From human intelligence, and then artificial intelligence can go explore the universe and meet aliens and explore black holes and 3D printers in space. And all of the cool things. It's such a fun read.
You can tell it was written by an engineer, but in a way that's so accessible. Like we're nontechnical, but we are. And I know I've already made you listen to this and I'm making James listen to it now, but. It's just so good. This is like my third re-listen now, I think. And it is so wildly entertaining and I just can't, I can't not love it. And I need the author to write. I think there's five books now. I need there to be at least five more because there's so many fun adventures that these AI can go on in the universe, in the galaxy. And I wanna read 'em all, every single one.
Lori-Lee: Yeah. How crazy is it that I forget when I'm reading or listening to that one that the main character actually isn't human.
Sofia: Yeah. It's so
Lori-Lee: Yeah,
Sofia: Yeah.
Lori-Lee: That takes,
Sofia: human machine interface, like a whole new
Lori-Lee: yeah.
Sofia: Yeah, it's a great one. But yes, we are legion We are Bob. Listen to it on audio and you can tell, I feel like it really comes across that it was written to be listened to. So it was one of the first audio books I listened to. I've always been like a physical book reader obviously. But this one I like converted me to audio books.
Lori-Lee: Yeah, there's very few fiction books I can listen to. In audio format and that's one of them.
Sofia: All right. Off we go.
Lori-Lee: Off we go. Thanks for joining us this week and we'll see you next time. Okay. Bye.
Sofia: see how many days it is until there's another headline that says the Metaverse is dead.
Lori-Lee: Right now we're at zero days.
Sofia: Zero. It is.