Meta and the Metaverse: a mostly useless big hit

Simone Brunozzi
13 min readNov 5, 2021

I have studied Mark Zuckerberg’s latest keynote about Facebook’s rebranding as Meta, and the push towards the Metaverse (Full 80-minutes Keynote here, or an 11-minute summary here). Most of it is, frankly, useless junk or marketing/PR fluff, as one could expect, but there’s some important gems in it that are worth highlighting and commenting on.

What I’ll do is provide a quick summary of each section in the keynote (direct quotes from the keynote are in cursive/italic), and then my commentary. Expect to spend 12–15 minutes to read it all, and hope you enjoy it. But if you don’t want to read it all, here’s my conclusions, right here at the start.

  1. Facebook (now Meta) is very serious about this switch of focus to everything Metaverse, and the “old” business and the “new” will be essentially two separate entities. Innovate or die. It’s a bold move.
  2. Facebook/Meta has invested, and will invest, several $B to make the Metaverse a reality.
  3. They have their own view of what the use cases for the Metaverse will be, and it seems to me that… They still haven’t figured it out. No, really. They want to be a platform, and a marketplace, and make money off that. I think they’re missing on three HUGE use cases that I would instead bet on: fitness, learning/education, and work. Let’s be specific.
  4. Fitness: it makes sense to use AR/VR to vastly improve how people experience fitness today. I’m more of a “solo” type, happy to hike a mountain alone rather than do a Peloton class, but for most people, “group” fitness, or at least “augmented” fitness, is a thing. A $100B/year market ripe for disruption.
  5. Education: again, another huge vertical where a novel approach to engaging lessons and exams could bring tons of revenues for the company. Or, just build MetaU, the global university, where hundreds of Nobel-prize winners teach you their stuff, and where Teaching Assistants are smart AI bots that help you better than any human counterpart, where a VR environment is where you can hang out with other students, and where tens of millions of people from all over the world pay $1,500/year to learn. Do the math, it’s a $50B/year revenue stream. But Facebook/Meta wants to be a platform… They don’t want to compete with Harvard and Stanford and MIT and everyone else.
  6. Work: maybe, just maybe, they will have the right technology to really outcompete Zoom and the rest. Because Zoom is much better than nothing, but Zoom sucks, and remote work is going to stay forever, and the world needs something better, soon.
  7. Apple is a pain in the butt for them (or a double pain, if they come up with their own AR/VR hardware). Possible solution: build the Metaverse, control it, become the Apple of the Metaverse, stop being dependent from the Apple’s digital wall. Good strategy, if it works and if Apple falls asleep for a few years.
  8. Why does my title say “mostly useless”? Because there’s a ton of desire to build things for the Metaverse, but a lot more confusion on what exactly should be built. Second Life, Minecraft, Roblox, there’s tons of examples of partially or vastly successful Metaverse-like communities out there, and yet it seems that Facebook/Meta is so focused on the technical details to be unable to describe a grand vision that makes sense. All the examples they provide are so lame, so shallow, so… sad, and so culturally narrow, that I can’t really understand if they really have a proper strategy under the hood, or are just improvising. But maybe they will get there eventually.
  9. Side comment. Facebook/Meta, do you really want to change the world? Use your data to fix divorces, finance worthy people, increase the mental health of your customers, instead of trying to squeeze every cent out of their attention span.
  10. Despite all of the above, I’m still bullish on Facebook/Meta the company, and I believe that they will do well in the next few years.

Part I: MetaSocial

We’ve gone from desktop to web to phones, from text to photos to video. But this isn’t the end of the line. The next platform and medium will be even more immersive, an embodied internet where you’re in the experience, not just looking at it, and we call this the Metaverse. It will be the successor to the mobile internet.

When you play a game with your friends, you’ll feel like you’re right there together in a different world, not just on your computer by yourself. And when you’re in a meeting in the Metaverse, it’ll feel like you’re right in the room together, making eye contact, having a shared sense of space and not just looking at a grid of faces on a screen.

Everything we do online today connecting socially, entertainment, games, work is going to be more natural and vivid.

Screens just can’t convey the full range of human expression and connection. They can’t deliver that deep feeling of presence, but the next version of the internet can.

Mark Zuckerberg’s (MZ from now on) view is that the Metaverse will certainly happen. How will we socialize in the Metaverse?
We will need a feeling of presence and a home space, use avatars, teleport around (like… links). We will need interoperability (related API already being built by Meta), privacy and safety. We’ll use virtual goods and import/export to/from the physical world to the Metaverse.
There will be three main modes: Virtual Reality (VR), Augmented Reality (AR), computers/phones (interfaces); but also new ways of interacting (natural interfaces), which in 5–10 years will be mainstream.
Horizon is the name for the social platform for the Metaverse, which starts at Horizon Home. Soon they’ll launch a social version of Horizon Home. People will be able to program things in Horizon Worlds.
Quite obvious so far. Let’s keep going.

Part II: MetaCreate

Avatars will be as common as profile pictures today, but instead of a static image, they’re going to be living 3D representations of you, your expressions, your gestures that are going to make interactions much richer than anything that’s possible online today.

You should be able to bring your avatar and digital items across different apps and experiences in the Metaverse. Beyond avatars, there is your home space.

Teleporting around the Metaverse is going to be like clicking a link on the internet. It’s an open standard. In order to unlock the potential of the Metaverse, there needs to be interoperability. You’re not going to be locked into one world or platform. You want to know that you own your items, not a platform. Privacy and safety need to be built into the Metaverse from day one.You’re going to be able to bring things from the physical world into the Metaverse, almost any type of media that can be represented digitally.

There are going to be new ways of interacting with devices that are much more natural. In the next five or 10 years, a lot of this is going to be mainstream and a lot of us will be creating and inhabiting worlds that are just as detailed and convincing as this one on a daily basis.
Horizon is the social platform that we are building for people to create and interact in the metaverse. Then there is Horizon Worlds, which is where you can build worlds and jump into them with people.

Horizon Workrooms will allow messaging between VR and RL (Real Life). The keynote shows a (lame) video of how a concert in VR could look like (at ~20:00).
Creators and artists will use Spark AR to program digital objects in RL and VR. There will be a Horizon Marketplace for 3D digital items.
Metaverse is mostly about gaming. Keynote shows mockups of a chess game, ping pong game, and MZ surfing virtual waves. Again, quite lame so far.
There will be the need to build an entire AR/VR ecosystem (~25:00).

Gaming provides many of the most immersive experiences and it is the biggest entertainment industry by far. Gaming in the Metaverse is going to span from immersive experiences and fantasy worlds to bringing simple games into our everyday lives through holograms.

Games available or being developed on Oculus: Arizona Sunshine, Beat saber, Billie Eilish music, Population:one (disclosure: I was an investor in BigBoxVR, the company behind it), Blade and Sorcery Nomad, Grand Theft Auto San Andreas. I’m not a gamer so I’m unable to comment, but it seems to me that most of these games cater to a young population worldwide, and to a US-based, wealthy adult population. I was surprised to see no mention to the Chinese market.

The most interesting games out there take advantage of how you can move around physically. And one example that we are seeing take off is fitness.
You can do anything from boxing lessons to sword fighting to even dancing. You’ll be able to work out in new worlds, even against an AI.
We’re making a fitness accessories pack.

Fitness in AR/VR: boxing video (~31:15), cycling video, sword fighting.

My personal view is that Fitness + AR/VR makes a lot of sense. It is one of these specific use cases where the ability to move around matches what you can do to enhance the experience for the protagonist. This allows two main things: remote social sports (e.g. biking alone with AR glasses), and augmented/virtual sport experiences (e.g. biking the Stelvio in VR, comfortably using a real bike in your living room). The global market for fitness is ~$100B per year, and it makes sense to expect that a large fraction of it could be captured by the right product in the coming few years, potentially also capturing new customers.
Let’s now move to work.

Part III: MetaWork

Remote work is here to stay for a lot of people.

Imagine if you could be at the office without the commute, you would still have that sense of presence, shared physical space, those chance and interactions that make your day, all accessible from anywhere.

When you’re ready to share what you’ve been working on, you can present it as if you’re right there with the team.

Hybrid work is going to be a lot more complex when some people are together and others are still remote.

Work will be hybrid. The Keynote shows an architecture studio debrief mockup video, and a work meeting mockup.

My view: I think that a better videoconferencing experience for work would be very valuable to the world. Zoom is currently the best, but still has several limitations. No one has seriously solved how we’re going to have effective remote meetings in small groups, which is the most common type of interaction, and also no one has solved how we can effectively share information (3D models, media) in these meetings. Later in the keynote there are some intriguing examples of prototype technology that could be really effective at doing that.

Here I think that Facebook is seriously underestimating how a sharp focus on this single use case could have a profound impact on their future revenue. It doesn’t surprise me that their enterprise efforts (Workplace) haven’t been quite successful so far.

Perhaps, maybe, this will be an area where they will decide to go deep, and provide a complete solution for enteprise customers.

Part IV: MetaLearn

In the Metaverse, learning won’t feel anything like the way we’ve learned before.
So we’re setting aside $150 million to train the next generation of creators to build immersive learning content and increase access to devices.

We’re going to establish a professional curriculum and certification process, make it easier to monetize, and put our Spark AR curriculum on Coursera and edX.

We plan to continue to either subsidize our devices or sell them at cost to make them available to more people.

Interactive training and learning will be huge in the Metaverse. Keynote shows an astrophysics video (~37:00), which is cool but also fails to represent what’s really going to happen here.

The education/training sector is huge, and in desperate need for a revamp. We still use centuries-old approaches to teach people, train people, upskill people of all ages, and this can and should change soon. We can’t wait to have millions of great teachers; we need technology to augment the ability of the best teachers, and creators, to help billions of people worldwide.

Again, here Facebook/Meta seems to underestimate how effective a sharp focus on this particular use case could be to generate a huge revenue stream for them. They could be a global university (MetaU?) and have millions of students enrolled, learning for tens of Nobel prize winners… But they don’t seem to be bold enough to think in these terms.
Such a global university, leveraging Facebook/Meta’s brand, could easily get tens of millions of students enrolled per year, at a $1,500/year tuition fee. The math makes it at least a $50B/year business opportunity.

Part V: MetaMeta (and Facebook Reality Labs)

Within the next decade, the Metaverse will reach a billion people, host hundreds of billions of dollars of digital commerce and support jobs for millions of creators and developers.
Today, much of what you buy on the internet is inside a single app, website, or game.
Businesses will be creators too, building out digital spaces or even digital worlds. They’ll sell both physical and digital goods as well as experiences and services. And they’ll be able to use ads to ensure the right customers find what they’ve created.

Creator economy for the Metaverse (~44:00, Vishal Shal)

More people are going to have the freedom to find a business model that works for them, whether that’s custom work, tipping, subscriptions, ads, or other monetization tools that may only make sense in the Metaverse.

Today, we’re introducing the Presence Platform, which is a broad range of machine perception and AI capabilities that empower developers to build mixed reality experiences on Quest 2.

Realistic presence is the key to feeling connected in the Metaverse, and the Presence Platform’s capabilities are what’s going to deliver on that promise, things like environmental understanding, content placement and persistence, voice interaction, standardized hand interactions.

The Interaction SDK is a library of modular components that will make it easy to add hand interactions to your apps (~53:00).

Small funny note: they should have asked the Italians. We know how this stuff with your hands works.

Voice SDK (voice input for gameplay and navigation). With the Passthrough API, we’ve already seen breakthrough experiments from developers that blend the virtual and the real world.

To achieve that rich mixed reality experience, apps also need to be aware of things in the room and blend the virtual objects with the physical environment around them so they can co-exist in the same space.

(World Beyond video, Oppy the pet) (~54:30)

So, developers want to be able to place persistent world-locked content, like animated holograms or your Instagram feed in your real space. And tools like spatial anchors and scene understanding capabilities will help make these mixed reality experiences feel seamless.

We created a tool, codenamed Polar (~57:00), that makes AR creation possible for novice creators who have no prior experience in art, 2D or 3D design, or programming.

This is the most interesting part of the Keynote: a TON of cool stuff, with the only drawback that it’s not exactly around the corner, and that the required hardware will not be cheap enough for at least a few more years. But we will get there eventually.

Part VI: MetaPrivacy

People want to know how we’re going to do all this in a responsible way, and especially that we play our part in helping to keep people safe and protect their privacy online.
The speed that new technologies emerged sometimes left policy makers and regulators playing catch-up.
Interoperability, open standards, privacy, and safety need to be built into the Metaverse from day one. Responsible Innovation Principles.

I am almost laughing at this coming from Facebook, but let’s read on.

Part VII: MetaHardware

And one example is what we’re doing with Project Aria (~1:00:30), our research device that helps inform the AR glasses that we’re building.

Your avatar will be able to make natural eye contact and reflect your facial expressions in real time.
Project Cambria (~1:03:00), we’ll be taking this to the next level with high-resolution, colored, mixed reality Passthrough. We essentially combined an array of sensors with reconstruction algorithms to represent your physical world in the headset with a sense of depth and perspective.

We’re pushing the limits of display technology and form factor with something called pancake optics.

We are also focused on the hardware to make true augmented reality possible.

last month, we launched Ray-Ban stories, but the ultimate goal here is true augmented reality glasses: Project Nazare (1:07:30)

There’s a lot of technical work to get this form factor and experience right. We have to fit hologram displays, projectors, batteries, radios, custom silicon chips, cameras, speakers, sensors to map the world around you and more into glasses that are about five millimeters thick.

It’s going to take about a dozen major technological breakthroughs to get to the next-generation metaverse and we’re working on all of them: displays, audio, input, haptics, hand tracking, eye tracking, mixed reality sensors, graphics, computer vision, avatars, perceptual science, AI, and more.

Besides what’s available today, or soon, a very interesting part is when Michael Abrash talks about future technology: displays, audio, input, haptics, hand tracking, eye tracking, mixed reality sensors, graphics, computer vision, avatars, perceptual science, AI, and more, and then full body Codec Avatar by Yaser Sheikh, Facebook Reality Research Lab (1:11:30).

Again, quite amazing stuff that appears to be at least a few years away.

Part VIII: MetaAvatars and Interfaces

Codec Avatars and 3D avatars; high fidelity real time rendering of the space and the moving objects.
Last year, we showed our very first full body Codec Avatar at Connect. Preventing others from using your avatar will be critical.
Neural interfaces are going to be an important part of how we interact with AR glasses, and more specifically EMG input (see wikipedia) from the muscles on your wrist combined with contextualized AI.
EMG input could potentially unlock full speed typing and it could give you subtle personalized controls that you can use in any situation.
EMG enables this by picking up subtle neuromotor commands with remarkable precision.
You’re going to be able to send a text message just by thinking about moving your fingers.

(There will be an) AI that understands your context and can give you a simple set of choices based on that context, AR glasses will tell her what her available actions are at any time, and a voice interface that will let you locate any object.

New Interfaces are actually an area that most people underestimate; I believe that Facebook/Meta investments in this area will play a huge role in enabling them to build tools for the next generation of HCI (Human/Computer Interaction). EMG in itself is an exciting field.

Part IX: MetaEnd

That gives you a sense of some of the technical challenges that we’re working on to deliver deep and immersive experiences in the Metaverse. A lot of what we’ve shown today isn’t going to be available in the next year or two. Some of this is still a long way off.

The ultimate promise of technology, to be together with anyone, to be able to teleport anywhere and to create and experience anything?

And high taxes on creative new ideas are stifling.
Together, we can create a more open platform with more ways to discover experiences and more interoperability between them.
I hope that we are seen as a Metaverse company.
Starting today, our company is now Meta.
The word meta comes from the Greek word meaning “beyond”

Congrats on the new name, MZ. And good luck trying to figure out how to find your true North. In the meantime, please do not screw the world too much.

Suggested further reading:

  1. The Metaverse it’s already here, and it’s Minecraft (2021)
  2. How I got hired by (with Second Life)(2008) — curious story from a “Metaverse” from 13 years ago. I wrote it. Lots, and very little, has changed since then.
  3. Stratechery on Meta
  4. Tim Sweeney (EPIC’s founder/CEO) on the Metaverse (2019)



Simone Brunozzi

Tech, startups and investments. Global life. Italian heart.