ENSPIRING.ai: Exploring the Future of AI Interaction with Mark Zuckerberg
In the video, Mark Zuckerberg discusses the future of human and artificial intelligence interfaces, focusing primarily on glasses as the medium for this interaction. He explains Meta's bold step into the realm of open-source AI and their strategic intent to avoid past platform mistakes, which once missed the mobile wave. Zuckerberg believes in creating an ideal social experience that isn't limited to a phone-based platform, aiming to harness the power of AI through glasses that offer enhanced interaction by capturing user views and sounds.
Zuckerberg addresses the practical challenges and opportunities of using glasses for AI interaction. He details how they can potentially serve as seamless AI assistants, projecting virtual images into the real world. However, he also acknowledges the limitations and discomfort that constant wear might introduce for many users. While enthusiastic about the integration of AI with wearable technology, the presenter suggests alternative formats, such as wearable pins or completely transformative hardware, that could ease usability concerns.
Please remember to turn on the CC button to view the subtitles.
Key Vocabularies and Common Phrases:
1. Interface [ˈɪntərˌfeɪs] - (n.) The point of interaction or communication between a computer and an external entity.
Zuckerberg talks about the final form of human to artificial intelligence interfaces.
2. Platform risk [ˈplætˌfɔrm rɪsk] - (n.) The potential loss or threat arising from a dependency on a single platform for business.
They were exposed to severe Platform risk because they were just built on other people's platforms.
3. Hologram [ˈhɒləˌɡræm] - (n.) A three-dimensional image formed by the interference of light beams from a laser or another coherent light source.
So there's a lot to talk about there. Mark Zuckerberg believes the future of human to AI interaction is going to be in the form of glasses.
4. Bullish [ˈbʊlɪʃ] - (adj.) Optimistic or positive, especially in the face of adversity.
And of course, he's bullish on that because he has a hit on his hands.
5. Monolithic [ˌmɒnəˈlɪθɪk] - (adj.) Massive, uniform, and unyielding, typically (in computing) denoting a system with a tightly integrated and inflexible structure.
But these things are not like pieces of software that are Monolithic.
6. Scorched Earth Strategy [skɔrʧt ɜrθ ˈstrætəʤi] - (n.) A military or political strategy that seeks to destroy anything that might be useful to an adversary.
And this is also known as the scorched earth strategy.
7. Novel [ˈnɒvəl] - (adj.) New and not resembling something formerly known or used.
You have to build a novel display stack.
8. Intellectualize [ˌɪntəˈlɛktʃuəˌlaɪz] - (v.) To think about or discuss a matter in a rational or studious way, ignoring emotional or practical concerns.
People like to intellectualize everything, but a lot of our experience is very physical.
9. Imperative [ɪmˈpɛrətɪv] - (adj.) Of vital importance, crucial.
So I think that's somewhat of an Imperative for us to go do that.
10. Ubiquitous [juːˈbɪkwɪtəs] - (adj.) Present, appearing, or found everywhere.
They are a complete monopoly in the glasses market.
Exploring the Future of AI Interaction with Mark Zuckerberg
Tell us the story of how these came to be at Meadow. We've been building social experiences for 20 years now, and originally it took the form of a website, then mobile apps.
But the thing is, I never thought about us as a social media company, right? We're not a social app company. We are a social connection company, right? I mean, we talk about what we're doing is building the future of human connection. And that's not only going to be constrained over time to what you can do on a phone. Right? On a.
All right, I'm going to pause really quickly. I know I didn't let him say that much, but I already know where he's going with this and I have a lot of thoughts on it. I talked about this in previous videos, but Meta and Zuckerberg in particular had been burned by building all of their apps on top of other people's platforms, other companies platforms.
Originally, Facebook was on the web, but when mobile came to be with Android and iOS, meta completely missed that wave. They tried to build their own phone internally and failed, shuttered the project and never released it. And they were exposed to severe Platform risk because they were just built on other people's platforms.
And he's going to talk a little bit about that, and this time he's not going to make that same mistake. So not only is he investing heavily to be the market leader in Open source artificial intelligence, but he also wants to be the interaction layer between humans and AI. And he believes it's in the form of glasses, not phones.
And I'm going to give you my thoughts on that after we watch a little bit more. When you think about, you know, when we got started, okay, we're like a handful of kids, you know, we weren't able, we didn't have the resources, the time to go define whatever the next computing platform is.
And also Facebook originally got started around the same time as a bunch of the early smartphones, and those platforms got started, so we didn't really get to play any role in developing that platform. And one of the big themes, I think, for the next chapter of what we do is I want to be able to build what I think are the ideal experiences, not just what you're allowed to build on some platform that someone else built, but what is actually, if you can think from first principles, what is the ideal social experience?
So I think what you would like to have is not a phone that you look down at that kind of takes your attention away from the things and the people around you, not just a small screen. I think what you ideally have is glasses. And through the glasses, there's one part of it, whereas the glasses, you can, they can see what you see and they can hear what you hear.
And in doing so, they can be kind of the perfect AI assistant for you because they have context on what you're doing. But then part of that is also that the glasses can project images, basically like holograms, out into the world. And that way your social experiences with other people aren't constrained to these little interactions you can have on a phone screen.
You know, in the not so distant future, you can imagine, because you guys have demoed some of the stuff that we've done, a version of this where we're having a conversation like this, but maybe one of us isn't even here. They're just a hologram, and we have glasses, and there's the question of delivering a realistic sense of presence.
So there's a lot to talk about there. Mark Zuckerberg believes the future of human to AI interaction is going to be in the form of glasses. Part of it makes sense, right? You're wearing these glasses. They're capturing exactly what you're looking at and what you're hearing.
Those are two of the most important senses when it comes to feeding that into an AI model. And of course, he's bullish on that because he has a hit on his hands. With the meta AI ray ban glasses, most people seem to really like them, including myself.
But here's the thing. They have one big flaw. You have to wear them, and you have to wear them all the time if you want to interact with AI all the time. And I don't wear prescription glasses. I wear sunglasses, and they're great outdoors. So if I'm driving or if I'm hanging out at the park, I'll put them on.
And great. I could take pictures, I could talk to my AI assistant, and that's wonderful, but I cannot imagine wearing them all day, let alone what am I supposed to do when I go indoors? Wearing glasses indoors is something that I've never done, so it has to get used to it.
And plus, it's just something you have to wear on your face all the time, and I just don't see that becoming a thing unless you wear prescription glasses and you kind of have to. Now, maybe they can make Transition lenses where it's tinted outside and then it goes clear inside, and that would make it a little bit better.
But the idea of having to wear glasses when I don't want to or don't have to just doesn't rub me the right way. And so I think a lot of people are going to think like that. And again, I love my meta ray ban glasses, but at maximum I wear them for two or 3 hours per day total, and I want to have access to my AI all the time.
So I don't actually think glasses are the final form factor of interaction between human and AI. So a couple other options of what I think it could be. One is Airpods, or something similar to Airpods, but again, you have to wear them all the time.
The nice thing is, at least in my opinion, they're much easier to wear for long periods of time than glasses are. But there's a few drawbacks. You can't project things onto the world in AR VR, and at least currently, they don't have any way to see the world. They don't have a camera on them, but I suspect that's actually a solvable problem.
But again, you are still forced to wear these things all day long. And with such a small form factor, the battery life on them probably isn't going to be great, especially when you're having to add cameras to them. So what is that final form factor? Well, I actually think in the short and mid run, it might actually be a pin, something that you could just wear in your clothes that you forget about throughout the day.
It has a camera, it has sensors, it has voice capabilities, it has recording capabilities, but we haven't really seen a good implementation of that yet. There's the humane pin. There's a few other pendants that you can wear around your neck, which I think is a good solution.
But all of those, the best part about them is that they kind of fade away throughout the day. You just forget that you're wearing them, and that is really important to me. But in the long run, I think the perfect form factor is none at all. I can't imagine what this actually would be, but something where you don't have to carry it around, you don't wear it, or it's something that you were already carrying around.
And it's very lightweight, but basically, you forget you have it. And of course, this thing has to be able to project things into the world so you can see it. It has to be able to see the world, hear the world, and all without really interrupting your day to day. So I don't know what exactly that device is, but if I could write a science fiction movie, it is just there.
It is just there for you. You don't actually have to have a hardware device. Maybe it's just something installed everywhere. I don't actually know. All right, let's keep watching.
Something magical in the realm of building social experiences around the feeling of human presence and, like, being there with another person and this physical perception, right, where we're very physical beings, right, people like to intellectualize everything, but a lot of our experience is very physical.
And this physical sense of presence that you are with another person doing things in the physical world, it's something that you're going to be able to do through holograms, through glasses, without being taken away from whatever else you're doing, just kind of have that mixed in with the rest of the world.
It's going to be, I think, the ultimate digital social experience. So I do agree. The ability to project things into the real world so they seem real to the human eye is incredibly important for this futuristic vision that he has. And really, what I think is going to come true as well.
But again, glasses just doesn't seem like the final form factor. And just last week, Snap, the company behind Snapchat, released their own AR glasses. That, engineering wise, seemed pretty darn cool, but they were extremely ugly, at least in my opinion.
So I can get behind the short term wearing these glasses. I just, again, don't think it's the long term solution. I know Apple, given their Apple vision Pro, is probably working towards a smaller form factor glasses solution as well. So I think that's what a lot of companies are betting on.
But if I had to bet, for the long run, I'm going to bet that glasses are not the final form factor. And I think it's also going to be the ultimate Incarnation of AI, because you're going to have conversations where it's like, all right, there's some people. It's like, maybe like, I'm physically here, there's a person, you're like a hologram there.
There's an AI that is kind of embodied as someone is there and the glasses will enable this. So, okay, so how are we going after this? Building this? This is like some huge project. We've been working on it for ten years and there are a lot of different challenges to solve.
To get there, you have to build a novel display stack. These aren't just screens like the kind that are in phones. There's this long lineage. They're connected to the screens that have been in TVs and monitors and things for a long time.
There's been this massive optimization of the supply chain. There's brand new display stacker on holographic displays that basically need to get created and then they need to be put into glasses, they need to be miniaturized. And you also in the glasses need to fit chips, microphones, speakers, cameras, eye tracking to be able to understand what you're doing. Batteries to make it last all day.
It's like new novel rf protocols. Yeah, it's like, ok, it's a pretty big challenge. So we're like, all right, let's go try to go for the big thing. And we've been working on that for a while and we're pretty close to being able to show off the first prototype that we have of that.
And I'm really excited about that. At the same time, we also came at it from this line. All right, so just a little news there. He said they're really close to releasing this AR VR glasses thing that has meta built into it, and now he's going to continue the story.
But talk about how the meta AI glasses came to be. Because they weren't always meta AI glasses. They were just simply a collaboration between meta and Ray ban to take pictures and to basically post things to Facebook and to Instagram and to WhatsApp. It was not at first meant to be AI glasses.
And this is why Mark Zuckerberg is just so good, because he has these ideas and he pushes his team to do it in such a short period of time. Let's listen. So that's a lot of new technology that needs to get developed, a lot to pack into a form factor, because the glass have to be good looking too.
So what if we just constrain ourselves to like, we're going to work with a great partner, Esselor Luxottica. They make Ray ban, they make a lot of the iconic glasses. Let's see what we can fit into glasses today and make them as useful as possible.
And, you know, I actually, I kind of thought when we were getting started with those that it was almost like a practice project for the ultimate, which, let's be clear, that's what you thought Facebook was. That's true. That's true. Yeah, I did. Like, for your real startups.
That's true. Yeah, no, this is. Let's go on a tangent there for a second. So started Facebook in school, came out to Silicon Valley with Dustin and a handful of people working on it at the time.
And we did that because Silicon Valley is where all the startups came from. And I remember we got off the plane, we were driving down 101. We're like, wow, eBay, Yahoo. This is amazing. All these great companies. One day maybe we'll build a company like this.
And I'd already started Facebook and I was like, surely the project that we're working on now is not a company and Facebook had some scale at this point. Oh, no, no, it was a great project. I just didn't have the ambition to turn it into a company at the time that just kind of happened.
But anyway, yeah, I mean, a lot of hard work, obviously, but I just, at the time, I was kind of like, yeah, I don't think this is it. Well, that's your answer of, would you have started? You actually didn't try to start Facebook. I didn't know.
So, yeah, so, I mean, the glasses, though, we thought that this was like, all right, we want to get working with Esselor Luxottica so we can start building more and more advanced glasses. And then they're really good. They look good. And then AI, like the massive transformation in AI for listeners.
Let's just be really clear. You guys shipped this product that I'm holding before LLMs, or at least before the public consciousness was aware of the chat GPT moment. And these were not manufactured and shipped as an AI device, came later when they were already in market.
Yeah, a few years ago, I would have predicted that AR holograms would have been available before, kind of like full scale AI, and now I think it's probably going to be the other order. So what he's saying, and very similar to the Apple vision Pro, these are heavy goggles that you put on and you just don't wear them all day.
And then meta kind of came in from the bottom as well. They just had very simple glasses, no AR, no VR, just enough to take a picture. And I believe Apple has abandoned, at least for the time being, their work on Apple Vision Pro, and instead are entering the market from the bottom as well.
They are creating very basic glasses and will add to their functionality over time. And I really like that approach. So you're basically going in from the top and from the bottom, and you're kind of seeing what's going to work. And then both companies are going to end up in the middle, where they have these incredibly lightweight, incredibly functional, and a lot of AI built into it.
But they're not goggles, and they're not just basic glasses. They're somewhere in the middle. So now it's like, all right, great. Well, this is actually a great product because it's got the cameras, so it can see what you see. It's got the microphone, it's got the speakers.
You can. And actually, let me say one more thing. One of the most genius moves that Mark Zuckerberg did here was work with Esslor Luxottica, the company behind Ray ban and a million other glasses brands. Basically, they own the entire market.
That's a whole nother subject in itself. They are a complete monopoly in the glasses market. And he worked with them because they already had gorgeous glasses. So he thought, okay, how much can we fit into these existing beautiful glasses?
And then we'll go from there. I remember calling Alex Himmel, the guy who runs the product group that's exactly running it. And I'm like, hey, you know, I think we should probably pivot this and make it so that meta AI is the primary feature of it.
And then, like, I remember I came in the next week and they built a prototype of it on Tuesday, and it was like, all right, good. Yeah, no, this is good. This is gonna be a very successful product. One week they had the glasses. The glasses had chips in them.
They had cameras, they had microphones, they had speakers. One week he just made a phone call and said, hey, I think this should be an AI product. And then within one week, they had a demo of it, which is just insane to think about.
And that is why being a founder led company is also so important, because founders can make those types of decisions and pivot an entire product line on a dime. And he was right. I mean, I'm going to say it again. I love my glasses, so let's keep watching.
Told us a much more high stakes version of that story. I was on the highway with my kids, and I get this call on a Saturday from Mark, and he's like, those glasses, could we put meta AI in them running on device and ship that soon so we can see if that's a good idea or nothing? Yeah, that tracks.
That's what I just said. Sounds right. Next, Mark is going to talk about Open source AI and why meta went all in on Open source and what that means to meta as a company and the ecosystem more broadly. We've also had an interesting relationship with this because sequentially, as a company, we came after Google.
So Google was the first of the great companies that built this Distributed computing infrastructure. So they came first. So they were like, all right, let's keep this proprietary because it's a big advantage for us. And then we're like, all right, we need that too, but we built it and then we're like, okay, not an advantage for us because Google already has that, so we might as well just make it open.
And by making it open, then you basically get this whole community of people building around it. So it wasn't going to help us compete with Google for any of the stuff that we were doing to have that technology. So what he just said is actually less important than what he didn't just say.
What he really meant to say is we open sourced it and then we took away Google's advantage. Not that we weren't going to compete with them, so we open sourced it and so we built a community around it. No, no, no. The reason they open sourced it is so that they could take away Google's advantage in Distributed computing.
And next, what he's going to explain is how he did the exact same thing to open a. And it is such a Ruthless strategy. I love it. I'm a hyper competitive person and I just love to hear these types of stories.
And this is also known as the scorched earth strategy. So let's keep watching. But what we were able to do with things like open compute were get it to become the industry standard. So now you have like all these other cloud service platforms that basically use open compute.
And because of that, the supply chain is standard around, standardized around our designs, which means that it's like way more supply, way cheaper to produce. We've saved billions of dollars and the quality of the stuff that we get to use goes up. So, all right, that's like a win win.
But I think in order for this to work, we do a lot of Open source stuff. We do a lot of closed source stuff. I'm not like a Zealot on this. I think Open source is very valuable, but I also think it sort of makes sense for us because of our position in the market and the same for AI around Lauren.
Okay, this is where we were going with this. Yeah, it's similar deal. We want to make sure that we have access to a leading AI model. I think just like we want to build the hardware so that we can build the best social experiences for the next 20 years.
I don't think that for us, it's like we've just been. We've been through too much stuff with the other platforms to fully depend on. And we're a big enough company at this point that we don't have to. We can build our own core technology platforms, whether that's going to be AR glasses or mixed reality or AI.
So I think that's somewhat of an Imperative for us to go do that. But these things are not like pieces of software that are Monolithic. They're ecosystems. They get better when other people use them. So for us, there's a huge amount of good, and it philosophically lines up with where we are.
Where like, I mean, look, I definitely firsthand have a lot of experiences. We were like, trying to build stuff on mobile platforms. The platforms are just like, nah, you can't build that. That's frustrating.
So that is the exact lesson that he learned on iOS and Android, but mostly iOS, because of apple. Apple just decided one day that Facebook shouldn't be able to track users across apps, or at least users should have the option to turn that off. And it really hurt Meta's bottom line for a short period of time.
They recovered, definitely, but it was just a single decision that Apple made. And again, Meta just had complete Platform risk, and he will avoid that at all costs. And the cost seems to be hundreds of billions of dollars because they're building their own AI and open sourcing it. They're building their own glasses.
They're building their own AR VR goggles. So he learned his lesson, and now all of us benefit. So there it is. Zuck believes the future of AI to human interaction is glasses. What do you think? Do you think glasses are the future of AI interactions?
Artificial Intelligence, Technology, Innovation, Mark Zuckerberg, Human-Computer Interaction, Open Source AI
Comments ()