Lunettes connectées: l’IA au coin de l’œil en 2026 (Full Transcript)

Démonstration des Meta Ray‑Ban: affichage, gestes EMG, messages manuscrits, caméra et garde-fous de confidentialité. Adoption encore incertaine.
Download Transcript (DOCX)
Speakers
add Add new speaker

[00:00:00] Speaker 1: I've just sent a WhatsApp message by writing on the table. It's a new feature on the meta display glasses. And could this be the year that smart glasses finally take off? On these ones, I can even watch my TikToks. There's certainly no shortage of companies trying to crack the market. They come in different shapes and sizes. Some are audio based, while others now include displays on the lenses. Now I'm going to play a skiing game. Oh, OK. So as I move this way, skiing the opposite direction. Oh no, I'm crashing into literally all the signs. Uptake of smart glasses is still quite slow. And perhaps one of the biggest barriers is convincing us that they're fashionable. Tesla are trying to do just that. They've partnered with the company behind Ray-Ban and Oakley to bring their tech to our eyes.

[00:01:02] Speaker 2: It was really hard for us to try to build a pair of glasses that are authentically a Ray-Ban that have a display that's bright enough that you can look at it even in the sunlight, that's high resolution enough that you can read message content, that doesn't leak light out. So when you're reading it, no one can see what you're reading. And that all fits into this form factor at a light enough weight that's comfortable to wear.

[00:01:28] Speaker 1: So far, these glasses are only on sale in the US, with supply issues meaning their rollout to other countries has been delayed. And for me, that means this is my first try of this tech.

[00:01:39] Speaker 3: So you're going to take your thumb and your middle finger and double tap.

[00:01:44] Speaker 1: They do take a little bit of getting used to. There we are.

[00:01:47] Speaker 3: The display is just popped up. You're on the settings page.

[00:01:52] Speaker 1: The display is appearing in front of my right eye. It's been designed that way to avoid obstructing my view.

[00:01:59] Speaker 3: How it works is it's called an electromyography technology. These sensors that are on the inside of the band, it's actually going to pick up the electrical signals in your wrist that controls different gestures. So when you take your thumb and your index finger and you tap, then you're selecting. And we know how to read that signal, and it changes on your glasses.

[00:02:18] Speaker 1: And perhaps the coolest feature that's just been added for users, writing out message replies. OK, so I've had a message. Hi, it was CES. So let's say it was. Great, it fills up.

[00:02:36] Speaker 3: You are very fast. And you just learned.

[00:02:39] Speaker 1: So my message has appeared on the screen. It's perfectly. There's no spelling mistake or anything. And I have terrible handwriting.

[00:02:48] Speaker 3: I mean, that's what was such an amazing technical challenge with this technology is because everyone has such personal ways of writing. And we had to train the model to be able to successfully pick up what you're writing, even when you're in a hurry and even a different style from every single person.

[00:03:03] Speaker 1: Other features include maps, translation and a teleprompter.

[00:03:08] Speaker 2: And that's useful if you're giving a talk. It's useful if you're doing an interview like this. We've got to get you a pair. Does that mean you've got a script on your glasses right now? I'm authentically talking to you right now. But if I forget something, I will probably get a message which will show up in my field of view and be able to read it.

[00:03:24] Speaker 1: People find at the moment they have a lot of distractions in their life. And actually putting the phone in your pocket because you're going into an important meeting or you're going to the cinema or you're going somewhere is actually something people value. Do you think people really want constant interruptions in their life?

[00:03:38] Speaker 2: We definitely, I completely agree with you. So we internally, we talk about being present as the primary thing we want to help people do, yet connected. And right now, we've been really selective with which notifications can come through. You can also set it so that the display does not turn on when a notification comes in, so you're not alerted to it in your field of view.

[00:03:59] Speaker 1: One of the main features across Meta's smart glasses and those from many other companies is a camera. It's part of interacting with AI-based search, but also for capturing what we're doing.

[00:04:12] Speaker 3: Pinch your finger and rotate like a radio dial and you can zoom in and out to get that perfect photo.

[00:04:21] Speaker 1: Okay, now I'm zooming in and out, like turning a dial. And snapping a picture.

[00:04:28] Speaker 3: And snapping a picture. You got a perfect one of Martin, good job.

[00:04:33] Speaker 1: One of the big concerns people have with your glasses is the cameras, and how are they recording them without people knowing about it?

[00:04:40] Speaker 2: Well, it's really important to us that not just do you like how the glasses look and how you feel wearing them and that you get value out of them, but equally important is that the people around you feel comfortable that you're wearing them, because if they don't, you're going to start wearing them. The right side is the privacy LED. When you take a photo, it flashes once. When you take a video, it continuously pulses for the duration of the video. We actually have a light detector on the LED, so when you activate the camera, we detect if light is able to enter that LED, and if it's not, it means someone's trying to cover it and the camera doesn't work.

[00:05:15] Speaker 1: My demo of these glasses was only brief, but from the now many different types I've tried, it does feel like we're on the cusp of smart glasses becoming a part of our lives. Consumers just need to be convinced they offer something we can't get from a phone or a smartwatch.

[00:05:32] Speaker 2: We believe that wearables are the best AI form factor because they're able to have the context of where you are, what you're seeing, what you're hearing, and they're the most accessible because they're already on you. The AI is going to get a lot better in this coming year. You should see a lot more coming from us.

ai AI Insights
Arow Summary
La transcription présente une démonstration de lunettes connectées (notamment les Meta Ray‑Ban) et s’interroge sur leur adoption en 2026. Le journaliste teste un affichage dans le champ de vision, un contrôle gestuel via électromyographie (capteurs au poignet) et une nouvelle fonction d’écriture de messages « sur la table » convertie en texte, ainsi que des usages comme cartes, traduction, téléprompteur et jeux. Les intervenants expliquent les défis techniques (luminosité en plein soleil, résolution, absence de fuite de lumière, poids, intégration dans un design Ray‑Ban) et les limites actuelles (déploiement retardé hors des États‑Unis). La discussion aborde aussi les inquiétudes de vie privée liées à la caméra et les mécanismes de signalement (LED de confidentialité, détection d’obstruction). Enfin, Meta défend l’idée que les lunettes seraient un « meilleur form factor » pour l’IA grâce au contexte (ce que l’utilisateur voit/entend) et promet des progrès rapides de l’IA et de nouvelles fonctionnalités, tout en reconnaissant la nécessité de convaincre les consommateurs que cela apporte plus qu’un téléphone ou une montre.
Arow Title
Les smart glasses à l’épreuve: promesses, usages et vie privée
Arow Keywords
lunettes connectées Remove
Meta Ray-Ban Remove
smart glasses Remove
réalité augmentée Remove
affichage sur lentille Remove
électromyographie (EMG) Remove
gestes Remove
écriture manuscrite Remove
notifications Remove
téléprompteur Remove
traduction Remove
cartes Remove
caméra Remove
vie privée Remove
LED de confidentialité Remove
IA contextuelle Remove
adoption Remove
mode Remove
Arow Key Takeaways
  • Les lunettes connectées gagnent en maturité avec des affichages lisibles au soleil et des interfaces gestuelles.
  • Le contrôle via EMG au poignet permet de naviguer et de sélectionner sans sortir le téléphone.
  • La saisie de messages par « écriture » (surface/air) est une nouveauté marquante, nécessitant un modèle capable de s’adapter aux écritures individuelles.
  • Les cas d’usage mis en avant: messages, TikTok, jeux, cartes, traduction et téléprompteur.
  • L’adoption reste freinée par la mode, la proposition de valeur vs smartphone/montre et la disponibilité (rollout limité).
  • La caméra est centrale (IA, capture), mais soulève des enjeux de confidentialité; Meta met en avant une LED et un mécanisme anti-obstruction.
  • Meta parie sur l’IA embarquée et contextuelle comme moteur principal de la prochaine vague de lunettes.
Arow Sentiments
Neutral: Ton globalement informatif et exploratoire: enthousiasme mesuré pour les fonctionnalités (écriture, affichage, usages) mais contrebalancé par des réserves sur l’adoption, la distraction et la vie privée (caméras, consentement des personnes autour).
Arow Enter your query
{{ secondsToHumanTime(time) }}
Back
Forward
{{ Math.round(speed * 100) / 100 }}x
{{ secondsToHumanTime(duration) }}
close
New speaker
Add speaker
close
Edit speaker
Save changes
close
Share Transcript