The battle between smart glasses and smartphone It's no longer science fiction: major brands have made their move and are aiming for a more natural computing experience, one that lives in our eyes and ears without chaining us to a screen in our hand. The debate is serious because mobile phones haven't made any disruptive leaps in a decade, while smart glasses open up new forms of interaction and continuous use contexts.
Since the launch of the iPhone in 2007, the smartphones have dominated Personal computing. They remain at the heart of the digital ecosystem, but their recent value proposition has been limited to better cameras, thinner screens, and, lately, AI features that are often more of a slogan than real change. Meanwhile, smart glasses have passed From clumsy prototypes to products that, although immature, already solve everyday use cases.
Context: From ubiquitous mobile to forward-looking computing

We look at our phones with our heads down at home, on the street, and at work; and that's not exactly healthy. The logical alternative is that the information appears where we lookwithout diverting attention from the surroundings. This is where the glasses come in, not the bulky mixed reality headsets designed for short periods of time or complex scenarios like games and movies.
El winning format aims to small mounts that pass for real glasses. The rest are headsets for sessions limited by weight and ergonomics. Even Apple's project to replace the PC with a premium headset isn't going as well as they hoped: the concept is dazzling, but continuous use is hampered by cost and comfort.
In this shift towards the everyday, Meta stands out with her Ray-Bans and, above all, Google with Android XRTheir idea is crystal clear: if you want a device you'll wear all day, it should look and feel like normal glasses. No disguises.
Let's add to this a paradigm shift in AI: the conversational models They want to see and hear what you see and hear. That an assistant perceive camera and screen It opens up hands-free and contextual experiences that don't fit the same way on a mobile phone.
Google, Android XR and the bet on lightweight glasses
Google showed at its most recent I/O that Android XR It's not just for helmets: its aim is to bring the system to conventional mounts and place a assistant in your field of visionThe vision is ambitious: to free you from your mobile phone, allow more natural interactions with the environment, and have AI act as your co-pilot.
The company claims to have carried more than a decade working This concept is now truly mature, and the technology is finally maturing. What's needed? A camera, microphone, speakers, and a connection to your phone. With that foundation, Android XR can already run your apps and services without turning the glasses into a technological brick. The more advanced models add a screen on the lens to display data privately.
What can be done now? Send messages, ask Google Maps directions, take photos that go directly to Google Photos, or translate conversations in real time While you're talking to someone else. The pattern is clear: the system sees and hears the same things you do to assist you without interrupting.
The key to success is not just technique, but also... ecosystem and styleGoogle has announced collaborations with Gentle Monster and Warby Parker to create designer frames, and its alliance with Samsung aims to go beyond visors and also reach traditional glasses.
Among headsets and goggles: Vision Pro, Quest 3, XREAL and VITURE
If we think about the experience closest to the ideal of mixed reality that everyone imagines, the apple vision pro They are at the top: virtual objects that anchor to space, occlusion with real elements, realistic shadows, and an interaction that feels natural.
But there are significant trade-offs: the Vision Pro uses camera passthrough video, so there's a slight lag which makes activities like cycling or playing tennis a bad idea. Plus, it's expensive, bulky, and heavy to wear for hours.
The Meta Quest 3 headset is lighter and more affordable, but it doesn't quite reach the precision of the Vision Pro. The environment is seen with lower resolution and with distortion: objects in the real world are not always where they should be according to your perception.
Another category is glasses that connect to a mobile phone or PC and act as external display, like XREAL or VITURE. The content is projected using prisms, centered in your field of vision, and you can adjust the opacity to see more or less of the real world. The brightness is also adjustable, and distortion of the real world is virtually nonexistent because you see it directly.
- Featured devices: Samsung Gear VR (uses a Galaxy as a screen), HTC Vive XR Elite (high-end VR with PC), Oculus Quest 2 and Quest Pro (standalone wireless), Sony PlayStation VR (for PS4), Microsoft HoloLens 2 (professional mixed reality) and Pico 4 VR (business and educational focus).
- On the Apple front, in addition to what has already been proven, its ecosystem revolves around high resolution screensEye and gesture tracking, spatial audio, ARKit/RealityKit, a constellation of LiDAR-like cameras and sensors, and voice control with Siri.
Consumer glasses: Ray-Ban, Oakley Meta, and a bracelet that reads your gestures
Meta has hit the jackpot in consumerism by combining fashion and technology. Ray-Ban by Meta They look like ordinary glasses, but they integrate cameras, a microphone, and speakers to record video and audio, take photos with your voice, discreetly listen to music or podcasts, and even translate conversations on the fly. In some markets, they connect to advanced AI assistants like Meta AI.
Proposals such as the following have also been seen: Oakley MetaThese devices embrace the same sporty design approach, incorporating hands-free capture, voice control, and synchronization with assistants. They are already being used to support events, replacing teleprompters and translation systems, although it's worth remembering that these tools are meant to enhance our abilities, not replace them.
Meta has also shown off some glasses with screen on the lens Very discreet: they project notifications, WhatsApp messages, turn-by-turn directions, or translations just below the main field of vision, and disappear when not needed. To control them, they've created the Neural Band, a bracelet that interprets muscle impulses from the wrist and converts them into gestures.
The battery is estimated at around 6 hours of continuous use, with a case that adds up to 24 hoursThe demo wasn't without connection issues, but the roadmap is clear: launch in the United States first, followed by international expansion. In certain countries, some AI features may take longer to arrive due to regional support limitations.
Interaction, sound and screens: talking, gesturing and watching without distractions
Smart glasses explore ways to more natural interactionVoice commands for quick alerts, subtle gestures read from the wrist, and cameras to understand the visual context. The learning curve is shorter than with a helmet because the device doesn't isolate you; it complements what you're looking at.
In audio, the use of bone conduction It allows you to listen without headphones by transmitting vibrations through your skull. It's not audiophile-grade Hi-Fi, but it handles calls, directions, and voice assistant responses without blocking your ear.
The screen in front of our eyes is reminiscent of Car HUDEssential data hovering at the edge of your attention, without monopolizing your gaze. It's surprising that, after Google Glass, this technology has evolved more slowly than expected, but the leap in integration and autonomy is underway.
Specific applications: walking around the city you could see restaurant reviewsThe history of a building or the availability of a nearby store. In leisure, a private screen always with you for videos and games opens up uses that a mobile phone can't replicate as conveniently.
Connectivity, convenience, and social acceptance
With 5G and eSIM, the glasses can have permanent connection Without relying so heavily on your phone, it's easier to work, learn, or socialize anywhere. The difference with a smartphone is that distractions are reduced: you don't have to take anything out of your pocket or look down.
Furthermore, glasses are a culturally accepted object. For those who already wear frames, wearing them is a good idea. technology in the temples It feels more natural than holding a phone all day. In sports and physical activities, they weigh less and are less cumbersome than looking at a screen.
Market and supply chain: three product routes and a fierce race
Today, three major categories coexist AI glasses According to their complexity: 1) AI Audio, with speakers for voice input and audio output; 2) AI Camera, which adds optical sensing to the above; and 3) AI + AR, with screens for visual output in addition to audio.
Prices also follow a gradient: those of AI audio moves In the 1.000–1.500 yuan range, AI cameras range from 1.500–2.000 yuan, and AI + AR cameras are priced above 3.000 yuan. This stratification reflects the technical difficulty and the costs of optics, sensors, and power.
The value chain adds pieces to the world of traditional optics: chips, waveguide modulesaudio, sensors, power sources and thermal management, as well as hardware-software integrators, AI model providers and system and application solutions.
Whoever dominates performance and manufacturing will come out on top. For every 1% improvement in performance With the waveguide, the total cost of the device can drop to between $3 and $5. When mass-market micro-LED technology matures, the optical engine could fall from $80 to $40.
The Chinese market is booming: between January and February 2025 alone, the online sales of glasses Home-based AI-powered glasses saw a 188,5% year-on-year increase. If 2024 was the first year of AI glasses, 2025 is shaping up to be the battle of the hundred frames, with capacity secured by ODM/OEM companies like Goertek, Luxshare, Huaqin, and Longcheer for multi-million dollar orders.
In AR technology, key players include the 'Big four' (Rokid, RayNeo, XREAL, and INMO), mobile giants anticipating a second growth spurt (Xiaomi, Huawei, Samsung), and internet giants (Baidu, ByteDance, Alibaba) pushing AI models and cloud services. Traditional optical brands, with thousands of stores, will be key in optometry and eyeglass fitting.
If domestic shipments of AI glasses reach 3,5 million in 2025 with a 30% replacement rate, they will significantly impact the millions of conventional glasses currently sold. This will involve optics, chips, and AI models. cross the critical pointThere will not be a single winner, but a migration of habits: more than 50% of information gathering could move to those few square centimeters in front of our eyes.
The evolution of capabilities can also be classified into levels from L1 to L5 (super intelligent entity)The industry has completed the basics (L1) and is moving towards L2, with plans to reach L4 around 2027. As XREAL itself points out, the big challenge in L4 is balancing computing, features and battery life in a design you'll still want to wear.
Business models: hardware, services and verticals
Today, the bulk of the income comes from hardware salesWith margins constrained by the supply chain (the optical waveguide accounts for a significant portion of the cost) and brand premiums, supplier optimization will be key to lowering prices.
In B2B, vertical solutions are flourishing: civil aviation inspection with AI glasses and packaged services hardware + software; surgical navigation or virtual labs with rental or usage fees; and retail with virtual trials and guided tours, charging for integration and maintenance.
There's also room for cloud-based revenue: photo and content storage with service fees (for example, charges for storing material on servers). In the future, an app store for AI glasses would allow revenue sharing with developers, as is already the case with mobile devices.
Marketing and product have a task: to integrate AR into customer experiences and operations. AR QR codes bring 3D content ubiquitously from any surface; today they are scanned with a mobile phone, tomorrow with the glasses themselves, enabling demonstrations, virtual tours and new ways of building loyalty.
Challenges: privacy, energy, and ecosystem maturity
Privacy is a priority. In the EU, cameras in glasses They are categorized as high risk under GDPR, with penalties that can reach 4% of global revenue for violations. In China, the Personal Data Protection Law specifically regulates the use of wearables.
Energy is another obstacle. Intensive use can leave the... batteries shakingThe Ray-Ban lenses from Meta last around 30 minutes at full power, and while dual-chip solutions extend that to 2 hours, they increase the bill of materials cost by about $15. Here, ingenuity in efficiency is key.
The third in discord is the application ecosystemWithout compelling use cases, there's a risk of repeating the mistakes of some smartwatches that people abandoned after three months. That's why Meta and Huawei, among others, are deeply integrating with their operating systems and offering seamless functionality across devices to retain users.
In terms of safety and ergonomics, the video delays With pass-through glasses, there are clear limits: walking is fine; cycling or fast-paced sports are best avoided. It's essential that the glasses don't get in the way or attract excessive attention if you want to wear them all day.
The sector's direction is clear: a gradual transition from audio and camera glasses to models with AR displays, while optics, chips, and AI models mature. Smartphones aren't going awayBut many queries and micro-interactions that we currently make by lowering our heads will shift to mounts that face forward.