Wednesday, September 17th, 2025, could be the date marking the broad accessibility of Smart, AR powered glasses, to the public. Meta’s CEO Mark Zucherberg introduced the new Meta Ray-Ban Display glasses during a live keynote in the USA. The core statement? The new glasses could bridge the gap between current smart glasses and real augmented reality (AR) (Soni & Wang, 2025). But Meta’s strategic interests appear to extend beyond product excellence in smart glasses.
The new technical innovation is a built-in display that appears a few feet in front of you. This display is only visible to you and as Zuckerberg stated, it might be the first form where AI can see what you see, hear what you hear and interact to you throughout the day. To interact with the AI these smart glasses introduce another new idea. An accompanied wristband that can detect muscle signals lets users control the AR display discretely by just some subtle hand actions. This is a new form of hands-free interaction as it involves neither voice nor text-based interactions (Soni & Wang, 2025).
Providing AR-powered smart glasses is of course not Meta’s only objective. They want to strengthen their platform dominance by integrating these glasses to their ecosystem. Users can make video phone calls, reply to messages, scroll through Instagram reals or take and post pictures in real time. The apps used for these actions are WhatsApp, Messenger and Instagram which are all part of Meta’s ecosystem. The smart glasses use Meta’s AI model to provide context aware information and come up with real time answers to the user’s question. Being able to perform all these interactions hands-free and having an AR empowered display just within your eyesight might boost the adoption rate of smart glasses (Song, 2025).
So, what is Meta’s main objective with this smart glass revolution? In the end it might be about the data these smart glasses capture and how they can improve the development of Meta’s AI models. During the past years human interaction with AI was mainly text-based. Advances in emerging technologies now make a hands-free interaction possible and having the best voice or eyesight-based AI models on the market might be the next competitive advantage Meta is looking to gain. Additionally, user adoption of Meta’s smart AR glasses would make users even more dependent on Meta’s ecosystem which in turn creates more data that can boost Meta’s platform-based business model (Liang et al., 2022).
To sum it up, while these innovative AR-powered smart glasses are very exciting for us users’ one should realize the importance of the collected data in a world where the whole tech industry is focused on building the best AI models possible.
References:
Soni, A., & Wang, E. (2025, September 18). Meta launches smart glasses with built-in display, reaching for “superintelligence”. Reuters. https://www.reuters.com/business/media-telecom/meta-launches-smart-glasses-with-built-in-display-reaching-superintelligence-2025-09-18/
Song, V. (2025, September 18). I regret to inform you Meta’s new smart glasses are the best I’ve ever tried. The Verge. https://www.theverge.com/tech/779566/meta-ray-ban-display-hands-on-smart-glasses-price-battery-specs
Liang, W., Tadesse, G. A., Ho, D., Fei-Fei, L., Zaharia, M., Zhang, C., & Zou, J. (2022). Advances, challenges and opportunities in creating data for trustworthy AI. Nature Machine Intelligence, 4, 669–677. https://doi.org/10.1038/s42256-022-00516-1