Comparing AR and VR in Beauty Industry: the L’Oréal Case

19

September

2025

5/5 (1)

In the age of digital transformation, many companies started to implement Generative AI in their strategy and business models. Terms such as AR or VR have become the new “trend” and competitiveness. One example is L’Oréal, one of the world’s leading companies, which have recently implemented AR and VR technologies in their business.

Even if these terms are frequently heard, the difference between AR and VR will be explained for clearness and in order to facilitate the reading. Augmented Reality (AR) starts with physical things and shows digitally what is physically present, through a smart device, using the data stored in the digital twin cloud (Porter, M., & Heppelman, J., 2025). Virtual Reality (VR) creates a fully new virtual experience, by using computer-generated images and a headset. You directly step into another reality, experiencing sensory and visual engagement (Coursera, 2025).

So, how did L’Oreal implement these new digital technologies?

As the leading company in beauty industry, L’Oréal owns a portfolio of thirty-two diverse and complementary brands (Mechdyne, 2017), with generated sales of over forty-three billion euros in 2024 (L’Oréal Finance, 2024). L’Oréal products are present in multiple physical distribution channels, such as mass market, department stores, pharmacies, hair salons, as well as in the e-commerce channel. (Mechdyne, 2017).

Few years ago, L’Oréal, in collaboration with Google Cloud and Capgemini, implemented products’ digital twins, scannable via a QR code found on the physical item (Capgemini, 2022). With this GenAI implementation, L’Oréal aims to improve customers relationship and loyalty by improving its transparency, offering personal education and guidance, and increase brand trust (Capgemini, 2022). Moreover, this creates better customer experience and adds value to the company by boosting the company’s digital business model and giving them competitive advantage.

By scanning the QR code, different product features appear, namely ingredients, formula, and sourcing (Capgemini, 2022). Moreover, customers have access to much more information regarding personalised professional tips (Capgemini, 2022). Furthermore, L’Oréal, in collaboration with NVIDIA and Accenture, recently founded the first AI-powered multi-brand marketplace focused on beauty products, Noli.com (Martin, 2025). Using AR features, the customer can scan their face, and the platform creates the so-called “BeautyDNA” using a large amount of skin data and analyses on product formula (Martin, 2025). Following this, it suggests the perfect combination of products, personalised by each customer, additionally offering different options and their associated benefits, so that the customer can choose which product decides to buy (Martin, 2025). Moreover, the user has the possibility to directly iteratively discuss with the app, and to access information about the ingredients and their chemical formulas (Martin, 2025). In addition, a tutorial of how to apply the products, the possibility of directly purchasing from the app, and customer reviews are available. (Martin, 2025)

Internally, L’Oréal incorporated VR implementation in their Labs from Paris and New York. This allows to project future store concepts, evaluate packaging, and create store layouts before the physical development begins. Furthermore, L’Oréal offers this feature to their retail partners as well, to get a better idea of the products and shelves before the implementation starts (Mechdyne, 2017).

What is next?

L’Oréal’s last year initiative, CreAITech, an AI-powered beauty content lab, has already been implemented on La Roche-Posay and Kérastase to generate content and beauty images (Dominguez, 2024). The platform is used for marketing purposes, image creation, product launches acceleration, time, and production costs savings, and it is expected to develop further in the upcoming years (Doolan, 2025; Dominguez, 2024).

What do you think about this digital implementation within the beauty sector, and the emerging opportunities? What could be some limitation that you think about?

References

Capgemini. (2022, September 26). Product digital twins bridge the digital and physical for L’Oréal. Capgemini. https://www.capgemini.com/news/client-stories/product-digital-twins-bridge-the-digital-and-physical-for-loreal/#

Coursera. (2025). Augmented Reality vs. Virtual Reality: What’s the Difference? Coursera. https://www.coursera.org/articles/augmented-reality-vs-virtual-reality?msockid=034f5e168a4561cf12644b498b8460f7

Dominguez, L. (2024, May 24). L’Oreal’s Generative AI Lab Driving Content and R&D Personalization. Consumer Goods Technology. https://consumergoods.com/loreals-generative-ai-lab-driving-content-and-rd-personalization

Doolan, K. (2025, June 12). L’Oréal Group plans to use 3D AI for product imagery. CosmeticsDesign-Europe.com. https://www.cosmeticsdesign-europe.com/Article/2025/06/12/loreal-plans-to-use-3d-ai-for-product-imagery/

L’Oréal Finance. (2024). 2024 Annual Results | L’Oréal Finance. Loreal-Finance.com. https://www.loreal-finance.com/eng/press-release/2024-annual-results

Martin, A. (2025, June 11). Retail Reboot: Major Global Brands Transform End-to-End Operations With NVIDIA. NVIDIA Blog. https://blogs.nvidia.com/blog/retail-agentic-physical-ai/

Mechdyne. (2017, June 28). L’Oréal Shares Their Beauty Expertise While Slashing Development Costs with Mechdyne VR Solution. AV & vr Solutions. https://www.mechdyne.com/av-vr-solutions/blog/loreal-shares-their-beauty-expertise-while-slashing-development-costs-with-mechdyne-vr-solution/?utm_source=chatgpt.com

Porter, M., & Heppelman, J. (2025). Whiteboard Session: Why Every Organization Needs an AR Strategy. Hbr.org. https://hbr.org/video/5809961699001/whiteboard-session-why-every-organization-needs-an-ar-strategy

Please rate this

Meta’s Ray-Ban Glasses Just Levelled Up

18

September

2025

No ratings yet.

———-

Do you remember Meta’s Ray-Ban glasses from 2023? You probably do (since we just mentioned them in class), but they weren’t exactly ground-breaking. But on September 30th, the second generation is being released, and this time the air smells different.

This iteration of the Meta Ray-Ban Display features an in-lens display visible only to the wearer, marking a significant step toward AR technology. While it isn’t true Augmented Reality yet, since the display doesn’t interact with your surroundings, this is a sign that the technology is rapidly advancing in that direction. More interesting is how you control it. The glasses connect to a neural wristband, a watch-style band that detects electrical impulses from your wrist muscles. This means you can control the display with subtle gestures, even from inside your pocket, unlike older camera-based tracking systems.

But is this truly disruptive? Not yet. At $800, it’s positioned like a flagship phone, but still lacks a broad app ecosystem. There is also a social barrier: are people willing to accept chunky glasses and an always-ready camera in shared spaces? Secondly, Meta´s reputation is fragile when it comes to trust and privacy. Clear recording indicators, strict on-device processing, and transparent data will matter just as much as the spec sheets. Also, the possibility of ads or brand placements drifting into your field of view is non-zero. One thing is sure, stronger privacy regulation will be crucial.

If those concerns are addressed, the upside is real: live captioning and translation, live guided navigation, quick capture and messaging, all controlled with a flick of fingers from a pocket.

But your phone can breathe a sigh of relief…

(for now)

References:

https://www.meta.com/nl/en/ai-glasses/meta-ray-ban-display

https://www.theguardian.com/technology/2025/sep/17/meta-ray-ban-smart-glasses

Please rate this

Philips and the Rise of Digital Twins: From Smart Systems to Smarter Lives

18

September

2025

No ratings yet.

You probably think of cars, airplanes, jet engines, or heavy machinery when you hear the term ‘digital twin.’ Conversely, Philips likely evokes images of hospital monitoring, TVs, or shavers. Nonetheless, Philips, a world leader in healthcare technology, is changing the story of the digital twin by taking it from the factory floor to the hospital setting and, eventually, to the human body.

From Buildings to Bodies
Digital twins, virtual models of physical systems, have long been used to optimize industrial operations (Emmert-Streib, 2023). Now, Philips is applying these principles to healthcare, starting with infrastructure.

In a recent hospital demonstration (Philips Healthcare, 2023), a care unit was digitally replicated and simulated to track patient flow, staff shifts, and room capacity. By adjusting parameters like staff availability and care demand, the model revealed impacts on key performance indicators such as discharge times. As such, data derived from these models provides administrators with powerful insights to optimize hospital operations.

Predictive Machines and Personalized Organs
Philips isn’t stopping at buildings. Their MRI systems now use digital twin models to track performance, forecast failures, and guide maintenance. Combining live sensor data with historical information, these simulations predict machine states, moving healthcare from reactive to predictive servicing (Philips, 2018a). In clinical settings where downtime delays diagnoses, foresight like this can be lifesaving.

The company has also ventured into modeling the human heart. In 2015, it introduced HeartModel, which generates personalized 3D heart simulations using ultrasound data (Philips, 2018b). By tailoring these anatomical models to individual physiology, clinicians can better evaluate cardiac function and plan treatments. Yet challenges remain. No two hearts are identical, and building universally accurate models is complex (Philips Nederland, 2022). Therefore, instead of replicating the entire human body, Philips now focuses on modular ‘building blocks’ that already add clinical value, such as single-organ models in cardiovascular care (Philips Nederland, 2022).

Beyond Twins
Digital twins are just one part of Philips’ broader vision. The company is also exploring technologies like virtual reality (VR) and augmented reality (AR). VR, for instance, would enable simulations of lifelike medical scenarios, allowing clinicians and students to practice complex procedures in safe, controlled environments. AR holds promise in surgery: imagine overlaying patient-specific 3D models onto the body, enabling surgeons to ‘see through’ the skin and anticipate anatomy before operating (Philips, 2018b).

Why this matters now
These innovations arrive at a critical moment, as healthcare systems are under immense pressure. According to the Future Health Index 2025, over 30% of patients experience worsening conditions due to delays, and 1 in 4 are hospitalized before seeing a specialist (Philips, 2025b). AI-powered digital twins could help ease these burdens by streamlining diagnoses, predicting complications, and personalizing care.

However, adoption isn’t straightforward. While 82% of healthcare professionals believe AI tools can save lives, only 59% of patients share that trust (Philips, 2025a). Concerns over accuracy, ethics, and data security remain barriers, highlighting that building public confidence is as important as advancing the technology itself.

A New Kind of Value
Philips’ transformation is not just technological, it’s strategic. Mapped onto the Business Model Canvas, Philips’ trajectory is clear. Key resources now extend beyond hardware to include AI, cloud platforms, and patient data. Customers increasingly consist of hospitals, clinicians, and health systems, and revenue streams increasingly revolve around ‘insight-as-a-service’ (McKinsey & Company, 2023), marking a shift from product-driven to data-driven ecosystems (Weill & Woerner, 2015).

The Future
So, digital twins are more than a breakthrough, they represent a shift towards predictive, personalized care that could redefine the future of healthcare. Ultimately, their impact depends not just on innovation, but on society’s willingness to embrace it.

As long as these tools remain complements to existing workflows, they have my trust. What about you, do you trust these developments?

References
Emmert-Streib, F. (2023). What is the role of AI for digital twins? AI, 4(3), 721–728. https://doi.org/10.3390/ai4030038

McKinsey & Company. (2023). How healthcare systems can become digital-health leaders. McKinsey & Company. https://www.mckinsey.com/industries/healthcare/our-insights/how-healthcare-systems-can-become-digital-health-leaders

Philips. (2018a, August 30). The rise of the digital twin: How healthcare can benefit. Philips Global. https://www.philips.com/a-w/about/news/archive/blogs/innovation-matters/20180830-the-rise-of-the-digital-twin-how-healthcare-can-benefit.html

Philips. (2018b, November 12). How a virtual heart could save your real one. Philips Global. https://www.philips.com/a-w/about/news/archive/blogs/innovation-matters/20181112-how-a-virtual-heart-could-save-your-real-one.html

Philips Nederland. (2022, May 19). Met een digitale tweeling kunnen we voorspellen hoe een patiënt reageert. Philips Nederland. https://www.philips.nl/a-w/about/news/archive/standard/about/news/articles/2022/20220519-met-een-digitale-tweeling-kunnen-we-voorspellen-hoe-een-patient-reageert.html

Philips. (2025a). Future Health Index 2025: Building trust in healthcare AI. Philips Global. https://www.philips.com/a-w/about/news/future-health-index/reports/2025/building-trust-in-healthcare-ai

Philips. (2025b, May 14). Philips Future Health Index 2025: AI poised to transform global healthcare, urging leaders to act now. Philips Global. https://www.philips.com/a-w/about/news/archive/standard/news/press/2025/philips-future-health-index-2025-ai-poised-to-transform-global-healthcare-urging-leaders-to-act-now.html

Philips Healthcare. (2023, February 16). Optimal care system design using Digital twin [Video]. YouTube. https://www.youtube.com/watch?v=2Bf6VfDVtmU

Weill, P., & Woerner, S. L. (2015). Thriving in an increasingly digital ecosystem. MIT Sloan Management Review, 56(4), 27–34. https://sloanreview.mit.edu/article/thriving-in-an-increasingly-digital-ecosystem/

Please rate this

Will Vehicles Be the Most Powerful Terminal Device in the Digital Era?

20

September

2024

5/5 (1)

In the movie Captain America 2, the director of SHIELD drove a Chevrolet Suburban equipped with artificial intelligence, and successfully escaped the enemy’s blockade with the help of automatic maintenance, real-time analysis of road conditions and autonomous driving. We may never have a war vehicle equipped with machine guns and artillery like him, but the introduction of various new technologies has made the arrival of smart vehicle just around the corner.

Why Are Vehicles So Representative?

As a representative product of the digital era, the innovation of the automotive industry is closely related to many technological advances. First of all, the new form of energy – electric vehicles make it easier for computers to take over the energy management and driving of vehicles. The introduction of cloud computing and artificial intelligence has further enhanced the capabilities of vehicles. A large amount of data is transmitted between the vehicle and the cloud servers, and the on-board autonomous driving system analyzes road conditions in real time. In this regard, we have learned about Tesla’s FSD (full-self driving) which is pure vision solution, and there are also manufacturers such as Nio that are using lidar solutions. Even if AI is not completely taken over, the combination of AR applications and HUD (head-up display) functions can make human drivers’ own driving easier and safer.

Tesla FSD user interface.

What Is the Current Situation of the Automotive Industry?

Less than 20 years after the release of the first prototype, Tesla has surpassed Volkswagen, General Motors and Toyota to become the world’s most valuable automotive manufacturer. In contrast to Tesla’s success, the market share of some traditional brands with a long history continues to shrink. Industry giants such as Porsche and Mercedes-Benz have also begun to transform to electrification and intelligent driving. Behind the decline of old-era products and the prosperity of new-era products is the “digital disruption” that we are familiar with.

Mercedes-Benz Vision Avtr, steering wheel-free autonomous driving.

How to Imagine the Future?

If we regard all vehicles on the road as mobile large computers, the imagination space will be very broad. Reliable and powerful hardware (think of stable high-voltage power supply and complex heat dissipation technology) will enable vehicles to become the largest and most powerful terminal devices in the digital era. What else can we expect? AI models can be deployed locally instead of in the cloud; cockpits equipped with VR devices can serve as our entry into the world of metaverse.

Referances

Wu, A. (2024) The Story Behind Tesla’s Success (TSLA). https://www.investopedia.com/articles/personal-finance/061915/story-behind-teslas-success.asp.

Staff, N. a T.A. (2024) Tesla Releases FSD v12.4: New Vision Attention Monitoring, Improved Strike System With Update 2024.9.5. https://www.notateslaapp.com/news/2031/tesla-releases-fsd-v12-4-new-vision-attention-monitoring-improved-strike-system-with-update-2024-9-5.

VISION AVTR | Future Vehicles (no date). https://www.mercedes-benz.ca/en/future-vehicles/vision-avtr#gallery.

Please rate this

Bridging the Gap Between AR, AI and the Real World: A Glimpse Into the Future of Smart Technology

12

September

2024

5/5 (3)

Apple’s recent keynote showcased new products, including the iPhone’s groundbreaking AI integration. However, when you break it down, what Apple has really done is combine several existing technologies and seamlessly integrate them, presenting it as a revolutionary technology. This sparked my imagination of what could already be possible with existing technologies and what our future might look like. This sparked my imagination about what could already be possible with today’s technology—and what our future might look like.

Apple introduced advanced visual intelligence, allowing users to take a picture of a restaurant, shop, or even a dog, and instantly access a wealth of information. Whether it’s reviews, operating hours, event details, or identifying objects like vehicles or pets, this technology uses AI to analyze visual data and provide real-time insights, bridging the gap between the physical and digital worlds. Tools like Google Image Search and ChatGPT have been available for some time, but Apple has taken these capabilities and seamlessly integrated them into its ecosystem, making them easily accessible and more user-friendly [1]. The Apple Vision Pro merges AR and VR, controlled by moving your eyes and pinching your fingers [2]. I’ve tried it myself, and it was incredibly easy to navigate, with digital content perfectly overlaying the physical world. Now imagine the possibilities if Apple integrated the iPhone’s visual intelligence into the Vision Pro. This headset wouldn’t just be for entertainment or increasing work productivity; it could become an everyday wearable, a powerful tool for real-time interaction with your surroundings.

Picture walking through a city wearing the Vision Pro. By simply looking at a restaurant and pinching your fingers, you could instantly pull up reviews, check the menu, or even make a reservation. Or, if you see someone wearing a piece of clothing you like, you could instantly check online where to buy it, without needing to stop. With these capabilities, the Vision Pro could bring the physical and digital worlds closer together than ever before, allowing users to interact with their environment in ways we’re only beginning to imagine.

Do you think the existing technologies can already do this? Do you think this is what the future would look like? I’m curious to hear your thoughts.

Sources:

[0] All images generate by DALL-E, a GPT made by ChatGPT.

[1] https://www.youtube.com/watch?v=uarNiSl_uh4&t=1744s

[2] https://www.apple.com/newsroom/2024/01/apple-vision-pro-available-in-the-us-on-february-2/

Please rate this