At Meta Connect 2023, the Meta Quest 3 was announced, and I also pre-ordered and purchased it, using it with great satisfaction.

However, in the previous announcement, there was quite interesting content besides the Quest 3: it was about the "Ray-Ban Meta" product, which allows you to read Meta's thoughts on the utilization of AI.

At Connect 2023, three main topics were announced.

  1. Meta Quest 3 Product Announcement
  2. Meta AIs (Artificial Intelligence Product Family) Announcement
  3. Ray-Ban Meta Smart Glasses Announcement

Among them, the Ray-Ban Meta glasses product was a smart glasses product that added cameras and computing functions to the Ray-Ban eyewear product line, unlike Quest products, allowing control and feedback through voice recognition.

Ray-Ban Meta Smart Glasses Official Video

Ray-Ban Meta Smart Glasses

So, what exactly can this product do as smart glasses?

When you visit the product introduction on the Ray-Ban site, it discusses three representative functions. Live Stream, Capture, Listen

Content on the Ray-Ban website regarding Ray-Ban Meta glasses usage.

These are not the type of AR glasses we commonly talk about; they are smart glasses that allow you to listen but not view a screen, take photos of the outside through a camera, and only live stream within the limited Meta ecosystem (Instagram, Facebook, etc.).

Meta has not announced smart glasses for the first time. In the past, they announced smart glasses under the name 'Ray-Ban Stories'.

Introducing New Smart Glasses, Ray-Ban Stories! | Meta About
Meet the smart glasses, 'Ray-Ban Stories', developed in collaboration between Facebook and EssilorLuxottica!

However, I remember it as a product that failed to generate market resonance, as it lacked performance and did not appear to be officially released in Korea for various reasons.

What changes have been made to the newly released product?

However, it has been revealed that the newly released Ray-Ban Meta features the Snapdragon AR1 Gen1 chipset, which is known to have been produced by Qualcomm for the AR glasses platform. It is said to be the first case of a commercially applied chipset produced by Qualcomm for an AR platform.

Image source: https://www.hardwarezone.com.sg/tech-news-ray-ban-meta-smart-glasses-2023-ar-augmented-reality-qualcomm-singapore-price-availabilty-features-specs

Therefore, it is said that high-quality video and photo shooting are now possible compared to the past, the sound effects have improved, and noise cancellation for voice commands is also included, making the hardware specifications much higher overall.

Personally, I am curious why the AR2 Gen1 chip announced last year hasn't been applied yet, but it is certainly clear that there have been many improvements in specifications compared to the past.

Therefore, compared to users who have used previous devices, it shows photo and video quality that is quite worth using, and it is said that satisfaction with actions such as filming and uploading photos has increased significantly, as Meta has done as it said, limited to Meta's ecosystem (Facebook, Instagram).

Ray-Ban Meta glasses review, Source: Engadget

Ultimately, this product can be interpreted as a way to examine Meta's direction regarding Qualcomm's AR glass platform, and it prompts thoughts on whether many similar smart glasses might emerge in the future.

In short, it can be viewed as a product where computing chipsets are embedded in the glasses frame, along with a camera, external speaker, and microphone.

Even if you might think, "Do I really need to use a product like this?" for users like me who wear glasses every day and like Ray-Ban's eyewear frame designs, if the price is reasonable and the usability is high, it appears to be a product worth considering.

However, I still have significant doubts about its potential, as I cannot help but wonder if it will actually be "useful?"

Will such hardware advancements alone enable smart glasses, which have long been overlooked in the market, to become widely adopted by users?

Key - AI for User Convenience

Indeed, Meta has released smart glasses in the past, but failed to generate a significant market response. Even though the hardware specifications have been significantly improved this time, if the product fails to derive a solution to the issues seen in the past, it is unlikely to generate a positive response.

Therefore, Meta proposed a solution for this: "AI Utilization."

Ray-Ban Meta smart glasses are said to be the first device with built-in Meta AI.

Viewing this device through a more engineering-focused lens, it can be seen as a product possessing the following features.

  1. It is a pair of glasses featuring computing capabilities powered by the Qualcomm AR platform chipset.
  2. It allows for Full HD-level video recording through external cameras and also takes photos.
  3. Users can input voice commands using the microphone or touch gestures.
  4. Feedback provided by the device is delivered through "voice" guidance.

Since the advent of computers in human history, and until mobile devices became the mainstream, the field that has seen new devices emerge, research, and development for widespread adoption is HCI (Human Computer Interaction).

While we often view UX as a domain we encounter, it was a field that was truly researched extensively in the early stages for mass adoption. If HCI is not excellent, users' usability will significantly drop, leading to frequent failures in mass adoption.

undefined
An image that makes it easy to understand HCI-related content. Image source: https://en.wikipedia.org/wiki/Human–computer_interaction

Simply put

"For a new platform to survive, it must be easy to use. "

It seems that Meta has invested its excellent AI resources to improve the usability and user experience of its smart glass products.

Of course, it is a pity that this is limited to the North American English-speaking environment, but it is a part to look forward to seeing how the results will turn out.

To understand the user experience Meta aims to present, it appears they are improving the relatively sparse user feedback compared to other platforms by utilizing voice recognition technology, similar to Siri, Bixby, and Google Assistant, to increase the utility of smart glasses products.

Google Assistant, Siri, Bixby, Alexa Which Is Best? - Gizchina.com
Voice AI services that have been around for quite a while.. it seems almost none of them are used properly. Image source: https://www.gizchina.com/2022/12/26/google-assistant-siri-bixby-and-alexa-which-works-best-lets-find-out/

Additionally, looking at known information or user reviews uploaded on overseas YouTube, it appears that by utilizing AI functions to understand user input and provide feedback through voice technology, and using the connected smartphone when necessary to convey information that is difficult to convey via voice alone, the user experience is extended, presenting an environment where users can utilize these features.

It is said that the device provides functions such as checking content on a phone according to voice guidance while using Meta smart glasses, or checking what was said as a script. Source: Engadget

Of course, since I haven't used this product yet, I don't know what the actual experience will be like, but if it is utilized well, it seems like quite a good scenario.

Considering the time period when this product is produced and sold, the most high-performance Qualcomm chipset was applied as a platform for smart glasses. I expect that if Meta, a strong player in the AI field with a higher understanding and utilization of AI technology than ever before, utilizes AI, the gap in HCI for smart glasses products can be resolved.

In other words, improved hardware performance and AI software support technologies should make using the device convenient and highly satisfying.

Exploring the Direction of AI Utilization Through Ray-Ban Meta Smart Glasses

I am personally quite skeptical about indiscriminate AI omnipotence.

However, if used appropriately and with a clear direction for utilization, I believe they will undoubtedly become a partner that enriches our lives.

AI technology did not suddenly appear out of nowhere; rather, as computer technology has developed and spread, it has long been proposed, used, and evolved as a methodology for tasks that are more efficient than humans at simple repetitive calculations and can be performed at much lower cost.

Of course, there are currently so many stories centered on generative AI that even ordinary people like me can easily use and experience its power, but I still believe the essence remains the same.

There is a field that can be utilized more efficiently than human intuition and labor.

Therefore, Meta's somewhat unconventional combination of smart glasses and AI could serve as the key to bridging the gap between the hardware of a new platform and the user experience, rather than being merely for marketing. Through this case, I hope to see an example of how AI technology can enrich our lives.