Meta Platforms has launched two new Ray-Ban smart glasses designed for people who need prescription lenses, expanding its wearable devices.
The announcement adds options to a product line that has gained ground in a market where many AI gadgets have found it difficult to stay resilient.
This time, the company is targeting everyday users who rely on glasses, not just early adopters.
Speaking earlier this year, chief executive Mark Zuckerberg said, “billions of people wear glasses or contacts for vision correction.” That point sits at the centre of Meta’s latest move, wanting smart eyewear to feel normal, not experimental.
The new models come in two styles, rectangular and rounded, and will be sold through regular eyewear channels. There is no display built into the lenses. Instead, the glasses rely on voice and audio, supported by a Qualcomm Snapdragon AR chipset.
Users can expect features such as real-time object recognition and location awareness. Meta has also improved the microphones and extended battery life, addressing some of the early complaints about performance.
The glasses are being developed with EssilorLuxottica, the parent company of Ray-Ban. That partnership continues to give Meta access to a well-known fashion brand, which has helped the product blend into everyday use.
Smart glasses have turned into one of the few areas where AI hardware is finding some traction. While other devices have struggled to prove their usefulness, Meta’s approach leans on something people already wear.
Still, competition is building. Apple is said to be working on its own augmented reality glasses, although its focus remains on the Vision Pro headset for now. Snap has its Spectacles, but adoption has been limited.
Meta’s edge, for now, is the mix of familiar design and added function. In adding prescription support, the company is making a case for daily use rather than occasional novelty.
Pricing has not been confirmed, though earlier versions of the Ray-Ban Meta glasses sold between $299 and $379, suggesting a similar range.
There are still questions around features that recognise objects and capture audio and how data is collected and used. Meta has not given new details on that front with this launch.




