
Meta (formerly Facebook) has rapidly expanded its smart glasses lineup under the Ray-Ban and Oakley brands, leveraging AI to deliver hands-free computing. Its product line now includes camera AI glasses (Ray-Ban Meta and Oakley Meta), a new display AI glasses (Meta Ray-Ban Display), and future AR glasses in development.
The latest Meta Ray-Ban Display glasses (launched Sept 2025) embed a small high-res screen in the lens, while Oakley Meta glasses (2025) focus on sports performance. All models feature “Hey Meta” voice assistants, on-device cameras (12 MP on new models), open‑ear speakers, and integration with Meta’s AI.
In contrast, earlier smart glasses like the original Ray-Ban Stories (2021) were limited to photo/video capture and basic voice commands. Below we detail the design, AI features, hardware specs, and ecosystem of Meta’s glasses – and compare them to contemporaries like Ray-Ban Stories and Snap’s Spectacles.
Design and Hardware
Meta’s AI glasses look much like stylish sunglasses. The Ray-Ban Display glasses adopt a modified Wayfarer frame: thicker temples and a squared-off nose area for a universal fit. They weigh only ~69 g and include Transitions® photochromic lenses (standard on all frames) that tint in sunlight.
Internally, the right lens hides a 600×600-pixel OLED display (20° diagonal FOV) that appears in your field of vision only when activated. Despite up to 5,000 nits peak brightness, the display leaks <2% light, so bystanders cannot see it. Meta’s Neural Band – a wrist-worn EMG sensor – comes bundled to enable touchless control via subtle hand gestures (pinches and swipes).
Figure: Meta’s new Ray-Ban Display smart glasses hide a small AR display in the right lens (shown lit green). They otherwise resemble chunky Ray-Ban Wayfarer-style sunglasses.
The Oakley Meta HSTN (2025) is built on Oakley’s signature sports frames, adding rugged design and an IPX4 water resistance rating. Like Ray-Ban Meta, it houses dual cameras, open-ear speakers and a mic array, but with extra battery and fast charging (8 hrs typical life, 50% in 20 min, plus 48 hrs from the case) for extended outdoor use.
By comparison, the first-gen Ray-Ban Stories (2021) were very light (about 40 g) and came in classic styles (Wayfarer, Round, Meteor). They contained two 5 MP cameras, one dedicated for photos and one for 30 s video (up to 60 s after a 2022 update). Stories had open-ear speakers and a 3‑microphone array for calls/music. However, no display or AI “brain” was built in – just a basic indicator LED.
AI Features and Capabilities
All of Meta’s smart glasses run “Meta AI,” an on-device assistant you can trigger hands-free by saying “Hey Meta”. You can ask questions, dictate messages, or issue commands by voice. Importantly, since 2024 Meta has added “AI with Vision”: the glasses’ camera can feed visuals into the AI so it can see and answer questions about your surroundings.
For example, asking “Hey Meta, what’s on that menu?” can trigger real-time OCR and translation. Meta AI can describe scenes, read signs, and even advise on photographic framing, all without a phone.
The display-equipped Ray-Ban Display takes this further with a tiny screen that can show visual AI responses. Meta AI answers, maps, or transcripts (like live captions) appear in the lens, not just read aloud. You can view messages and photos privately on the lens, or get turn-by-turn walking directions with an on-screen map.
In live video calls (WhatsApp or Messenger), the glasses will show the caller’s face in your vision, while the caller sees exactly what you see. Unseen by most competitors, the Neural Band lets you navigate apps and type by pinch-and-swipe gestures.
By contrast, Ray-Ban Stories and early Meta glasses (Gen1/2) rely on audio only. They could capture images, play music, and activate the AI assistant, but there was no heads‑up display. Snap’s Spectacles 5 (2024) are the nearest competitor with AR vision: they offer a stereo waveguide display covering 46° FOV, enabling fully immersive AR apps. Meta’s Orion AR prototype (shown at Meta Connect 2024) aims for that level, but remains an internal dev kit. For now, Meta’s consumer AI glasses stop short of full AR, using only a monocular popup display.
Hardware Specifications

Meta’s glasses pack high-end hardware. The Ray-Ban Display uses a Qualcomm Snapdragon AR1 Gen1 processor, 32 GB storage and 2 GB RAM – plenty to run the AI and graphics. It has a 12 MP camera (with 3× digital zoom and 1080p video at 30 fps), six microphones for clear voice pickup, and open‑ear speakers with Bluetooth 5.3. Connectivity includes Wi‑Fi 6 and Bluetooth 5.3. All lenses in the Ray-Ban and Oakley Meta lines have an IPX4 rating (splash resistance).
Battery life varies by model. The Ray-Ban Display glasses last about 6 hours of mixed use (capturing video, answering questions, etc), and their folding charging case provides another ~24 hours (30 hours total). The Neural Band runs separately for ~18 hours.
The new Ray-Ban Meta Gen2 (camera-only) glasses achieve up to 8 hours on a charge, while Oakley Meta HSTN sports glasses also reach ~8 hrs typical and 48 hrs with the case. For comparison, Ray-Ban Stories were rated only a few hours per charge (using a small 3000 mAh case).
Below is a summary of key specs across Meta’s latest products and competitors:
| Feature | Meta Ray-Ban Display | Oakley Meta HSTN | Ray-Ban Stories (2021) | Snap Spectacles (5th Gen) |
| Release (Year) | 2025 | 2025 | 2021 | 2024 (dev kit) |
| Style | Ray-Ban Wayfarer (chunky) | Oakley HSTN (sporty) | Ray-Ban Wayfarer, Round | AR sunglasses |
| Display | 600×600 px color (20° FOV) | — | None | Stereo AR (46° FOV) |
| Cameras | 12 MP (3× zoom), 1080p video | 12 MP, 3K video | Dual 5 MP (photo/video) | 4× cameras (spatial) |
| Audio | Open-ear speakers, 6‑mic array | Open-ear speakers, microphone array | Open-ear speakers, 3‑mic array | Built-in speakers/mics |
| AI/Assist. | Meta AI voice & Vision (on-device) | Meta AI with “Performance” context | Facebook Assistant (voice only) | Snap AI (Lens/AR assistant) |
| Controls | Touchpad & gestures (Neural Band) | Touchpad & gestures (Neural Band) | Touchpad, voice commands | Hand-tracking gestures & voice |
| Battery (glasses) | ~6 hrs mixed use | ~8 hrs typical | ~3–4 hrs per charge | ~45 min (streaming AR) |
| Battery (case) | +24 hrs with case | +48 hrs with case | Yes (multi-charge case) | — |
| Weight | 69 g (standard size) | ~40–50 g (per frame) | ~40 g (glasses alone) | 226 g |
| Connectivity | Wi‑Fi 6, BT 5.3 | Wi‑Fi, BT | BT 5.0, Wi‑Fi 802.11ac | Wi‑Fi, BT (Snap OS) |
| Waterproof | IPX4 (splash resistant) | IPX4 | None (no spec) | Unknown (likely splash) |
| Price (USD) | $799 (including Neural Band) | $499 (special HSTN) / $399 (others) | $299 | — (dev kit); consumer TBD |
Software Ecosystem
Meta’s glasses pair to a smartphone (Android or iOS) via the Meta View app, which was formerly “Facebook View.” Through the app you sync photos and videos, adjust settings, and update the glasses’ software.
Apps like WhatsApp, Messenger, Instagram and Spotify integrate so you can make calls, send/receive messages, or control music via voice and display. For example, display glasses can show incoming WhatsApp messages or Instagram Reels as floating windows.
Meta AI is constantly improving. Later in 2025, Meta is rolling out features like Handwriting input (trace letters with your finger on any surface to type messages) and Conversation Focus (amplifies only the voice of the person you’re talking to).
Cloud AI models (such as Llama) power the assistant and run on-device or via the linked phone. Security and privacy are emphasized: content is processed locally, indicator LEDs signal recording, and users can disable the camera/mic or restrict AI functions.
Third-party support is limited today. Unlike Snap’s openly programmable “Lenses,” Meta’s glasses are primarily closed-platform devices meant to augment Meta’s own services. (An exception is that developers can access Meta AI through APIs.) Meta positions these as companion devices, not standalone computing platforms.
Real-World Use Cases
Meta’s AI glasses enable many hands-free scenarios:
- Capturing Memories: Take spontaneous photos or videos by voice or gesture. The wide-angle 12 MP camera lets adventurers record POV clips of sports, travel, or everyday life without holding a phone.
- Communication: Make voice/video calls while keeping your hands free. In a video chat, the glasses both record your view and show the caller on-screen. You can also receive texts, emails or social media updates read aloud or shown in the display and reply by dictation or hand gestures.
- Navigation: Walk hands-free with turn-by-turn directions in the display. A small map and arrows appear in view, letting you steer without looking down at a phone.
- Language Translation: Read and translate foreign text in real-time. For instance, looking at a menu triggers Meta AI to overlay the English translation in your vision.
- Accessibility: Live captioning helps the hearing-impaired. The glasses can transcribe speech or translate conversations live in front of your eyes (or via ear speakers) by focusing on whoever you look at.
- Productivity and Fitness: Oakley Meta glasses tie into athletic use – ask for weather/wind conditions during a golf round, get heart-rate data, or record performance videos tagged by time. More generally, you can glance at schedules, reminders or search queries without interrupting your workflow.
- Entertainment and Social: Watch short videos, control music, or browse lightweight content. Meta plans to add features like streaming full movies via cloud, though current battery limits make this niche.
(Table for Use Cases – illustrative):
| Scenario | Meta Glasses | Ray-Ban Stories | Snap Spectacles |
| Hands-free photography | Voice/gesture control, 12 MP camera | Yes (voice or button) | Yes (Snap OS capture) |
| Messaging/Texting | Display & voice replies | Voice only (no display) | No native chat feature |
| Navigation | On-screen maps & directions | None | Possibly via Snap Lenses |
| Translation | Real-time on-display translation | Voice translation (audio) | Via Lenses (devs) |
| Video calling | Heads-up calling on vision | None | None |
| AR Gaming/Apps | Limited (no full AR display) | None | Immersive AR (Snap OS) |
Pros and Cons
Pros: Meta’s AI glasses combine useful features in sleek everyday styles. The built‑in AI assistant and computer vision reduce phone dependency – you can get information, directions or translations completely hands-free. The full-color display (in Display model) and neural-gesture controls provide intuitive interaction.
Real-world testers note that the Ray-Ban Display glasses finally realize the Google Glass vision of a practical see‑through HUD. Battery and comfort have improved: Gen2 Ray‑Ban now lasts ~8 hrs and won’t slip on your face thanks to universal fit and spring hinges.
Multi-day recording and instant photo previews help content creators and visually impaired users in new ways. Meta’s partnership with Luxottica means familiar fashion brands and lens options (transitions, prescriptions, etc) are available, easing consumer adoption. Finally, Meta’s deep pockets mean continuous software updates: e.g. live translation and captioning came via free updates.
Cons: These glasses are still expensive and emerging tech. At $799, the Ray-Ban Display is far pricier than basic camera glasses; even Oakley’s standard models are $399‑$499. Battery life (6–8 hrs) may feel short under heavy use, requiring frequent charging. The mini-display is monocular (only one eye) and covers just a 20° view, so it’s more like checking a pop-up screen than experiencing full AR. Privacy and social acceptance remain concerns: recording indicators are tiny, and wearers draw attention when the display is active.
Finally, the software ecosystem is limited: very few third‑party apps exist, and the glasses depend heavily on the smartphone. In short, while technologically impressive, they are still more “novelty window” than daily workhorse for now.
Pricing and Availability
Meta Ray-Ban Display glasses launched on Sept 30, 2025 for $799 (including the Neural Band). The frames come in Black or Sand, and two sizes (standard, large). Pre-orders at select retailers (Best Buy, LensCrafters, Verizon, etc) began shortly after Meta Connect. Oakley Meta HSTN debuted Aug 2025: a limited edition version cost $499, with standard models at $399. Ray-Ban Meta Gen2 (camera-only) is priced similar to the first Stories, around $329 (exact price via Meta’s store).
\By contrast, original Ray-Ban Stories in 2021 launched at $299. Snap’s Spectacles 5 are currently available only to developers ($99/month program), with a planned consumer launch in 2026. Pricing for Snap’s future glasses is unannounced.
Comparison with Other Smart Glasses
Meta’s glasses now encompass the broad spectrum of “smart eyewear” technologies. Compared to Ray-Ban Stories (2021), the new Meta glasses are vastly upgraded. Stories could capture and share social-media content (thanks to its 5 MP cameras and Facebook Assistant), but lacked a real voice assistant or any on-board display. Meta’s Gen2 models double the camera resolution (12 MP), add Meta AI features like live translation, and extend battery life substantially.
The Display glasses go further by adding a visible HUD and the Neural Band for gesture control. In practical terms, Meta’s latest glasses can do everything Stories could, plus much more (visual AI, maps, messaging).
Snap Spectacles are a different breed. Snap’s current glasses (5th gen) are designed from the ground up for augmented reality. They use dual processors and waveguide optics to render full-scene AR overlays over 46° of vision. By contrast, Meta’s products are primarily “AI assistants with cameras.”
Snap’s glasses enable immersive 3D games and shared AR experiences (via Snap OS and Lenses), whereas Meta’s revolve around hands-free media capture and productivity. However, Meta’s approach yields a lighter, more discreet device and ties closely to everyday social apps and Meta’s AI. Both companies see AI as key: Snap is integrating OpenAI/Gemini with lenses, while Meta uses its own large models on-device.
In summary, Meta AI glasses lead the camera‑AI category with practical smart features in designer eyewear, whereas Snap Spectacles pioneer AR hardware for immersive experiences. Ray-Ban Stories and early Meta glasses were first steps; the latest models represent a “second generation” leap, blurring the line between sunglasses and personal computers.
As of late 2025, Meta’s offerings deliver the most polished consumer AI glasses on the market, but Snap’s advanced AR glasses are a looming competitive step beyond.