Laughing Hyena
  • Home
  • Hyena Games
  • Esports
  • NFT Gaming
  • Crypto Trends
  • Game Reviews
  • Game Updates
  • GameFi Guides
  • Shop
Tag:

prototype

Google's Android XR prototype demonstrated at Google I/O 2025
Product Reviews

Hands on: I tried Google’s Android XR prototype and they can’t do much but Meta should still be terrified

by admin May 21, 2025



Why you can trust TechRadar


We spend hours testing every product or service we review, so you can be sure you’re buying the best. Find out more about how we test.

The Google Android XR can’t do very much… yet. At Google I/O 2025, I got to wear the new glasses and try some key features – three features exactly – and then my time was up. These Android XR glasses aren’t the future, but I can certainly see the future through them, and my Meta Ray Ban smart glasses can’t match anything I saw.

The Android XR glasses I tried had a single display, and it did not fill the entire lens. The glasses projected onto a small frame in front of my vision that was invisible unless filled with content.

To start, a tiny digital clock showed me the time and local temperature, information drawn from my phone. It was small and unobtrusive enough that I could imagine letting it stay active at the periphery.


You may like

Google Gemini is very responsive on this Android XR prototype

(Image credit: Philip Berne / Future)

The first feature I tried was Google Gemini, which is making its way onto every device Google touches. Gemini on the Android XR prototype glasses is already more advanced than what you might have tried on your smartphone.

I approached a painting on the wall and asked Gemini to tell me about it. It described the pointillist artwork and the artist. I said I wanted to look at the art very closely and I asked for suggestions on interesting aspects to consider. It gave me suggestions about pointillism and the artist’s use of color.

The conversation was very natural. Google’s latest voice models for Gemini sound like a real human. The glasses also did a nice job pausing Gemini when somebody else was speaking to me. There wasn’t a long delay or any frustration. When I asked Gemini to resume, it said ‘no problem’ and started up quickly.

That’s a big deal! The responsiveness of smart glasses is a metric I haven’t considered before, but it matters. My Meta Ray Ban Smart Glasses have an AI agent that can look through the camera, but it works very slowly. It responds slowly at first, and then it takes a long time to answer the question. Google’s Gemini on Android XR was much faster and that made it feel more natural.

Google Maps on Android XR wasn’t like any Google Maps I’ve seen

Celebrities Giannis Antetokounmpo and Dieter Bohn wear Android XR glasses and shake hands with the crowd (Image credit: Philip Berne / Future)

Then I tried Google Maps on the Android XR prototype. I did not get a big map dominating my view. Instead, I got a simple direction sign with an arrow telling me to turn right in a half mile. The coolest part of the whole XR demo was when the sign changed as I moved my head.

If I looked straight down at the ground, I could see a circular map from Google with an arrow showing me where I am and where I should be heading. The map moved smoothly as I turned around in circles to get my bearings. It wasn’t a very large map – about the size of a big cookie (or biscuit for UK friends) in my field of view.

As I lifted my head, the cookie-map moved upward. The Android XR glasses don’t just stick a map in front of my face. The map is an object in space. It is a circle that seems to remain parallel with the floor. If I look straight down, I can see the whole map. As I move my head upward, the map moves up and I see it from a diagonal angle as it lifts higher and higher with my field of view.

By the time I am looking straight ahead, the map has entirely disappeared and has been replaced by the directions and arrow. It’s a very natural way to get an update on my route. Instead of opening and turning on my phone, I just look towards my feet and Android XR shows me where they should be pointing.

Showing off the colorful display with a photograph

(Image credit: Philip Berne / Future)

The final demo I saw was a simple photograph using the camera on the Android XR glasses. After I took the shot, I got a small preview on the display in front of me. It was about 80% transparent, so I could see details clearly, but it didn’t entirely block my view.

Sadly that was all the time Google gave me with the glasses today, and the experience was underwhelming. In fact, my first thought was to wonder if the Google Glass I had in 2014 had the exact same features as today’s Android XR prototype glasses. It was pretty close.

My old Google Glass could take photos and video, but it did not offer a preview on its tiny, head-mounted display. It had Google Maps with turn directions, but it did not have the animation or head-tracking that Android XR offers.

There was obviously no conversational AI like Gemini on Google Glass, and it could not look at what you see and offer information or suggestions. What makes the two similar? They both lack apps and features.

Which comes first, the Android XR software or the smart glasses to run it?

(Image credit: Philip Berne / Future)

Should developers code for a device that doesn’t exist? Or should Google sell smart glasses even though there are no developers yet? Neither. The problem with AR glasses isn’t just a chicken and egg problem of what comes first, the software or the device. That’s because AR hardware isn’t ready to lay eggs. We don’t have a chicken or eggs, so it’s no use debating what comes first.

Google’s Android XR prototype glasses are not the chicken, but they are a fine looking bird. The glasses are incredibly lightweight, considering the display and all the tech inside. They are relatively stylish for now, and Google has great partners lined up in Warby Parker and Gentle Monster.

The display itself is the best smart glasses display I’ve seen, by far. It isn’t huge, but it has a better field of view than the rest; it’s positioned nicely just off-center from your right eye’s field of vision; and the images are bright, colorful (if translucent), and flicker-free.

The author in Ray-Ban Meta Smart Glasses looking dumbfounded (Image credit: Future / Philip Berne)

When I first saw the time and weather, it was a small bit of text and it didn’t block my view. I could imagine keeping a tiny heads-up display on my glasses all the time, just to give me a quick flash of info.

This is just the start, but it’s a very good start. Other smart glasses haven’t felt like they belonged at the starting line, let alone on retail shelves. Eventually, the display will get bigger, and there will be more software. Or any software, because the feature set felt incredibly limited.

Still, with just Gemini’s impressive new multi-modal capabilities and the intuitive (and very fun) Google Maps on XR, I wouldn’t mind being an early adopter if the price isn’t terrible.

My Ray-Ban Meta Smart Glasses are mostly just sunglasses now (Image credit: Future / Philip Berne)

Of course, Meta Ray Ban Smart Glasses lack a display, so they can’t do most of this. The Meta Smart Glasses have a camera, but the images are beamed to your phone. From there, your phone can save them to your gallery, or even use the Smart Glasses to broadcast live directly to Facebook. Just Facebook – this is Meta, after all.

With its Android provenance, I’m hoping whatever Android XR smart glasses we get will be much more open than Meta’s gear. It must be. Android XR runs apps, while Meta’s Smart Glasses are run by an app. Google intends Android XR to be a platform. Meta wants to gather information from cameras and microphones you wear on your head.

I’ve had a lot of fun with the Meta Ray Ban Smart Glasses, but I honestly haven’t turned them on and used the features in months. I was already a Ray Ban Wayfarer fan, so I wear them as my sunglasses, but I never had much luck getting the voice recognition to wake up and respond on command. I liked using them as open ear headphones, but not when I’m in New York City and the street noise overpowers them.

I can’t imagine that I will stick with my Meta glasses once there is a full platform with apps and extensibility – the promise of Android XR. I’m not saying that I saw the future in Google’s smart glasses prototype, but I have a much better view of what I want that smart glasses future to look like.

You might also like…



Source link

May 21, 2025 0 comments
0 FacebookTwitterPinterestEmail
We tried on Google’s prototype AI smart glasses
Product Reviews

We tried on Google’s prototype AI smart glasses

by admin May 21, 2025


Here in sunny Mountain View, California, I am sequestered in a teeny-tiny box. Outside, there’s a long line of tech journalists, and we are all here for one thing: to try out Project Moohan and Google’s Android XR smart glasses prototypes. (The Project Mariner booth is maybe 10 feet away and remarkably empty.)

While nothing was going to steal AI’s spotlight at this year’s keynote — 95 mentions! — Android XR has been generating a lot of buzz on the ground. But the demos we got to see here were notably shorter, with more guardrails, than what I got to see back in December. Probably because, unlike a few months ago, there are cameras everywhere and these are “risky” demos.

The Project Moohan VR headset.

First up is Project Moohan. Not much has changed since I first slipped on the headset. It’s still an Android-flavored Apple Vision Pro, albeit much lighter and more comfortable to wear. Like Oculus headsets, there’s a dial in the back that lets you adjust the fit. If you press the top button, it brings up Gemini. You can ask Gemini to do things, because that is what AI assistants are here for. Specifically, I ask it to take me to my old college stomping grounds in Tokyo in Google Maps without having to open the Google Maps app. Natural language and context, baby.

But that’s a demo I’ve gotten before. The “new” thing Google has to show me today is spatialized video. As in, you can now get 3D depth in a regular old video you’ve filmed without any special equipment. (Never mind that the example video I’m shown is most certainly filmed by someone with an eye for enhancing dramatic perspectives.)

When angled just so, you can see a glimpse of the hidden display.

Because of the clamoring crowd outside, I’m then given a quick run-through of Google’s prototype Android XR glasses. Emphasis on prototype. They’re simple; it’s actually hard to spot the camera in the frame and the discreet display in the right lens. When I slip them on, I can see a tiny translucent screen showing the time and weather. If I press the temple, it brings up — you guessed it — Gemini. I’m prompted to ask Gemini to identify one of two paintings in front of me. At first, it fails because I’m too far away. (Remember, these demos are risky.) I ask it to compare the two paintings, and it tells me some obvious conclusions. The one on the right uses brighter colors, and the one on the left is more muted and subdued.

Tapping the side will bring up Gemini on the Android XR prototype glasses.

On a nearby shelf, there are a few travel guidebooks. I tell Gemini a lie — that I’m not an outdoorsy type, so which book would be the best for planning a trip to Japan? It picks one. I’m then prompted to take a photo with the glasses. I do, and a little preview pops up on the display. Now that’s something the Ray-Ban Meta smart glasses can’t do — and arguably, one of the Meta glasses’ biggest weaknesses for the content creators that make up a huge chunk of its audience. The addition of the display lets you frame your images. It’s less likely that you’ll tilt your head for an accidental Dutch angle or have the perfect shot ruined by your ill-fated late-night decision to get curtain bangs.

These are the safest demos Google can do. Though I don’t have video or photo evidence, the things I saw behind closed doors in December were a more convincing example of why someone might want this tech. There were prototypes with not one, but two built-in displays, so you could have a more expansive view. I got to try the live AI translation. The whole “Gemini can identify things in your surroundings and remember things for you” demo felt personalized, proactive, powerful, and pretty dang creepy. But those demos were on tightly controlled guardrails — and at this point in Google’s story of smart glasses redemption, it can’t afford a throng of tech journalists all saying, “Hey, this stuff? It doesn’t work.”

Reminder: this is a prototype. What’ll end up shipping to consumers will be different.

Meta is the name that Google hasn’t said aloud with Android XR, but you can feel its presence loom here at the Shoreline. You can see it in the way Google announced stylish eyewear brands like Gentle Monster and Warby Parker as partners in the consumer glasses that will launch… sometime, later. This is Google’s answer to Meta’s partnership with EssilorLuxottica and Ray-Ban. You can also see it in the way Google is positioning AI as the killer app for headsets and smart glasses. Meta, for its part, has been preaching the same for months — and why shouldn’t it? It’s already sold 2 million units of the Ray-Ban Meta glasses.

The problem is, even though Google let us take photo and video this time, it is so freakin’ hard to convey why Silicon Valley is so gung-ho on smart glasses. I’ve said it time and time again. You have to see it to believe it. Renders and video capture don’t cut it. Even then, even if, in the limited time we have, we could frame the camera just so and give you a glimpse into what I see when I’m wearing these things — it just wouldn’t be the same.





Source link

May 21, 2025 0 comments
0 FacebookTwitterPinterestEmail

Categories

  • Crypto Trends (102)
  • Esports (80)
  • Game Reviews (86)
  • Game Updates (93)
  • GameFi Guides (100)
  • Gaming Gear (101)
  • NFT Gaming (95)
  • Product Reviews (102)
  • Uncategorized (1)

Recent Posts

  • Praise the Omnissiah, Warhammer 40,000; Mechanicus 2 just got a new gameplay trailer, featuring a short glimpse at a new faction
  • Solaxy and Pepeto rise as Nasdaq composite, ETH, DOGE, and Pepe coin heat up
  • Ripple CEO Breaks Silence on What Makes Crypto ETFs Exciting
  • This Bluetooth Label Maker Is Almost Free on Amazon, Already Bought by 20K People in the Past Month
  • FromSoftware and A24 to produce Elden Ring adaptation

Recent Posts

  • Praise the Omnissiah, Warhammer 40,000; Mechanicus 2 just got a new gameplay trailer, featuring a short glimpse at a new faction

    May 24, 2025
  • Solaxy and Pepeto rise as Nasdaq composite, ETH, DOGE, and Pepe coin heat up

    May 24, 2025
  • Ripple CEO Breaks Silence on What Makes Crypto ETFs Exciting

    May 24, 2025
  • This Bluetooth Label Maker Is Almost Free on Amazon, Already Bought by 20K People in the Past Month

    May 24, 2025
  • FromSoftware and A24 to produce Elden Ring adaptation

    May 24, 2025

Newsletter

Subscribe my Newsletter for new blog posts, tips & new photos. Let's stay updated!

About me

Welcome to Laughinghyena.io, your ultimate destination for the latest in blockchain gaming and gaming products. We’re passionate about the future of gaming, where decentralized technology empowers players to own, trade, and thrive in virtual worlds.

Recent Posts

  • Praise the Omnissiah, Warhammer 40,000; Mechanicus 2 just got a new gameplay trailer, featuring a short glimpse at a new faction

    May 24, 2025
  • Solaxy and Pepeto rise as Nasdaq composite, ETH, DOGE, and Pepe coin heat up

    May 24, 2025

Newsletter

Subscribe my Newsletter for new blog posts, tips & new photos. Let's stay updated!

@2025 laughinghyena- All Right Reserved. Designed and Developed by Pro


Back To Top
Laughing Hyena
  • Home
  • Hyena Games
  • Esports
  • NFT Gaming
  • Crypto Trends
  • Game Reviews
  • Game Updates
  • GameFi Guides
  • Shop

Shopping Cart

Close

No products in the cart.

Close