These prototype XR glasses sold me on mixed reality gaming

I was skeptical about the idea of gaming on XR glasses, to say the least. I had questions swirling in my head about how I would use them, why I would use them, and cynical answers to both.

But all those questions faded into the background when I got a chance to actually experience it myself. I had a few days to play with a prototype version of the Viture One XR glasses, a project funded on Kickstarter — and as outlandish as the concept seems, it does work.

This isn’t the future of gaming for everyone, but the bells of the early days of VR were ringing in my head with the Viture One XR glasses. There’s a lot of work to be done on the prototype I tried, but despite all my assumptions, they could be the first step in an exciting new category for gaming.

A massive screen, anywhere you want

Jacob Roach / Digital Trends

The Viture Ones give you the equivalent of a 120-inch screen, and although it doesn’t have the specs of one of the best VR headsets, it doesn’t need to. You’re getting a pixel density of 55, full signal at 1080p running at 60 fps, and a peak brightness of 1,800 nits, according to Viture. Now, I wasn’t able to strap a luminance meter inside the frames to verify 1,800 nits, but the screen was bright enough to combat even direct sunlight pouring through my living room windows.

There’s a little blur around the edge, but the screen looks great. It’s sharp and super responsive, and I constantly drifted off into a game or video every time I put on the glasses. Sure, you can see the surrounding room and it’s evident you’re not looking at a physical screen, but I never fought against getting engrossed in whatever I was doing. The Viture Ones pulled me in, which is shocking considering I normally wear glasses with a pretty heavy prescription.

Elden Ring Gif

Devil May Cry 5 is what tipped me off. I played it on my Steam Deck, connected directly to the glasses through a USB-C cable, and it felt like playing the game on a normal 60Hz display. Devil May Cry 5 is extremely fast, and the Viture Ones held up exceptionally. I also watched a few YouTube videos and some Netflix on my couch, allowing me to lay down or rest my head while always having my media in the center of my field of view.

Someone sitting and playing a game on the Viture One glasses.
Jacob Roach / Digital Trends

Having a screen anywhere is a huge plus. I can’t tell you how many times I’ve had to lay my head down awkwardly while playing a game or watching a movie when I want to rest and still see the screen, and it’s generally so uncomfortable that I just don’t do it. The Viture Ones get past that issue unlike any device you can buy right now, clocking in at only 78 grams so they never feel heavy.

The experience at home is great, but I’d really like to see the Viture Ones in action on a plane.  Sunglasses on a plane may look silly, but I can’t stand looking down at my phone to watch a movie or my Steam Deck to play a game on a flight. These glasses seem like a huge win if you travel a lot.

The Viture Ones may be a glimpse into the future of gaming, at least for enthusiasts like myself that don’t mind strapping crazy tech to their faces. It’s a glimpse into the future, but we haven’t arrived yet.

Growing pains

Hands holding the Viture One glasses.
Jacob Roach / Digital Trends

Any early prototypes come with a laundry list of issues, and Viture sent over a list of problems it’s aware of and working on for glasses that will ship out. I’m focused more on the hurdles that come up when designing a unique product, and I hope to see Viture address these issues either before launch in a version two.

Above all else, size is an issue. You’re given three nose pads in different sizes, but none of them fit my (admittedly large) nose. Comfort isn’t the issue here, either. If the glasses aren’t positioned on your face in the right way, you can’t see the full screen. I’m well aware of how awkward the glasses look on my face, but that was the only way I could set them and still see the screen.

There’s a reason that regular glasses have so many points of adjustment, and it’s hard to have that flexibility with how much tech is inside the Viture Ones. The ergonomics definitely need more tuning and more flexibility for larger heads.

The glasses themselves don’t have much computing power in them. If you want to access the Android TV operating system, you’ll need to connect the glasses to the neckband. The band is super comfortable and light, and all of your controls are easy to access. Within a couple of hours, I knew where everything was without a second thought.

A hand controlling the Viture One glasses.
Jacob Roach / Digital Trends

The actual computing power is inside the neckband, and it’s actively cooled. The neckband warms up, and you can hear a fan inside trying to keep everything cool with minimal ventilation. It’s not uncomfortable, but with the lackluster built-in speakers, it feels like the fan noise and the speakers are fighting against each other.

I didn’t get to try out the optional mobile dock, which is the third part of the Viture Ones. This dock is exclusively for the Switch and it connects directly to the console. It probably works as well as the Steam Deck, which is great, and Viture says it can even upscale from 720p at 30 fps to 1080p. The company says 1080p at 60 fps, but we’re talking about the Nintendo Switch here.

Racing toward the finish line

Someone smiling while wearing the Viture One glasses.
Jacob Roach / Digital Trends

The Viture Ones are the first step in what could become a popular category over the next few years, especially as we see glasses like the Lenovo Glasses T1 start to pop up.

There are some usability hurdles to overcome, but Viture has clearly done a lot to get its first version right out of the gate. The glasses work, and that’s about as much as I can ask for right now.

Editors’ Choice

Repost: Original Source and Author Link


VR in a pair of glasses? New research just made it possible

Many technology companies are working on improving aspects of VR, including making it more vivid and realistic. Now, new research from Stanford University and Nvidia aims to make headsets easier to wear.

The device is a push to have VR glasses look more like everyday glasses rather than being the large, wrap-around VR headsets that are seen on the market today. The prototype is called “Holographic Glasses” and can provide “a full-color 3D holographic image using optics that are only 2.5 millimeters thick” and weights 60 grams. Researchers compared the prototype to the Meta Quest 2, which weighs 503 grams.

The design remains rudimentary at this point with film ribbons that extended from each lens. However, the teams said this design has many benefits beyond a smaller and thinner VR frame. The “pancake lenses” on the Holographic Glasses prototype in theory allow for an unlimited resolution and a field-of-view of up to 200 degrees.

The current prototype does face a number of limitations, including having a field of view of only 22.8 degrees. The Holographic Glasses also need to be able to very accurately measure and track a user’s pupil, which is contingent on a more developed design.

Overall, it would take a larger company to invest in this concept to bring it to life in a consumer fashion.

You can read more details about the project in the research paper, “Holographic Glasses for Virtual Reality” by Jonghyun Kim, Manu Gopakumar, Suyeon Choi, Yifan Peng, Ward Lopes, and Gordon Wetzstein.

Mark Zuckerberg wearing a prototype VR headset.

Meanwhile, brands continue to develop technology for the standard VR and AR headsets. In June, Meta CEO Mark Zuckerberg showcased technologies for at least four VR headset prototypes the brand is currently working on that might translate into a consumer product later this year. These technologies focused on solving issues around resolution, focal depth, optical distortion, and HDR, with “focal depth” and “retinal resolution,” being similar to the goals Stanford and Nvidia aim to meet with their own prototype.

Focal depth aims to account for how eyes might shift away from focus objects in VR, while retinal resolution aims to meet 20/20 vision with the associated headset display. Such developments stand to improve on the current specs of Meta’s current headsets, such as the Quest 2. However, they are especially expected to benefit the brand’s long-rumored VR headset, currently known as “Project Cambria,” or the Meta Quest Pro.

Meanwhile, Apple is rumored to be developing its own mixed-reality headset, which is rumored to weigh just 150 grams.

Editors’ Choice

Repost: Original Source and Author Link


Steam games are coming to Nreal’s augmented reality glasses

Nreal users can now play some Steam games on their augmented reality glasses. The Chinese company has released the beta version of “Steam on Nreal,” which gives users a way to stream games from their PC to their AR eyewear. Nreal admits that installing the beta release will require a bit of effort during the setup process, and the current version is not optimized for all Steam games just yet. It will work on both Nreal Light and Nreal Air models, though, and it already supports some popular titles like the entire Halo series. 

To note, users can already play games on Nreal’s glasses by accessing Xbox Cloud Gaming on a browser inside the company’s 3D system called Nebula. But Steam on Nreal will give users who don’t have Xbox accounts the opportunity to see what gaming on the device would be like. Company co-founder Peng Jin said the beta release is “meant to give people a glimpse into what is possible.” He added: “AAA games should be played on a 200-inch HD screen and they should be played free of location restrictions.”

Nreal launched its Light mixed reality glasses in 2020 after a US court ruled in its favor for the lawsuit filed by Magic Leap. The American company accused its former employee Chi Xu of using stolen secrets to set up Nreal, but the court decided that Magic Leap failed to make any viable claim. In 2021, Nreal launched a new model called Air that was designed with streaming shows and playing mobile games in mind. Air looks more like a pair of ordinary sunglasses than its predecessor does, and it also comes with a better display.

In an effort to offer more content and perhaps entice those on the fence to grab a pair of its glasses, Nreal has also announced AR Jam, an online international contest for AR developers that will kick off on June 27th. Developers can compete in various categories that include at-home fitness, art, games and video, with each one having a $10,000 grand prize. Those interested can head over to the company’s Developer page for more information.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.

Repost: Original Source and Author Link


Apple’s rumored release of its AR glasses demands patience

While the Apple rumor mill has recently been focusing on the company’s reported mixed reality headset, Apple engineers are also believed to be working on a pair of high-tech glasses featuring augmented reality (AR) technology.

The tech giant has remained characteristically tight-lipped on the matter, though a new report from 9to5Mac suggests that it’s aiming to launch the advanced specs in “late 2024.” That means they could arrive a whole two years after Apple’s AR/VR headset, which some suggest will be unveiled later this year before landing in stores in early 2023.

The information comes courtesy of oft-quoted analyst Jeff Pu of Haitong Intl Tech Research, who in a note seen by 9to5Mac said he believes that Apple will introduce its first AR glasses in the second half of 2024.

Additionally, Pu also claimed that Apple would launch the second generation of its AR/VR headset in late 2024, around the same time as the AR glasses and about a year after the first-gen headset seems set to land.

Not a great deal is known about Apple’s rumored glasses. Most leaks in recent months have referenced Apple’s AR/VR headset, with Apple engineers reportedly having recently presented a prototype of the device to the company’s top team, including Apple CEO Tim Cook.

It’s certain that the mixed reality headset will be the more advanced of the two devices, likely packing a suite of AR/VR technologies powered by Apple Silicon and possibly including an 8K display for each eye. The so-called “Apple Glasses,” meanwhile, will reportedly function primarily as a display for the iPhone, showing information transmitted from the handset.

With the first iteration of the rumored specs apparently a long way from any kind of launch, it’s possible that Apple is still fiddling with the design, so the final product could be markedly different from the one it’s playing with today. The company could even abandon the project if it feels it’s not making any progress with it.

There’s certainly a lot more information out there regarding the AR/VR headset, suggesting it’s well on its way to becoming a … ahem … reality.

Editors’ Choice

Repost: Original Source and Author Link

Tech News

Niantic’s AR glasses teaser has my inner Pokémon trainer excited

Pokémon GO is an addictive game and you can find yourself stuck to the screen in a hunt of mythical creatures for many hours a week. But wouldn’t it be fantastic if we could do all of that without a phone?

I hope that’s possible through Niantic’s AR glasses — a teaser of which Niantic CEO John Hanke shared on Twitter. The company, which is also behind Pokemon GO didn’t say much about when these will be available or exactly what they’ll be capable of.

While Pokémon GO on AR glasses might be a pipe dream, for now, it’s no secret that Niantic wants to map the world for augmented reality. Last year, the firm announced that the Niantic Real World Platform to enable AR experiences on gadgets ranging from phones to “the wearable devices of the future.”

Earlier this month, Niantic showed off a demo of real-world Pokemon GO gameplay on Hololens 2 headset at Microsoft’s Ignite developer conference.

The demo indicated that the game could work on mixed reality glasses — at least in a test environment. The company hasn’t indicated when it plans to unveil these glasses. We’ve asked Niantic for a comment, and we’ll update the story if we hear back.

If analyst predictions are correct, Apple’s own AR glasses are coming in 2025, and Oppo’s set to release an AR headset with gesture control this year. So the AR glasses market is going to be very hot for the next few years. I, for one, can’t wait to play Pokemon GO with dorky AR glasses on.

Did you know we have a newsletter all about consumer tech? It’s called Plugged In –
and you can subscribe to it right here.

Published March 30, 2021 — 07:08 UTC

Repost: Original Source and Author Link

Tech News

Facebook reveals its Smart Glasses’ nerve-tracking wristband tech

Facebook has revealed another step in its path to augmented reality Smart Glasses, a wrist-based controller that blends AI and nerve-tracking EMG to leapfrog traditional input systems. While the eventual goal is a super-smart artificial intelligence that instinctively intuits what you might want from your high-tech glasses, Facebook says, this wrist-based input controller is much more practical in the shorter term.

Earlier this month, Facebook discussed its roughly 10 year vision for smart glasses and augmented reality. Designed to be comfortable for all-day wear, as well as the technological advances of transparent displays, they’d also feature a new, proactive AI that would effectively be your co-pilot and personal assistant.

“The AI will make deep inferences about what information you might need or things you might want to do in various contexts, based on an understanding of you and your surroundings, and will present you with a tailored set of choices. The input will make selecting a choice effortless — using it will be as easy as clicking a virtual, always-available button through a slight movement of your finger,” Facebook explains. “But this system is many years off.”

In the shorter term, then, Facebook is working on a different approach. Its Facebook Reality Labs (FRL) Research team has been exploring how to combine a less all-encompassing AI with better ways to track and respond to human input, beyond keyboards, trackpads, and voice commands. A “usable but limited contextualized AI” could then basically fill in the blanks.

The system developed is worn like a watch, but uses electromyography (EMG) to track the electric motor nerve signals passing through the wrist. “The signals through the wrist are so clear that EMG can understand finger motion of just a millimeter,” Facebook explains. “That means input can be effortless. Ultimately, it may even be possible to sense just the intention to move a finger.”

Such a system could track the messages sent from the brain to the fingers when it intends to type, for example, or swipe across a touchscreen, and share those intents with the AI. Initially, FRL says, it’s looking at just one or two such interactions, the equivalent of tapping a button. Clicking fingers and pinching-and-releasing the thumb and forefinger are the start, though eventually the team plans to expand that to full control over virtual UIs and objects.

Other possibilities are typing on a virtual keyboard, with the EMG wrist sensor monitoring each finger movement and figuring out which imaginary key you’re likely to have wanted to tap. The AI, meanwhile, would contribute its own predictions and customizations, to improve accuracy.

The short-term goal is what’s being described as an “intelligent click”: effectively a limited AI that’s aware of environmental and situational context, and which presents its suggestion on the most likely predicted goal for a virtual EMG-powered click to confirm.

“Perhaps you head outside for a jog and, based on your past behavior, the system thinks you’re most likely to want to listen to your running playlist,” Tanya Jonker, FRL Research Science Manager, suggests. “It then presents that option to you on the display: ‘Play running playlist?’ That’s the adaptive interface at work.”

It’d be combined with haptics, with the wristband opening the way to giving a physical sense of feedback from a virtual interaction or object. “You might feel a series of vibrations and pulses to alert you when you received an email marked “urgent,” while a normal email might have a single pulse or no haptic feedback at all, depending on your preferences,” Facebook says. “When a phone call comes in, a custom piece of haptic feedback on the wrist could let you know who’s calling.”

Alternatively, AR games could use haptics to give the feeling of using a virtual bow-and-arrow, or other weapon or tool. “Haptic emojis” could map emotion emojis to different haptic feedback. FRL has developed a special wristband it calls Bellowband, named for the eight pneumatic chambers it consists of. These can be precisely controlled to deliver pressure and vibration to the wearer’s wrists. A second prototype, dubbed Tasbi (Tactile and Squeeze Bracelet Interface), uses six vibrotactile actuators, along with a different wrist-squeeze mechanism.

The end goal is a wearable computer that feels collaborative and useful, rather than omnipresent and demanding. Eventually, FRL predicts, a combination of smart haptics, virtual displays, and non-traditional input and control methods will allow for many of the interactions to be pretty much unconscious, less distracting than using a current smartphone or laptop. All the same, it’s doing that with the idea of ensuring “meaningful boundaries” between users and their devices too.

Any production version of the prototypes is still some way off, FRL concedes. Still, it insists that the technology using EMG is still relatively near-term in its potential, though exactly what that means in terms of a product roadmap remains a mystery.

Repost: Original Source and Author Link

Tech News

I’m not a gamer but I really want Razer’s smart glasses — for work

Razer‘s products are usually designed for gamers, and I, an Apple Arcade person, can’t claim to be one. I don’t sit on my gaming chair to fire up  a custom PC to play the latest titles with graphics settings at full blast.

However, the company’s latest product has piqued my interest, and I imagine I’m not alone: a pair of smart glasses. Before you jump the gun, no,it doesn’t show you a ton of information via Augmented Reality (AR), like Google Glass and the rest.

In terms of smart features, these glasses can connect to your phone or computer through Bluetooth. You can take voice or video calls through an in-built mic and speakers. You can also listen to music through these open-design speakers with 16 mm drivers. Plus, you can control tracks and activate voice assistants through a touch-enabled side panel.

I know these functions are available on a pair of headphones, but something about having them in your glasses is attractive.