, a complete remake of PS3 classic , will (and PC at some point in the future). Not only will the game include all of the from 2020’s , Naughty Dog has revealed some extra ones that it’s including.
One that takes advantage of the seems particularly novel. “[A feature that] started as a prototype but ended up being really successful during playtesting is a feature that plays dialogue through the PS5 DualSense controller as haptic feedback,” game director Matthew Gallant told the . “That way a deaf player can feel the way a line is delivered, can feel the emphasis, along with the subtitles to give some sense of how that line is delivered.”
Another big accessibility update is audio descriptions for cutscenes. Gallant said Naughty Dog teamed up with a company that delivers descriptions for TV, movies and game trailers. The feature will be available across all the localized languages in The Last of Us Part I. “We’re expecting this to be an accessible experience for blind players, for deaf players, for players with motor accessibility needs,” Gallant said.
The blog post details all of the settings, including presets for vision, hearing and motor accessibility. You can expect visual aids and a way to zoom into a specific section of the screen using the touchpad. You’ll be able to fully remap the controls — there’s even the option to link a command to shaking the DualSense. There are also in-depth settings for motion sickness, navigation, traversal, combat, the heads-up display and, of course, difficulty.
It’s heartening to see Naughty Dog place so much emphasis on making its games as accessible as possible. Not every developer has the resources of that studio, but here’s hoping more game creators take inspiration from Naughty Dog’s work in this area. In the meantime, if you want to find out much more about The Last Of Us Part I, you can read Engadget’s review on August 31st.
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.
The metaverse seems to be coming, as is the futuristic hardware that will increase immersion in virtual worlds. Meta, the company formerly known as Facebook, has shared how its efforts to usher in that new reality are focusing on how people will actually feel sensations in a virtual world.
The engineers at Meta have developed a number of early prototypes that tackle this goal and they include both haptic suits and gloves that could enable real-time sensations.
Meta’s Reality Labs was tasked to develop, and in many cases invent, new technologies that would enable greater human-computer interaction. The company started by laying out a vision earlier this year about the future of augmented reality (AR) and VR and how to best interact with virtual objects. This kind of research is crucial if we’re moving toward a future where a good chunk of our day is spent inside virtual 3D worlds.
Sean Keller, Reality Labs research director, said that they want to build something that feels just as natural in the AR/VR world as it does in the real world. The problem, he admits, is the technology isn’t yet advanced enough to feel natural and this experience probably won’t arrive for another 10 to 15 years.
According to Keller, we’d ideally use haptic gloves that are soft, lightweight, and able to accurately reproduce the correct pressure, texture, and vibration that corresponds with a virtual object. That requires hundreds of tiny actuators that can simulate physical sensations. Currently, the existing mechanical actuators are too bulky, expensive, and hot to realistically work well. Keller says that it requires softer, more pliable materials.
To solve this problem, the Reality Labs teams turned to research into prosthetic limbs, namely soft robotics and microfluidics. The researchers were able to create the world’s first high-speed microfluidic processor, which is able to control the air flow that moves tiny, soft actuators. The chip tells the valves in the actuators when to move and how far.
The research team was able to create prototype gloves, but the process requires them to be “made individually by skilled engineers and technicians who manufacture the subsystems and assemble the gloves largely by hand.” In order to build haptic gloves at scale for billions of people, new manufacturing processes would have to be invented. Not only do the gloves have to house all of the electronics and sensors, they also have to be slim, lightweight, and comfortable to wear for extended periods of time.
The Reality Labs materials group experimented with various polymers to turn them into fine fibers that could be woven into the gloves. To make it even more efficient, the team is trying to build multiple functions into the fibers including capacitance, conductivity, and sensing.
There have been other attempts at creating realistic haptic feedback. Researchers at the University of Chicago have been experimenting with “chemical haptics.” This involves using various chemicals to simulate different sensations. For example, capsaicin can be used to simulate heat or warmth while menthol does the opposite by simulating coolness.
Meta’s research imto microfluidic processors and tiny sensors woven into gloves may be a bit more realistic than chemicals applied to the skin. It will definitely be interesting to see where Reality Labs takes its research as we move closer to the metaverse.
I never thought vibrating headphones would be a good idea until I tried the recently-released Razer Kraken V3 HyperSense headset. It’s a jarring concept at first — why would I want my headset to vibrate? But after spending some time with Razer’s HyperSense technology, I’m a believer that haptic feedback will show up in the best gaming headsets in the future.
But, the concept requires belief. Although haptics have a chance to elevate gaming, watching movies, and listening to music, the options available today aren’t great — Kraken V3 HyperSense included. Here’s why haptics feedback headphones are a great idea and what companies need to do to make the tech better.
Why haptic feedback headphones make perfect sense
Vibrating headphones might seem like a gimmick, but they make perfect sense. Sound is vibration, physical vibration is just sound that you don’t hear, and all the places where you’d want haptic feedback are the same places you’d hear low frequencies — in particular low bass parts. If you’ve ever used a pair of headphones with a “bass boost” feature, you already know this. Boosting the bass also vibrates the headphones.
High frequencies are clear; either you hear a high pitch or you don’t. Low frequencies are vague. As the frequency goes lower, the sound starts to morph from sound into feeling. Sound is just vibration, and after a certain point, you stop hearing, and you start feeling.
HyperSense makes the low-end sound bigger. You don’t add a subwoofer to an audio system to hear low frequencies — you add one to feel low frequencies. HyperSense does the same thing in headphones. The problem is that, unlike a subwoofer, HyperSense isn’t producing a range of frequencies. It’s reacting to them, which can ultimately lead to a disjointed experience. That’s exactly how haptic feedback headphones feel today.
Expectation versus reality
The Razer Kraken V3 HyperSense headset sells you on expectations. Immediately after hearing about haptic feedback headphones, I conjured up images of bombastic bass blasts in blockbuster trailers, sounds of scraping shrapnel in AAA war video games, and the thump of a thick bass guitar grooving heavy on a beat.
For brief moments while using the headset, I experienced all of those scenarios — just not consistently. There’s an inherent flaw with the design of HyperSense. It works based on a threshold. Think about haptic feedback in a controller; developers choose when to trigger the haptics, what sounds or images it’s reacting to and what vibration it’s trying to mimic.
Massive bass blasts send a ripple throughout the headset, but so does a deep voice.
That’s not what HyperSense does. It’s taking the audio that it’s fed and spitting out feedback based on, from my testing, a narrow range of low-end frequencies. Massive bass blasts send a ripple throughout the headset, but so does a deep voice. That leads to a strange disconnect where HyperSense draws you into an experience before pulling you immediately out of it.
After watching the Dune trailer and a compilation of the trailers from the latest PlayStation showcase, I was ready to shout from the rooftops that HyperSense is the way to experience media with headphones. After playing through some of Guardians of the Galaxy and hearing Star Lord’s voice reverberate through what sounded like broken bass port, though, I have a different impression.
There’s a strange balancing act with HyperSense between what is and what could be. Although it would take an army, individual game and movie support could elevate HyperSense from an amusing shoo-in to an essential feature on any pair of over-ear headphones.
Preparing for a headache
One of the problems with HyperSense is the intensity. Razer thankfully included a button on the Kraken V3 HyperSense that allows you to adjust the intensity on the fly, but it has four settings: low, medium, high, or off. Even in the Synapse software, you can’t adjust the intensity manually.
The feedback would bob back and forth between being too much and not enough. At its best, the vibration was a nice reassurance that immersed me in a game or movie. At its worst, HyperSense would rock the headset halfway off my ears, produce no feedback at all, or give me a massive headache.
For haptic feedback headphones to work, you need to be able to adjust the vibration and the sound independently on the fly. It’s a balancing act, and even after dozens of hours of using the Kraken V3 HyperSense, I would reach for a feedback intensity dial that wasn’t there.
This technology needs a way to filter out the junk frequencies.
Independent, granular controls are essential because everything reacts a little differently to the haptic feedback. Most well-produced music with a consistent low-end worked well with the Kraken V3 HyperSense, but video games and movies were all over the place. HyperSense makes a bad audio mix apparent immediately.
Beyond fine control over the intensity of the vibration, this technology needs a way to filter out the junk. As mentioned, HyperSense operates within a range of frequencies, topping out somewhere around 200Hz.
There are a lot of junk frequencies between 100Hz and 200Hz, and I suspect the disjointed feeling of HyperSense is largely due to this range. Here, bass starts to sound like cardboard. It’s not low enough to feel like a sub frequency, but not high enough to venture into the midrange. Filtering would not only lead to more consistent haptics but also allow users to tune the headset for fewer headaches.
Not quite there yet
HyperSense is just a concept. Although Razer now sells two headsets with the feature, it’s not ready for prime time yet. If anything, it’s a proof of concept. It still needs independent, granular controls, as well as dedicated integrations in games and movies.