Apple has an augmented reality (AR) headset in the works, and a well-known analyst now predicts that it will use Face ID to track hand movements.
The upcoming headset is said to be equipped with more 3D sensing modules than iPhones and, according to the report, may one day replace iPhones altogether.
The information comes from a note for investors prepared by Ming-Chi Kuo, a respected analyst, which was then shared by MacRumors. In his report, he elaborates on the kind of performance and features we can expect from the upcoming Apple AR/MR (augmented reality/mixed reality) headset.
According to Kuo, the new headsets will feature four sets of 3D sensors as opposed to the one to two sets currently offered by the latest iPhones. The use of extra sensors opens up the headset to a whole lot of new capabilities, extending the realism of the user experience.
The sensors used in the new Apple headset rely on structured light to detect motion and actions. Kuo predicts that this will make it possible for the headset to track not just the position of the user, but also the hands of the user and other people, objects in front of the user, and lastly, detailed changes in hand movements.
Kuo compared the headset’s ability to track small hand movements to the way Apple’s Face ID is capable of tracking changes in facial expressions. Being able to detect small hand and finger movements allows for a more intuitive user interface that doesn’t take away from the realism of using an AR/MR headset.
Both the iPhone and the yet unnamed Apple headset rely on structured light, but the headset needs to be more powerful than the iPhone in order to offer proper hand movement detection. Kuo notes that this means that the structured light power consumption of the AR/MR headset is higher.
“We predict that the detection distance of Apple’s AR/MR headset with structured light is 100% to 200% farther than the detection distance of the iPhone Face ID. To increase the field of view for gesture detection, we predict that the Apple AR/MR headset will be equipped with three sets of ToFs (time of flight) to detect hand movement trajectories with low latency requirements,” said Ming-Chi Kuo in his investor note.
Kuo believes that Apple may one day wish to replace the iPhone with the AR headset, although that won’t happen anytime soon. He predicts that in the next 10 years, headsets may very well replace existing electronics with displays.
With the added hand gesture tracking, the new Apple headset may offer an immersive user experience. As rumors suggest that Apple may be looking to join Meta and other companies in expanding toward the metaverse, it’s possible that this headset might be the first step toward just that.
HTC has rolled out a firmware update for the latest standalone Vive Focus that greatly improves its hand-tracking capabilities. The company says firmware version 3.0.999.284 significantly improves the feature’s performance, stability and accuracy. HTC’s Vive Focus 3 launched with hand tracking back in July, allowing users to use their hands as controllers. With this software engine upgrade, HTC says the headset will be able to track fast hand movements more easily and recognize pinch-to-zoom gestures more accurately.
Since the company opened the feature to developers, these improvements would translate to better hand tracking within applications. Developers can integrate the headset’s six current predefined hand gestures into their VR apps, and HTC previously said that additional gestures will be added in the future.
HTC said in its announcement:
“Being able to navigate virtual environments naturally and intuitively will go a long way towards making VR more accessible to everyone, no matter their familiarity with technology. As we step into the metaverse era, we couldn’t be more excited to bring these quality-of-life improvements to all VIVE Focus 3 customers around the world.”
When the manufacturer launched the Vive Focus 3 back in July, we found it to be the best standalone VR headset yet. It’s not a direct competitor to the Quest 2, however, seeing as it costs $1,300. Unlike the Oculus (now Meta) headset, it targets business users and not ordinary consumers who want to enjoy VR experiences in their own home.
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.
Elevate your enterprise data technology and strategy at Transform 2021.
At a 2020 meeting of the World Economic Forum in Davos, Salesforce founder Marc Benioff declared that “capitalism as we have known it is dead.” In its place now is stakeholder capitalism, a form of capitalism that has been spearheaded by Klaus Schwab, founder of the World Economic Forum, over the past 50 years. As Benioff put it, stakeholder capitalism is “a more fair, a more just, a more equitable, a more sustainable way of doing business that values all stakeholders, as well as all shareholders.”
Unlike shareholder capitalism, which is measured primarily by the monetary profit generated for a business’ shareholders alone, stakeholder capitalism requires that business activity should benefit all stakeholders associated with the business. These stakeholders can include the shareholders, the employees, the customers, the local community, the environment, etc. As an example, Benioff’s approach includes homeless people in San Francisco as stakeholders in Salesforce.
While believers in stakeholder capitalism have been working on the idea for some time now, an important milestone was reached in early 2021. Following discussion at the 2020 meeting led by Bank of America CEO Brian Moynihan, a formalized set of ESG (environmental, social, and corporate governance) metrics were announced that business can report, indexed around 4 pillars:
Principles of governance
These metrics are important because they make it possible to easily audit a business’ compliance to the principles of stakeholder capitalism.
Given the role that technology has within business, it is impossible to overlook the growing impact that artificial intelligence will have in society and the parallels to the discussion of stakeholder capitalism. Many businesses are transitioning from a goal of pure profit to more inclusive and responsible goals of stakeholder capitalism. In AI we are also at the start of a transition – one that moves from the goal of maximizing pure accuracy to goals that are inclusive and responsible. In fact, given the prevalence of AI technologies across businesses, they will become critical components of stakeholder capitalism.
Also present at the 2020 meeting was then IBM CEO Ginni Rometty, who, when questioned about stakeholder capitalism in the context of the 4th Industrial Revolution, said that this is “going to be the decade of trust.” It is critical that all stakeholders trust in business and the technologies that they use. With respect to AI, Rometty said it is important to have a set of ethical principles (such as principles of transparency, bias mitigation, and explainability) and that you should audit your business to them.
Not all organisations will have adopted stakeholder capitalism principles as vocally and publicly as the likes of Benioff’s Salesforce. However, businesses still have traditional CSR (corporate social responsibility) requirements and in the context of AI, existing and proposed regulation also contain similar themes as those discussed in the context of stakeholder capitalism at the World Economic Forum meeting.
Shortly after the stakeholder capitalism ESG metrics were announced in January of this year, the U.S. Department of Defense announced its AI ethical principles in February. The European Union followed with proposed AI regulation in April (which affects business both inside and outside of the EU), and then the UK announced its guidance on the ethics of algorithmic decisioning in May. Look at these announcements (and the 2019 proposed Algorithmic Accountability Act in the United States) and you will see many requirements, including those for governance, transparency, and fairness — requirements that align clearly with the goals and metrics of stakeholder capitalism.
So just over a year into this decade of trust, what should businesses be doing? IBM has introduced the role of a Chief AI Ethics Officer, and Deloitte give plenty of detail on what this role entails. Not all business will quite be ready for this role, but they can start by documenting their ethical principles. As Rometty pointed out, it is important to know what you stand for as a company. What are your values? These lead to the formation of a set of ethical principles, which can lead you to form your own (or adopt an existing) AI ethics framework for your business.
Again, drawing a parallel to the ESG metrics announced in January that take stakeholder capitalism from talk to auditable action, you must test and audit your AI systems against your framework to move beyond talk and demonstrate your AI systems’ compliance (or lack thereof) with hard metrics.
Thorough, auditable ethics for AI should not be seen to be at odds with your business goals. As Rometty put it, “it is not good for anyone if people do not find the digital era to be an inclusive era where they see a better future for themselves.” Effective governance of AI ethics provides benefit to all stakeholders and that includes the shareholders too.
Stuart Battersby is Chief Technology Officer of Chatterbox Labs, where he leads a team of scientists and engineers who have built the AI Model Insights platform from the ground up. He holds a PhD in Cognitive Science from Queen Mary, University of London.
VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.
Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:
up-to-date information on the subjects of interest to you
gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
The LG G8 can’t fold, doesn’t take 4K selfies, and won’t wirelessly charge another phone. It pretty much uses the same design as the G7 from last year, which means it doesn’t have a triple-camera setup like the V40. But it just might be one of the most intriguing phones of the year.
Whether that translates into sales is the big question, but my usually jaded hands couldn’t wait to pick up the G8 following LG’s briefing. I wanted to try it out almost as much as Samsung’s $2,000 Galaxy Fold. And I walked away feeling like LG might be onto something, after a string of one-and-done gimmicks going all the way back to the G5’s modular Friends accessories and the V20’s second screen.
In person, the G8 is essentially a G7 with a vertical camera array. On paper, the G8 is a typical LG flagship phone, filled with high-end specs that put it in good company with the rest of 2019’s premium phones:
But the G8 is LG’s first ‘G’ phone in a while that doesn’t feel a least a little hobbled. For one, it finally uses an OLED display, a feature that was previously reserved for its ‘V’ phones. But more importantly, it doesn’t have to wait to use the newest Snapdragon 855 processor. Qualcomm’s exclusive collaboration with Samsung on the 835 and 845 forced the G7 to arrive months late, and the G6 to use an older chip. It’s also got the other features LG has been adding to its ‘G’ phones over the years: Quad DAC, Boombox sound, IP68, HDR10, AI Cam, and the dedicated Google Assistant button.
Design-wise, the G8 is still very much an LG phone, with a notch and a chin, and noticeable bezels all around. It’s not terrible, but it’s not going to win any awards either, and like the G7 it emulates, it looks a little stale next to the latest handsets from Samsung and Apple. The back camera is entirely under glass, however, which gives the phone a sleeker touch.
Inside the notch you’ll find the biggest changes to the G8. First off, you won’t find a receiver, because LG has turned the whole display into a speaker by combining its Boombox amplifier with its new Crystal Sound OLED tech. LG says you’ll be able to listen clearly underwater—something I wasn’t able to test—but you’ll need to be more mindful of the volume of your calls lest anyone listen to what the other person is saying.
Also new to the notch is a time-of-flight camera, and it’s here where LG gets wild. LG previously announced the use of the Infineon sensor, but now we know it’s for more than facial recognition and enhanced selfies (though the G8 brings both of those things). LG is using its ‘Z’ camera to let you control your phone in truly unique ways, called Hand ID and AirMotion.
If those aren’t the most LG names ever, I don’t know what are. When the new features were announced, there were audible snickers in the room. But while they may seem like the kind of eye-rolling gimmicks typical of a 2015 phone rather than a 2019 one, they’re not as silly as they sound. Even after just an hour of playing around with them, I could see how they could be useful. If LG takes the time to develop them, the G8’s touchless gestures might one day become as commonplace as the fingerprint sensor or navigation bar.
Speaking of the fingerprint sensor, the G8 still has a standard one on the back, but that’s the least convenient way to unlock it. Like the iPhone, you can also register a 3D scan of your face for secure facial unlocking, and LG says it works in all kinds of light thanks to the ToF sensor. I didn’t get to check it in low light, but it worked well in a normal shadowy setting.
Far more unique and revolutionary is Hand ID, the other new secure biometric on the G8. Hand ID takes advantage of the camera’s infrared sensor to read the absorption characterization of hemoglobin in the veins of your hand. If that sounds crazy, it’s because it is. What’s even crazier is that it works.
At least it does when you get it right. It took me a few minutes to figure out where to position my hand and how high to hold it about the camera—about six inches above the phone and close to the notch. That’s pretty specific placement. But when I nailed it, waving my hand over the G8 really did unlock the phone instantaneously.
Now, I know what you’re thinking: Why would I ever need a hand sensor when the G8 has a fingerprint sensor and 3D facial recognition? I thought the same thing until I unlocked a phone lying on a table, without contorting my head so an iris scanner could read my eyes, or reaching around for a fingerprint scanner. As gimmicks go, it’s a pretty good one.
Unlocking isn’t all your hands can do on the G8. There’s also AirMotion, which is a brand new navigational system that let’s you control basic commands without touching the screen. Like Hand ID, it’s equal parts finicky and pretty amazing. Once the sensor locks on your hand—again, my best results were when my hand was roughly six or so inches from the sensor—you can open apps, control music, and answer calls with a flick of your wrist when your hands are wet or your phone is in a holder. You’ll see an infrared scan of your hand in a tiny window at the top of the screen
In its current state, AirMotion is pretty basic—more of a party trick than a game-changing feature. But I can see the potential, assuming LG continues to develop the technology. I can see AirMotion actually becoming a trademark feature of LG’s phones that gives people a reason to choose them again. I had a blast using it and can’t wait to try it in a less controlled setting. Even more, I want to see the future—what the LG G10 is going to do with it, two generations out.
Note: When you purchase something after clicking links in our articles, we may earn a small commission. Read our affiliate link policy for more details.