Categories
Tech News

‘Bat-sense’ algorithm could be used to monitor people and property without cameras

A “bat-sense” algorithm that generates images from sounds could be used to catch burglars and monitor patients without using CCTV, the technique’s inventors say.

The machine-learning algorithm developed at Glasgow University uses reflected echoes to produce 3D pictures of the surrounding environment.

The researchers say smartphones and laptops running the algorithm could detect intruders and monitor care home patients.

[Read: 3 new technologies ecommerce brands can use to connect better with customers]

Study lead author Dr Alex Turpin said two things set the tech apart from other systems:

Firstly, it requires data from just a single input — the microphone or the antenna — to create three-dimensional images. Secondly, we believe that the algorithm we’ve developed could turn any device with either of those pieces of kit into an echolocation device.

The system analyses sounds emitted by speakers or radio waves pulsed from small antennas. The algorithm measures how long it takes for these signals to bounce around a room and return to the sensor.

It then analyzes the signal to calculate the shape, size, and layout of the room, as well as pick out the presence of objects or people. Finally, the data is converted into 3D images that are displayed as a video feed.

Credit: University of Glasgow
Categories
Game

Titanfall 2 goes free-to-play on Steam, but only temporarily

Apex Legends‘ next season, dubbed “Legacy,” is on the horizon, and as the name suggests, the Apex Legends lore is revisiting some characters from its past. Specifically, we’ve seen references to a number of Titanfall 2 characters in the marketing for Apex Legends‘ Legacy season, but if you really want to see where it all started, there’s no better way to do that than by playing Titanfall 2. Thankfully, Respawn has made that particularly easy this weekend if you happen to be a PC gamer.

Titanfall 2 has gone free-to-play on Steam for the weekend, giving interested players a chance to check it out without having to shell out the cash to buy it first. Recently, we’ve seen a number of Steam promotions that let players keep the free games permanently once they were claimed, but it’s important to note that this not one of those promotions.

Instead, this is merely a temporary free-to-play weekend. Titanfall 2 is free to download and play on Steam now through Monday, May 3rd at 10 AM PDT/1 PM EDT. Once Monday morning rolls around, the game will no longer be free and you’ll have to buy it if you want to continue playing.

Sadly, it seems there’s no sale on Titanfall 2 to go along with this promotion, which suggests that this is more meant to promote the new Apex Legends season. Still, the Titanfall 2 Ultimate Edition runs $24.99 these days, and it’s worth pointing out that it’s included in EA Play, which is available through both Steam and Xbox Game Pass for PC.

This could be good news for the Titanfall 2 fans already playing through Steam, because for this weekend at least, it means that multiplayer lobbies will probably be packed with players. It isn’t hard to imagine the free-to-play promotion leading to at least a few new players beyond the weekend as well, so if you’ve been wanting to get in some game time with Titanfall 2, now is the time to do it.

Repost: Original Source and Author Link

Categories
AI

AI Weekly: How the power grid can benefit from intelligent software

Join Transform 2021 this July 12-16. Register for the AI event of the year.


Google parent Alphabet’s “moonshot” X lab announced last week at the White House Leaders Summit on Climate that it’s working on a project for the electric grid. Over the past three years, the lab says it has been investigating “new computational tools” designed to bring the grid “out of the industrial age and into the age of the intelligence.” Among other areas, X says it’s experimenting with (1) a real-time virtualization that shows power moving onto and off the grid, (2) tools that simulate what might actually happen on the grid, and (3) a platform to make information about the grid useful to stakeholders.

The work is being led by Audrey Zibelman, former managing director at Australian electricity and gas systems operations firm Australia Energy Market Operator, and it remains in the planning stages. But experts believe the core of this effort — intelligent software — is likely to become increasingly important in the energy sector.

“Hybrid plants and battery energy storage now mean power plants can be controlled and can simulate traditional power plants, and this will require sophisticated IT to integrate forecasting of reusable energy production, along with forecasting prices,” Ric O’Connell, executive director of clean energy consulting firm GridLab, told VentureBeat via email.

The U.S. electrical grid has long been burdened by aging infrastructure. Sixty percent of distribution lines have surpassed their 50-year life expectancy, according to Black and Veatch, while the Brattle Group anticipates $1.5 trillion to $2 trillion in spending by 2030 to modernize the grid and maintain reliability. The latest report from the American Society for Civil Engineers found that current grid investment trends will lead to funding gaps of $42 billion for transmission and $94 billion for distribution by 2025.

Neil Sahota, chief innovation officer at the University of California, Irvine, says intelligent software opens the door to the deployment of AI designed for power grid use cases. Utilities are already employing AI to address the windfalls and fluctuations in energy usage. Precise load forecasting ensures operations aren’t interrupted, thereby preventing blackouts and brownouts. And it can bolster the efficiency of utilities’ internal processes, leading to reduced prices and improved service.

“There are a lot of subtle clues that in aggregate show where and when a natural disaster can occur. To ‘see’ the clues, we need to process a lot of data across a broad spectrum of variables and look for subtle differences,” Sahota told VentureBeat via email. “This is difficult for people to do effectively but is in the wheelhouse of AI. Consider wildfires, where we are using climate information (including wind forecasts), drone surveillance, and satellite images to predict hot spots and how a fire may start and spread. AI can monitor all these millions of data points in real time and constantly generate prediction models.”

For example, startup Autogrid works with more than 50 utilities in 10 countries to deliver AI-informed power usage insights. Its platform makes 10 million predictions every 10 minutes and optimizes over 50 megawatts of power, which is enough to supply the average suburb. Flex, the company’s flagship product, predicts and controls tens of thousands of energy resources from millions of customers by ingesting, storing, and managing petabytes of data from trillions of endpoints. Using a combination of data science, machine learning, and network optimization algorithms, Flex models both physics and customer behavior, automatically anticipating and adjusting for supply and demand patterns.

O’Connell believes that efforts like X’s will face challenges, particularly on the distributed energy resource (DER) side of the equation. DER systems — small-scale power generation or storage technologies that provide an alternative to traditional power systems or enhance those systems — can be difficult to orchestrate because they might span solar panels, electric vehicle charging setups, and even smart thermostats. But if a digital transformation of the power grid succeeds, its long-term benefits could be significant, O’Connell says.

“Currently, when independent system operators want to add a new market participant type, it takes them a year to incorporate those changes. That’s legacy IT systems,” he said. “The IT systems that grid operators will need are going to have to get a serious upgrade from the ’90s technology that they use now.”

For AI coverage, send news tips to Kyle Wiggers — and be sure to subscribe to the AI Weekly newsletter and bookmark our AI channel, The Machine.

Thanks for reading,

Kyle Wiggers

AI Staff Writer

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link

Categories
Computing

How to Change File Associations in Windows 10

If you’ve ever run into the problem of Windows trying to open up a file in entirely the wrong application, you’re not alone. Although you can get around it using the “Open With” command, there is a way to make sure you don’t have to do that every time: Learn how to change file associations.

If you aren’t sure which file type you want to change the association of, right-click your desired file and click Properties from the resulting menu. Look for Type of File at the top of the window. The three-letter designation next to that is its file type. You can do it on an individual file type basis or change all of them from one location.

Changing one file type

The quickest method to change a single file type is to do it from that Open With menu we mentioned. If you want to change more than one file type at a time, skip to the next section.

Step 1: Right-click on a file of the type you wish to change the association for.

Step 2: Select Open With from the resulting menu.

screenshot

Step 3: Windows will then offer you an app or a list of apps that can act as the default for that file type. If you see the one you want, select it, and Windows will open that file in the app you’ve chosen.

If you don’t see your preferred app, then from the menu that appears when you select Open With, either search for one by selecting the Search the Microsoft Store option or click Choose Another App for an expanded list of already-installed applications.

Step 4: When you’ve found the app you want, and it has been selected, simply click the gray OK button. You can also tick the box labeled Always Use This App to Open [Type of File] Files before you hit the OK button if you want that app to open all files of that type going forward.

Windows 10 File Association screenshot
screenshot

From now on, any files of that type will be opened with your chosen application.

Changing any and all

If you want to change a few different file types — or even all of them — then the Settings menu is the best place to go.

Step 1: Press the Windows + X keys and click Settings from the resulting menu.

Alternatively, search for Settings in the Windows search bar and click the relevant result.

Step 2: Select Apps from the list of options.

Step 3: Click Default Apps from the left-hand menu.

Step 4: Scroll down if needed, and click Choose Default Apps by File Type.

Windows 10 File Association Settings menu screenshot
screenshot

You’ll then be presented with a list of all of the file types Windows 10 supports, with their associated applications on the right-hand side. If a file type doesn’t have a particular application set up to handle it, there will be a gray plus sign (+) icon instead.

Step 5: Scroll through the list to find the file type that you want to change the file association for. Click the application or Plus icon to its right.

Step 6: Choose your preferred application from the list that appears, and click its corresponding icon.

Windows 10 File Association Settings menu file types list screenshot
screenshot

In the case of some file types, there will be multiple options, whereas others may have none.

There are two other avenues available should you not find an offered option. Either download a compatible application from the web or select the Look for an App in the Microsoft Store option, which brings you to the Microsoft Store.

Note: In some cases, the Microsoft Store may not return any results for relevant applications for a particular file type, or it may return search results that aren’t relevant. If this happens to you, you have to decide which app will work with the file type you want to change.

Once you’ve chosen your preferred application, you’re all set. This simple change has successfully reset the default application for your targeted file type; now this program will consistently open when you select a certain type of file.

If you find that you need to change back to the original application, you can use the steps above to get back to the default settings. Alternatively, you can go to the Default Apps section of the Windows 10 Settings menu as we did before, scroll down, and then click on the Reset button located under the phrase Reset to the Microsoft Recommended Defaults.

Editors’ Choice




Repost: Original Source and Author Link

Categories
AI

Google workers announce plans to unionize

A group of Google workers have announced plans to unionize with the Communications Workers of America (CWA). The Alphabet Workers Union will be open to all employees and contractors at Google’s parent company. Its goal will be to tackle ongoing issues like pay disparity, retaliation, and controversial government contracts.

“This union builds upon years of courageous organizing by Google workers,” said Nicki Anselmo, a Google program manager. “From fighting the ‘real names’ policy, to opposing Project Maven, to protesting the egregious, multi-million dollar payouts that have been given to executives who’ve committed sexual harassment, we’ve seen first-hand that Alphabet responds when we act collectively.”

Google’s work on Project Maven, an effort to use AI to improve targeted drone strikes, sparked protests among employees who saw the work as unethical. In 2018, the company decided not to renew its contract with the Pentagon. The company also ended its forced arbitration policy after 20,000 workers staged a walkout to protest former executive Andy Rubin getting a $90 million exit package after he was credibly accused of sexual harassment.

Now that the union effort is public, organizers will likely launch a series of campaigns to rally votes from Google workers. Prior to the announcement, about 230 Google employees and contractors had signed cards in support of the union.

Arranged as a members-only union, the new organization won’t seek collective bargaining rights to negotiate a new contract with the company. Instead, the Alphabet Workers Union will only represent employees who voluntarily join, as reported by the New York Times. That structure will also allow it to represent all employees who seek to participate — including temps, vendors, and contractors (known internally as TVCs) who would be excluded by labor law from conventional collective bargaining.

Google contractors have long complained about their unequal treatment compared to full-time staff. While they make up the majority of Google’s workforce, they often lack the benefits of salaried employees. In 2019, roughly 80 Google contractors in Pittsburgh voted to join the United Steelworkers union.

The Alphabet Workers Union plans to unionize with CWA Local 1400, which represents workers in Massachusetts, Maine, New Hampshire, Vermont, and California.

The news comes one month after the National Labor Relations Board filed a complaint alleging Google illegally fired two workers who were organizing employee protests. The employees, Laurence Berland and Kathryn Spiers, were organizing against the company’s decision to work with IRI Consultants, a firm famous for its anti-union efforts.

It also follows the firing of prominent AI ethicist Timnit Gebru in December. In a press release announcing the union, the Alphabet Workers Union wrote: “The firing has caused outrage from thousands of us, including Black and Brown workers who are heartbroken by the company’s actions and unsure of their future at Google.”

Earlier this year, employees at the crowdfunding platform Kickstarter voted to unionize with the Office and Professional Employees International Union Local 153, as reported by NBC. It was the first time white-collar employees in the tech industry had unionized.

Google employees who decide to join are committing one percent of their annual compensation to the union. The money will go toward paying legal fees and organizing staff.

In a statement emailed to The Verge, Kara Silverstein, director of people operations at Google, said: “We’ve always worked hard to create a supportive and rewarding workplace for our workforce. Of course our employees have protected labor rights that we support. But as we’ve always done, we’ll continue engaging directly with all our employees.”

Updated 8:53AM ET: Added Google statement and additional clarity on AWU’s members-only status.

Correction: An earlier version of this story misstated when Google employees would begin paying one percent of their annual compensation to the union. We regret the error.

Repost: Original Source and Author Link

Categories
Tech News

Hulu Live TV adds 14 ViacomCBS channels, but some cost extra

If you remember back in early January, Hulu announced a big deal with ViacomCBS that would bring 14 of its networks to the streaming service’s live television plan. Now, months later, those channels are finally available to subscribers, offering the opportunity to watch Comedy Central, Nickelodeon, and other popular networks.

As originally announced, a total of 14 ViacomCBS channels are now available to people who have signed up for Hulu’s Live TV offering — though you’ll need to pay extra for the platforms’ Entertainment package to get access to all of the networks.

Assuming you have both Hulu Live TV plus its Entertainment package at another $7.99/month, you’ll get access to Paramount Network, TV Land, Nickelodeon and Nick Jr, Comedy Central, BET, MTV, VH1, CMT, BETher, NickToons, TeenNick, MTV Classic, and MTV 2.

The last four networks on that list are all locked behind the Entertainment package paywall, however, so you’ll find yourself with only the first nine on the list if you’re not willing to sign up for the extras package.

The Hulu Live TV subscription remains at the same $64.99/month price. It’s unclear whether this new addition will result in a future price increase or if moving some of the channels behind the paid Entertainment package was a way to avoid raising the base streaming plan rate.

Repost: Original Source and Author Link

Categories
Game

Titanfall 2 Is Free-To-Play This Weekend After Surge in Play

Titanfall 2 is going free to play on Steam this weekend. The move comes one week after the five-year old shooter broke new records on Steam when fans flocked to the long-dormant game.

First released in 2016, Titanfall 2 is a first-person shooter by Respawn Entertainment. The game initially had the misfortune of launching during a crowded holiday season next to Battlefield 1 and Call of Duty: Infinite Warfare. It initially got lost in the shuffle, but is gaining new interest after Respawn announced that it’s adding elements of the game to Apex Legends. Last weekend, the game hit its highest Steam player count ever, with its player base rising 650% seemingly out of nowhere.

Respawn has now responded to the moment by making the game free to play on Steam this weekend. From now until 10 a.m. PT on May 3, anyone can dive into the multiplayer game for free.

The announcement comes in response to another coordinated fan effort to boost the game this weekend. Twitter account @TitanfallNews posted a call to all series fans asking players to log in on May 1 to show their appreciation for the game. That post was retweeted by the Xbox Game Pass account, propelling it to over 9,000 likes and counting.

Do we need to RSVP? because this is our RSVP https://t.co/kkR9OYuxIh

— Xbox Game Pass (@XboxGamePass) April 30, 2021

It’s an out-of-left-field success story for a game that struggled to find an audience at launch. The game failed to meet EA’s expected sales milestones, which put the possibility of a sequel in question. Once Respawn moved on to the popular Apex Legends, it seemed like the franchise could be over. The newfound push could make Titanfall 3 seem like a more viable possibility for EA.

Even if nothing comes of it, it’s a wholesome moment between Respawn Entertainment and fans who want to show their appreciation for its work.

Editors’ Choice




Repost: Original Source and Author Link

Categories
Tech News

A scientist created emotion recognition AI for animals

A researcher at Wageningen University & Research recently published a pre-print article detailing a system by which facial recognition AI could be used to identify and measure the emotional state of farm animals. If you’re imagining a machine that tells you if your pigs are joyous or your cows are grumpy… you’re spot on.

Up front: There’s little evidence to believe that so-called ’emotion recognition’ systems actually work. In the sense that humans and other creatures can often accurately recognize (as in: guess) other people’s emotions, an AI can be trained on a human-labeled data set to recognize emotion with similar accuracy to humans.

However, there’s no ground-truth when it comes to human emotion. Everyone experiences and interprets emotions differently and how we express emotion on our faces can vary wildly based on cultural and unique biological features.

In short: The same ‘science‘ driving systems that claim to be able to tell if someone is gay through facial recognition or if a person is likely to be aggressive, is behind emotion recognition for people and farm animals.

Basically, nobody can tell if another person is gay, or aggressive just by looking at their face. You can guess. And you might be right. But no matter how many times you’re right, it’s always a guess and you’re always operating on your personal definitions.

That’s how emotion recognition works too. What you might interpret as “upset,” might just be someone’s normal expression. What you might see as “gay,” well.. I defy anyone to define internal gayism (ie: do thoughts or actions make you recognizably gay?). 

It’s impossible to “train” a computer to recognize emotions because computers don’t think. They rely on data sets labeled by humans. Humans make mistakes. Worse, it’s ridiculous to imagine any two humans would look at a million faces and come to a blind consensus on the emotional state of each person viewed.

Researchers don’t train AI to recognize emotion or make inferences from faces. They train AI to imitate the perceptions of the specific humans who labeled the data they’re using.

That being said: Creating an emotion recognition engine for animals isn’t necessarily a bad thing.

Here’s a bit from the researcher’s paper:

The system is trained on dataset of facial features of images of the farm animals collected in over 6 farms and has been optimized to operate with an average accuracy of 85%. From these, we infer the emotional states of animals in real time. The software detects 13 facial actions and 9 emotional states, including whether the animal is aggressive, calm, or neutral.

The paper goes on to describe the system as a high-value, low-impact machine learning paradigm where farmers can gauge livestock comfort in real-time using cameras instead of invasive procedures such as hormone sampling.

We covered something similar in the agricultural world a while back. Basically, farmers operating orchards can use image recognition AI to determine if any of their trees are sickly. When you have 10s of thousands of trees, performing a visual inspection on each one of them in a timely manner is impossible for humans. But AI can stare at trees all day and night.

AI for livestock monitoring is a different beast altogether. Instead of recognizing specifically-defined signs of disease in relatively-motionless trees, the researcher’s attempting to tell what mood a bunch of animals are in.

Does it work? According to the researcher, yes. But according to the research: kinda. The paper makes claims of incredibly high accuracy, but that’s when compared against human spotters.

So here’s the thing: Creating an AI that can tell what pigs and cows are thinking almost as accurately as the world’s leading human experts is a lot like creating a food so delicious it impresses a chef. Maybe the next chef doesn’t like it, maybe nobody but that chef likes it. 

The point is: this system uses AI to do a slightly poorer job than a farmer can at determining what a cow is thinking by looking at it. There’s value in that, because farmers can’t stare at cows all day and night waiting for one of them to grimace in pain. 

Here’s why this is fine: Because there’s a slight potential that the animals could be treated a tiny bit better. While it’s impossible to tell exactly what an animal is feeling, the AI can certainly recognize signs of distress, discomfort, or pain well enough to make it worth while to employ this system in places where farmers could and would intervene if they thought their animals were in discomfort.

Unfortunately, the main reason why this matters is because livestock that lives in relative comfort tends to produce more.

It’s a nice fantasy to imagine a small, farm-to-table, family installing cameras all over their massive free-range livestock facility. But, more likely, systems like this will help corporate farmers find the sweet spot between packing animals in and keeping their stress levels just low enough to produce.

Final thoughts: It’s impossible to predict what the real-world use cases for this will be, and there are definitely some strong ones. But it muddies the water when researchers compare a system that monitors livestock to an emotion recognition system for humans.

Whether a cow gets a little bit of comfort before it’s slaughtered or as it spends the entirety of its life connected to dairy machinery isn’t the same class of problem as dealing with emotion recognition for humans.

Consider the fact that, for example, emotion recognition systems tend to classify Black men’s faces as angrier than white men’s. Or women, typically, rate pain higher when observing its perceived existence in people and animals. Which bias do we train the AI with?

Because, based on the current state of the technology, you can’t train an AI without bias unless the data you’re generating is never touched by human hands, and even then you’re creating a separate bias category.

You can read the whole paper here.



Repost: Original Source and Author Link

Categories
AI

IBM is acquiring Turbonomic to advance AIOps agenda

Join Transform 2021 this July 12-16. Register for the AI event of the year.


IBM announced this week that it is acquiring Turbonomic, provider of application resource management (ARM) and network performance management (NPM) software infused with machine learning algorithms. Terms of the acquisition, which is expected to close this quarter, were not disclosed.

The two companies have a long-standing relationship under which IBM has been reselling Turbonomic’s ARM platform. Cisco also resells tools developed by the company. Turbonomic, which is privately held, claims revenues were up 41% for fiscal 2021 and counts Avon, HauteLook, and Litehouse Foods among its customers.

Applications and systems management

The decision to acquire Turbonomic comes after IBM began revamping its application and systems management portfolio last fall. This push began in earnest with the acquisition of Instana, provider of an application performance management (APM) platform for monitoring and observing applications.

IBM now plans to further integrate the ARM software Turbonomic developed with the APM software from Instana and an IBM Cloud Pak for Watson AIOps platform that employs machine learning algorithms to identify anomalies in real time.

“Turbonomic provides actionable observability,” IBM Automation GM Dinesh Nirmal told VentureBeat in an interview.

IBM is further extending its IT management portfolio via the recent acquisition of WDG Automation, provider of a robotic process automation (RPA) platform, and MyInvenio, which offers process mining tools, he noted.

As IT environments become more complex, Nirmal said it won’t be feasible to manage these environments without augmenting IT staff with capabilities enabled by AI platforms. It’s not likely AI platforms will replace the need for human IT administrators, but the job functions themselves will continue to evolve as lower-level manual tasks become automated, Nirmal added.

IT challenges

Now that companies are becoming more cognizant of the scope of IT management challenges, IT teams are increasingly embracing AI platforms. Organizations are now deploying a new generation of microservices-based applications that are more difficult to manage than the existing monolithic legacy applications, which are not likely to be retired anytime soon, Nirmal said. Those applications make use of cloud-native technologies such as containers, Kubernetes, and serverless computing frameworks that all need to be managed alongside virtual machines. At the same time, the IT environment has become more distributed than ever, thanks to the rise of both cloud and edge computing platforms.

The only way to contain the total cost of managing that extended enterprise is to rely more on automation enabled by AIOps platforms, Nirmal said.

IT teams need to come to terms with the fact that it takes time for machine learning algorithms to learn IT environments that are unique and subject to change. Implementing AI requires patience, Nirmal said, adding, “IT teams need to accept that AI comes with an upfront cost.”

But the return on investment in AIOps becomes apparent as rote tasks are eliminated and more potential issues are addressed before they impact an application, Nirmal noted. IT teams, for example, will be able to predict the impact new code is likely to have on the overall IT environment before it’s deployed.

IBM’s investments in AIOps are a natural extension of the capabilities IBM has developed to automate a wide range of business processes using AI technologies, Nirmal added. IT leaders can’t make a credible case for applying AI to automate business processes if the IT team isn’t using the same technologies to automate IT operations, he noted.

At this juncture, AI is about to become a mainstream component of IT operations. The issue now is determining to what degree. In some cases, AI capabilities will be slipstreamed into existing platforms, while in others, IT teams will decide to move to a new platform. Either way, machine learning algorithms will be present in one form or another.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link

Categories
Computing

AMD FidelityFX Super Resolution May Still Be a Long Way Off

AMD FidelityFX Super Resolution is poised to counter Nvidia’s Deep Learning Supersampling (DLSS) technology, finally giving Team Red a way to run games with ray tracing at playable frame rates. Whenever it arrives, that is.

AMD has been silent on when Super Resolution will show up, and a recent FAQ from the upcoming Metro Exodus Enhanced Edition suggests that wide adoption may be a long way off.

The removed FAQ — spotted by @Locuza on Twitter — responds specifically to Super Resolution. Apparently, AMD’s technology is “not compatible” with the rendering techniques used by Metro Exodus Enhanced Edition, and developer 4A Games says its own temporal-based reconstruction tech offers “the same or better image quality benefits for all hardware.” That doesn’t bode well for Super Resolution, suggesting it may be more difficult to implement than AMD has implied.

AMD's Super Resolution feature is likely many months away and we already have a first Ray Tracing game which according to 4A Games is not compatible with its rendering pipeline.
Their own method is claimed to be similar or even better in terms of quality.https://t.co/bOMyWwCubu pic.twitter.com/a2zTZRQjl7

— Locuza (@Locuza_) April 29, 2021

In a talk with PCWorld, Scott Herkelman, AMD vice president of graphics, said that “you don’t need machine learning to do it,” referencing Nvidia’s reconstruction technique and how Super Resolution could improve on it. Herkelman said there are “many different ways” to go about reconstructing an image, and that AMD is focused on one core question: “What do game developers want to use?”

Clearly, 4A Games didn’t want to use whatever AMD is offering, or simply couldn’t. Metro Exodus Enhanced Edition is one of a small number of games that require hardware ray tracing support. The latest AMD RX 6000 graphics cards support ray tracing, but without a tool like DLSS to combat the massive performance loss it brings on, it’s a bit of a moot feature.

That puts AMD at a large disadvantage for ray tracing-only titles moving forward. The recommended system specs for the RT Extreme preset don’t even include an AMD card. Instead, AMD tops out with the Ultra preset, which targets 4K at 30 frames per second on an RX 6900 XT. It’s worth pointing out that 4A Games created its system specs matrix with DLSS turned off, and it says that players can expect far better performance with Nvidia’s reconstruction feature turned on.

However, Metro Exodus Enhanced Edition is just a single game. The RDNA 2 architecture behind the RX 6000 graphics cards is the same architecture inside the PlayStation 5 and Xbox Series X. Consoles still make up the majority of the gaming market, and AMD will, at some point, need an image-reconstruction method to keep games looking their best on consoles. The lack of support on Metro Exodus Enhanced Edition could point to problems with implementing Super Resolution. We won’t know for sure, though, until we hear something official from AMD.

Editors’ Choice




Repost: Original Source and Author Link