Want to make robots run faster? Try letting AI take control

Quadrupedal robots are becoming a familiar sight, but engineers are still working out the full capabilities of these machines. Now, a group of researchers from MIT says one way to improve their functionality might be to use AI to help teach the bots how to walk and run.

Usually, when engineers are creating the software that controls the movement of legged robots, they write a set of rules about how the machine should respond to certain inputs. So, if a robot’s sensors detect x amount of force on leg y, it will respond by powering up motor a to exert torque b, and so on. Coding these parameters is complicated and time-consuming, but it gives researchers precise and predictable control over the robots.

An alternative approach is to use machine learning — specifically, a method known as reinforcement learning that functions through trial and error. This works by giving your AI model a goal known as a “reward function” (e.g., move as fast as you can) and then letting it loose to work out how to achieve that outcome from scratch. This takes a long time, but it helps if you let the AI experiment in a virtual environment where you can speed up time. It’s why reinforcement learning, or RL, is a popular way to develop AI that plays video games.

This is the technique that MIT’s engineers used, creating new software (known as a “controller”) for the university’s research quadruped, Mini Cheetah. Using reinforcement learning, they were able to achieve a new top-speed for the robot of 3.9m/s, or roughly 8.7mph. You can watch what that looks like in the video below:

As you can see, Mini Cheetah’s new running gait is a little ungainly. In fact, it looks like a puppy scrabbling to accelerate on a wooden floor. But, according to MIT PhD student Gabriel Margolis (a co-author of the research along with postdoc fellow Ge Yang), this is because the AI isn’t optimizing for anything but speed.

“RL finds one way to run fast, but given an underspecified reward function, it has no reason to prefer a gait that is ‘natural-looking’ or preferred by humans,” Margolis tells The Verge over email. He says the model could certainly be instructed to develop a more flowing form of locomotion, but the whole point of the endeavor is to optimize for speed alone.

Margolis and Yang say a big advantage of developing controller software using AI is that it’s less time-consuming than messing about with all the physics. “Programming how a robot should act in every possible situation is simply very hard. The process is tedious because if a robot were to fail on a particular terrain, a human engineer would need to identify the cause of failure and manually adapt the robot controller,” they say.

Mini Cheetah gets the once-over from a non-robot dog.
Image: MIT

By using a simulator, engineers can place the robot in any number of virtual environments — from solid pavement to slippery rubble — and let it work things out for itself. Indeed, the MIT group says its simulator was able to speed through 100 days’ worth of staggering, walking, and running in just three hours of real time.

Some companies that develop legged robots are already using these sorts of methods to design new controllers. Others, though, like Boston Dynamics, apparently rely on more traditional approaches. (This makes sense given the company’s interest in developing very specific movements — like the jumps, vaults, and flips seen in its choreographed videos.)

There are also faster-legged robots out there. Boston Dynamics’ Cheetah bot currently holds the record for a quadruped, reaching speeds of 28.3 mph — faster than Usain Bolt. However, not only is Cheetah a much bigger and more powerful machine than MIT’s Mini Cheetah, but it achieved its record running on a treadmill and mounted to a lever for stability. Without these advantages, maybe AI would give the machine a run for its money.

Repost: Original Source and Author Link


Autonomous trucking company Plus drives faster transition to semi-autonomous trucks

This article is part of a VB Lab Insight series paid for by Plus.

Breaking away from the competition, Plus, a Silicon Valley-based provider of autonomous trucking technology, is taking an innovative driver-in approach to commercialization that aligns with the critical challenges facing the trucking industry today.

According to newly-released estimates of traffic fatalities in 2021, crashes involving at least one large truck increased 13% compared to the previous year.

With a nationwide truck driver shortage estimated at 80,000 last year and growing, PlusDrive, Plus’s market-ready supervised autonomous driving solution, helps long-haul operators reduce stress while improving safety for all road users.

First-to-market solution helps fleets today

In 2021 Plus achieved a critical industry milestone, becoming the first self-driving trucking technology company to deliver a commercial product to the hands of customers. Over the past year Plus has delivered units of PlusDrive to some of the world’s largest fleets and truck manufacturers.

These units are not demos or test systems. Shippers have installed the technology on their trucks and PlusDrive-equipped trucks with the shippers’ drivers are hauling commercial loads on public roads nationwide.

PlusDrive improves safety and driver comfort, and saves at least 10% in fuel expenses, addressing driver recruitment and retention while offsetting surging diesel prices and other costs tied to today’s volatile trucking market.

Plus’s first-to-market shipments and installations validate the progress the company has made in developing a safe, reliable driver-in technology solution for the long-haul trucking industry.

PlusDrive will reach more fleets this year as Plus continues to expand the close collaboration with customers pioneering the use of driver-in autonomous trucking technology for their heavy-duty truck operations.

Partnerships unlock commercial pathways for PlusDrive

Partnerships with industry stakeholders — from automotive suppliers to truck manufacturers and regulators — have been critical to Plus’s success, helping to unlock innovation and commercial pathways to deploy its technology globally. Its autonomous driving technology can be retrofitted on existing trucks or installed at the factory level. With Cummins and IVECO, Plus is also developing autonomous trucks powered by natural gas for the U.S. and Europe.

Building on the market penetration it has achieved already, Plus this month announced a collaboration with Velociti, a fleet technology solutions company, creating a nationwide installation and service network capable of delivering PlusDrive semi-autonomous trucks to customers within 12 hours. Maintenance services are also available nationwide by utilizing mobile resources to meet customers at their preferred location.

The program, known as Plus Build, equips Class 8 trucks with state-of-the-art lidar, radar and camera sensors and Plus’s proprietary autonomous driving software. 

With PlusDrive, truck drivers stay in the cabin to oversee the Level 4 technology, but they do not have to actively drive the vehicle. Instead, they can turn on PlusDrive to automatically drive the truck on highways in all traffic conditions, including staying centered in the lane, changing lanes and handling stop-and-go traffic. PlusDrive reduces driver stress and fatigue, providing a compelling recruitment and retention tool during a time of driver shortages.

Velociti’s nationwide installation and maintenance network will help get PlusDrive into the hands of more truck drivers across the country, making their jobs “safer, easier and better,” said Shawn Kerrigan, COO and co-founder of Plus.

“Plus Build helps companies unlock the benefits of autonomous driving technology today by quickly modernizing trucks to improve their safety and uptime.”

Drivers are on board for next-generation autonomous driving technology

Plus works closely with customers, drivers and industry partners to help them understand the advantages of PlusDrive. Their testimonials validate the key benefits of the system.

“I am an admitted cynic, and I was blown away,” said commercial vehicle and transportation industry analyst Ann Rundle, Vice President of ACT Research. After taking a demo ride of a PlusDrive-enabled truck at the recent Advanced Clean Transportation Expo, Rundle said, “It was so seamless. I suppose if I wasn’t watching the screen that indicated the system is ‘Engaged’ I might have wondered [if PlusDrive was indeed still doing the driving].”

A professional driver invited to test PlusDrive echoed those sentiments. “If I were to have a system like this in my truck,” he said, “it would make my job a whole lot smoother, easier and a lot less stressful.”

Another operator from a customer fleet praised PlusDrive for “reinforcing safety; just keep giving us these tools — it’s a tool to help us help the company.” 

PlusDrive keeps economy moving, safely

Startups competing for a slice of the self-driving trucking future are aligned on the long-term goal of getting fully driverless commercial trucks (with no safety drivers) on the road. But Plus stands out as the only company to release a product that enables fleets and drivers to benefit from automation today, when there is little sign of an end to the chaos and stress roiling the freight markets. Through its collaborative, considered approach to delivering its autonomous trucking technology as a commercial product now, Plus is maximizing benefits for fleets while making long-haul trucking safer and easier for the hard-working operators who keep the nation’s economy moving.

VB Lab Insights content is created in collaboration with a company that is either paying for the post or has a business relationship with VentureBeat, and they’re always clearly marked. Content produced by our editorial team is never influenced by advertisers or sponsors in any way. For more information. For more information, contact

Repost: Original Source and Author Link


How Nvidia is Using A.I. to Delivery Pizzas Faster

Nvidia announced a new tool that can help deliver your pizzas faster — yes, really — at its fall GTC 2021 event. It’s called ReOpt, and it’s a real-time logistics tool that Domino’s is already using to optimize delivery routes based on time and cost.

ReOpt is a set of logistics-planning algorithms that can find billions of routes to the same location. It utilizes heuristics powered by GPU computing to route vehicles in the most efficient way possible. It’s like Google Maps, just way more complex and designed specifically to meet the needs of last-mile delivery.

Keith Nelson/Digital Trends

Domino’s hasn’t been shy about adopting new technology. The company has been saying for years that it’s a technology company that delivers pizzas, and has adopted everything from driverless cars to in-car delivery apps to help optimize ordering and delivery. ReOpt is the next evolution of that, it seems, and it’s an increasingly important tool for how the world operates today.

Last-mile delivery compromises the final leg of a package arriving to your door. It’s where an Amazon or FedEx driver drops off your package, and it represents the most significant pain point for e-commerce sales today. About 53% of delivery costs come just from last-mile delivery, as distant homes or traffic congestion create inefficiencies in delivery routes.

And with the pandemic boosting e-commerce sales to new highs, last-mile delivery has never been more important. ReOpt is the latest in a long line of Nvidia technologies that eyes GPUs as a solution. Nvidia says it can route 1,000 packages in three seconds on a GPU, work that would normally take a CPU five minutes.

Nvidia says that ReOpt can reduce delivery costs by 15%, which it says represents “billions” in savings. To illustrate this point, Nvidia pointed to Domino’s, which was able to quickly calculate 87 billion ways to visit 14 locations using ReOpt. ReOpt is also dynamic, helping last-mile delivery workers adapt to changing traffic conditions quickly.

Although Domino’s is the first company to adopt ReOpt, this type of tool applies across a wide variety of industries. From Amazon packages to DoorDash drivers, last-mile delivery has been a hurdle for several industries, and hopefully ReOpt will provide some solutions.

Editors’ Choice

Repost: Original Source and Author Link


AMD RX 6600 XT Is 15% Faster Than the RTX 3060, but $50 More

Following months of leaks and rumors, AMD finally pulled back the curtain on the RX 6600 XT. The new graphics card is a 1080p addition to the RDNA 2 range, which should provide high frame rates at 1080p and 1440p with a little help from FidelityFX Super Resolution (FSR).

The Radeon RX 6600 XT is set to launch on August 11 for $379. In addition to board partner designs, AMD will supply units to desktop makers like Acer, Alienware, and HP. Although AMD showed off a render of a reference design, it won’t be manufacturing a reference model for the 6600 XT.

The card targets 1080p high refresh rate monitors with performance somewhere between an RTX 3060 and RTX 3060 Ti. In Doom Eternal, for example, the RX 6600 XT averaged 155 frames per second (fps) compared to 134 fps with the RTX 3060. Similarly, the card hit 92 fps in Assassin’s Creed Valhalla compared to 69 fps on Nvidia’s card. Overall, AMD claims the card is 15% faster on average.

It’s important to point out that these benchmarks come from AMD, so we’ll need to wait for further testing to draw any firm conclusions. AMD also ran the tests with Smart Access Memory (SAM) enabled, which is a feature that can boost frame rates with Ryzen 5000 and select Ryzen 3000 processors.

Here are the specs we know right now:

RX 6600 XT
GPU Navi 23
Interface PCIe 4.0
Compute units 32
Stream processors 2,048
Ray accelerators 32
Game clock 2,359MHz
Memory 8GB GDDR6
Memory speed 16Gbps
Bandwidth Up to 256 GB/s
Memory bus 128-bit
TDP 160W

Although the performance is impressive, the suggested price of $379 is higher than the direct competition. That’s only $20 less than the RTX 3060 Ti and $50 more than the RTX 3060, the latter of which matches the RX 6600 XT in games like Cyberpunk 2077 and Horizon Zero Dawn. 

AMD set the price to be representative of where the market currently is. At launch, select designs from AMD’s partners will be available at $379, though the company pointed out how challenging this price is to meet given the ongoing GPU shortage.

RX 6600 XT models from board partners.

The biggest win for the RX 6600 XT looks like FSR. At 1080p with max settings and ray tracing turned on, the card was able to surpass 100 fps in Godfall and boost frame rates by up to 74% in The Riftbreaker. It also managed to increase the frame rate in Resident Evil Village, though only by a modest 13%.

FSR also allows you to push the resolution above 1080p. With ray tracing off at 1440p, AMD showed the RX 6600 XT jumping from 113 fps to 243 fps in Resident Evil Village. Similarly, Marvel’s Avengers climbed from 57 fps at native 1440p to 96 fps in FSR’s aggressive Performance mode.

RX 6600 XT benchmarks with FSR turned on.

With FSR available, the RX 6600 XT looks like the 1080p gamer’s dream. However, availability will likely be a problem. “We are doing our best to get supply, but the demand is unprecedented,” an AMD spokesperson said.

AMD isn’t releasing a reference design for the RX 6600 XT, but models from ASRock, Gigabyte, MSI, Asus, PowerColor, and more will be available on August 11.

Editors’ Choice

Repost: Original Source and Author Link


Dropbox adds new file conversion tools and faster camera backups

Dropbox is getting a slew of interface tweaks alongside new tools and features. One of the more useful inclusions is a new file conversion feature, which lets Dropbox users convert images between different formats like JPEG and PNG, and files into PDFs. Support for video conversion is coming soon. Dropbox is also adding new features to its password manager.

Automatic camera backups from mobile, which will now be available to users on Dropbox’s free tier, are also seeing some improvements. Dropbox says uploads should now be faster and more reliable, and on iOS there’s now an option to specify exactly which folders should be automatically backed up (the feature is coming to Android later this year). There’s also the option to have photos automatically deleted after they’re backed up to save space.

The web interface’s details pane has also been redesigned.
Image: Dropbox

The service’s password manager, which it made available to free account holders in April, is being updated to store credit and debit card details while a password sharing feature announced in March is now rolling out. Dropbox’s password manager is available to both its free and paid account holders, but free accounts are limited to storing a maximum of 50 passwords.

Finally, on the web Dropbox is updating its side navigation interface, making it easier to organize files through dragging and dropping. The details pane is also being overhauled to offer more details on your files at a glance. Desktop users are also getting a simplified interface from their system tray, with “streamlined access to content, search, file activity, and sync progress.”

Repost: Original Source and Author Link

Tech News

How to play irritating WhatsApp voice messages faster

Welcome to TNW Basics, a collection of tips, guides, and advice on how to easily get the most out of your gadgets, apps, and other stuff.

Voice messages are a divisive form of communication. Their devotees say they create a more personal connection, feel more natural than writing, and convey emotion better than mere text. They’re also helpful for people who have difficulties texting (who are obviously exempt from the criticisms that are coming).

Despite these advantages, voice notes have manifold detractors. The critics argue that they’re slow to scan, susceptible to rants, and disruptive to whatever else you’re doing. They’re also awkward to rewind on the rare occasions when someone says something important that you didn’t quite catch the first time.

I personally hate them. They were a constant cause of conflict with an ex-girlfriend who adored them. Not quite the reason we broke up, but they may have played a part.

The folks at WhatsApp must have heard my cries of frustration, as they’ve finally made it possible to at least abbreviate the misery.

[Read: This dude drove an EV from the Netherlands to New Zealand — here are his 3 top road trip tips]

The app now has a Fast Playback option, which lets you listen to a message at 1.5x or 2x speed, without changing the pitch of the sender’s voice.

Telegram has had a similar feature for years. But for people still on WhatsApp, Fast Playback provides a simple way to accelerate those torturously rambling voice messages. Here’s how to use it:

  1. Hit Play on the voice message you want to speed up.
  2. When the message starts playing, tap the 1x icon to increase the speed to 1.5x or 2x.

The feature might have arrived too late to salvage my relationship, but it could still save countless others.

Repost: Original Source and Author Link


Microsoft Build touts Power Apps, Cosmos DB enhancements to develop code faster

Elevate your enterprise data technology and strategy at Transform 2021.

At Microsoft’s Build conference this week, CEO Satya Nadella was focused on speed. “It’s all about that developer velocity,” he promised, as the company unveiled tools and services that would enable developers to turn ideas into software stacks faster.

The annual event has something for both traditional developers and newer developers who use spreadsheets and other “low-code” tools. Key announcements this week included the integration of AI technologies with Microsoft Power FX low-code programming language and enhancements to Cosmos DB.

Turning data into dashboards

The Microsoft Power Platform allows non-technical users to create, automate, and analyze data themselves and not have to wait for developers to build the applications and processes for them. Power BI is a collection of low-code and no-code tools that turn complex data into reports and interactive dashboards. Analysts can use Power Apps to build data applications and processes.

Integrating AI into Power FX will make it easier to use natural-language input and “programming by example” techniques when developing with PowerApps. The fact that Power FX is a formula-based tool built on Microsoft Excel means people can write custom code without having to learn traditional programming languages.

Power FX is a “low-code programming language for everyone,” Microsoft program manager Greg Lindhorst said.

While this approach offers plenty of benefits, there are limits to how much coding the world can handle. The Excel lovers who can create elaborate interactive spreadsheets will be overjoyed to write even more complex functions that can trigger more elaborate dashboards. But casual spreadsheet users will find there is still a steep learning curve as they struggle to keep track of complex syntax and other gotchas that drive neophytes mad.

It’s low code, not no code, after all.

Teams as more than video

Power Apps is natively integrated into all Microsoft cloud offerings, including Microsoft Teams (Office 365), business apps (Dynamics 365), and developer cloud (Azure). With an embedded app studio, Teams is more than just a place for email and video chat. At Microsoft Build, the company tried to position the remote collaboration tool as a fully customizable platform for delivering apps.

This integration could increase the amount of custom code within an organization. While this feature won’t mean much to average users, giving teammates the ability to not just chat but also create code could be amazing. Power users would be able to share their code over Teams, and others can extend it.

A  few clever hacks can save millions of hours of work.

Enhancing Cosmos DB

Cosmos DB is one of Microsoft’s flagship tools on Azure, and it remains one of the simplest and most flexible ways for developers to store data. Microsoft emphasized cost containment and serverless options for Cosmos DB.

The biggest option may be a customized cache. In the past, Azure users could insert a version of Redis to handle bursts in similar traffic. The new cache is optimized for Cosmos DB.

The price for the cache is computed as a regular instance based upon its compute power and size of the RAM, the most important parameter for deciding how much data to cache. When the cache hits, there’s no cost incurred for Cosmos DB, effectively trading the seemingly infinite exposure of database queries with the fixed monthly cost of a caching machine.

Caching helps with high loads with big, concentrated bursts of activities. The Cosmos DB team is also emphasizing the opportunity to deploy serverless loads for intermittent applications and those that might still be in testing. The serverless version became generally available at Microsoft Build.

Cosmos DB users tend to be more serious developers with a grander set of assignments than power-spreadsheet users working with the Power BI platform. The new features are aimed at making it easier and faster for developers to start storing data in Cosmos DB and also help contain the costs (or even drive them lower).

Software development for everyone

Nadella’s goal is to push software development into every corner of the world. He pointed out that the number of developers at non-technology companies has grown faster than the number at technology companies, making low-code tools ideal for those environments.

“In the automotive industry, there were more software engineers than mechanical engineers hired over the last year,” he noted in his keynote address.


VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link

Tech News

Chrome 91 is boasted to be 23% faster thanks to JavaScript improvements

Despite the obsolescence and, now, death of Flash, the Web has never been richer and more interactive. That’s partly thanks to JavaScript, the much-maligned yet also widely-used programming language that powers the Web and many apps both on the Web and even on desktops. Its power doesn’t come without a price, though, and JavaScript performance has always been the bane of web browsers, which is why Google is so proud of how it has made the latest Chrome release significantly faster and uses less memory, which will be music to users’ ears.

For all the features that JavaScript enables, it is still a programming language that humans use to type out programs. These mostly human-readable programs still need to get translated into a language that the machine can understand and run. This translation process, called interpretation and compilation, is one of the performance bottlenecks that JavaScript has on Web browsers.

Google says that it has developed a sort of middle ground between starting a JavaScript program quickly and making it run fast, which often means getting it started more slowly. This is thanks to a new Sparkplug compiler for the V8 JavaScript engine that Chrome uses which, according to the company, yielded up to 23% speed improvements in Chrome M1. The improvements, however, don’t end there.

Chrome now also uses a well-known trick in software optimization where code is packed more closely together in memory so that the CPU doesn’t jump around when performing the most basic operations. Google says that this improvement is even more relevant for devices running on Apple’s powerful M1 chip, like the M1 Macs and the new M1 iPad Pro.

Although it is arguably the most popular and most-used browser in the world, Chrome is also widely regarded to be a resource hog, both in CPU and memory. This 23% speed boost is no small matter and will eventually also translate to longer battery life on laptops and mobile devices.

Repost: Original Source and Author Link


Google Says Chrome Is Now 23% Faster in Version 91

Google has steadily been improving the performance of the Chrome web browser, and you’re about to see the biggest jump yet.

Rolling out in Chrome version 91 are some changes that make the browser up to up to 23% faster.

The performance improvements are largely thanks to changes in the underlying JavaScript execution under the hood of Chrome. Google says that it has introduced a new Sparkplug compiler and short built-in calls to help save more CPU time when using Chrome on your laptop or computer.

More specifically, it is up to about 17 years’ worth of CPU time, per Google’s estimate. How does it work? Well, the new JavaScript compiler is able to fill the gap between needing to start executing quickly and optimizing the code for maximum performance.

“Short built-in calls optimize where in memory we put generated code to avoid indirect jumps when calling functions,” said Thomas Nattestad, Chrome product manager at Google Chrome in a blog post announcing the changes.

Over the past few releases, Google has made Chrome faster, addressing the common complaints that it takes up too much RAM and system resources. In March 2021, Google claimed that Chrome version 89 could take up to 22% less memory on Windows 10. Before that, Chrome version 85 introduced 10% faster page load times.

Chrome version 91 originally released on May 25 to the stable desktop channel, and comes with many security fixes, as well as these performance improvements. Other features improve the experience on Android. MacOS, Windows, and Linux.

Google Chrome version 91 should have already rolled out to your PC automatically. If you’re not already seeing it, you can force the update in a few steps. Just click the three downward-facing dots next to your profile picture, then hit About Chrome. From there, Chrome will download the update in a few minutes depending on your internet speed. You’ll then have to restart the browser to apply it.

On a related note, Microsoft also rolled out version 91 of its Edge browser this week. That browser is based on the same open source Chromium engine as Google Chrome. It comes with the performance-improving Sleeping Tabs feature, as well as more ways to save money as you shop.

Editors’ Choice

Repost: Original Source and Author Link


Nvidia’s Ada Lovelace and AMD RDNA3 Might Be 3 Times Faster

While many people still can’t get their hands on one of the graphics cards released in 2020, Nvidia and AMD are both already working on the next generation of GPUs. Using the Nvidia Ada Lovelace and AMD RDNA 3 architectures, these new graphics cards are rumored to offer performance that was previously unheard of. Kopite7kimi, a reliable leaker on the GPU scene, revealed some exciting information that tells us more about these cards.

The cards have not been given official names as of yet, but the leaker talks about Nvidia’s Ada Lovelace (AD102), Nvidia’s Hopper (GH202), and AMD’s RDNA 3 (Navi 3X). While nothing is set in stone yet, the most exciting piece of information is pretty straightforward — there might be a huge jump in performance.

Compared to Nvidia’s Ampere GA102 architecture, the new AD102 chip is rumored to offer a 2.2x performance increase. As if that wasn’t enough, AMD’s Navi 3X is said to take things a step further with a performance increase of up to 2.5x.

The cards in question are most likely going to be based on a 5nm process node as opposed to the 8nm process node used in, for example, RTX 3070 cards. The new GPUs are also rumored to use either TSMC’s or Samsung’s design. A drastic increase in power consumption is likely to follow the improved frames per second (fps), necessitating the use of even stronger PSUs.

Nvidia and AMD CEOs

Additional stats prove to be even more impressive. According to the leak, a single graphics card might come with up to 100 teraflops of FP32 compute power. If proven true, this means an increase of over 70 TFLOPs from Nvidia’s flagship RTX 3080.

It seems that Ada Lovelace is going to be a relatively small transition from Nvidia’s current Ampere technology, but Hopper is shaping up to be the next huge GPU revolution for the manufacturer. While the AD102 chip is going to be a straight upgrade with small architectural changes, Hopper is likely to be something else entirely, with a new approach to design and power requirements, as well as a boost in performance thanks to raster/ray-tracing.

It’s possible that the future AMD RX 7000 cards are going to outperform Nvidia’s RTX 4000 series, at least to begin with. This is because AMD is likely beating Nvidia in the race to offer the first MCM-powered graphics cards. On the other hand, Nvidia’s going to follow Ada Lovelace with Hopper — and that one is an MCM-based card expected to offer an up to 3x jump in performance.

In terms of today’s hardware, it almost seems like the sky is the limit, but Kopite7kimi also talks about a possible bottleneck for both Nvidia and AMD — memory bandwidth. With GPUs as powerful as these, GDDR6X will not suffice. This might force manufacturers to go back to HBM (high bandwidth memory) or perhaps work on a new GDDR memory standard.

The solution to memory bandwidth problems might also lie in close-to-GPU caches that could be helpful in avoiding possible bottlenecks. This is a technology we’re already familiar with thanks to AMD’s Infinity Cache, which has proven to be highly effective in its RX-6000 series cards.

The first rumors about Nvidia’s next series of GeForce RTX and AMD’s next Radeon RX surfaced in late 2020. These rumors are in line with the leaks Kopite7kimi reports. Another reliable leaker, KittyYYuko, also reported similar findings as to the next generation of cards for AMD.

However, as the release of these GPUs is unlikely to happen before 2022, or even 2023, it’s a good idea to wait for more information to surface.

Editors’ Choice

Repost: Original Source and Author Link