Nvidia’s RTX 4000 get new specs, and it’s not all good news

Nvidia’s upcoming Ada Lovelace graphics cards just received a new set of rumored specifications, and this time around, it’s a bit of a mixed bag.

While the news is good for one of the GPUs, the RTX 4070 actually received a cut when it comes to its specs — but the leaker says this won’t translate to a cheaper price.

And TBP, 450/420?/300W.

— kopite7kimi (@kopite7kimi) June 23, 2022

The information comes from kopite7kimi, a well-recognized name when it comes to PC hardware leaks, who has just revealed an update to the specifications of the RTX 4090, RTX 4080, and the RTX 4070. While we’ve already heard previous whispers about the specs of the RTX 4090 and the RTX 4070, this is the first time we’re getting predictions about the specs of the RTX 4080.

Let’s start with the good news. If this rumor is true, the flagship RTX 4090 seems to have received a slight bump in the core count. The previously reported number was 16,128 CUDA cores, and this has now gone up to 16,384 cores, which translates to an upgrade from 126 streaming multiprocessors (SMs) to 128. As for the rest of the specs, they remain unchanged — the current expectation is that the GPU will get 24GB of GDDR6X memory across a 384-bit memory bus, as well as 21Gbps bandwidth.

The RTX 4090 includes the AD102 GPU, which maxes out at 144 SMs, but it seems unlikely that the RTX 4090 itself will ever reach such heights. The full version of the AD102 GPU is probably going to be found in an even better graphics card, be it a Titan or simply an RTX 4090 Ti. It’s also rumored to have monstrous power requirements. This time around, kopite7kimi didn’t reveal anything new about that card, and as of now, we still don’t know for a fact that it even exists.

Moving on to the RTX 4080 with the AD103 GPU, it’s said to come with 10,240 CUDA cores and 16GB of memory. However, according to kopite7kimi, it would rely on GDDR6 memory as opposed to GDDR6X. Seeing as the leaker predicts it to be 18Gbps, that would actually make it slower than the RTX 3080 with its 19Gbps memory. The core count is exactly the same as in the RTX 3080 Ti. So far, this GPU doesn’t sound very impressive, but it’s said to come with a much larger L2 cache that could potentially offer an upgrade in its gaming performance versus its predecessors.

Jacob Roach / Digital Trends

When it comes to the RTX 4070, the GPU was previously rumored to come with 12GB of memory, but now, kopite7kimi predicts just 10GB across a 160-bit memory bus. It’s said to offer 7,168 CUDA cores. While it’s certainly an upgrade over the RTX 3070, it might not quite be the generational leap some users are hoping for. It’s also supposedly not going to receive a price discount based on the reduction in specs, but we still don’t know the MSRP of this GPU, so it’s hard to judge its value.

Lastly, the leaker delivered an update on the power requirements of the GPUs, which have certainly been the subject of much speculation over the last few months. The predicted TBP for the RTX 4090 is 450 watts. It’s 420 watts for the RTX 4080 and 300 watts for the RTX 4070. Those numbers are a lot more conservative than the 600 watts (and above) that we’ve seen floating around.

What does all of this mean for us — the end-users of the upcoming RTX 40-series GPUs? Not too much just yet. The specifications may yet change, and although kopite7kimi has a proven track record, they could be wrong about the specs, too. However, as things stand now, only the RTX 4090 seems to mark a huge upgrade over its predecessor while the other two are a much more modest change. It remains to be seen whether the pricing will reflect that or not.

Editors’ Choice

Repost: Original Source and Author Link


Intel Arc Alchemist A380 Discrete Graphics Card: Specs Leak

Intel’s upcoming discrete GPUs, dubbed Intel Arc Alchemist, are coming next year, and some new leaks reveal what kind of performance we can expect from them.

According to the leak, one of the upcoming GPUs, the A380, is likely to offer performance similar to that of Nvidia’s GTX 1650 Super, an entry-level video card from Nvidia’s previous generation of graphics.

Image credit: Wccftech

The information comes from TUM_APISAK on Twitter, a well-known source for graphics card-related rumors and leaks. The tweet in question talks about some of the specifications of the upcoming Intel Arc A380 graphics card and reveals the expected naming convention Intel might use. It seems that Intel is going to name the new cards A***, with the numbers changing to correspond to the performance tier of that specific card.

What we’re seeing in TUM_APISAK’s reveal is most likely the desktop variant of this graphics card. In terms of specifications, the A380 is said to be based on an Alchemist (XE-HPG DG2) GPU. It will be fabricated on the TSMC 6nm process node. Its 8 Xe cores will house 128 execution units (EUs). The top model of this lineup will allegedly have 512 EUs and 32 Xe cores.

The card is also rumored to have an impressive clock speed of 2.45GHz. Whether this frequency will be the boost clock or the base clock remains to be seen, but such speeds put the A380 within range of AMD Navi 22 and Navi 23 graphics cards. In addition, the card will have 6GB of GDDR6 memory. It has also been said that all Arc Alchemist cards will come with ray-tracing and the XeSS feature set, a form of image upscaling on Intel cards.

There was no mention of the bus, but previous leaks suggest a 96-bit interface. In the desktop version of the card, we can expect to see 16Gbps pin speeds, adding up to 168GB/s of bandwidth. The laptop version is said to be slightly worse, with 14Gbps pin speeds and 168GB/s bandwidth. Intel Arc Alchemist A380 is likely going to be fairly conservative with poweer, with a TDP of 75W.

Intel Arc A380 Graphics
2.45GHz 6GB

perf 1650S#IntelArc #DG2

— APISAK (@TUM_APISAK) December 2, 2021

TUM_APISAK hasn’t provided any benchmarks, but he did suggest that the performance of this card is going to rival that of the Nvidia GeForce GTX 1650 Super. While that is a rather dated card by now, it continues to be one of the best budget graphics cards out there. This bodes well for the Arc Alchemist.

The pricing of the card hasn’t yet been revealed, but the launch is still a few months away. Remember, its performance and specifications may not be accurate right now. If the leaks prove to be true, this card is likely to be rather inexpensive, with a price of around $250 or less.

Editors’ Choice

Repost: Original Source and Author Link


Micro Center Spills the Beans on Alder Lake Price, Specs

U.S. retailer Micro Center just leaked the prices for Intel’s upcoming Alder Lake processors. Twitter leaker @momomo_us grabbed a screenshot of the Core i7-12700K and Core i9-12900K listings before they were removed by Micro Center — and both show a decent price increase over 11th-gen chips.

Micro Center listed the Core i7-12700K for $470 and the Core i9-12900K for $670. That’s a $70 increase on the Core i7 model and a $120 increase on the Core i9. We’ve seen a few leaks related to Alder Lake pricing, some of which suggested retail prices of above $1,000 for the flagship chip. This listing confirms that the chips aren’t that expensive, though still cost more than the previous generation.


— 188号 (@momomo_us) October 22, 2021

Although we recommend viewing any prerelease information with skepticism, Micro Center’s pricing lines up with some previous leaks. In October, a Reddit user was able to obtain two Core i9-12900K processors from a retailer, and they said they paid $610 for each of them. That lends more creditability to the prices listed by Micro Center.

Alder Lake looks more expensive than the previous generation, but more interesting is how the pricing stacks up to AMD. For the Core i7-12700K, it’s $20 more expensive than AMD’s competing Ryzen 7 5800X. However, the Core i9-12900K is significantly cheaper than AMD’s Ryzen 9 5950X. That processor launched for $799, though it often sells for around $750 now. Either way, Intel’s flagship is much cheaper at $670.

Outside of price, the listings revealed some other key details about Alder Lake, most of which we knew from previous leaks. The listings confirm a November 4 release date, which has been rumored for some time. It’s possible that Intel is looking to release them further into November — previous leaks pegged November 19 as the release date — but November 4 seems likely.

The listings also confirm some specs. The Core i7-12700K comes with 12 cores (eight performant and four efficient cores) and 20 threads and a base clock speed of 3.6GHz. The Core i9-12900K comes with 16 cores (eight performant and eight efficient cores) and 24 threads and a base clock speed of 3.2GHz.

The base clock speeds aren’t too high, but like previous Intel generations, Alder Lake has high boost clock limits. The listings say the Core i9 model can boost up to 5.2GHz, while the Core i7 model can boost up to 5GHz — both with a 125-watt TDP.

It shouldn’t come as a shock, but the listings also point out that neither processor will come with a cooler. Intel hasn’t included a CPU cooler with its processor releases for the past several generations, so this hardly comes as a surprise. Thankfully, many CPU cooler makers are offering free upgrade kits for the new LGA 1700 socket.

Intel hasn’t officially announced a release date for Alder Lake yet, though an increasing number of leaks suggest that the launch date is coming soon. Intel is hosting its Intel Innovation event next week, starting on October 27, where we expect to hear more about Alder Lake’s launch.

Editors’ Choice

Repost: Original Source and Author Link


Battlefield 2042 technical playtests tipped for next week as PC specs surface

Battlefield 2042 is quickly approaching, but before it arrives, DICE and Electronic Arts will want to put the game through its paces with alpha and beta tests. As it turns out, the first of those tests could be right around the corner, as new reports are claiming that they could kick off as soon as next week. In addition, we’re also learning about the minimum and recommended PC specifications ahead of this series of tests.

Neither EA nor DICE have confirmed plans for beta testing Battlefield 2042 next week, but they’re preparing for it behind the scenes according to Video Games Chronicle. Video Games Chronicle claims to have seen email invites sent out to select users, claiming that EA and DICE will host six technical playtests between August 12th and the 15th.

According to VGC, five of those technical playtests will last for three hours, while the final one will last for six hours. Space will apparently be very limited as only a few thousand participants are slated to be accepted into this early round. EA’s emails have also revealed the minimum and recommended specifications for Battlefield 2042, which you can see below:

Minimum specs:
• OS: 64-bit Windows 10
• Processor (AMD): AMD FX-8350
• Processor (Intel): Core i5 6600K
• Memory: 8GB
• Video Memory: 4GB
• Graphics card (AMD): AMD Radeon RX 560
• Graphics card (NVIDIA): Nvidia GeForce GTX 1050 Ti
• DirectX: 12
• Online Connection Requirements: 512 KBPS or faster Internet connection

Recommended specs:
• OS: 64-bit Windows 10
• Processor (AMD): AMD Ryzen 5 3600
• Processor (Intel): Intel Core i7 4790
• Memory: 16GB
• Video Memory: 8GB
• Graphics card (AMD): Nvidia GeForce RTX 2060
• Graphics card (NVIDIA): AMD Radeon RX 5600 XT
• DirectX: 12

Keep in mind that these are only the minimum and recommended specifications for the Battlefield 2042 technical playtests. While we would expect the finished product to have similar hardware requirements, there’s always a chance that some specifications could change between now and release. Additionally, VGC says that these technical playtests will only be taking place on Xbox Series X|S, PS5, and PC – sorry Xbox One and PS4 players – and that they were originally scheduled to take place in July.

So, it may not be much longer before we get official word of some technical playtests for Battlefield 2042. We’ll let you know if EA and DICE confirm this report, so stay tuned for more.

Repost: Original Source and Author Link


Nintendo Switch vs. Steam Deck: Specs, Performance, and More

Once Valve announced the new 7-inch Steam Deck handheld, the comparisons were immediate: What’s better, the Deck or Nintendo’s immensely popular Switch? Will the Switch finally have some serious mobile console competition? Will PC players be willing to hop over to a handheld they can fit in their backpack?

However, at a closer look, there are many significant differences that really set these two handhelds apart, making them suitable for very different audiences. We’re going over all the details so you can figure which may be best for you.

Note: You may also want to compare Nintendo’s latest OLED version of the Switch with the Steam Deck. While similar to the Switch, the OLED Switch does have some updates worth noting if you want to cover all your options before you buy.

Further reading

Specs and performance

A quick rundown of the specs shows how different the Deck’s goals are from the Nintendo Switch. The Deck wants enough power and speed to handle a wide array of PC games the way a gaming laptop could, while the Switch is more concerned with being a highly portable console for Nintendo’s titles, which are generally less demanding and don’t require the same performance. In our spec list below, note the higher RAM and greater storage space the Deck provides, while the Switch has more output options and better battery life.

Nintendo Switch


Steam Deck


Dimensions 10 by 4.2 by .55 inches 11.7 by 4.6 by 1.9 inches
Weight 0.66 pounds without controllers, 0.88 pounds with controllers 1.47 pounds
Processor Nvidia customized Tegra/Tegra X1+ AMD-customized APU Zen 2
Storage 32GB of flash storage, expandable through microSD 64GB to 512GB of flash storage, expandable through microSD
A/V output HDMI out No
I/O output One USB-C, USB 3.0, USB 2.o x2 One USB-C with DisplayPort 1.4 alt mode
Communication Wi-Fi, Bluetooth 4.1 Wi-Fi, Bluetooth 5.0
Controller Joy-Con or Pro Controller Built into system
Screen size and resolution 6.2 inches, 720p 7 inches, 1280 x 800 resolution
Battery life Up to 6.5 hours for old model, 9 hours for new model 2 to 8 hours
Physical media Proprietary cartridge N/A
Availability Available now Available December 2021 or later
Price $299.99 $399 to $549


At first glance, the Switch and the Deck have very similar designs. They both use joystick controls on either side of the screen, both have a 7-inch display, and both use a charging dock that can be connected to a larger screen if you prefer (neither handheld supports 4K resolution when outputting). Plus, they’re both around the same weight.

There are, however, some notable differences. First, the Deck’s controls cannot be removed or combined, like the Nintendo Switch’s can. They are there to stay. Second, control placement is a bit different, with the joysticks located on the upper corners instead of being staggered like the Switch controls, and buttons are generally smaller to make room for more control options, which we’ll discuss below.


Nintendo’s controls rely on the somewhat infamous Joy-Cons, which offer two thumb-friendly joysticks, two sets of four clustered buttons that can be used for a variety of control options, and two shoulder buttons. The Joy-Cons can be detached and combined into a more traditional controller, which makes switching to a larger screen or more comfortable position much easier.

Valve’s Steam Deck, meanwhile, doesn’t have the detachment options, but it does go all-in on control inputs with a remarkably wide variety. There are once again two thumbsticks here, one four-cluster of buttons, and a D-Pad, but also much more. On the left and right sides are two small thumb trackpads for touch-sensitive alternatives when controlling your game. On the top, you will find four shoulder buttons, two front and two back. And on the back of the handheld, your fingers will rest against four back buttons that can be programmed for all kinds of additional choices.

The Deck’s many options provide an incredible amount of versatility in how you can control a game, but they also involve more of a learning curve than the Switch, which anyone can sit down and play immediately.

How to Preorder the Nintendo Switch OLED.


Both handhelds offer compatibly for additional control options like mouse and keyboard, but they do it in very different ways, and the Steam Deck is a clear winner here.

The Nintendo Switch can support accessories like a keyboard, but only if you plug them into the USB-C port or the dock’s USB port. While the Nintendo Switch does have Bluetooth, it’s the restricted version that only works with certain designated products, and things like a mouse or keyboard aren’t included. There are a number of handheld controls that can connect with the Switch, but they don’t provide much of an advantage except being more comfortable than the combined Joy-Cons.

The Steam Deck, meanwhile, has the open kind of Bluetooth that allows for connections with all kinds of peripherals, including many mice and keyboards. If you’d like a few ideas, you can peruse our wireless mouse and wireless keyboard guides to get an idea of what’s available. If you’d really rather stay away from wireless accessories, the Deck’s dock offers USB-A 3.1 and USB-A 2.0 ports for wired connections.

Supported games

One of the Steam Deck’s acclaimed advantages is that you can access your whole Steam library on it, and with cloud saving, you keep all your progress no matter where you are playing. In addition to this, Valve has made it clear that other game stores will also be compatible, such as Epic Games Store and uPlay — and even the potential to play Xbox Cloud Gaming on it, although that will need more research. Basically, if you can play it on PC, there’s a good chance that you can play it on the Deck.

Of course, there are limitations. Developers that don’t have their games on Steam will make it a little harder to get their titles. You’ll need enough storage to handle larger games, which probably means relying on an SD card. While the Deck’s specs are impressive, the most demanding games may struggle on the handheld. We’ve also seen some bugs crop up with things like anti-cheat coding, although Valve is working to fix those aspects.

On the Switch, meanwhile, you can only buy games on the Nintendo Game Store (or their physical versions). Of course, you are not limited to Nintendo-only titles on the Switch — plenty of AAA games have made their way to the Switch, sometimes even before consoles like PlayStation or Xbox (you can also jailbreak the device, although we don’t really suggest this, as it invites other problems). But the focus is certainly on Nintendo’s own games and franchises, which are not available on platforms like Steam.

This creates a stark choice: What games are more important to you? Do you want a handheld that can handle Skyward Sword and Super Smash Bros.? Or are you looking forward to playing Factorio or Destiny 2 on the go?

The Steam Deck handheld and the Steam platform of games.

 Pricing and availability

Buy the Nintendo Switch here.

Pre-order the Steam Deck here.

One final difference between the Switch and Deck: The Switch is significantly cheaper, starting at $299 for the standard model. The Steam Deck starts at $399 for its base model and goes all the way up to $649. Given the specs, this isn’t exactly surprising, but it’s clear that the Switch is better for saving, and the Steam Deck works better if you have more money to work with.

As for availability, the Nintendo Switch is widely available (unless you want the OLED version, which ships in October), but the Steam Deck starts shipping in December, with staggered shipments throughout the months afterward, depending on your location and when you reserved the Deck. In other words, acting quickly will help you get a Deck faster in 2022.


While it’s a little too early to tell — we would like to spend some more time one-on-one with the Steam Deck to really test all its features — the ability to play your Steam library on the go is truly exciting and should inspire many serious gamers to consider handhelds even if they weren’t interested in the Switch. The Switch, meanwhile, continues to be the best and only option if you want to play Nintendo’s latest titles on a console.

Editors’ Choice

Repost: Original Source and Author Link


Nvidia RTX 3080: Price, Release Date, Specs, and More

The Nvidia GeForce RTX 3000 range brings some much-needed updates to the RTX lineup. These new cards sport the Ampere architecture, which is built on Samsung’s 8nm process. This new generation is different, though, delivering performance increases that Nvidia hasn’t seen in many years.

To help you navigate the range — and hopefully the GPU shortage — we rounded up everything you need to know about every Ampere card. The RTX 3080 is the flagship card, followed by the RTX 3070 and RTX 3060. There’s also the powerful RTX 3090 for workstation purposes and Ti variants of the rest of the range.

Release date and price

The RTX 3080 was announced on September 1, 2020, and it began shipping shortly after, on September 17.

Nvidia launched the card for $699, which is the same price as the RTX 2080 and 2080 Super. The larger and more powerful card of the generation, the RTX 3090, costs $1,500. The 3090, in some sense, replaces both the RTX 2080 Ti and RTX Titan. It’s sold in prebuilt gaming PCs, though it features 24GB of GDDR6X memory for workstation performance in fields like data science and media production.

Moving down the range, the RTX 3070 replaces the 2070 Super and costs $499. It was released shortly after the RTX 3080 and 3090 in October 2020. The “sweet spot” GPU, the RTX 3060 Ti, was released shortly after in December for $399.

Nvidia announced the RTX 3060 during its CES 2021 press conference, replacing the RTX 2060 and 2060 Super. It has an MSRP of $329 and launched in February 2021.

Finally, Nvidia revealed the RTX 3080 Ti and 3070 Ti during its Computex 2021 keynote. The RTX 3080 Ti launched on June 3 for $1,199, and the the 3070 Ti came a week later on June 10 for $599.


The RTX 3080 Founders Edition features a new “unibody” design that’s a bit more modest compared to the reflective aluminum of the RTX 2000-series cards. We found the card to be quite classy in our review, with an angular design that features a matte, dark grey finish with white backlighting instead of the traditional green hue used on prior GeForce cards.

Early leaked images revealed a very industrial-looking card with a dual-fan design, and they turned out to be accurate. Marking a stark departure from older graphics cards where all the fans are aligned on the same side, the RTX 3080 has a fan on each side of the card in an effort to better control the flow of hot and cold air throughout the desktop. Nvidia calls its thermal solution a “dual axial flow” and says it results in twice the cooling performance of its predecessor.

The RTX 3090 uses this same design, only it’s much larger, as the card utilizes a triple-slot format. The RTX 3070, meanwhile, uses a more traditional dual-fan system. For a size comparison, check out the photo below. The RTX 3080 Ti and RTX 3070 Ti both use the dual axial design as the more expensive cards in the range.

The RTX 3060 Ti uses a similar design as the RTX 3070, with dual fans on the front for the Founder’s Edition, as does the RTX 3060.

The rumor mill has been spinning over a new 12-pin PCIe power interface system for some of these new cards, and the RTX 3080 and RTX 3090 will make use of the low-profile connectors. The company stated that the new connector combines the power of dual eight-pin connectors in a space-saving design. The RTX 3070, meanwhile, uses just a single eight-pin connector, as does the RTX 3060 Ti. The RTX 3080 Ti and RTX 3070 Ti utilize the 12-pin connector, which you can split off into an eight-plus-eight-pin connector.

Port selection includes three HDMI 2.1 ports and a DisplayPort 1.3a. Importantly, HDMI 2.1 supports a variable fresh rate of up to 120Hz in 4K. New OLED TVs have begun to support this new standard, which could open up more options for PC gaming in the living room.


Nvidia made some ambitious claims about its new RTX 3000-series cards, and they turned out to be true. First off, it said that both the RTX 3080 and RTX 3070 are more powerful than the current RTX 2080 Ti. As you can see from the chart below, it’s not hard to see how that’s true.

CUDA cores Memory Memory interface Boost clock Graphics card power
RTX 3090 10496 24GB GDDR6X 384-bit 1.70GHz 350W
RTX 3080 Ti 10240 12GB GDDR6X 384-bit 1.67GHz 320W
RTX 3080 8704 10GB GDDR6X 320-bit 1.71GHz 320W
RTX 3070 Ti 6144 8GB GDDR6X 256-bit 1.77GHz 290W
RTX 3070 5888 8GB GDDR6 256-bit 1.73GHz 220W
RTX 3060 Ti 4864 8GB GDDR6 256-bit 1.67GHz 200W
RTX 3060 3584 12GB GDDR6 192-bit 1.78GHz 170W
RTX 2080 Ti 4352 11GB GDDR6 352-bit 1.54GHz 250W
RTX 2080 Super 3072 8GB GDDR6 256-bit 1.82GHz 250W
RTX 2070 Super 2560 8GB DRR6 256-bit 1.77GHz 215W

The RTX 3080 features 8,704 CUDA cores, making for a 65% leap over the RTX 2080 Super and a 50% increase over the RTX 2080 Ti. It’s also a more power-hungry card rated at 320 watts versus the standard 250 watts of the RTX 2080 Ti and RTX 2080 Super.

The RTX 3080 also features 10GB of GDDR6X memory at 19Gbps, which is as fast as it gets. Nvidia said it had collaborated with Micron for the G6X video RAM on the card, which according to the memory partner, delivers a bandwidth of up to 1TB per second. The new Ampere RTX 3080 GPU comes with 30 shader TFLOPs, up from 11; 58 ray tracing TFLOPs, up from 34; and 238 tensor TFLOPs, compared to 89.

Nvidia is also bundling in its new RTX I/O to help improve game loading times on its GPUs. There are three components in play here for the new RTX I/O, including new APIs for fast loading and streaming directly from SSD to GPU memory, GPU lossless decompression, and collaboration with Microsoft for direct storage for Windows.

The result of all this, Nvidia claims, is twice the performance of the original RTX 2080. In real-life performance, the RTX 3080 can consistently run games at 60 fps (frames per second) in 4K with RTX turned on, and though performance varies widely across different titles, we found in our review that 4K at 60 fps stays. Even ray tracing in 4K at decent frame rates is possible if you turn on DLSS.

Even bigger claims were made about the RTX 3090, which can reportedly play games in 8K at 60 fps. Although that’s a little far, the RTX 3090 is still a monster of a card. This is done using DLSS 2.0, of course, rather than native rendering. The RTX 3070, meanwhile, is supposedly 1.6 times faster than the RTX 2070.

Moving into the mid-range, Nvidia has two offerings: The RTX 3060 Ti and RTX 3060. The 3060 Ti has already proven itself as the sweet spot for price and performance, with an MSRP of only $400. It’s a remarkable card at that price, delivering performance similar to the last-gen RTX 2080 Ti (a $1,200 GPU) in most games.

The RTX 3060’s specs are, for the most part, what we’ve come to expect from Nvidia’s RTX 3000 range. The exception is the 12GB of GDDR6 memory, surpassing the RTX 3060 Ti, 3070, and even 3080.

We have some general performance specs, too: 13 shader TFLOPs, 25 ray tracing TFLOPs, and 101 tensor TFLOPs. Although much lower than the flagship RTX 3080, the RTX 3060’s specs surpass the RTX 2080 Super.

The RTX 3080 Ti and RTX 3070 Ti update the higher-end cards in the range. The RTX 3080 Ti mirrors the RTX 3090 in most specs, utilizing nearly as many CUDA, Tensor, and RT cores. The difference is that it comes with half of the GDDR6X memory, making it a great card for gaming but a weaker choice for data science and 3D modeling.

The RTX 3070 Ti is more of a marginal upgrade. It comes with 256 more CUDA cores and a slightly boosted clock compared to the base 3070. The big upgrade is 8GB of G6X memory, which is able to deliver much higher bandwidth on the same 256-bit bus.


The new cards are all built on the new Ampere microarchitecture, which is based on Samsung’s 8nm processor, featuring 28 billion transistors. Ampere uses second-generation RT cores and third-generation Tensor cores to boost performance.

In our RTX 3080 review, we found the performance of the card to be quite impressive. In our 3DMark Time Spy test, the performance of the RTX 3090 was 15% better than the RTX 2080 Ti, despite the latter costing $500 more than Nvidia’s newest offering, and 28% better than the RTX 2080.

We found that the RTX 3080 delivered solid performance uplift compared to the RTX 2080, but the gains weren’t quite as high as what Nvidia claimed. In Assassin’s Creed Odyssey, we found that the RTX 3080 was able to hit 61 fps on the game on native 4K resolution when played at 60 fps on Ultra High settings. This translates to a 30% performance boost compared to the RTX 2080 Super. Similarly, in the graphics-intensive Battlefield V, the game averaged 97 fps in 4K at Ultra settings, a result that was 33% better than the RTX 2080 Super.

While the generational improvement in performance was expected for a new GPU family, the new Ampere cards perform significantly better at ray tracing, the hallmark feature of the RTX family and an area of struggle for the RTX 2000 series. Ampere’s improvements with artificial intelligence and DLSS improve ray tracing performance, especially when games are rendered at higher resolutions.

In our test of the RTX 3080 on Battlefield V, we found ray tracing capabilities to be far superior when compared to the RTX 2080 Ti. In the mission Tirailleur, we found that the scenes rendered beautifully, with nice reflections and shadows, and the game played at 55 fps in 4K at Ultra settings. When DLSS was enabled, performance jumped to 69 fps, compared to just 45 fps on the RTX 2080 Ti.

Still, there were some performance bottlenecks when ray tracing was turned on. In Fortnite, for example, our review showed that the RTX 3080 was able to squeeze out an average of 23 frames per second when played at 4K with the highest settings enabled. In 1440p (2K) resolution, the game averaged 53 fps. Depending on the game, your monitor’s resolution, and what you hope to achieve, you may want to still leave ray tracing disabled at higher resolutions even on the RTX 3080.

Bringing in the other cards, you can expect around a 20% difference in performance for each step down. The RTX 3070 performs about 80% as well as the RTX 3080, and so on. The Ti models complicate things a bit, though.

The RTX 3080 Ti is a significant upgrade over the RTX 3080 on paper, but it only represents around a 10% improvement in practice. In our RTX 3080 Ti review, we found a 9% improvement in 3DMark Time Spy scores compared to the RTX 3080, and it earned around five extra frames per second in Assassin’s Creed Valhalla and Battlefield V.

We noted a similar performance increase in our RTX 3070 Ti review. We measured an 8% improvement across synthetic and real-world benchmarks, mirroring the RTX 3080 Ti (though without the price premium). That said, the RTX 3070 Ti isn’t an RTX 3080 killer. It’s more of a 1440p card than a 4K one, though it can still run some less demanding games at 4K.

Editors’ Choice

Repost: Original Source and Author Link


Microsoft Surface Duo 2: Rumors, Specs, Design, and More

In 2020, Microsoft released the Surface Duo, and it didn’t end up being as popular as the company had hoped. Reviewers and early adopters of the device mentioned issues with the software, the cameras, and even performance.

One year later, Microsoft is apparently working on a follow-up to the device, code-named Zeta. It is believed to address a lot of problems from the first Surface Duo and could be coming at the end of this year. Here’s a look at everything we currently know about the device so far.

Price and release date

Jeremy Kaplan/Digital Trends

Microsoft has largely been quiet about the Surface Duo 2, and there is no official word yet on when it can be expected. However, since Microsoft usually releases new Surface hardware in the fall, you can expect Surface Duo 2 around the same time the original Surface Duo released. That would be around late October, just in time for the holiday season. But take that with a fine grain of salt. The pandemic has had an impact on the mobile phone chip industry, and a Duo 2 might be delayed because of it.

Microsoft also has a “flash sale” on the Duo at the moment, selling it for more than half off, suggesting a new model could be released soon. When it comes to pricing on the Duo 2, though, we expect Microsoft to keep the same price range as the original Surface Duo. Many had said the device was too expensive to begin with, but coming from the Surface range, it will again be an expensive venture. You can expect the Duo 2 to be around $1,400.

Specs and performance

Jeremy Kaplan/Digital Trends

One of the biggest criticisms of the original Surface Duo was the fact that it featured “last year’s” Qualcomm Snapdragon processor under the hood. It also lacked many features like NFC, a rear-facing camera, and wireless charging. A lot of that is expected to change in the Surface Duo 2, according to rumors from Windows Central.

Surface Duo 2 could come with the “latest flagship from 2021.” If that holds up to be true, it will likely be the Snapdragon 888 processor, which can be found in other phones like the Galaxy S21 5G. That means the Duo will finally get 5G support for faster connectivity to the internet. It will also pick up support for NFC, allowing for contactless payments at subways, restaurants, and other places — a feature that is becoming common to use during the pandemic.

You can also expect the same amount of RAM and storage as the original. Those included 6GB of RAM as well as 128GB or 256GB of storage on the original Duo. A bump in RAM would be respectable, but it doesn’t appear likely at the moment.

But a jump in the processor won’t be too bad, either. Since the Surface Duo has two screens and is all about multitasking, a newer processor could help boost the responsiveness of the device. Opening apps side by side would be even faster and leave you more room to play. Even gaming might see some boosts, especially since Microsoft has worked to improve Xbox Game Pass for the Duo, allowing you to use the second screen as a controller.


Jeremy Kaplan/Digital Trends

Rumor has it that the Surface Duo 2 could come with a better camera, as seen on an earlier prototype device for the original Duo. There could be a camera bump on the exterior, which would allow for world-facing photography.

Coming in at 11 megapixels, the camera on the original Duo isn’t necessarily bad, but the performance in low-light conditions made it hard to use for some. The fact that the screen had to be turned around each time the camera was needed for a world-facing photo also complicated things. Sometimes, the software was buggy, and the camera or the screen would not activate.

With a dedicated camera on the rear, the Duo 2 could end up more like a traditional phone, where you can use the external camera for photos of you around the world instead of depending on your selfie camera for everything. This is important at a time when other phone makers are including dual-, triple-, or even quad-camera setups on their devices.


Jeremy Kaplan/Digital Trends

As far as the software goes, you can expect the Duo 2 to ship with Android 11. The original Surface Duo came with Android 10, and, as of writing, still hasn’t gotten the Android 11 update. The reason for the delay could be the fact that Microsoft is working with Google to tweak Android on the Duo 2 first. But that’s just a rumor.

Android 11 comes with improvements for dual-screen devices and foldables, so don’t count it out for the Duo 2. Other dual-screen phones like the Samsung Galaxy Z Fold 2 have already gotten Android 11 updates, and Google is already working on beta testing Android 12 on Pixel phones. So even if the Duo 2 launches with Android 11, it might still be a year old. However, Microsoft did say it was committed to three years of updates for the original Duo, so if the phone comes with Android 11, then it will eventually get Android 12, too.


Jeremy Kaplan/Digital Trends

Along with the camera changes, the design might get a bit of a tweak on the Duo 2. There could be more rounded corners with bigger displays as well as slimmer bezels. This is “for a more streamlined fit and finish,” according to Windows Central. At the same time, the dimensions of the Duo won’t change from the original. The device was already plenty thin and light, coming in at 0.19 inches in thickness and a little under half a pound in weight.

There’s even the chance that Microsoft’s Duo 2 could go the way of the Surface Pro lineup. Microsoft had filed a patent for a kickstand on a hinged device, which the Duo just so happens to be. The word wasn’t mentioned in the patent, but rather “integrated support,” suggesting that it might not really be for the Duo after all.

Editors’ Choice

Repost: Original Source and Author Link

Tech News

Surface Duo 2 might be arriving this Fall with better specs

Samsung and some manufacturers, including Google, seem to be heading towards a foldable phone future. Microsoft is, as well, but it has a very different definition of foldable. It describes its Surface Duo as a “dual-screen foldable” and admittedly made quite a good case for it nearly two years ago. Its delivery, however, was anything but convincing and it was pretty dated even before it went out the door. For those still holding a torch for the device, the Surface Duo 2 could prove to be what Microsoft’s first dual-screen foldable should have been, and, based on the latest leak, it should be arriving soon.

The problem with the Surface Duo was more hardware than software. The latter is easier to fix with updates and patches but mobile hardware isn’t exactly replaceable once they’re there. When the Surface Duo was first announced, it was already revealed to be using less than recent hardware, even more so when it finally launched a year later.

Based on Windows Central’s information, the Surface Duo 2 will at least have up-to-date specs, at least for 2021. That means a Snapdragon 888 at the very least and hopefully more RAM as well. The single main camera doesn’t need a companion but should be upgraded as well.

This, of course, depends on whether the device will actually ship this year. According to the site, the good news is that it might finally happen this Fall, around September or October. Microsoft has repeatedly delayed the launch of the first Surface Duo so even fans have reason to be cautiously optimistic about this timeline.

Timing is everything in this business and, unfortunately, the Surface Duo 2 won’t be shipping with Android 12 because of that. That’s well and good provided Microsoft ships a more stable version of the software and it has had more than a year to actually get that right.

Repost: Original Source and Author Link

Tech News

OnePlus Nord CE 5G specs leak has everything you want to know

Last year, OnePlus embarked on a new journey towards the mid-tier smartphone market with its new Nord series, and, at least according to the company, they are hot-selling items in markets where they launched. It’s a bit curious, then, that the next OnePlus Nord that the company will launch might actually be a slight downgrade from the first one. Regardless, the OnePlus Nord CE 5G is coming next week and this latest leak leaves no stone unturned.

Granted, the switch from a Snapdragon 765G in the OnePlus Nord to the Snapdragon 750G rumored for the OnePlus Nord CE 5G isn’t as big a step down as it may sound. Dropping from two to a single 16MP front-facing camera, on the other hand, may disappoint some but is probably negligible. There is also one less camera on its back but the main camera does get upgraded to 64MP.

In other aspects, the OnePlus Nord CE 5G does have some upgrades but a lot is also the same, at least according to MySmartPrice’s report. The battery is larger at 4,500 mAh but the memory and storage selections are the same as last year’s model, which means 6 or 8 GB of RAM and 64 or 128 GB of storage. There’s also an in-display fingerprint scanner, which is also present last year.

All in all, the OnePlus Nord CE 5G seems to live up to its name of being a “Core Edition”, which means a distilled OnePlus Nord experience. Whether or not it will live up to expectations as a successor of the first OnePlus Nord remains to be seen when the company unveils it on June 10.

According to the leak, the OnePlus Nord CE 5G with 8GB RAM and 128GB storage will cost around 25,000 INR, roughly $340. Of course, OnePlus already announced that it will launch first in India and European markets and North American customers will have to wait for a much more “distilled” OnePlus Nord N200 5G later this year.

Repost: Original Source and Author Link

Tech News

Galaxy Tab A7 Lite specs all revealed in newest leak

A lot of attention has been given to the Galaxy Tab S7 Lite or Galaxy Tab S7 FE, depending on which rumor you subscribe to, probably because of how close it is to the premium Galaxy Tab S7. Not everyone will be interested in such a big or potentially expensive tablet and might appreciate that there is also a Galaxy Tab A7 Lite coming with a smaller size, lower specs, and, hopefully, a more accessible price tag as well.

The Galaxy Tab A7 Lite definitely falls in line with the Galaxy Tab A series, most of which have smaller screen sizes. In this case, that comes in an 8.7-inch LCD with a resolution of 1340×800. That’s good enough for 720p content and its 2MP front camera seems to be made to match that.

Inside will be an octa-core MediaTek MT8768x. This comes with 3GB of RAM and 32 or 64 GB of thankfully expandable storage. The 5,100 mAh battery can be charged via USB-C with quick charge capabilities. That actually sounds pretty mediocre given its size but, again, the tablet is meant for the budget-conscious consumer in the first place.

As for that wallet-friendly price tag, WinFuture says it will be around 150 EUR, roughly $180, for the base Wi-Fi-only model. There will also be an LTE model though the price will reportedly be only a bit higher. The leak, unfortunately, doesn’t have anything on potential launch dates.

That said, there are some rumors that Samsung has a June event just for its tablets. Yes, plural. That will include the aforementioned Galaxy Tab S7 FE and possibly a Galaxy Tab S7 XL Lite. Which ones will actually end up being announced is anyone’s guess at this point.

Repost: Original Source and Author Link