Categories
Computing

I make my GPU perform worse on purpose, and I’m not sorry

I have a confession to make: I have one of the most powerful GPUs you can buy, the AMD RX 6950 XT, and I deliberately make it underperform. Let me explain.

I appreciate that off the back of a GPU pricing crisis, which saw almost everyone unable to find a card like this (let alone afford it), that sounds super wrong. I’m lucky that one of the perks of this job is getting to test out the kind of high-end components that I wouldn’t otherwise shell out the cash for. But even then, I can’t bring myself to unleash the full power of this awesome GPU.

It’s sad to say, but in my day-to-day living with this graphics card, I found myself valuing a quieter, cooler system more than the extra performance GPUs of this kind can provide.

Taming the beast

Jacob Roach / Digital Trends

The 6950 XT is a power-hungry card. It’s the same GPU as the 6900 XT but pushed to its limit, so of course, it runs hot. When put in an mATX case it runs loud, too, even with a big cooler. It’s not necessarily the kind of GPU I would have bought myself, but now that I have one, what am I going to do, not use it?

But that was a very real prospect after a few days of retreating to headphones after the testing was done. How could I continue to enjoy high frame rates and detail settings in games, but not have to listen to its fans hitting 2,200 RPM as soon as the game menu appears? The solution, it turns out, was the make the card run worse.

The PowerColor Red Devil 6950 XT has a BIOS switch, ostensibly for backup purposes in case you brick one of them trying a heavy overclock. But the secondary Silent BIOS also lowers clock speeds and undervolts the GPU. Enabling that got me halfway there. With that, we were only hitting around 1700 RPM on the fans, and the junction temperatures dropped by around five degrees, resting under 100 at load for the first time since I’d gotten the card up and running.

This was an exciting development. Maybe there was a way to have my PCB cake and eat it too. But it still wasn’t quite quiet enough for my sensibilities and temperature thresholds.

AMD Radeon Tuning Control interface.

Next, I played with the settings in AMD’s Adrenaline driver application and made further headway. The automated undervolting had a minor effect of its own, but a few extra RPM and a single-degree drop in temperatures wasn’t going to cut it. I could have manually adjusted the fan curves myself, but ultimately, setting the tuning profile to Quiet was enough.

Clock speeds dropped a little more, and fan speeds followed suit. Suddenly I had a card that even with a 4K Furmark run wasn’t going over 265W, with a maximum junction temperature of just 80 degrees. Just as important? Fan speeds never went over 1,500 RPM, keeping the card cool and quiet enough that it only just registered over the whir of the system fans.

No, performance isn’t the same. The core clock now barely breaks 1,900MHz, and my Time Spy score isn’t quite as good, but I don’t care. I have a near-silent 6950 XT that still performs better than almost any other GPU out there, and it didn’t require a custom cooler or heavy tweaking. Now, I can game in relative peace, and it takes far, far longer for my little home office to heat up. I’m living the dream.

It turns out I’m not alone

I know this post reads like the most whiney of first-world problems — believe me, I’ve read it through multiple times before posting. I’m also aware that there are other ways around this problem without tamping down on the card’s performance, such as better system cooling, or gaming with the air conditioning on.

I would have been too ashamed to write about it myself if I hadn’t found out that my fellow DT writer and high-end GPU owner, Jacob Roach, is also a sacrilegious downclocker. His daily driver gaming PC has an RTX 3090 inside, and it’s stupendously powerful (although my card’s better). But according to him, it’s often a bit much and, frankly, more than he needs.

“I’ve been limiting the frame rate with my 3090 for a while,” he said when I mentioned how I felt bad for making my RX 6950XT perform worse than it can. “I just can’t handle the noise and heat, even if the card is capable of more.”

This is something that both of us have had to deal with as our respective countries grapple with heat waves. There might be a case of gaming with one of the best graphics cards pulling over 300W if you live somewhere tepid or it’s not the middle of summer, but when the temperatures rise outside and you want to hide inside playing games, switching on a miniature space heater to do it doesn’t feel very comfortable once temperatures start rising.

It also doesn’t sound comfortable, because just as gaming with a hot PC next to your legs or on your desk can make you hot and bothered, they can get really loud too. My mATX case does not give this 6950 XT enough room for its triple fans to cool it effectively, so even if I do run it full tilt, I run into some thermal throttling after extended use. The longer I play, the worse it gets, for both the card and me.

The future looks warm

A hand grabbing a graphics card.
Jacob Roach / Digital Trends

While Jacob and I might not use our GPUs to their full potential, there’s no denying that the RTX 3090 and the RX 6950 XT are incredibly power-hungry and hot GPUs, and they’re not alone. The entire lineup of modern graphics cards from both camps has had a bump in TDP this generation, and if the rumors are to be believed, the next generation will only exacerbate this problem.

And it is a problem. Jacob and I are prime examples that if you don’t have a larger space with plenty of ventilation or capable A/C running all the time, playing games with some of these top graphics cards is decidedly uncomfortable, both from a noise and heat perspective. I’m not all that excited about a graphics card that’s even hotter and potentially louder, even if it is much more performative.

It’s not that I wouldn’t keep it if I was given one. But don’t be surprised if you find me downclocking it to oblivion to make my gaming sessions cooler and quieter.

Editors’ Choice




Repost: Original Source and Author Link

Categories
Security

U.S. court system cyberattack was worse than we thought

A cyberattack incident that involved the U.S. federal court system infrastructure has been proven to be an “incredibly significant and sophisticated” attack.

This statement is a stark difference from the one initially provided when the situation occurred in 2020.

Stock Depot/Getty Images

As reported by TechRadar, the attack itself was confirmed in January 2021 via a hearing from the judiciary committee, with its chairman Jerrold Nadler stating that a data breach was indeed successfully carried out by threat actors.

Upon further investigation, it seems the cybersecurity event was considerably more impactful than the government initially discovered.

Nadler stressed that the committee only started to uncover the “startling breadth and scope of the court’s Document Management System security failure” in March 2022.

“And perhaps even more concerning is the disturbing impact the security breach had on pending civil and criminal litigation, as well as ongoing national security or intelligence matters,” he continued.

He also stated that the hack has resulted in “lingering impacts on the department and other agencies.”

An official from the justice department was questioned about what sort of investigations, types of cases, and attorneys were affected most by the breach. However, the individual could not provide an adequate answer. “This is, of course, a significant concern for us given the nature of information often held by the courts,” he added.

A digital depiction of a laptop being hacked by a hacker.
Digital Trends Graphic

Another government figure, Sheila Jackson Lee, asserted that the discovery of the actual impact of the attack is a “dangerous set of circumstances.” Lee said that the justice department should share more information on the matter, such as the number of cases that have been influenced in any capacity, in addition to how many of these cases were outright dismissed.

TechRadar highlights how this specific cybersecurity incident is reportedly not related to the SolarWinds attack, even though they both materialized around the same time during 2020.

For reference, the SolarWinds attack has gone down in history as among the most impactful supply chain cyber attacks ever. The group and individuals behind the incident managed to extract Microsoft 365 login credentials from SolarWinds employees via phishing methods, as detailed by TechRadar.

An exposed patch was then deployed by the threat actors to hundreds of thousands of endpoints, which saw government agencies and several technology giants bearing the brunt of the impact.

In related governmental cybersecurity news, a bug bounty program revealed how one of the largest departments of the U.S. government — Homeland Security — discovered over 100 security vulnerabilities within external DHS systems.

Editors’ Choice




Repost: Original Source and Author Link

Categories
Computing

The M2 MacBook Pro’s performance is worse than expected

Additional benchmarks have shown the entry-level model of Apple’s MacBook Pro with an M2 chip is performing far worse than anyone expected. This comes after initial tests revealed that the device had a slower SSD when compared to last year’s MacBook Pro with an M1 chip.

Spotted by MacRumors, the M2 MacBook Pro reportedly lags behind in day-to-day multitasking performance in apps like Photoshop, Lightroom, and Final Cut Pro. Even file transfers to an external SSD suffer on Apple’s latest flagship laptop. This is all because the M2 MacBook Pro appears to be using space on the 256GB SSD as virtual memory when the in-built 8GB of Apple Unified memory is used up by the system and other apps.

Just like the issue with SSD speeds, this is believed to be due to the fact that Apple is only using a single NAND chip on the 2022 MacBook Pro 13-inch M2 models. That’s compared to the M1 MacBook Pro, which has two NAND chips for faster speeds.

Apple

A lot of the tests in question have been done by the YouTuber, Max Tech. In his 12-minute video, he showcases that when his tests are down on their own without background activity, the M2 MacBook Pro defeats the M1 MacBook Air. It’s only when multitasking and background activity on both machines comes into play that things go bad for Apple’s latest 13-inch flagship laptop.

For basic multitasking in Google Chrome, the M2 MacBook Pro loads several tabs and pages like Google Drive slower than the M1 MacBook Pro. Having that open on top of exporting 50 RAW images in Adobe Lightroom Classic, meanwhile, takes longer on the M2 MacBook Pro at a time of 4 minutes and 12 seconds versus just 3 minutes and 36 seconds on the M1 MacBook Pro.

In other tests done by Max Tech, the Apple M2 MacBook Pro falls even further behind the M1 MacBook Pro with so-called “pro app” background activity going in Final Cut Pro. A 5-minute 4K HVEC export on the M2 MacBook Pro took a total of 4 minutes and 49 seconds. The M1 MacBook Pro did that same test in 3 minutes and 36 seconds with similar background activity.

Even SSD File Transfers appear to suffer on the M2 MacBook Pro. Max Tech finds that in his video transfer tests, the M1 MacBook Pro writes a 35GB video file to an external SSD in 34 seconds, but the M2 MacBook Pro does it in 1 minute and 25 seconds.  As for read speeds, the results are closer, with the M2 MacBook Pro doing it in 58 seconds, and the M1 MacBook Pro doing it in 45 seconds.

With all this in mind, if you’re considering buying a new MacBook Pro model with an M2 chip, you should definitely pay for the $200 upgrade and buy the higher-end model with 512GB of storage. Or, hold off and buy an older M1 model.

Editors’ Choice




Repost: Original Source and Author Link

Categories
Computing

Intel Arc Alchemist desktop GPUs may be worse than we thought

Today marks yet another round of bad news for Intel Arc Alchemist, this time pertaining to the Arc A380, which is the first discrete GPU for desktops that Intel had released. Upon announcing the card, Intel compared it to the budget AMD Radeon RX 6400, all the while promising that the A380 would provide an up to 25% uplift in performance versus the RX 6400.

Intel’s claims have been closely examined, and unfortunately, the A380 fails to meet those expectations. While the Intel Arc GPU is faster than the AMD RX 6400, it only wins by 4%. The other cards from the lineup have also been given another look.

3DCenter

Intel has just recently released its first Arc Alchemist desktop GPU, the A380. For the time being, the card is only available in China, and is only being shipped in pre-built desktop PCs. However, Intel has promised to soon move on to the next stage, which is to release it on the DIY market in China, and then finally, globally.

As part of the release announcement, Intel shared a performance slide for the GPU, showing the average frames per second (fps) when gaming at 1080p on medium settings. With that, Intel promised that the A380 should be up to 25% faster than the AMD Radeon RX 6400 — but the slide didn’t contain any matching figures to back up that statement. This prompted 3DCenter to verify that information, and unfortunately, it’s bad news all around for Intel Arc.

It seems that the general public may have missed an important factor in relation to Intel’s claims — the promise of an up to 25% increase in performance only applies to a performance versus price comparison. In short, since the RX 6400 is slightly more expensive than the A380, the actual performance boost is much smaller than expected.

3DCenter compared the data available for Intel Arc A380 and for the RX 6400. The Intel GPU is priced at 1,030 yuan (around $153) while the AMD graphics card costs 1,199 yuan ($178). According to 3DCenter, Intel’s claims mostly check out when it comes to performance per yuan — the Arc A380 wins by around 21%, making it more cost-effective. However, the raw performance gains are significantly smaller, amounting to around 4%.

Intel Arc lineup -- expectations versus possible reality.
3DCenter

As a result of those findings, 3DCenter went on to take a closer look at some of the other claims that were made about the performance of the entire Intel Arc lineup. Although much like Intel’s claims, these comparisons are difficult to verify, it seems that it might be a good idea to keep your expectations muted where Intel Arc desktop GPUs are concerned.

The flagship Intel Arc A780, with the full 32 Xe-cores across a 256-bit bus and 16GB of GDDR6 memory, was often compared to the Nvidia GeForce RTX 3070, and sometimes, even the RTX 3070 Ti. However, 3DCenter now says that the GPU will be “slightly worse than RTX 3060 Ti.” The other GPUs in the range are also knocked down a notch with these updated predictions, with the most entry-level A310 now being called “significantly slower than Radeon RX 6400.”

It’s hard to deny that things are looking a little bleak for Intel’s first discrete gaming GPU launch. After numerous delays, a staggered launch, and most importantly, with questionable levels of performance, it might be difficult for Intel Arc to find its footing in a GPU market dominated by Nvidia and AMD. However, despite the wait, it’s still early days, and further driver optimizations might bring Intel Arc amongst the best GPUs yet — especially if the company keeps the price competitive.

Editors’ Choice




Repost: Original Source and Author Link

Categories
Game

The Nintendo Switch OLED shortage just got even worse news

If you’ve been having a difficult time finding the recently-released Nintendo Switch OLED, you definitely aren’t alone. It’s a tale we’re sure most are tired of hearing: global supply disruptions have made new hardware difficult to find, and the Switch OLED is not immune to the issue. In fact, a new report details the cuts Nintendo has had to make to its Switch sales forecast for its current fiscal year, and those cuts are significant.

Nintendo Switch production targets cut as OLED model launches

According to Nikkei Asia, Nintendo has been forced to cut its production targets by 20% for the 2021 fiscal year, ending in March 2022. Nintendo originally targeted 30 million unit sales for the fiscal year as a whole – an ambitious target for a platform that’s coming up on its 5th birthday, but obtainable given the Switch’s popularity. Now, Nintendo reportedly expects only to produce 24 million, a target that comes after earlier revisions to that initial goal.

Like every other company that makes consumer electronics, Nintendo is feeling the squeeze of the global semiconductor shortage prompted by the COVID-19 pandemic. Switch sales actually spiked at the beginning of the pandemic, as here in the US, states began implementing lockdowns right around the time that Animal Crossing: New Horizons launched for the platform.

The sudden interest in the Switch and gaming in general made Animal Crossing: New Horizons one of the best-selling games for the platform seemingly overnight, and it led to Switch shortages in the early days of the pandemic. Demand for the Switch remains strong, which is contributing to current stock shortages as well.

In a brief statement to Nikkei Asia, a Nintendo spokesperson suggested that the impact of these component shortages is still being determined, saying simply, “We are assessing their impact on our production.” Still, even if the decrease in production targets isn’t as bad as Nikkei’s report suggests it will be, component shortages are still a reality of the world we currently live in.

Nintendo is in good company

Of course, Nintendo isn’t alone in its struggle to secure components for new gaming hardware. However, it has good company in its current position, as Microsoft and Sony are grappling with the same issues and have been for quite some time. With the component shortage, PlayStation 5 and Xbox Series X consoles are still hard to find a year out from release.

While there are signs that the component shortage may be easing a little bit – for instance, the Xbox Series S is readily available in many places – manufacturers have indicated that we could be in this for the long haul as well. At best, it seems that the component shortage is expected to start getting better at some point in 2022, though we may not see noticeable improvement until we’re into 2023.

Repost: Original Source and Author Link

Categories
Tech News

Samsung Galaxy Note 10+ hands-on: A new model changes the game, for better or worse

When I picked up the Samsung Galaxy Note 10, it felt like I was holding the most powerful and beautiful phone on the planet. A 6.3-inch display, near-bezel-less design, triple-camera array, and supercharged S Pen are all crammed into an impossibly compact design. And then I picked up the Note 10+.

For the first time in the Note’s storied history, there are two sizes to choose from, and the difference between the two models couldn’t be clearer. While Samsung has set up the Galaxy S10 and S10+ smartphones to be mostly larger and smaller versions of the same device, the Note 10+ is a clear upgrade from the Note 10—so much so that many fans likely won’t even consider the smaller, lower-end model.

First, let’s review the specs, which really don’t tell the whole story…

Galaxy Note 10

  • Dimensions: 71.8 x 151 x 7.9mm
  • Display: 6.3-inch AMOLED FHD 2280 x 1080
  • Processor: Snapdragon 855
  • RAM: 8GB
  • Storage: 256GB
  • Camera: 12MP telephoto Camera, f/2.1, OIS + 12MP wide-angle, dual f/1.5-f/2.4, OIS + 16MP Ultra wide (123 degrees), f/2.1
  • Battery: 3,500mAh

Galaxy Note 10+

  • Dimensions: 77.2 x 162.3 x 7.9mm
  • Display: 6.8-inch AMOLED QHD 3040 x 1440
  • Processor: Snapdragon 855
  • RAM: 12GB
  • Storage: 256GB/512GB
  • Camera: 12MP telephoto Camera, f/2.1, OIS + 12MP wide-angle, dual f/1.5-f/2.4, OIS + 16MP Ultra wide (123 degrees), f/2.1 + DepthVision
  • Battery: 4,300mAh

While the 6.8-inch display might make the Note 10+ seem like a monster on paper, it’s not nearly so big in person. In fact, its frame is roughly the same size as the 6.4-inch Note 9’s (76.4 x 161.9 x 8.8mm). That’s because Samsung has seriously trimmed down the bezels on the Note 10 series, so much so that the non-plus version feels downright puny. It’s not just smaller than the Note 9 and the Note 8 either. Millimeter for millimeter, it’s the most compact Note since the Note 2, and its display is actually a tenth of an inch smaller than the Note 9’s.

note 10 note 9 compare Michael Simon/IDG

The 6.3-inch Note 10 (left) is positively tiny compared to the 6.4-inch Note 9.

That makes the Note 10 feel less like the latest in the lineage of premium phablets and more like a Galaxy S phone that happens to have a stylus. That’s not a criticism, nor does the Note 10 feels cheap or even inferior. It just doesn’t feel like a Note. In fact, Samsung made its motivations clear during my briefing: This is the Note for people who’ve always wanted a Note, but have been put off by its size. It’s truly impressive that Samsung was able to pack such a large-screened and high-performing device into such a small package, but I don’t think long-time Note fans will appreciate the dip in screen size, even if it is just a tenth of an inch.

Other tweaks—like the relatively low-res HD screen and the lack of an MicroSD card slot, both nonstarters for Note die-hards—only drive home that notion further. For the first time, Samsung has made a Note that Note fans probably won’t want. 

The Note 10+, on the other hand, is every inch a phablet. Its screen is the biggest I’ve used in a Samsung phone (or in any phone for that matter), and it includes the high-end features that are missing from the smaller Note 10, mainly a Quad HD 1440p display. The difference between the two displays is obvious at even a first glance. Even after a short time with it, I have no qualms about declaring the Note 10+’s display as the best to ever grace a smartphone.

note 10 selfie cam Michael Simon/IDG

The centered selfie cam hole is much less distracting on the Note 10.

Like on the S10, the Note 10+’s dynamic AMOLED display is a stunner. The Note 10’s design complements it even more, with barely any bezels and a more symmetrical camera cutout. It’s the closest I’ve gotten to holding a floating pane of touchscreen glass in my hand. That said, the Note 10 is still very much a Note, down to its trademark tight corners, flat edges, and shell colors, which of course includes the usual white, black, and blue. There’s also a new “Aura Glow” iridescent option that changes color based on how the light strikes it.

Repost: Original Source and Author Link

Categories
Tech News

NVIDIA RTX 30-Series supplies might actually be worse this quarter

Last year’s final months were filled with drama in the tech and gaming industries, including companies hyping products to the point that demand far outstripped their ability to produce adequate supply. Of course, market dynamics are not exactly simple or clear-cut and NVIDIA promised that the shortage of its latest-gen RTX 30-Series GPUs would be resolved by the start of 2021. What it might not have said was that things might actually get worse before they get better, especially in the first quarter of the year.

Some considering high demand a good sign for a product but when the gap between demand and supply becomes too extreme, the scales then tip the other way. The NVIDIA RTX 30-Series should have been the company’s biggest show of strength but now they have almost become proof of its inability to produce enough volume to satisfy the increasing number of orders and disgruntled buyers.

NVIDIA promised last year that things will start to look better starting 2021 but European hardware retailer Alternate reveals that it won’t be happening this quarter. In fact, the shortage might actually take a turn for the worse, though, thankfully, it won’t be staying that way for long.

To be fair, many of the factors that will lead to this worsening situation are beyond NVIDIA’s control. The Chinese New Year, where companies in the country often close for a week or two, means that almost no new graphics cards will be produced during that period. While companies often take that annual event into account, the COVID-19 pandemic along with a general shortage of raw materials further up the supply chain, made the usual preparations ineffective.

Unfortunately, the demand for NVIDIA’s newest graphics cards, along with other silicon-based products, isn’t going down while production and supplies wait to reboot after February. Many orders remain open, especially for the more popular RTX 3080 and RTX 3060 Ti, and the use cases for GPUs, like cryptocurrency mining, only grow higher as time passes.

Repost: Original Source and Author Link

Categories
Security

SolarWinds hack may be much worse than originally feared

The Russia-linked SolarWinds hack which targeted US government agencies and private corporations may be even worse than officials first realized, with some 250 federal agencies and business now believed affected, the New York Times reported.

Microsoft has said the hackers compromised SolarWinds’ Orion monitoring and management software, allowing them to “impersonate any of the organization’s existing users and accounts, including highly privileged accounts.” The Times reports that Russia exploited layers of the supply chain to access the agencies’ systems.

The Times reports that early warning sensors that Cyber Command and the NSA placed inside foreign networks to detect potential attacks appear to have failed in this instance. In addition, it seems likely that the US government’s attention on protecting the November elections from foreign hackers may have taken resources and focus away from the software supply chain, according to the Times. And conducting the attack from within the US apparently allowed the hackers to evade detection by the Department of Homeland Security.

Microsoft said earlier this week it had discovered its systems were infiltrated “beyond just the presence of malicious SolarWinds code.” The hackers were able to “view source code in a number of source code repositories,” but the hacked account granting the access didn’t have permission to modify any code or systems. However, in a small bit of good news, Microsoft said it found “no evidence of access to production services or customer data,” and “no indications that our systems were used to attack others.”

Sen. Mark Warner (D-Virginia), ranking member on the Senate Intelligence Committee, told the Times the hack looked “much, much worse” than he first feared. “The size of it keeps expanding,” he said. “It’s clear the United States government missed it.”

Repost: Original Source and Author Link