Categories
Computing

Corsair K100 Air Wireless review: too expensive to recommend

Corsair K100 Air Wireless

MSRP $280.00

“The Corsair K100 Air Wireless boasts superb performance, but is far too expensive for its feature set.”

Pros

  • Excellent typing experience
  • Sleek design
  • iCUE customization
  • Connects to multiple devices including game consoles

Cons

  • Expensive
  • Only choice is tactile
  • No hot-swappable keys

Smaller 60% or tenkeyless gaming keyboards have become popular in recent years, but there will always be room for a full-sized gaming keyboard that gives you every button or key you could ever need.

The Corsair K100 Air Wireless is an excellent mechanical keyboard in that vein, boasting superb performance and a slick design. At $280, though, it’s too high of a price to pay, especially when it’s missing some features we’ve come to expect at that price.

Design

From a design perspective, the Corsair K100 Air Wireless is an attractive, low-profile keyboard. The keyboard itself is well-made and rigid, with no real discernable flexing despite being really thin. It inspires confidence that it’ll last for thousands of gaming sessions and workdays. Also, despite being a full-sized keyboard, it doesn’t take up more space than it needs to.

The K100 Air has all the usual keys you’d expect. The characters themselves are easy to read and are arranged logically. At the top are an array of buttons for controlling different features such as a brightness toggle, Windows lock key, media controls, and a very handy volume roller.

Corsair K100 Air gaming keyboard top view.

The lack of Mac-specific icons was an annoyance for me, but 99% of people buying this gaming keyboard won’t have that problem.

The K100 Air comes bundled with a nice braided USB-C to USB-A cable. Unfortunately, there isn’t a USB-C to USB-C option, so if you have a device with all USB-C connections (like a Mac or non-gaming laptop), you’ll have to buy an adapter if you prefer a wired connection.

I’ll talk about the iCUE software later in this review, but you’re able to configure a wide range of keyboard lighting effects and colors to your preferred style. The keyboard lighting itself is well-balanced for most keys. Some keys such as Home, End, Page Up/Down, and the rest in that group aren’t fully lit, with the edges of the words being a little dim.

Switches and typing experience

Corsair K100 Air gaming keyboard media keys.

I’ve personally owned the Logitech G915 Lightspeed gaming keyboard (the full-sized version), so I’m used to the feeling of low-profile keyboards. In fact, I tend to prefer those types of keyboards because I lean toward having shorter key travel.

Corsair is using Cherry’s ultra-low-profile tactile key switches, which are similar to MX Brown switches in feel. Those who are used to a tactile switch will feel right at home, but those who prefer other switch types are out of luck. There isn’t an option for linear or clicky switch options, which will disappoint many who prefer those types of key switches.

Let me start with what I liked about these low-profile switches. I tend to prefer linear switches, but the typing experience on the K100 Air was great. I was able to adjust quite well and felt no long-term fatigue or strain while typing. The key actuation (0.8mm) and travel (1.8mm) were satisfactory for my purposes.

Those who will be using this keyboard primarily for productivity tasks such as writing or coding will have no problems using these switches.

The Backspace, Enter, and Space bar keys are noticeably louder than the letter keys.

My primary gripe with these key switches is the sound. More accurately, the inconsistent loudness with the keys. Most of the keys deliver what you’d expect from tactile keys. They’re not as loud as traditional clicky (or MX Blue) switches. However, the Backspace, Enter, and Space bar keys are noticeably louder than the letter keys.

When you’re a writer (and programmer) like me, those three keys are essential in regular typing. Perhaps your tolerance level may vary, but I wished those keys were as (relatively) quiet as the other keys. Perhaps that was by design in order to audibly hear when you’re using those keys. In any case, it was a persistent distraction as I typed and so I have to point it out.

One big negative for many people will be the lack of replaceable switches. Unlike other gaming keyboards such as the ROG Strix Flare II Animate, you won’t be able to swap out the keys with one of your preference. In fact, Corsair told me directly that because of the construction of the switches, it’s very risky to remove the keycaps without damaging the switches.

Performance and connectivity

Corsair is very proud of its 8,000Hz polling rate, similar to its K70 Pro Mini Wireless keyboard. Using the keyboard either in its wired mode or with its Slipstream wireless dongle allows you to access the higher polling rates (it defaults to 2,000Hz in wireless mode). If plugged into a gaming console (more on that later), it’s only able to go up to 1,000Hz.

I want to be able to say that I felt a difference between wired and wireless modes, but I didn’t. I tried it out with some Counter-Strike: Global Offensive, a game that can definitely benefit from that responsive, lag-free performance. I may not be the greatest shot in the world, but using the keyboard didn’t feel differently in either wired, SlipStream, or Bluetooth.

Corsair K100 Air gaming keyboard wit PS5 and SlipStream dongle.

Most of the gaming I did was on my PS5. The K100 Air has a “PlayStation Mode” that allows you to connect to a PS4/PS5. No such mode was needed for an Xbox console. I played primarily on a wired connection, but you can also use the Slipstream wireless dongle. I didn’t feel any difference between wired and wireless mode playing first-person shooters like Call of Duty Modern Warfare and Destiny 2 (connecting over Bluetooth wasn’t available for PS5, unfortunately).

While I didn’t notice a real change, my favorite thing about this keyboard is being able to effortlessly pair to multiple devices at the same time. I plugged the Slipstream adapter into my Mac while keeping the wired connection to my PS5. The keyboard automatically switched to the PS5 while it was on and back to my MacBook Pro while the PS5 was off. You can’t use both simultaneously, obviously.

You’re able to save up to three Bluetooth profiles and switch between them using the Fn + Bluetooth button that corresponds to that device. That’s very helpful for people who have multiple devices and only want to use one keyboard. Technically, you can have five total devices connected (one wired, one Slipstream, and three Bluetooth).

Software

iCUE software for K100 Air Wireless color picker.

Corsair’s iCUE software offers a veritable delight of features when it comes to customizability. I’m used to Logitech’s G Hub, but I appreciate all of the different options that iCUE offers.

I echo the sentiments in the Corsair K70 Pro Mini Wireless review in that iCUE allows you to basically rebind any key you want or create crazy RGB combinations if you want to make your keyboard look like fireworks. There are a number of options for making the keyboard function exactly how you want.

I will admit that I didn’t take full advantage of the ability to remap any key as most of the time I use the in-game settings to rebind game actions. I do appreciate having quick actions on my Mac’s status bar at the top. I’m able to change profiles and various settings like the polling rate without having to open the iCUE software.

iCUE notice to use Elgato Stream Deck software.

There are four macro keys you can use, particularly for streaming purposes. You’ll have to download the Elgato Stream Deck software to actually use those macro keys, however. If you’re interested in streaming, it may be a decent way to set a few shortcuts, such as switching cameras or scenes in OBS Studio if you’re not already invested in Elgato’s Stream Deck hardware.

Finally, the keyboard has 8MB of onboard memory to store up to 50 profiles. That way, you don’t even have to use iCUE to switch between profiles.

Should you buy it?

This is where it gets a little tough to recommend. At $280, it’s definitely on the higher end of premium gaming keyboards, and there are other excellent full-sized mechanical keyboards such as the Logitech G915 Lightspeed gaming keyboard, which goes for around $230. That’s still expensive, but is a better deal considering what you’re getting.

The K100 Air’s super high polling rate is impressive on paper but doesn’t really make much of a difference in real-world use. The actual typing experience on the K100 Air is superb but could be a little too loud for some people who are sensitive to that.

The biggest fault, however, is the lack of options for key switches. I would have loved a K100 Air with linear switches, as I prefer the overall smoothness. Limiting the K100 Air to just tactile switches is a disappointing decision when other manufacturers offer choices.

If you’re invested in Corsair’s other products and want a high-end mechanical keyboard with ultra-low-profile tactile switches that offer excellent typing and gaming performance, then I would recommend it if it’s on sale. Otherwise, you’re better served with cheaper alternatives.

Editors’ Choice




Repost: Original Source and Author Link

Categories
Computing

Age of Empires IV Performance, Tested: Minimum to Recommend

Age of Empires IV is here, and it has some interesting tech under the hood. As a real-time strategy (RTS) game, it doesn’t call for a super-high framerate. Still, it’s more demanding than most RTS titles, so you’ll need to find the best settings to optimize your performance.

The game is good at automatically configuring the settings in the best way for your system, and it includes a minimum spec renderer to help low-end machines and integrated GPUs maintain a playable framerate. I took the game out for a spin on a few machines to measure the minimum spec renderer and what kind of performance you can expect with different rigs.

Age of Empires IV system requirements

Age of Empires IV is a highly scalable game, partly because it includes a dynamic minimum spec mode that will kick in when your hardware isn’t up to snuff. The developers have four different tiers for the hardware requirements, scaling up to recent CPUs and GPUs and down to integrated graphics from a few years back.

As you can see in the chart above, you don’t need a lot to run Age of Empires IV, but the visual quality will change a lot based on your hardware. If you just meet the minimum specs, you won’t get the full experience. Based on your hardware, the game will automatically use a lower-spec renderer, reducing visual fidelity across the board.

You can trigger this by turning the Image Quality settings to Low. Doing this will block off all of the settings except for Shadow Quality. You can choose to turn off shadows entirely if you want, but the other settings are automatically set to Low in this configuration.

I tested the game on a few different configurations, and each time, the game automatically recognized the right settings to ensure my framerate was near or above 60 frames per second (fps). I have a few optimization tips in the desktop section below, but overall, you shouldn’t need to poke your head in the settings menu at all.

XPS 13 — Integrated GPU

XPS 13 running Age of Empires IV.

I started testing with my 2020 Dell XPS 13, which includes an Intel Core i7-1185G7 and 16GB of RAM. This machine includes Intel’s Iris Xe integrated graphics, which are actually pretty good at running some less demanding games. If you were hoping to crank everything up to max in Age of Empires IV, though, you’ll be disappointed.

At native 1920 x 1080, I managed just 17 fps. Even that doesn’t capture how bad the experience was. You don’t need a high framerate for Age of Empires IV, but with minimum framerates down in the single digits, this type of performance won’t cut it.

As mentioned, Age of Empires IV automatically adjusts your video settings when you first load up the game. For my XPS 13, it set the resolution to 720p and lowered the render resolution to 66%. Needless to say, it looked horrible the first time I loaded it up. I went back to the recommended graphics settings (below) but kept the resolution at 1080p with full render resolution.

  • Image quality: Low
  • Animation quality: Low
  • Shadow quality: Low
  • Texture detail: Low
  • Geometry detail: Low
  • Physics: Low
  • Anti-aliasing: Off

With just the settings turned down, the machine managed a playable 58 fps average. The game certainly didn’t look great — every setting was turned to Low, and anti-aliasing was off — but the experience held up. This is the minimum spec mode at work. Based on my testing, it can help integrated graphics run what would otherwise be a fairly demanding game.

I would recommend turning to the graphics settings before lowering your render resolution. The settings have a lot of bandwidth, so you can easily optimize your performance with them. The resolution slider is useful, though it will mangle the image in a mess of pixels quickly. The settings won’t.

XPS 15 — Low-end discrete GPU

XPS 15 running Age of Empires IV.

After my current laptop, I tested my previous laptop — a 2017 Dell XPS 15 with an Intel Core i7-7700HQ, a GTX 1050 graphics card, and 8GB of RAM. This was a solid machine a few years back, and the discrete GPU helped Age of Empires IV a lot (even if it isn’t nearly as powerful by today’s standards).

At 1080p with all of the settings maxed out, I managed 26 fps. That’s nearly double the framerate on my newer XPS 13, and although below what I would consider playable, you could manage with this level of performance. The game actually defaulted to higher graphics settings when I first booted it up, and it ran in 1080p with the render resolution at 66%, not 720p like on my XPS 13.

A discrete GPU, even one that was underpowered when it was released, is better than integrated graphics. With the recommended settings (below), the machine jumped to a 58 fps average. The recommended settings were higher than my XPS 13, too, so there’s still a little room to optimize the game beyond what it recommends.

  • Image Quality: Medium
  • Animation Quality: Low
  • Shadow Quality: Low
  • Texture Detail: Low
  • Geometry Detail: Medium
  • Physics: Low
  • Anti-aliasing: Low

1080p is the target for most mobile rigs unless you have a dedicated gaming laptop. A discrete GPU is best so you can turn up some settings, but as the XPS 13 shows, you can get by with integrated graphics if you’re willing to turn down everything. The render slider helps, too, though at a significant cost to image quality.

Desktop — High-end gaming rig

Finally, I turned to my desktop with an Intel Core i9-10900K, an RTX 3070, and 32GB of RAM. This configuration blows past Age of Empires IV‘s ideal system requirements, so I was able to manage a comfortable 79 fps average at 4K with all of the sliders maxed out. This really is the ideal way to play Age of Empires IV. 

The minimum spec mode is great for low-end hardware and integrated GPUs, but the game shines with all of the visual bells and whistles. The lighting system looks fantastic, and the character models are surprisingly detailed. A 79 fps average is nothing to sneeze at, but you could hit framerates above 100 fps if you optimize a few settings.

If you want to optimize your game, look at the Shadow Quality and Animation Quality settings. Those will provide a decent boost in performance. If you’re using a graphics card with less than 8GB of RAM, Texture Quality will offer some huge performance gains, too.

I took a couple of 4K screenshots (above) to see the difference between the regular renderer and the minimum spec one. And there are some massive differences. The character models are of lower quality, but the bigger differences come in lighting, shadows, and terrain quality. If you can play the full version, you should. That said, it’s nice to see an option to allow less powerful machines to still experience the game.

Editors’ Choice




Repost: Original Source and Author Link

Categories
AI

Studies find bias in AI models that recommend and diagnose diseases

Research into AI- and machine learning model-driven methods for health care suggests that they hold promise in the areas of phenotype classification, mortality and length-of-stay prediction, and intervention recommendation. But models have traditionally been treated as black boxes in the sense that the rationale behind their suggestions isn’t explained or justified. This lack of interpretability, in addition to bias in their training datasets, threatens to hinder the effectiveness of these technologies in critical care.

Two studies published this week underline the challenges yet to be overcome when applying AI to point-of-care settings. In the first, researchers at the University of Southern California, Los Angeles evaluated the fairness of models trained with Medical Information Mart for Intensive Care IV (MIMIC-IV), the largest publicly available medical records dataset. The other, which was coauthored by scientists at Queen Mary University, explores the technical barriers for training unbiased health care models. Both arrive at the conclusion that ostensibly “fair” models designed to diagnose illnesses and recommend treatments are susceptible to unintended and undesirable racial and gender prejudices.

As the University of Southern California researchers note, MIMIC-IV contains the de-identified data of 383,220 patients admitted to an intensive care unit (ICU) or the emergency department at Beth Israel Deaconess Medical Center in Boston, Massachusetts between 2008 and 2019. The coauthors focused on a subset of 43,005 ICU stays, filtering out patients younger than 15 years old who hadn’t visited the ICU more than once or who stayed less than 24 hours. Represented among the samples were married or single male and female Asian, Black, Hispanic, and white hospital patients with Medicaid, Medicare, or private insurance.

In one of several experiments to determine to what extent bias might exist in the MIMIC-IV subset, the researchers trained a model to recommend one of five categories of mechanical ventilation. Alarmingly, they found that the model’s suggestions varied across different ethnic groups. Black and Hispanic cohorts were less likely to receive ventilation treatments, on average, while also receiving a shorter treatment duration.

Insurance status also appeared to have played a role in the ventilator treatment model’s decision-making, according to the researchers. Privately insured patients tended to receive longer and more ventilation treatments compared with Medicare and Medicaid patients, presumably because patients with generous insurance could afford better treatment.

The researchers caution that there exist “multiple confounders” in MIMIC-IV that might have led to the bias in ventilator predictions. However, they point to this as motivation for a closer look at models in health care and the datasets used to train them.

In the study published by Queen Mary University researchers, the focus was on the fairness of medical image classification. Using CheXpert, a benchmark dataset for chest X-ray analysis comprising 224,316 annotated radiographs, the coauthors trained a model to predict one of five pathologies from a single image. They then looked for imbalances in the predictions the model gave for male versus female patients.

Prior to training the model, the researchers implemented three types of “regularizers” intended to reduce bias. This had the opposite of the intended effect — when trained with the regularizers, the model was even less fair than when trained without regularizers. The researchers note that one regularizer, an “equal loss” regularizer, achieved better parity between males and females. This parity came at the cost of increased disparity in predictions among age groups, though.

“Models can easily overfit the training data and thus give a false sense of fairness during training which does not generalize to the test set,” the researchers wrote. “Our results outline some of the limitations of current train time interventions for fairness in deep learning.”

The two studies build on previous research showing pervasive bias in predictive health care models. Due to a reticence to release code, datasets, and techniques, much of the data used to train algorithms for diagnosing and treating diseases might perpetuate inequalities.

Recently, a team of U.K. scientists found that almost all eye disease datasets come from patients in North America, Europe, and China, meaning eye disease-diagnosing algorithms are less certain to work well for racial groups from underrepresented countries. In another study, Stanford University researchers claimed that most of the U.S. data for studies involving medical uses of AI come from California, New York, and Massachusetts. A study of a UnitedHealth Group algorithm determined that it could underestimate by half the number of Black patients in need of greater care. Researchers from the University of Toronto, the Vector Institute, and MIT showed that widely used chest X-ray datasets encode racial, gender, and socioeconomic bias. And a growing body of work suggests that skin cancer-detecting algorithms tend to be less precise when used on Black patients, in part because AI models are trained mostly on images of light-skinned patients.

Bias isn’t an easy problem to solve, but the coauthors of one recent study recommend that health care practitioners apply “rigorous” fairness analyses prior to deployment as one solution. They also suggest that clear disclaimers about the dataset collection process and the potential resulting bias could improve assessments for clinical use.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform
  • networking features, and more

Become a member

Repost: Original Source and Author Link

Categories
Tech News

Spotify patents eerie ‘mood-detecting tech’ to recommend you songs

For example, the tone of voice may be more upbeat, high-pitched and/or exciting for users that have been assigned the personality trait of extroversion.

It also proposes using “intonation, stress, [and] rhythm” to infer your mood, “a combination of vocal tract length and pitch” to estimate your age, and environmental metadata to detect whether you’re alone or in a group.

[Read: How this company leveraged AI to become the Netflix of Finland]

These insights could be used to recommend songs in a variety of ways:

In one example, the output might simply be to play the next content. In another example, the output might be a recommendation on a visual display… In another example aspect, the output is a display of recommended next music tracks corresponding to the preferences.

The patent may not find its way into the platform, but it does offer a glimpse into the future of music recommendations.

Don’t worry, be happy?

Spotify has always stressed that it recommends a diverse range of music — because there are only so many times you can listen to the Black Eyed Peas before you go van Gogh on your ears. One, to be precise.

Still, the idea of determining music tastes based on demographic data sounds rather restrictive.

Just because I’m a 35-year-old male from the UK, it doesn’t mean I like Norah Jones. Maybe I wanna get down with the kidz and put on some K-pop? I absolutely don’t, but you get what I mean.

The mood-based recommendations sound more promising — but also more disturbing.

If I’m wallowing in self-pity, would Spotify recommend some Morrisey to push me deeper into the darkness, or Walking on Sunshine to drag me out of my hole? Either way, I wouldn’t wanna rely on algorithms for psychological support.



Repost: Original Source and Author Link