Alphabet is putting its prototype robots to work cleaning up around Google’s offices

What does Google’s parent company Alphabet want with robots? Well, it would like them to clean up around the office, for a start.

The company announced today that its Everyday Robots Project — a team within its experimental X labs dedicated to creating “a general-purpose learning robot” — has moved some of its prototype machines out of the lab and into Google’s Bay Area campuses to carry out some light custodial tasks.

“We are now operating a fleet of more than 100 robot prototypes that are autonomously performing a range of useful tasks around our offices,” said Everyday Robot’s chief robot officer Hans Peter Brøndmo in a blog post. “The same robot that sorts trash can now be equipped with a squeegee to wipe tables and use the same gripper that grasps cups can learn to open doors.”

These robots in question are essentially arms on wheels, with a multipurpose gripper on the end of a flexible arm attached to a central tower. There’s a “head” on top of the tower with cameras and sensors for machine vision and what looks like a spinning lidar unit on the side, presumably for navigation.

One of Alphabet’s Everyday Robot machines cleans the crumbs off a cafe table.
Image: Alphabet

As Brøndmo indicates, these bots were first seen sorting out recycling when Alphabet debuted the Everyday Robot team in 2019. The big promise that’s being made by the company (as well as by many other startups and rivals) is that machine learning will finally enable robots to operate in “unstructured” environments like homes and offices.

Right now, we’re very good at building machines that can carry out repetitive jobs in a factory, but we’re stumped when trying to get them to replicate simple tasks like cleaning up a kitchen or folding laundry.

Think about it: you may have seen robots from Boston Dynamics performing backflips and dancing to The Rolling Stones, but have you ever seen one take out the trash? It’s because getting a machine to manipulate never-before-seen objects in a novel setting (something humans do every day) is extremely difficult. This is the problem Alphabet wants to solve.

Unit 033 makes a bid for freedom.
Image: Alphabet

Is it going to? Well, maybe one day — if company execs feel it’s worth burning through millions of dollars in research to achieve this goal. Certainly, though, humans are going to be cheaper and more efficient than robots for these jobs in the foreseeable future. The update today from Everyday Robot is neat, but it’s far from a leap forward. You can see from the GIFs that Alphabet shared of its robots that they’re still slow and awkward, carrying out tasks inexpertly and at a glacial pace.

However, it’s still definitely something that the robots are being tested “in the wild” rather than in the lab. Compare Alphabet’s machines to Samsung’s Bot Handy, for example; a similar-looking tower-and-arm bot that the company showed off at CES last year, apparently pouring wine and loading a dishwasher. At least, Bot Handy looks like it’s performing these jobs, but really it was only carrying out a prearranged demo. Who knows how capable, if at all, this robot is in the real world? At least Alphabet is finding this out for itself.

Repost: Original Source and Author Link


AMD May Work With Samsung on New 4nm Chromebook Processors

AMD may soon work with the Samsung chip foundry in order to prepare 4nm Chromebook processors, according to Gokul Hariharan, an analyst at J.P. Morgan.

AMD has previously only worked with two semiconductor factories — TSMC and GlobalFoundries — but if the report proves to be true, the cooperation with Samsung may also extend to graphics cards.


The cooperation between AMD and GlobalFoundries, and later mainly TSMC, has lasted for over a decade. It now seems that AMD may be looking to expand its reach and utilize Samsung’s 4LPP process in the creation of new Chromebook processors. These new 4nm nodes are likely to end up in AMD’s entry-level APUs, which are optimized for long battery life at minimal power requirements.

“Our research indicates that AMD is likely to outsource a Chromebook CPU to Samsung on its 4nm (likely mass production in late 2022), and TSMC may have limited capacity to allocate for Chromebook projects, given the declining market demand,” said Hariharan in a note.

If AMD does indeed proceed with working with Samsung, some of its products will be changed compared to the chips manufactured at TSMC. It’s possible that AMD may have to redesign its CPU and GPU IP in order to adapt it to Samsung’s 4LPP 4nm-class node process. Although this could prove to be rather costly, branching out is never a bad idea, and the 4nm process node seems well-suited for the low power requirements of Chromebooks.

AMD Ryzen 5000G

The analysts at J.P. Morgan also believe that AMD may use the Samsung Foundry for some of its graphics card designs going forward, but this won’t happen until much later. The report predicts 2023/2024 as the time when AMD may work with Samsung 3nm on some GPU-related projects. Although AMD is definitely branching out, the analysts predict that the vast majority of its core platforms will remain with TSMC N3.

The information comes from a note prepared for the clients of J.P. Morgan that was then shared on Twitter by @MarcTheShark83. We’re not likely to see the new AMD APU anytime soon — production is set to begin in 2022. AMD is also working on new APUs from the AMD Rembrandt 6000-series, which when released will offer unprecedented iGPU performance for this price range.

AMD is not the first manufacturer looking to switch things up. Recent rumors suggest that Apple may soon change its new M1 Pro and M1 Max chips. Found inside the new MacBook Pros, these chips are currently based on a 5nm manufacturing process, and it’s impossible to deny that they deliver top-notch performance. However, Apple might be about to switch to a 3nm process in order to achieve even better output. These chips would be created in cooperation with TSMC.

Editors’ Choice

Repost: Original Source and Author Link


Battlefield 2042: How Does the Plus System Work?

Battlefield 2042 brings plenty of new content to the storied franchise, but the Plus System is arguably the most exciting. This powerful, quality-of-life change gives you improved control over the weapons you bring into battle — and it’s a bit more complex than you’d expect.

If you’re looking to be a threat in every situation Battlefield 2042 throws at you, here’s what you need to know to master the Plus System.

Further reading

What is the Plus System in Battlefield 2042?

Battlefield 2042‘s Plus System gives you the ability to change weapon attachments while in the middle of a game. Instead of having to wait until your next respawn, you can quickly adjust your scope, barrel attachment, and other settings as needed.

See an enemy out on the horizon? Quickly clip a scope on your gun, and you’re ready for long-range engagements. Need to sneak through a building undetected? Throw on a silencer with just a few button presses. The system is surprisingly flexible, and it allows you to cater your loadout to whatever situation you happen to find yourself in.

How does the Plus System work?

A soldier stands next to a downed satellite in Battlefield 2042 Hazard Zone.

While in a match, press the Plus System button for your specific platform (LB on Xbox, L1 on PlayStation, T on PC) to pull up the menu. The Plus System is broken into four arms — hence its name — and depending on which weapon you’re carrying, you’ll have a variety of different attachments to choose from.

On console, you can simply use the D-Pad to highlight which attachment you want to add to your weapon. Release LB or L1, and the game will automatically add the selected attachment. On PC, you’ll have to left-click your selection. There’s no limit to how many times you can use the Plus System, so don’t be afraid to dive into the menu and adjust your loadout as needed.

How do I unlock more content for the Plus System?

Although your options are limited at the start of Battlefield 2042, the Plus System will gradually accumulate more attachments as you level up. These are typically tied to your Mastery Level with each specific weapon — that is, the more kills you rack up, the more attachments you’ll unlock.

You can view all currently unlocked Plus System attachments — and change your Plus System layout — by following these simple steps:

  • Head to the Main Menu.
  • Select the Collection tab at the top of the screen.
  • Navigate to the Weapons submenu.
  • Select the weapon you’d like to view/modify.

This will pull up a menu that shows all the gear you’ve unlocked for your specific weapon. It’ll also give you a chance to add each piece of gear to the Plus System — making it easy to optimize your build before heading into a match.

Keep in mind that there are more unlockable attachments than there are slots in the Plus System, so you’ll need to make sure you’re only bringing your favorite gear into battle. These can be changed at the end of every battle, but while you’re in-game, you’ll only have access to those attachments you’ve slotted into the Plus System.

Editors’ Choice

Repost: Original Source and Author Link


What is the Metaverse and How Does it Work?

Wondering what the Metaverse is? Chances are you’ve already been there. There is no one perfect description for the concept, but in general, we’re talking about digital interaction and human decision making in a few key ways. The term “Metaverse” comes from the Neal Stephenson science fiction novel Snow Crash, released in 1992 – but it’s come to mean so much more since then.

The Metaverse is Second Life

There were games before it that were similar, but the release of Second Life really struck a chord with pop culture in a way that still rings true today. As Dwight said in The Office, “Second Life is not a game. It is a multiuser virtual environment. It doesn’t have points, or scores, it doesn’t have winners or losers.”

Dwight does a decent job of explaining Second Life as it represents a very rudimentary doorway into the metaverse. Incidentally, the second bit about how he plays Second Life is important to explaining the difference between classic games and the metaverse.

Dwight says he created a version of himself in Second Life that was exactly the same as he was in his real life, except he could fly. The metaverse can be as simple as that, it doesn’t need to be anything as wild and crazy as we see in Ready Player One.

The Metaverse is Ready Player One

The story and movie Ready Player One present a future in which the idea of the metaverse has become so pervasive that people care more about their life in the machine more than they do in real life. In what the book and movie call The Oasis, we see a hosted metaverse with an idyllic (and potentially impossible) sense of freedom and openness.

At the same time, this representation of the metaverse suggests there’ll be one all-encompassing, all-accessible piece of software that’ll host everything and anything. Aside from the reality in which we live today, we’re crossing our fingers and praying that no other singular universe like this dominates our digital world in the future. The implications would be nightmarish.

The Metaverse is Minecraft and Roblox

The first time you play Minecraft, you realize you’ve entered a new realm of “gameplay.” You’re represented by an avatar that has the ability to dig holes, harvest materials, build things, and live a life as you see fit. You can also play games and go fishing.

In Minecraft, creativity comes from a careful balance of limitations and functionality. You control blocks, and you get a sense of accomplishment from achieving goals within an environment that has a clear set of rules. Minecraft was created as a game, and became a platform once its potential was revealed.

Roblox was built as a platform, and is a game platform for creators from the get-go. Roblox was created as a place where content would be generated by users with very few restrictions.

Despite what the very fantastical trailer above shows, the Roblox platform is not as immediately aesthetically pleasing as Minecraft. Because of the relative lack of curation done by those in charge of Roblox, it is not difficult to find glitches and games that are effectively non-functional.

The important bits of both Roblox and Minecraft are in creative potential. Both titles are immersive, and both allow you to create and modify the environment in which you exist.

The Metaverse is not new

The basic building blocks of the metaverse have been around since the early days of the internet. As soon as we started giving ourselves personalized usernames, using fun icons, and building our own webpages – we’ve been in the metaverse for a while, really.

It’s just now that we’re getting to a point where describing these creative environments has become necessary. We’ve entered a point at which an all-digital environment can host more than just a game – it can be the place where we do work, socialize, and effectively live an entire second life.

The Metaverse is Mixed Reality

As Niantic describes the Real-World Metaverse we see the phase we’re entering now. Non-fungible Tokens (NFT) represent one way in which digital goods can be seen as “real” as physical goods. An experience like Pokemon GO shows us how attaching digital goods to our real world can make a platform feel like more than a game.

The potential for the metaverse is massive. Metaverse apps will generate billions in consumer spending from this point forward. Companies that successfully accept and secure their place in this creative digital landscape will find monstrous room for growth.

There is no one metaverse

As an individual, it’s important that you stay aware of the dangers of this new reality. As it is with any phase change in our human experience, there’s room for profit and power, but there’s also room for malicious actors and all manner of people with bad intent.

There’ll be plenty of liars. Lying liars why lie about how their take on the metaverse is the end-all, be-all platform for said metaverse. There is no one single “metaverse”, even if a company has branded their ecosystem as such.

Just as it has always been with the internet, so too is it true of the metaverse – there is on one single authority, only entities. There are plenty of entry points into the ephemeral environment that is the metaverse, and not all elements within this future are compatible. Whatever avenue you choose, and with whomever you choose to interact, be careful – and have fun!

Repost: Original Source and Author Link


Pokemon GO wont work on your old iPhone soon

An announcement was made by Niantic in mid-November 2021 that Pokemon GO would soon end support for older versions of iOS. Your iPhone running a version of iOS released before September of 2019. If you have an iPhone with software that old, now is the time to get an update from Apple. Also, who are you, and would you mind answering a few questions for us?

In September of 2019, Apple released the operating system known as iOS 13. If you have an iPhone running software older than this, Pokemon GO will cease to function in the very near future. To see which version of iOS your device is running, open Settings – General – About.

Niantic suggested that their “next release” will be the one that’ll make this change. They’ve recommended that all iOS users make sure they’re updated beyond iOS 12 as soon as possible so as to “continue to play without interruption.”

No changes will be made to the devices that are supported by Pokemon GO. This means that if you have a very old iPhone, but your very old iPhone is able to run iOS 13, you’ll still be golden. Niantic suggested this week that their eldest iPhone model support remains the same – iPhone 6s and newer.

Pokemon GO release version 225 will be the first that will cut off access to features from devices running iOS 12 or older. Any version 12.x counts in this – you’ll need at least iOS 13 to continue forward in Pokemon GO once version 225 is released.

If you have an Android device, you will not be affected by this update. Android users already have a minimum version number Android 6, AKA Android Marshmallow. That operating system was first released all the way back in October of 2015. The span of time Niantic covers in version releases of operating systems is based on the percentage of users that play on any given operating system.

Once Niantic has a requisite amount of players appearing using the newest version of iOS, and a small enough number of players playing on older versions of iOS, they move forward. Each time a new version of the Pokemon GO app is made, each time they need an update, they must make certain it’ll all work with every different operating system Niantic officially supports. Reducing the number of operating system versions Pokemon GO needs to be tested on allows Niantic’s developers to focus on more important things – like plopping Shiny Pokemon in the game for special events!

This should give you some idea of how spread out the version numbers are in active Android players, and how up-to-date most iOS users are at any given time. Niantic sees that most iOS users are on iOS 13 or higher, so they’ll drop support for iOS 12 soon. Once enough Android users update from… Marshmallow?! What on earth? They’ll drop that too!

Repost: Original Source and Author Link


Do Old Joy-Cons Work with the New Nintendo Switch OLED?

While it wasn’t the rumored Switch Pro, Nintendo did release a new version of their latest hardware in 2021. The Nintendo Switch OLED is the latest iteration on their handheld/home console hybrid machine, but it isn’t bringing much of any new power. What it does bring is the titular OLED screen for when you play it in handheld mode. OLED screens are seen as the superior screen type thanks to how they make colors look so much more vibrant and deep. Even though the system can’t push games to actual higher resolutions, the OLED screen still makes every game on the Switch look that much sharper.

Unlike the Switch Lite, which has the controls connected to the device itself, this new Switch OLED is another version of the standard Switch. That means it can be docked and played in handheld mode just like the original. For Nintendo fans who are looking to upgrade their original units, the new OLED screen could be very tempting, but if you’ve had the Switch for a couple years now and have collected some of the many colorful Joy-Con controllers, you may be hesitant to buy this new one if they aren’t compatible. Now that we’ve gotten our hands on the new system, we can share for certain if your old Joy-Cons will work with the new Nintendo Switch OLED.

Further reading

Are old Joy-Cons compatible with the Nintendo Switch OLED?

The short and sweet answer is yes. Any and all Joy-Cons you have in your collection can be paired up and used with the new Nintendo Switch OLED seamlessly. That means all those bright and colorful Joy-Cons you have can be mixed and matched with your shiny new OLED screen as you wish. With this new model’s slightly bigger screen and much-improved kickstand, it has never been more comfortable to take your Switch on the go. Also, if you plan on keeping your old Switch, the new Joy-Cons that come with the OLED model can be used on that system as well. Mix and match your controllers to your heart’s content!

Also, should the situation ever come up, the entire Switch OLED can also fit into your original Switch’s dock. All this to say, if you feel the enhanced screen is worth the investment and can find any available units to purchase, there’s no downside to grabbing this slightly new Nintendo Switch system.

Editors’ Choice

Repost: Original Source and Author Link


Best Nvidia Control Panel Settings: Gaming, Work, Creativity

Whether you’re into gaming, creative work, or just some day-to-day computing, the graphics card plays a key role in many tasks. What many people don’t know is that getting the most out of your Nvidia GPU involves knowing the best Nvidia Control Panel settings.

Are you looking for the ultimate gaming experience with higher frame rates and better visuals? We’re here to help.

Keep reading to optimize your Nvidia settings in a few quick steps.

Step 1: Update your drivers

Keeping your drivers up-to-date is the key to getting the best out of your graphics card. Before delving further into the Nvidia Control Panel, make sure that you’ve got the latest drivers that Nvidia has to offer.

Downloading Nvidia drivers is simple and can be done in two ways: via the Nvidia website or through the GeForce Experience program.

Downloading via the Nvidia website

If you’re downloading straight from the Nvidia website, simply select your graphics card, press Search, and download the latest driver. If you’re not sure about the exact model of your card, check your PC specifications first.

Once the file has been downloaded, double click it and let the installation commence. During that time, your screen may go black at times — don’t worry about it.

Downloading via GeForce Experience

Nvidia GeForce Experience.

GeForce Experience is a program that helps you update drivers, optimizes in-game settings, and more. It also alerts you when new drivers are available for download.

To use this method of updating your drivers, download the program and then install it. You may be prompted to create an account. Once that is done, simply launch GeForce Experience.

The program will automatically begin searching for new drivers. If any are found, you will be alerted at the top of the screen.

Choose Express Installation and let the drivers install. Your screen may occasionally go black during the installation — this is perfectly normal.

Step 2: Launch the Nvidia Control Panel

Before you proceed, make sure to restart your computer after the installation of new drivers.

Most owners of Nvidia graphics cards will have the Control Panel installed by default. This means that you likely won’t need to download it. However, if you will need to download it at any point, you can get it here:

There are two ways to launch the Control Panel. The easiest way is to simply right-click on the desktop and choose Nvidia Control Panel from the dropdown menu.

Finding the Nvidia Control Panel.

Alternatively, launch the Windows Start menu by clicking the Windows icon at the bottom-left of your screen. You can also press the Windows button on your keyboard, located similarly near the bottom-left side.

With the Start menu open, type in Nvidia Control Panel and then press Enter.

Best Nvidia Control Panel settings for gaming and performance

You can use individual games’ settings menus to decide your GPU settings, but optimizing the settings of your Nvidia card in the Control Panel may have a huge impact on your gaming experience. Smoother gameplay and better, sharper, brighter visuals are all a possibility when the settings are properly adjusted.

The Nvidia Control Panel is easy enough to navigate, but there are so many options to choose from, it may seem confusing at first. In order to find the best Nvidia settings for gaming or simply day-to-day performance, you will need to navigate the list of settings explained below.

If you don’t need a long explanation of what each setting does and how it can improve the performance of your graphics card, refer to the bite-sized guide below:

Nvidia Control Panel — 3D Settings

The 3D Settings tab on the left-hand side of the Nvidia Control Panel is arguably the most important when it comes to gaming, but it’s equally important for creativity. To access all the options, simply click on Adjust Image Settings With Preview.

Below the moving Nvidia logo, select Use the Advanced 3D Image Settings and then click Apply at the bottom.

Switch to the Manage 3D Settings tab on the left side in order to edit all the available 3D settings.

Editing 3D Settings in the Nvidia Control Panel.

Image sharpening

This setting enhances the visuals in your games, making them appear sharper and clearer. For the best performance, turn it On.

The sharpening level should be set to around 0.50 and the film grain to around 0.17, but feel free to play around.

If you see the option to turn on GPU Scaling, do that by ticking the box. This will enable scaling to the native resolution of your display.

Ambient occlusion

This setting is responsible for the shadows and environmental lighting in your games.

For the best balance between GPU load and great gameplay, set this to Performance.

Anisotropic filtering

Anisotropic filtering increases the visual quality of game textures when your camera is at a steep angle.

This setting should be set to Application-controlled.

Antialiasing — FXAA

This stands for Fast approximate anti-aliasing which is Nvidia’s screen-space anti-aliasing algorithm. You can turn this setting Off.

Antialiasing — Gamma correction

This corrects the brightness values in images enhanced by antialiasing. It’s usually best to turn this setting On.

Antialiasing — Mode

This is a general setting related to antialiasing, which in itself is a technique that smooths out images. Leave this at Application-Controlled.

Antialiasing — Transparency

The last AA setting applies to Nvidia’s technology of applying antialiasing to transparent textures. You can usually turn this Off.

Background Application Max Frame Rate

This controls the frames per second (fps) that your games and other applications will have when minimized. If you don’t have any performance issues, you can leave this Off.

If you feel you’d rather limit background frame rates, turn this on and set it to the bare minimum frame rate you want to target, such as 60 fps, or on older or weaker GPUs, 30 fps.


This setting should always be set to All, as it refers to which of the CUDA cores in your graphics card can be used.

DSR – Factors

DSR stands for Dynamic Super Resolution. This technology improves image quality by rendering the images and upscaling them to a higher resolution.

While this sounds good on paper, it’s a killer for your frame rates (fps), so it’s best to turn this setting Off.

DSR – Smoothness

Much like the previous DSR setting, this will only decrease your fps. As such, it’s better to turn it Off.

Low Latency Mode

Low Latency Mode ensures that the frames in your game are submitted into the render queue just when the graphics card requires them. Nvidia refers to this as “just in time frame scheduling”. This results in, as the name itself suggests, lower latency and higher frame rates.

To increase your fps, turn this setting On.

Max Frame Rate

This setting limits your fps to a certain number. Different games will be able to achieve different fps, and though the true limit of what you can experience will always be your monitor’s refresh rate, some games have heavily unlocked frame rates in menus, which can result in undue power drain on your GPU.

If you don’t want to limit your frame rates in any way, simply turn this setting Off. If you’d like to adjust this setting to match your monitor, keep reading.

Some competitive gamers like to set this option to double that of their monitor’s refresh rate. If you’re not sure about the refresh rate of your monitor, you can check it in Windows Settings.

In order to check your display’s refresh rate, click the Windows logo on the bottom-left side of the screen. Next, type in Advanced Display and click on View Advanced Display Info. Check the refresh rate and adjust this setting to match it. In our example, the limit would be 75.

Checking monitor refresh rates in Windows 10.

Monitor technology

This setting will not be visible to all users. If you can see it, you can use it to turn on Nvidia’s G-Sync. Nvidia G-Sync is responsible for adjusting your monitor’s refresh rate to become dynamic, causing display refreshes only when a frame is sent from the GPU. It solves issues such as screen tearing.

G-Sync is a welcome addition on budget computers, but it’s not needed on modern desktops and monitors. If you’re using a mid-to-high-end setup, it’s best to leave this Off. Otherwise, turn it On.

Multi-Frame Sampled Anti-Aliasing (MFAA)

This setting removes jagged edges and smooths out graphics, resulting in improved visuals. Unfortunately, this is a small gain for the price that your frame rates may have to pay.

For gaming, we recommend that you turn this setting Off.

OpenGL Rendering GPU

This option lets you choose which one of your graphics cards (if you have more than one) will be used for OpenGL. Pick your GPU from the dropdown menu and select it.

Power Management Mode

As the name implies, this setting is responsible for optimizing the power vs performance ratio of your graphics card.

If you don’t mind letting your GPU use maximum power and perhaps run a little hotter, select Prefer Maximum Performance.

Shader Cache

When turned on, this setting reduces processor usage. It’s optimal to turn it On.

Texture filtering — Anisotropic sample optimization

Anisotropic sample optimization limits the number of samples that are used by your GPU. Turn this setting On.

Texture filtering — Negative LOD bias

When turned on, this setting makes stationary images sharper and enables texture filtering. Toggle it to Allow.

Texture filtering — Quality

This setting lets you optimize texture filtering to value performance. Switch it to High Performance.

Texture filtering — Trilinear optimization

Trilinear optimization will usually be on by default. It smooths out textures in your games. If it happens to be turned off, make sure to switch it to on.

Threaded optimization

Threaded optimization allows your computer to utilize several processor cores at once. Turn this to Auto.

Triple buffering

When triple buffering is enabled, frames are rendered in one back buffer. Although there are some games that benefit from this setting, it’s best to turn it Off.

Vertical sync

Vsync synchronizes the frame rate with your monitor’s refresh rate. As it can limit your fps, it’s better to turn it Off unless you experience severe screen tearing issues.

Virtual Reality pre-rendered frames

This setting limits the number of frames that your processor prepares ahead of your GPU being able to process it.

For optimal performance, set this to 1.

Finishing up

Once you have adjusted all of these settings, click Apply on the bottom right-hand side to save the changes.

Nvidia Control Panel settings – Configure Surround, PhysX

Go back to the menu on the left-hand side and navigate to Configure Surround, PhysX.

Nvidia Control Panel.

On the right part of this section, you will find PhysX Settings. Switch that from Auto to the model of your GPU.

Nvidia Control Panel settings – Display

This section will help you optimize your display settings. Navigate to it on the left-hand side and go down the list as required.

Change resolution

Select the monitor that you are using at the top of this section. If you have multiple displays, you will have to repeat the steps for all of them.

Under Resolution, scroll down until you find the highest possible resolution in the PC section. Next, adjust the monitor’s refresh rate to the highest available.

Scroll down to the color settings and click on Use Nvidia Color Settings. Make sure that the desktop color depth is set to Highest (32-bit) and that the Output Dynamic Range is set to Full.

Nvidia Control Panel.

Click Apply to save all your changes.

Adjust desktop color settings

This section lets you play around with the color settings on your display. All the settings here are down to your personal preference. You can adjust Brightness, Contrast, and Gamma in the first row. Feel free to move the sliders and press Apply to see the result, as the changes are easily reverted.

You can also try out Digital Vibrance. This setting will increase color saturation and make the shades brighter. A value of around 70% to 80% may look best, but this depends on the game.

Nvidia Control Panel.

If you’ve made any changes, press Apply.

Adjust desktop size and position

You will notice that we’ve skipped over the next three sections: Rotate display, View HDCP status, and Set Up Digital Audio. All of these have no impact on gaming and won’t require adjustments.

In the Adjust Desktop Size and Position tab, pick the display you want to make changes to, and then look down toward the Scaling section.

Nvidia Control Panel.

You can choose the correct setting to pick here based on your needs.

  • For the highest fps possible at native resolution, pick No Scaling.
  • If you want the best mix of performance and visuals, make sure you Enabled GPU Scaling in 3D settings, and then set this option to Aspect ratio.
  • For games with low resolutions and pixel games pick Integer Scaling.

Don’t forget to make sure that the scaling is performed on the GPU by selecting that option in the dropdown menu below. Lastly, tick the box that says Override the Scaling Mode Set By Games and Programs.

Once you’re done, press Apply.

Set up G-Sync

Look back at the menu on the left-hand side. If this section is not visible to you, simply move on to the next step. If it is, it will let you decide whether or not to use Nvidia G-Sync.

Nvidia’s G-Sync synchronizes your monitor’s refresh rates to match the graphics card. However, on modern computers with powerful graphics cards and gaming monitors with higher refresh rates, this setting is almost obsolete. It can actually lower your gaming performance.

If you are playing on a budget or mid-range computer that is several years old, you can try out G-Sync by ticking the box next to Enable. However, if your PC is somewhat new, you should keep G-Sync off.

Set up multiple displays

This section is only useful to users who run multiple monitor setups. If that’s not you, jump down to the next step.

In this part of the Control Panel, you will be able to change your display configuration and which displays to use. If you have two or more monitors, you can drag their icons using your mouse. This lets you pick which display will act as your primary monitor.

In general, you should drag the icons to match the physical setup of your displays. You can move them above, below, to the right, or to the left, so all kinds of monitor setups can be configured here.

Adjust video color settings

This setting will help you optimize the color palette used in videos and games.

Nvidia Control Panel.

In the second question about color adjustments choose the second option — Nvidia Settings. Switch over to the Advanced tab below. Pick Full (0-255) and press Apply to save all changes.

Finalize the changes

As the last section dubbed Adjust Video Image Settings doesn’t require any changes, you are almost done. Your PC is now running on the best Nvidia Control Panel settings for gaming and overall performance.

In order to finalize the changes, quickly restart your computer. Once that’s done, run your favorite game and check to make sure that everything is stable. Feel free to go back to the Nvidia Control Panel to tweak the settings further if needed.

Quick guide to Nvidia Control Panel settings for gaming

If you want to quickly adjust your settings to increase frames per second and offer smooth gameplay, but don’t necessarily need to know what each setting does, follow the guide below.

Start by referring to steps one and two above: download the latest drivers and launch Nvidia Control Panel. Afterward, apply the following settings:

3D Settings

  • Image sharpening: Turn this On. Set the sharpening level to 0.50 and the film grain to 0.17.
  • Ambient occlusion: Set this to Performance.
  • Antialiasing — FXAA: Turn this Off.
  • Antialiasing — gamma correction: Turn this On.
  • Antialiasing — mode: Set this to Application-controlled.
  • Antialiasing — transparency: Turn this Off.
  • Background application max frame rate: Turn this Off.
  • CUDA – GPUS: Set this to All.
  • DSR – Factors; DSR – Smoothness: Turn both of these Off.
  • Low latency mode: Turn this On.
  • Max frame rate: Turn this Off or synchronize it to your monitor’s refresh rate.
  • Monitor technology: Turn G-Sync On if you are on a budget PC/monitor combo; otherwise, turn it Off.
  • MFAA: Turn this Off.
  • OpenGL Rendering GPU: Pick your GPU from the dropdown menu and select it.
  • Power Management Mode: Select Prefer Maximum Performance.
  • Shader Cache: Turn this On.
  • Texture filtering — anisotropic sample optimization: Turn this On.
  • Texture filtering — negative LOD bias: Set this to Allow.
  • Texture filtering — quality: Set this to High Performance.
  • Texture filtering — trilinear optimization: Turn this On.
  • Threaded optimization: Set this to Auto.
  • Triple buffering: Turn this Off.
  • Vertical sync: Turn this Off.
  • Virtual reality pre-rendered frames: Set this to 1.
  • Configure surround, PhysX: Switch PhysX to the model of your GPU.

Press Apply to save all changes.

Display settings

Press Apply after completing each step.

  • Change resolution: Set the highest possible resolution and the highest refresh rate available. Select Use Nvidia Color Settings and set the color depth to the Highest, and the dynamic range to Full.
  • Adjust desktop color settings: Set Digital Vibrance to 70-80% and see if you like it.
  • Adjust desktop size and position: Refer to our full explanation to pick the right setting in this section.
  • Set up G-Sync: Turn G-Sync On if you are on a budget PC/monitor combo; otherwise, turn it Off.
  • Adjust video color settings: Select the option to Use Nvidia Settings. Pick Full (0-255).

Restart your computer to finalize the changes.

Editors’ Choice

Repost: Original Source and Author Link


Apple cares about privacy, unless you work at Apple

Jacob Preston was sitting down with his manager during his first week at Apple when he was told, with little fanfare, that he needed to link his personal Apple ID and work account.

The request struck him as odd. Like anyone who owns an Apple product, Preston’s Apple ID was intimately tied to his personal data — it connected his devices to the company’s various services, including his iCloud backups. How could he be sure his personal messages and documents wouldn’t land on his work laptop? Still, he was too giddy about his new job as a firmware engineer to care. He went ahead and linked the accounts.

Three years later, when Preston handed in his resignation, the choice came back to haunt him. His manager told him to return his work laptop, and — per Apple protocol — said he shouldn’t wipe the computer’s hard drive. His initial worry had come to pass: his personal messages were on this work laptop, as were private documents concerning his taxes and a recent home loan. Preston pushed back, saying some of the files contained highly personal information and there was no reasonable way to make sure they were all removed from the laptop without wiping it completely.

He was told the policy wasn’t negotiable.

Preston’s story is part of a growing tension inside Apple, where some employees say the company isn’t doing enough to protect their personal privacy and, at times, actively seeks to invade it for security reasons. Employees have been asked to install software builds on their phones to test out new features prior to launch — only to find the builds expose their personal messages. Others have found that when testing new products like Apple’s Face ID, images are recorded every time they open their phones. “If they did this to a customer, people would lose their goddamn minds,” says Ashley Gjøvik, a senior engineering program manager.

Apple employees also can’t use their work email addresses to sign up for iCloud accounts, so many use their personal accounts.

The blurring of personal and work accounts has resulted in some unusual situations, including Gjøvik allegedly being forced to hand compromising photos of herself to Apple lawyers when her team became involved in an unrelated legal dispute.

Underpinning all of this is a stringent employment agreement that gives Apple the right to conduct extensive employee surveillance, including “physical, video, or electronic surveillance” as well as the ability to “search your workspace such as file cabinets, desks, and offices (even if locked), review phone records, or search any non-Apple property (such as backpacks, purses) on company premises.”

Apple also tells employees that they should have “no expectation of privacy when using your or someone else’s personal devices for Apple business, when using Apple systems or networks, or when on Apple premises” (emphasis added).

Many employees have a choice between getting an Apple-owned phone or having the company pay for their phone plan. But one source tells The Verge that trying to maintain two phones can become impractical. In software engineering, certain employees are expected to participate in a “live-on” program that puts out daily builds with bug fixes. “You can’t have a successful live-on program without people treating these devices exactly the same as a personal phone,” the source says. “So a work device or a work account just won’t cut it.”

None of these policies are unique. Tech companies almost always have rules in place to search employees’ corporate devices, including personal devices used for work. It’s also common practice for tech companies to ask employees to test new software, which could potentially expose personal information. But Apple sets itself apart from other tech giants through its commitment to consumer privacy. As Tim Cook said at the CPDP Computers, Privacy and Data Protection conference in January 2021, businesses built on buying and selling user data, without the knowledge or consent of consumers, “[degrade] our fundamental right to privacy first, and our social fabric by consequence.” The lack of employee privacy has made the perceived hypocrisy particularly irksome to some workers.

Now, as employees begin to push back against a variety of Apple norms and rules, these policies are coming under the spotlight, raising the question of whether the company has done enough to safeguard personal employee data. It might seem like a company obsessed with secrecy would be sympathetic to its employees’ wishes to have confidential information of their own. But at Apple, secrecy requires the opposite: extensive knowledge, and control, over its workforce.

This is how it starts: a new Apple employee is told during onboarding that collaborating with their colleagues will require them to make extensive use of iCloud storage, and their manager offers a two terabyte upgrade. This will link their personal Apple ID to their work account — in fact, the instructions for accessing this upgrade explicitly say “you must link your personal Apple ID with your AppleConnect work account.” The connection will give them access to collaborative apps like Pages and Numbers that they might need to do their jobs. (Apple employees who do not have a business need to collaborate do not go through this process.)

Employees could pause during onboarding and say they want to create a new Apple ID specifically for work or use a different phone. But most do not — it seems a little paranoid, and the Apple instructions say to go ahead and use your personal account. What’s more, most Apple devices don’t support using multiple Apple IDs. To switch between iCloud accounts on an iPhone, you have to completely sign out of one ID and into another — a clunky, disruptive process. It is far easier culturally and technically to simply link personal and work accounts, which adds a new Apple Work folder to the employee’s iCloud account.

In theory, this Apple Work folder is where all of the collaborative documents for employees are supposed to live in order to keep personal and work files separate. In practice, the owner of a document often forgets to store files in the work folder, and documents quickly become intermingled. In fact, when Apple employees create a document in, say, Pages, the app automatically enters the personal email address used for their Apple ID. “I asked my manager about it and it’s just sort of an issue everyone deals with,” Preston says.

Employees can choose to not sync certain folders, like their photo libraries. But others, like messages, can be trickier. Apple adopted Slack in 2019, but some teams still use iMessage as a primary way to communicate, which makes opting out of a message sync nearly impossible.

Over the past few weeks, employees have been discussing the difficulty of setting up different Apple IDs to keep work and personal files separate, noting that while it’s possible, there are significant technical hurdles. “I don’t understand why they didn’t create an Apple ID and iCloud account from our work email address during the onboarding process,” one employee said on Slack. “I get mad that I have to use my personal phone to text my boss,” said another.

Concerns about data privacy are not ubiquitous inside Apple. Many employees who spoke to The Verge said they were aware the company gave itself extensive rights to search their data, but — for various reasons — weren’t overly worried about the fallout.

“When I joined Apple, I personally expected it to be pretty invasive and took some serious steps to separate my work and personal life,” one source says.

For other employees, however, the mixing of personal and work data has already had real consequences. In 2018, the engineering team Ashley Gjøvik worked on was involved in a lawsuit. The case had nothing to do with Gjøvik personally, but because she’d worked on a project related to the litigation, Apple lawyers needed to collect documents from her phone and work computer.

Gjøvik asked the lawyers to confirm that they wouldn’t need to access her personal messages. She says her team discouraged the use of two phones; she used the same one for work and personal and, as a result, had private messages on her work device.

A member of the legal team responded that while the lawyers did not need to access Gjøvik’s photos, they did not want her to delete any messages. During an in-person meeting, Gjøvik says she told the lawyers the messages included nude photos she’d sent to a man she was dating — a sushi chef who lived in Hawaii. Surely, those weren’t relevant to the lawsuit. Could she delete them? She says the lawyers told her no.

In 2017, Apple rolled out an app called Gobbler that would allow employees to test Face ID before it became available to customers. The process was routine — Apple often launched new features or apps on employees’ phones, then collected data on how the technology was used to make sure it was ready for launch.

Gobbler was unique in that it was designed to test face unlock for iPhones and iPads. This meant that every time an employee picked up their phone, the device recorded a short video — hopefully of their face. They could then file “problem reports” on Radar, Apple’s bug tracking system, and include the videos if they found a glitch in the system. “All data that has your face in it is good data,” said an internal email about the project. After rumors of criticism, Apple eventually changed the codename to “Glimmer.”

Unlike other Apple features, Glimmer wasn’t automatically installed on employee phones. It required an informed consent form so employees would know what they were getting into. Still, for some people on engineering teams, participation was encouraged — even expected, according to two staff members. Once it was installed, some data that didn’t contain personally identifiable information would automatically upload to Radar, unless employees turned off this setting.

Apple was careful to instruct employees not to upload anything sensitive, confidential, or private. But it didn’t tell people what was happening with the hundreds of images they didn’t upload in Radar reports.

The reports themselves were also a cause for concern. When employees file Radar tickets, they include detailed information about the problems they are seeing. In 2019, Gjøvik filed a ticket about Apple’s photo search capabilities. “If I search for ‘infant’ in my photo library, it returns a selfie I took of myself in bed after laparoscopic surgery to treat my endometriosis,” she wrote, including four images in the ticket. The default sharing settings for the ticket included all of software engineering.

Radar tickets also are not removable. Even when the tickets are closed, they remain searchable. In training, employees say they are told: “Radar is forever.”

What’s more, when employees file Radar tickets, they are often asked to include diagnostic files, internally called “sysdiagnose” to give Apple more information about the problem. If they are filing a bug about iMessage, they might be asked to install a sysdiagnose profile that exposes their iMessages to the team tasked with fixing the issue. For employees using a live-on device, default settings can mean that, as they are filing a Radar ticket, a sysdiagnose profile is being automatically created in the background, sending data to Apple without the employee realizing it.

When sysdiagnose profiles are not included, employees have been known to post memes calling out the omission.

Gjøvik is currently on administrative leave from Apple due to an ongoing investigation into claims she made about harassment and a hostile work environment. If she leaves the company, she’ll likely face the same conundrum as Jacob Preston, related to the mixing of her personal and work files.

Employees likely wouldn’t care too much about this were it not for another Apple rule that bars them from wiping their devices when they leave the company. If they do, they’ll be in direct violation of their employment agreement, leaving them vulnerable to legal action.

After Preston gave notice, he received a checklist from his manager that explicitly said: “Do not wipe or factory reset any Apple owned units (such as laptops, Mac, ipads, and iPhones).”

“Before joining Apple I had a lot of respect for the company,” Preston says. “They’re the one tech company that takes privacy seriously. But then they go and have these policies that are hypocritical and go against their stated values. It’s sort of hard to reconcile. It’s like now that I’m leaving, my privacy isn’t a concern anymore.”

Apple did not respond to a request for comment from The Verge.

Repost: Original Source and Author Link


Apple’s Siri will finally work without an internet connection thanks to on-device processing

Apple’s digital assistant Siri will process audio on-device by default in iOS 15, meaning you will be able to use the feature without an active internet connection. Apple says the upgrade will also make Siri faster.

Processing audio on-device will make using Siri more private, says Apple. This follows the company’s well-established preference for implementing machine learning features on-device, rather than sending data away to the cloud to be processed.

“This addresses one of the biggest privacy concerns for voice assistants, which is unwanted audio recording,” said Apple in a press statement.

Obviously, if you don’t have an internet connection, then you won’t be able to carry out certain Siri functions like searching the web. But you can use Apple’s digital assistant for simple navigational tasks, like controlling your music, opening apps, setting timers and so on. Making these functions accessible offline is a great move for accessibility, too.

However, the feature will only be available on iPhones and iPads with its A12 Bionic chip or later. That means the iPhone XS, XS Max, and XR; iPhone 11, 11 Pro, 11 Pro Max, and iPhone SE (second generation); iPhone 12, 12 Mini, 12 Pro, and 12 Pro Max; iPad Mini (fifth generation); iPad Air (third and fourth generations); and iPad (eighth generation). And zero Apple Watch availability.


Repost: Original Source and Author Link


Eidos’ Quebec game studios shift to a four-day work week

Eidos is joining the ranks of game developers pushing back against crunch time. The Square Enix-owned studio has announced that its Quebec locations (Montreal and Sherbrooke) are shifting to a four-day, Monday-to-Thursday work week sometime in the “next few weeks.” Salaries and working conditions will remain the same, the Deus Ex and Tomb Raider developer said.

The hope, as you might expect, is to improve the quality of working hours. Eidos is accordingly encouraging teams to redefine work conditions and improve efficiency, such as by cutting meeting times. The company had already implemented some quality-of-life changes during the pandemic, such as rest periods and compensation for mental and physical health costs — this is ostensibly a logical extension of that strategy.

Eidos isn’t the first studio to adopt a four-day week. Bugsnax developer Young Horses made that switch in September. It’s very rare for a large studio to make this move, though, and it might prompt similar moves by other developers if the strategy proves successful.

“If” is the operative term, however. Eidos was eager to tout past tests with shortened work weeks, like with Iceland’s civil service, but it’s not yet clear if that translates to the game industry. Developers are notorious for rushing games to make the holiday season, even if that leads to extremely buggy results. Eidos will have to make a hard choice: does it stick to the four-day schedule and risk delaying games, or demand extra hours to be sure a title is ready? This shortened week could pay off with happier developers, but it could also cause problems if teams take months more to finish projects.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.

Repost: Original Source and Author Link