Categories
Computing

Mac Mini 2022: new design, better performance, and more

At WWDC 2022, Apple announced the M2 chip that would power the 13-inch MacBook Pro and redesigned MacBook Air. However, the Mac Mini was notably absent from this announcement and the event at large. Apple may still release the M2 Mac Mini, but it’s hard to tell when.

If you’re looking to get into Apple’s Mac ecosystem, the superb Mac Mini is one of the best-value ways to do it. After 2020’s M1 model, expectations are high for how Apple could follow up with new chips and new features, including a high-end model for more demanding users.

We’ve put together this roundup with as many details on the next Mac Mini as we can find. Simply read on to see what Apple has planned for its smallest desktop Mac.

Release date

Rich Shibley/Digital Trends

For a while, there were rumors of two Mac Minis being in the works, with one high-end model and one entry-level version said to be coming. After Apple’s Peak Performance event in March, it became apparent that the rumored high-end Mac Mini was almost certainly the Mac Studio, which essentially looks like several Mac Minis stacked on top of each other.

With that out of the way, we’re still waiting on updates to the main Mac Mini line. Not only has the M1 Mac Mini not been updated since late 2020, but Apple is still selling an Intel-based Mac Mini on its website, despite promising to have almost completed its transition to its own Apple Silicon chips. That means both versions could be updated sooner rather than later.

But when specifically can we expect these changes? Well, Apple’s Worldwide Developers Conference (WWDC) on June 6th was a good bet, but the Mac Mini didn’t make an appearance. However, that doesn’t mean that the M2 Mac Mini isn’t coming out. It just didn’t launch at the same time as the new MacBooks.

A new Mac Mini is still on the way, and that idea is bolstered by a discovery made by iOS developer Steve Troughton-Smith, who unearthed an interesting clue in firmware for Apple’s Studio Display monitor. The firmware made mention of an as-yet-unreleased Mac dubbed “Macmini10,1,” which Troughton-Smith believes could be referring to an M2 Mac Mini. Having it mentioned in official firmware is a strong indication that Apple is almost ready to deploy the Mac, and even with WWDC out of the way, it could still release in 2022.

As for the high-end Mac Mini that is set to replace the Intel-based version, the timing of this model is less certain. We’ll have to wait and see.

Price

The latest Mac Mini, sitting under a PC monitor.

Now for the price. The current M1 Mac Mini starts at $699, with a second model costing $899. The Intel version starts at $1,099, meanwhile. That pricing structure makes sense, so we wouldn’t be surprised if Apple stuck with it for the new models.

The only caveat is that there are rumors swirling that the forthcoming high-end Mac Mini will get a redesigned chassis. When Apple has done this in the past, it has sometimes come with a price increase — see the 2021 16-inch MacBook Pro for a recent example — so we could see a similar situation hit the Mac Mini.

A familiar design?

Renders of the next Mac Mini, complete with a new design.

As we outlined above, one rumored Mac Mini — complete with a redesigned chassis — turned out to be the Mac Studio. However, there is another rumor that has not yet been disproved that suggests the Mac Mini will still get a new look.

In May 2021, leaker Jon Prosser released renders depicting the next Mac Mini with a much slimmer design than its current iteration (largely due to the more power-efficient Apple Silicon chip inside), with an aluminum body topped with a plexiglass-like surface. He also asserted Apple has been experimenting with different color options, but whether this will make it to the finished product is unknown.

As well as that, in August 2021, Mark Gurman stated in his Power On newsletter that the Mac Mini “will have an updated design and more ports than the current model.” However, he did not go into specifics regarding the shape and size of the upcoming device.

The slimmed-down design attested by Prosser makes sense. With the advent of the M1 chip, Apple has been able to design its computers around the chip’s greater efficiency compared to Intel processors by cutting their bulk. We’ve already seen the results in the totally overhauled 24-inch iMac, which was reduced to a minuscule 11.5mm in thickness, and the Mac Mini could be next to get this treatment.

It’s also believable for another reason. The Mac Mini is a popular computer in server farms thanks to its small size, which is one reason we doubted the rumored “multi-stack” Mac was actually a Mac Mini (and in the end, it was released under the Mac Studio name instead). If Apple thins down the Mac Mini’s chassis, it will be good news for server farms, which will potentially be able to squeeze even more of the machines onto their racks.

All that said, there was a dissenting voice in the form of well-known Apple analyst Ming-Chi Kuo. In a tweet from March 2022, Kuo explained that “the new Mac Mini in 2023 will likely remain the same form factor design,” and suggested that Apple will not go for a slimmed-down appearance. Kuo accurately predicted that there will be no new Mac Mini at WWDC 2022 since he said 2023 will see the next new Mac Mini.

Even better performance

The Apple Mac Mini 2018 under a monitor with two external speakers.
Julian Chokkattu/Digital Trends

We can’t be certain of the next Mac Mini’s performance for one big reason: It’s not yet clear what chip it will use. Right now, rumors suggest it could be either the as-yet-unreleased M2 chip or the M1 Pro.

Right now, the M2 seems to be making the stronger case, especially since the new M2 MacBook Air launched at WWDC (along with hardware refresh for the 13-inch MacBook Pro). It would seem odd for Apple to launch an M2 Mac — its next generation of chip architecture — then also launch a previous-generation M1 Pro Mac Mini alongside it. Such a move could potentially make the Mac Mini instantly feel out of date. For that reason, an M2 Mac Mini feels much more likely, with an M2 Pro Mac Mini perhaps following later in 2022 or 2023.

So if the M2 is the most probable chip we’ll see inside the next Mac Mini, what kind of performance can we expect?  Well, if Apple’s numbers from WWDC can be believed, the M2 will be nearly 20% faster than the M1. That’s a sizeable performance bump, although it maintains the same number of cores as the last generation. Early speculation predicted that the M2 would have more cores than the M1, which ended up not being the case.

When we eventually get a high-end Mac Mini, its M2 Pro chip (assuming that’s what it comes with) will be a noticeable upgrade over the M2. The current M1 Pro and M1 Max have various options, with memory ranging from 16GB to 64GB. They also include the following core options:

  • M1 Pro with eight-core CPU and 14-core GPU
  • M1 Pro with 10-core CPU and 14-core GPU
  • M1 Pro with 10-core CPU and 16-core GPU
  • M1 Max with 10-core CPU and 24-core GPU
  • M1 Max with 10-core CPU and 32-core GPU

The M2 Pro and M2 Max (if the Mac Mini gets it) are likely to upgrade those core counts, although it’s too early to say what the complete lineup might look like. However, Mark Gurman has stated Apple is working on a 14-inch MacBook Pro with an M2 Pro chip featuring 12 CPU cores and 38 GPU cores. A previous newsletter from Gurman had also suggested Apple was planning an M2 Pro chip with 12 CPU cores and 16 GPU cores. It’s possible that one or both of these will be offered inside the upcoming Mac Mini.

Note that both Gurman and 9to5Mac have separately claimed that Apple is testing an M2 Pro Mac Mini but have not mentioned an M2 Max version, so we’re skeptical that the Mac Mini will get that chip at this stage.

Features: More ports and monitor support

The port selection in the rear of a Mac Mini.

The new chips won’t just mean more power — they will also affect the features you can expect to find in the upcoming Mac Mini. That’s because they control a number of things beyond simply raw performance, such as the port selection and external monitor support.

While the M1 Mac Mini was a step up over its Intel predecessor in almost every way, it had one notable drawback: Instead of the four Thunderbolt ports the Intel model offered, the M1 edition only came with two. The most likely explanation is that that was a limitation imposed by the chip itself.

There are no such worries on Macs with M1 Pro and M1 Max chips. The 14-inch and 16-inch MacBook Pros, for example, offer three Thunderbolt ports compared to the two found on the M1 MacBook Pro. And according to both Mark Gurman and Jon Prosser, the Mac Mini will also get a more generous port selection.

While Gurman has been coy about the exact port arrangement, Prosser has laid his cards on the table: Four Thunderbolt/USB-C ports, two USB-A slots, one Ethernet port, and one HDMI port is his prediction, and that matches the offering on the current Intel-based model. There could also be a MagSafe-style power adapter like the one on the 24-inch iMac, Prosser believes.

The M1 Pro and M1 Max chips could fix another annoyance linked to the M1 chip: The poor support for external monitors. Every M1 Mac is limited to one external display (barring the Mac Mini itself, but that’s only thanks to its HDMI port). That’s something we lamented in our M1 MacBook Air review, and it isn’t really good enough these days.

Luckily, the latest Apple chips have remedied this situation. The M1 Pro allows up to two 6K displays to be attached to the 2021 MacBook Pro, while the M1 Max can support up to four monitors (three 6K and one 4K). With the Mac Studio, meanwhile, you can attach up to five external displays (four 6K and one 4K). The Mac Mini doesn’t come with its own display, so external monitor support is crucial — and the more you can connect, the better.

Editors’ Choice






Repost: Original Source and Author Link

Categories
AI

Unity moves robotics design and training to the metaverse

Join gaming leaders online at GamesBeat Summit Next this upcoming November 9-10. Learn more about what comes next. 


Unity, the San Francisco-based platform for creating and operating games and other 3D content, on November 10 announced the launch of Unity Simulation Pro and Unity SystemGraph to improve modeling, testing, and training complex systems through AI.

With robotics usage in supply chains and manufacturing increasing, such software is critical to ensuring efficient and safe operations.

Danny Lange, senior vice president of artificial intelligence for Unity, told VentureBeat via email that the Unity SystemGraph uses a node-based approach to model the complex logic typically found in electrical and mechanical systems. “This makes it easier for roboticists and engineers to model small systems, and allows grouping those into larger, more complex ones — enabling them to prototype systems, test and analyze their behavior, and make optimal design decisions without requiring access to the actual hardware,” said Lange.

Unity’s execution engine, Unity Simulation Pro, offers headless rendering — eliminating the need to project each image to a screen and thus increasing simulation efficiency by up to 50% and lowering costs, the company said.

Use cases for robotics

“The Unity Simulation Pro is the only product built from the ground up to deliver distributed rendering, enabling multiple graphics processing units (GPUs) to render the same Unity project or simulation environment simultaneously, either locally or in the private cloud,” the company said. This means multiple robots with tens, hundreds, or even thousands of sensors can be simulated faster than real time on Unity today.

According to Lange, users in markets like robotics, autonomous driving, drones, agriculture technology, and more are building simulations containing environments, sensors, and models with million-square-foot warehouses, dozens of robots, and hundreds of sensors. With these simulations, they can test software against realistic virtual worlds, teach and train robot operators, or try physical integrations before real-world implementation. This is all faster, more cost-effective, and safer, taking place in the metaverse.

“A more specific use case would be using Unity Simulation Pro to investigate collaborative mapping and mission planning for robotic systems in indoor and outdoor environments,” Lange said. He added that some users have built a simulated 4,000 square-foot building sitting within a larger forested area and are attempting to identify ways to map the environment using a combination of drones, off-road mobile robots, and walking robots. The company reports it has been working to enable creators to build and model the sensors and systems of mechatronic systems to run in simulations.

A major application of Unity SystemGraph is how it enables those looking into building simulations with a physically accurate camera, lidar models, and SensorSDK to take advantage of SystemGraph’s library of ready-to-use models and easily configure them to their specific cases.

Customers can now simulate at scale, iterate quickly, and test more to drive insights at a fraction of current simulation costs, Unity says. The company adds that customers like Volvo Cars, Allen Institute of AI, and Carnegie Mellon University are already seeing results.

While there are several companies that have built simulators targeted especially at AI applications like robotics or synthetic data generation, Unity claims that the ease of use of its authoring tools makes it stand out above its rivals, including top competitors like Roblox, Aarki, Chartboost, MathWorks, and Mobvista. Lange says this is evident in the size of Unity’s existing user base of over 1.5 million creators using its editor tools.

Unity says its technology is aimed at impacting the industrial metaverse, where organizations continue to push the envelope on cutting-edge simulations.

“As these simulations grow in complexity in terms of the size of the environment, the number of sensors used in that environment, or the number of avatars operating in that environment, the need for our product increases. Our distributed rendering feature, which is unique to Unity Simulation Pro, enables you to leverage the increasing amount of GPU compute resources available to customers, in the cloud or on-premise networks, to render this simulation faster than real time. This is not possible with many open source rendering technologies or even the base Unity product — all of which will render at less than 50% real time for these scenarios,” Lange said.

The future of AI-powered  technologies

Moving into 2022, Unity says it expects to see a steep increase in the adoption of AI-powered technologies, with two key adoption motivators. “On one side, companies like Unity will continue to deliver products that help lower the barrier to entry and help increase adoption by wider ranges of customers. This is combined with the decreasing cost of compute, sensors, and other hardware components,” Lange said. “Then on the customer adoption side, the key trends that will drive adoption are broader labor shortages and the demand for more operational efficiencies — all of which have the effect of accelerating the economics that drive the adoption of these technologies on both fronts.”

Unity is doubling down on building purpose-built products for its simulation users, enabling them to mimic the real world by simulating environments with various sensors, multiple avatars, and agents for significant performance gains with lower costs. The company says this will help its customers to take the first step into the industrial metaverse.

Unity will showcase the Unity Simulation Pro and Unity SystemGraph through in-depth sessions at the forthcoming Unity AI Summit on November 18, 2021.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link

Categories
Game

Xbox Design Lab brings back rubberized grips and metallic finishes for controllers

After being on hold while Microsoft launched the Xbox Series X and S consoles, Xbox Design Lab returned this summer. Unfortunately, rubberized grips and other options that were available before were nowhere to be found. Now, Xbox Design Lab has announced that rubberized grips and metallic color finishes are back, along with some all new additions.  

Black rubberized grips are now available for the side grips and back case, and you can choose from 19 metallic finish colors. Those include three types of silver (sterling, pewter and gunmetal) along with more exotic shades like Deep Pink, Oxide Red, Zest Orange, Gold, Electric Volt, Velocity Green and Glacier Blue.

Xbox Design Lab also introduced three new regular colors on top of the 18 already available: Dragonfly Blue, Nocturnal Green and Velocity Green. The latter two were updated from Nocturnal Green and Velocity Green, which are no longer on option. It also launched “inspired by” controller designers from Elder Scrolls V: Skyrim, Battlefield 2042, Forza Horizon 5, and Riders Republic.

The new options bring Design Lab back to where it was before, with literally millions of combinations possible. As before, you’ll see a nice 3D rendering of the product each time you add an option, to help make sure your final product doesn’t look like hot garbage. The controllers start at $70 and up, though “pricing may vary for additional options” like the rubberized grips, Microsoft points out. You can custom design your controller here

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.

Repost: Original Source and Author Link

Categories
Game

Xbox Design Lab controller option explosion just made your decision even tougher

Microsoft announced some new additions to Xbox Design Lab customization options today. When the service re-launched earlier this year with Xbox Series X gamepads, it was missing some customization options it had previously. Those are now available once more, along with some entirely new options and designs inspired by four big games.

Grips and metallic finishes back on the menu

Returning to Xbox Design Lab today is the option to outfit your custom controller with rubberized grips or a metallic finish for specific controller components. Prospective buyers can choose to outfit their controller with rubberized grips both on the sides and back of the gamepad, though those grips only come in one color (black).

The d-pad and triggers can be given a metallic design as well. There are 19 colors in all, according to Microsoft, making for quite the long list of options: Sterling Silver, Pewter Silver, Gunmetal Silver, Abyss Black, Retro Pink, Deep Pink, Oxide Red, Zest Orange, Gold, Electric Volt, Velocity Green, Glacier Blue, Dragonfly Blue, Mineral Blue, Photon Blue, Midnight Blue, Regal Purple, Nocturnal Green, and Warm Gold.

In addition to those returning customization options, Microsoft has also launched some new stuff as well. As revealed in that list above, Dragonfly Blue is an entirely new color that can be applied to some components. Microsoft also says that Military Green and Electric Green have been replaced by Nocturnal Green and Velocity Green, respectively.

Finally, there are four pre-made controller designs (pictured in the gallery above) inspired by Forza Horizon 5, The Elder Scrolls V: Skyrim, Battlefield 2042, and Riders Republic that customers can use as a jumping-off point for their own custom controllers. On their own, the controllers look pretty good, though each one comes with rubberized side and back grips along with metallic d-pads and triggers, so customers may want to tweak those before checkout.

New options come at a premium price

That’s because rubberized grips and metallic components add a decent amount of money to the overall cost of an Xbox Design Lab controller. For example, rubberized grips for the sides and back cost $5.99 each, while metallic d-pads and triggers cost $3.99 each. That doesn’t seem like much, but when you consider that Xbox Design Lab controllers start at $69.99, it’s possible to craft a controller that costs just a few cents shy of $100 – assuming that you also opt for a $9.99 engraving on it.

That’s a pricey controller for sure, but that’s if you choose every premium option available. Mixing and matching those premium options will, obviously, save you some cash, resulting in a custom controller that’s still expensive but not triple-digit expensive.

Xbox Design Lab has proven to be a popular service for Microsoft, so if you’ve ever wanted to create a controller that’s uniquely yours, you now have more options in that endeavor. These new options are live on the Xbox Design Lab right now, but just a head’s up: At the time of this writing, the website seems to be having a few issues, so you may have to refresh a few times before any changes you make to your controller are applied.

Repost: Original Source and Author Link

Categories
Computing

Apple M3 Max Rumored to Use 3nm Design With Up to 40 Cores

A new report reveals some important details on the future of Apple’s Mac chips, including both of the next two generations of Mac chips, presumably to be called the M2 Max and M3 Max.

The report comes from The Information, which states that the second generation of Apple silicon will be a minor improvement compared to the third, which will allegedly feature 3nm chips with up to 40 cores.

Apple’s current M1 Max and M1 Pro chips are produced by TSMC, the Taiwan-based semiconductor foundry, and based on the latest report, their successors reveal how Apple will scale performance in future generations.

The M1, M1 Max, and M1 Pro chips are all 5nm chips, and according to the rumors, the upcoming generation, likely called M2, will also utilize the TSMC’s 5nm process. However, there will be improvements: The chip will contain two dies, allowing for the use of more cores. This will still be an upgrade over the current (already incredibly solid) generation, but it won’t be as big a leap as what we’ve seen between the M1 and the M1 Max and M1 Pro chips.

This new variant of the M1 Max chip is reportedly going to be used in the successor to the current Mac Pro and will feature two dies, whoch should provide a notable jump in performance. It’s likely that the second generation of M1 Max and M1 Pro chips will be found in the next MacBook Pro models.

While the immediate successor to the M1 Max and M1 Pro sounds interesting, it’s the third generation of Apple silicon that truly captures attention. According to The Information, Apple is looking to start producing 3nm chips with the rumored M3 chip. The swap from 5nm to 3nm would make room for up to four dies, opening up the possibilities for much greater performance than what we’re seeing now.

Using the 3nm chip with four dies clears room for up to 40 compute cores. Compared to the 5nm chip, this is a massive upgrade. The highest number of cores presently found inside Apple products is the high-end Mac Pro tower with up to 28 cores, but that’s on an Intel Xeon W processor. Apple’s own silicon offers eight cores (M1) and 10 cores (M1 Max and M1 Pro) at the most.

new macbook pro 2021.

The third generation of Apple chips is currently code-named Ibiza, Lobos, and Palma. The report reveals that TSMC may be able to produce 3nm chips in partnership with Apple as soon as 2023. They would likely first be found in Apple’s premium models, such as the next generation of MacBook Pro, but also in future models of the iPhone. Apple is also releasing planning to release a less impressive version of the 3nm chip specifically for the MacBook Air.

Apple struck gold with its recent M1 Max and M1 Pro chips. Found inside the new 14-inch and 16-inch MacBook Pros, these powerful chipsets are performing excellently in benchmarks. It seems that Apple’s road map is quite clear: 2022 may bring the second generation of Apple silicon with smaller upgrades, and 2023 will be the year of a massive leap that has the chance to blow the competition out of the water.

Editors’ Choice




Repost: Original Source and Author Link

Categories
Computing

AMD Zen 4D Could Use Hybrid Design to Rival Intel Alder Lake

YouTuber and leaker Moore’s Law is Dead revealed new information regarding AMD’s future architecture plans. According to leaks, AMD is working on a “dense” version of Zen 4 called Zen 4D. Zen 4D is basically a fork of Zen 4 that strips out features and reduces clock speeds.

It will also feature a newly designed cache system. All of this is to slightly reduce single-core performance in exchange for greatly increased multi-core performance. This would also allow AMD to increase the chip density, hence the “D” in the name.

If the leaks are true, it seems the company may be creating its own hybrid architecture to compete with the success of Intel’s 12th-gen Alder Lake chips. This follows in the footsteps of both Intel and Apple, who have utilized similar architectures in their respective CPU designs.

These Zen 4D processors would have about half the L3 cache of regular Zen 4 and feature 16 cores per chiplet. Moore’s Law is Dead stated that Zen 4D is expected to have simultaneous multithreading (SMT), but they couldn’t be 100% certain. He was also uncertain if Zen 4D would support AVX-512 but did confirm that Bergamo, AMD’s 128-core server-grade EPYC CPU slated for second quarter 2023, would feature the new architecture.

The new architecture for Zen 5 was also leaked, and this is by far the most interesting news. The leaks suggest that Zen 5 will be AMD’s first hybrid processor architecture. It would use eight Zen 5 “big” cores and up to 16 Zen 4D “little” cores. Zen 5 is also rumored to be codenamed Granite Ridge and based on the Ryzen 8000 series processors built on TSMC’s ridiculously tiny 3nm process.

As we’ve seen with Intel’s Alder Lake chips and Apple’s M1 Pro/Max CPUs, the hybrid approach can offer huge performance increases. It makes sense that AMD would architecture their chips in a similar manner, as Zen 5 could offer a 20-25% IPC increase over Zen 4. The problem is that Zen 5 is still a few years out, and Alder Lake currently outperforms AMD’s best consumer chips.

Editors’ Choice




Repost: Original Source and Author Link

Categories
AI

Google is using AI to design its next generation of AI chips more quickly than humans can

Google is using machine learning to help design its next generation of machine learning chips. The algorithm’s designs are “comparable or superior” to those created by humans, say Google’s engineers, but can be generated much, much faster. According to the tech giant, work that takes months for humans can be accomplished by AI in under six hours.

Google has been working on how to use machine learning to create chips for years, but this recent effort — described this week in a paper in the journal Natureseems to be the first time its research has been applied to a commercial product: an upcoming version of Google’s own TPU (tensor processing unit) chips, which are optimized for AI computation.

“Our method has been used in production to design the next generation of Google TPU,” write the paper’s authors, co-led by Google research scientists Azalia Mirhoseini and Anna Goldie.

AI, in other words, is helping accelerate the future of AI development.

In the paper, Google’s engineers note that this work has “major implications” for the chip industry. It should allow companies to more quickly explore the possible architecture space for upcoming designs and more easily customize chips for specific workloads.

An editorial in Nature calls the research an “important achievement,” and notes that such work could help offset the forecasted end of Moore’s Law — an axiom of chip design from the 1970s that states that the number of transistors on a chip doubles every two years. AI won’t necessarily solve the physical challenges of squeezing more and more transistors onto chips, but it could help find other paths to increasing performance at the same rate.

Google’s TPU chips are offered as part of its cloud services and used internally for AI research.
Photo: Google

The specific task that Google’s algorithms tackled is known as “floorplanning.” This usually requires human designers who work with the aid of computer tools to find the optimal layout on a silicon die for a chip’s sub-systems. These components include things like CPUs, GPUs, and memory cores, which are connected together using tens of kilometers of minuscule wiring. Deciding where to place each component on a die affects the eventual speed and efficiency of the chip. And, given both the scale of chip manufacture and computational cycles, nanometer-changes in placement can end up having huge effects.

Google’s engineers note that designing floor plans takes “months of intense effort” for humans, but, from a machine learning perspective, there is a familiar way to tackle this problem: as a game.

AI has proven time and time again it can outperform humans at board games like chess and Go, and Google’s engineers note that floorplanning is analogous to such challenges. Instead of a game board, you have a silicon die. Instead of pieces like knights and rooks, you have components like CPUs and GPUs. The task, then, is to simply find each board’s “win conditions.” In chess that might be checkmate, in chip design it’s computational efficiency.

Google’s engineers trained a reinforcement learning algorithm on a dataset of 10,000 chip floor plans of varying quality, some of which had been randomly generated. Each design was tagged with a specific “reward” function based on its success across different metrics like the length of wire required and power usage. The algorithm then used this data to distinguish between good and bad floor plans and generate its own designs in turn.

As we’ve seen when AI systems take on humans at board games, machines don’t necessarily think like humans and often arrive at unexpected solutions to familiar problems. When DeepMind’s AlphaGo played human champion Lee Sedol at Go, this dynamic led to the infamous “move 37” — a seemingly illogical piece placement by the AI that nevertheless led to victory.

Nothing quite so dramatic happened with Google’s chip-designing algorithm, but its floor plans nevertheless look quite different to those created by a human. Instead of neat rows of components laid out on the die, sub-systems look like they’ve almost been scattered across the silicon at random. An illustration from Nature shows the difference, with the human design on the left and machine learning design on the right. You can also see the general difference in the image below from Google’s paper (orderly humans on the left; jumbled AI on the right), though the layout has been blurred as it’s confidential:

A human-designed chip floor plan is on the left, and the AI-designed floor plan on the right. The images have been blurred by the paper’s authors as they represent confidential designs.
Image: Mirhoseini, A. et al

This paper is noteworthy, particularly because its research is now being used commercially by Google. But it’s far from the only aspect of AI-assisted chip design. Google itself has explored using AI in other parts of the process like “architecture exploration,” and rivals like Nvidia are looking into other methods to speed up the workflow. The virtuous cycle of AI designing chips for AI looks like it’s only just getting started.

Update, Thursday Jun 10th, 3:17PM ET: Updated to clarify that Google’s Azalia Mirhoseini and Anna Goldie are co-lead authors of the paper.

Repost: Original Source and Author Link

Categories
Game

Design Adorable Vacation Homes In New Animal Crossing DLC

New paid DLC for Animal Crossing: New Horizons that gives players the opportunity to design vacation homes for the game’s various villagers has been announced during today’s Animal Crossing Nintendo Direct. The DLC, titled Happy Home Paradise, is set to launch on November 5, along with the game’s 2.0 update, and will cost players $25.

Happy Home Paradise will also be available as part of Nintendo’s recently revealed Nintendo Switch Online + Expansion Pack membership, along with access to Nintendo 64 and Sega Genesis titles.

In the expansion, players assist a resort developer called Paradise Planning by designing vacation homes that are tailored to villagers’ requests. Customization is king when designing vacation houses, with players being able to change nearly every aspect of homes and the lots they’re on. Players can change the size of homes themselves, and put up partition walls or pillars to section off spaces. Outside, players can change the location of the house, its design, and edit the surrounding area with a new top-down editor.

Along with vacation homes, players will be able to redesign different facilities across the DLC’s archipelago. Whether they create a restaurant, hospital, or school, villagers will occupy and use each facility.

Once players have successfully designed a vacation home, they’ll receive payment in the form of Poki, a new currency. While it can only be used on the archipelago, Poki can be used to purchase rare furniture and other cosmetics, which can then be brought back to players’ main islands.

The new design techniques that players use on vacation homes can also be applied to homes on the main island, drastically growing the ways player homes can be customized. And for players who are tired of visiting their villagers’ ugly homes, once enough vacation homes have been designed, villagers may ask the player to give their house a makeover.

Editors’ Choice




Repost: Original Source and Author Link

Categories
Game

Xbox Series S teardown reveals “brilliant” design and a Master Chief surprise

While there’s currently a lot of focus on the Xbox Series X and PlayStation 5, the Xbox Series S may just wind up being one of the biggest surprises of the new generation. The Xbox Series S is underpowered compared to the Xbox Series X and PlayStation 5, but its $300 price point and its ability to play current-gen games at a lower target resolution and framerate could make it an attractive buy for some gamers. Now, thanks to a new teardown, we’re getting to take a look inside the Xbox Series S.

The teardown was performed by Rich Leadbetter of the always-excellent Digital Foundry, though if you’re worried that a functioning Xbox Series S was broken down in this age of console shortages, you can breathe a sigh of relief. Leadbetter says at the outset that this is actually a console that was bricked in another experiment, so there is no functional hardware being destroyed here.

Leadbetter says that the teardown requires a Torx 8 security screwdriver to remove all of the screws in the system and notes that the back can be taken off with the removal of just two screws. With the back, lid, and internal chassis removed, it’s clear that the Xbox Series S uses up pretty much all the space in its small footprint. With the main fan removed, we see a modular power supply and a massive heatsink covering the console’s SoC. The two are big enough to cover up the entire Xbox Series S motherboard, and it’s on that power supply that we find the hidden Master Chief Easter egg.

We also get to take a peek at the tiny PCIe SSD that provides the Xbox Series S with its 512GB of storage. One of the most interesting parts of Leadbetter’s teardown is when he shows us the Xbox Series X SoC side-by-side with the Xbox Series S SoC. We also get a rare look at the SoC layout for both systems thanks to diagrams that Microsoft gave to Digital Foundry.

All in all, it’s a fascinating and illuminating look into the Xbox Series S, and we get to take that look without taking apart our own consoles. In fact, it’s a good idea to keep your own Xbox Series S intact because Leadbetter notes that once the cooling assembly is removed, the console will be irrevocably damaged. We’ve embedded the teardown above, so be sure to give it a watch.

Repost: Original Source and Author Link

Categories
AI

AI has become a design problem

Join AI & data leaders at Transform 2021 on July 12th for the AI/ML Automation Technology Summit. Register today.


AI is facing new calls for regulation now that it has emerged from the laboratory and is becoming more widely deployed across our daily lives. The public does not trust the technology. Nor should they. The problems are numerous, from understanding how the models actually work, controlling the data that feeds AI, and addressing growing distrust in those that wield this technology. Even companies themselves are unsure of how to safely and effectively use AI as part of their business.

The bulk of the discussion is currently centered on engineering. It’s understandable — artificial intelligence is a mysterious black box to many people, one assumed to be fixable only through lines of code and better data. But I want to argue that in order to truly understand and begin to master this technology, we must get better at seeing AI as a component of larger systems and design accordingly. Fundamentally, AI is not just a technology problem, it has become a design problem.

Human-centered design has a vital role across three key areas: Design thinking can help companies map their systems to understand how and where AI fits. Design is needed to devise better tools to create, monitor, and manage AI. And design must create new interfaces centered around the kind of information that AI delivers users.

Design has a long-familiar practice around research and discovery (aka design thinking) to help effectively frame problems. And it can help companies understand what it takes to ensure their AI works as desired. Design teams now routinely create journey maps to show how a customer flows through all of a company’s touchpoints, as well as the external touchpoints, and how those collectively drive their experience. Similar methods can be used to map the flow of data, software, and decision-making within a company, covering not only the AI itself, but more holistically the larger systems that will influence the AI. This exercise can help companies begin to better understand what drives an AI to perform — or not. It’s complex, to be sure, but in short, AI is not an island. Any first step to ensuring ethical AI requires understanding all of the systems impacting it.

Right now, creating an AI system that contributes usefully to any given business is still the primary struggle. The data may be too raw, suspect, or shallow. The models may be unproven. And integrating the AI into the rest of the business engine is difficult. Because of this there’s often not enough attention placed on higher order goals: efficiency, accuracy, and business value. It’s a very Wild-West attitude — move fast and break things. We’ll figure it out as we go.

Much of this attitude can be attributed to the early stages of how AI systems are created. The process is still very engineering-driven and, even then, requires a highly customized approach to each problem. In such situations, engineering teams tend to measure success by whether they can get the system to work; not by how well it fits its purpose.

Because of this, it is imperative to move the act of making things “up the stack.” That means creating tools that make the development of AI systems less of a raw engineering chore and more of a creative and operational task for the business itself. This is where design is critical. Tools must be designed to demystify the data, objects, and processes that make up AI so that subject-matter experts focused on business outcomes can participate in authoring these systems.

There are many analogies to draw from. Desktop publishing moved graphic design from a draftsman-and-camera-room specialty to a simple desktop tool anyone could use. The result was an explosion of contributors and a dramatic improvement in the quality of design overall. In software engineering, simplified tools like HTML and JavaScript have moved application and website development into the hands of people with intent and ideas rather than solely engineering skills. These people have more time and attention to focus on the quality of the work.

All the best data, model, and development practices in the world cannot fully guarantee perfectly behaved AI. In the end, good user interface design has to appropriately present AI to end users. An effective user interface can, for instance, tell the user the provenance of its insight, recommendations, and decisions. This gives the user agency in making sense of what the AI has to offer.

UI design also needs to evolve its art form of presenting information. Historically, UIs presented data as matter-of-fact. Common lists of data were not suspect; they were simply regurgitating what was stored. But increasingly, presentations of data are sourced, culled, and shaped by AI and therefore carry with them the suspect nature of the AI’s curation. UI design must introduce new mechanisms to allow users to inspect data provenance and reasoning and introduce visual cues to better share data confidence and bias to the user.

As we navigate the intricacies of a technology already integrated into many of our systems, we must design these systems in a responsible manner, mindful of transparency, privacy, and fairness. Design can frame AI-driven user experiences to end users in a manner that engenders trust and helps the end user understand the scope, strengths, and weaknesses of a given system. In turn, fear and mistrust are alleviated around the mysterious black boxes.

Trust is where the story ends — or begins. Better systems, tools, and interfaces will lead to AI that performs as designed and can be trusted. Because trust will be the final measure of effective and responsible AI systems.

Mark Rolston is Founder and Chief Creative Officer of argodesign, a global product design consultancy. He was previously Chief Creative Officer of frogdesign and has worked with such companies as Disney, Magic Leap, Dreamworks, Salesforce, GE, Microsoft, and AT&T. He currently serves as advisor to the Responsible AI Institute (RAI), working to define responsible AI with practical tools and expert guidance.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link