Categories
Computing

What are mouse jigglers? | Digital Trends

Along with the rise of remote work and hybrid workplaces, monitoring software, and business-collaboration options like Slack or Teams, another kind of technology has started to go mainstream — the mouse jiggler, also known as a mouse mover.

This odd little tech solution is being used to thwart some types of micromanagement and help employees manage time on their own terms, among other solutions. Does it work? Are jigglers allowed in the workplace? Here’s everything you should know.

What are mouse jigglers?

Mouse jigglers are technology, typically hardware, that simulate mouse activity on a computer screen by automatically moving the cursor. That, in turn, keeps the computer active and avoids states like sleep mode or a status indicator on an app switching to “away” or “inactive.” This is a big deal for users who have their computers monitored.

How are people using mouse jigglers?

Vaydeer Mouse Jiggler in use.

The rise of remote work during the COVID-19 pandemic — among other factors — made many businesses worry that they could not physically track employees the way they used to. From installing mandatory monitoring software (“bossware”) to even hiring private investigators, employers started going to great lengths to find some way to keep an eye on employees. Others started using additional productivity measurements, such as active versus inactive statuses on company communication apps so employees who went inactive too frequently would face consequences.

Employees frequently disagreed. Many believed that “time on the computer” had little to do with actual productivity, especially if their tasks were completed in a timely manner. Others wanted employers to realize that work-from-home and other work situations required a more flexible approach and that measuring active status on a computer was a very poor indicator, unfair to those juggling their work/life balance. The Center for Democracy & Technology directly called the practice hazardous to employees’ health.

A solution existed to these tracking issues: mouse jigglers that simulated computer activity enough to keep status indicators active. In the past few years, their use has quickly grown among monitored employees. Activating these jigglers can be useful in situations like:

  • An employee has to step away for a bathroom break but doesn’t want surveillance software to think they aren’t working.
  • A remote work employee has to take a moment to take care of a child but resumes their task afterward.
  • An employee needs to take time to read a datasheet, whitepaper, or manual but doesn’t want their status to go inactive while they do so.
  • A user has to step away but doesn’t want their computer to go into sleep mode, which could interfere with current downloads or other important activities.

Are there different kinds of mouse jigglers?

Vaydeer Tiny Mouse Jiggler/

Yes, there are two main types. One is a dock-like device that you plug into a power source and rest beside your mouse. When you step away from the keyboard, you place your mouse on this dock, and it will manipulate the optical sensor to keep the mouse cursor on the screen slowly moving.

The second kind is a USB plugin, similar to the normal kind that many wireless mice use, except it includes software to wiggle the mouse cursor around when the mouse isn’t being used. These tend to be more affordable and take up less space but may be more detectable.

There are also apps that can simulate mouse movement, but these are less common.

Do they actually work?

They do what they are intended to do: They keep status lights green and keep screensavers and sleep modes from activating.

However, they can’t imitate actual work. They can’t fool keystroke tracking, which some of the most invasive bossware may use, and they can’t respond to callouts you may receive on videoconferencing apps like Zoom or WebEx.

Are mouse jigglers legal? Will I get in trouble using one?

There’s no law against using a mouse jiggler. But a workplace that implements bossware or rules about status indicators is likely to have a problem with mouse jigglers, too. That could put your job in jeopardy if someone finds out you are using one, and at least one story of someone being caught has gone viral.

Jigglers and movers are generally hard to detect unless you are using a work computer, which can be monitoring for extra peripherals or additional software. That’s why some people prefer the dock-style mouse jigglers that can be plugged into a separate power source.

We’re still early on in this arms race, and it’s uncertain just how hard employers will pursue technology to detect mouse jigglers and similar solutions (along with the requisite IT resources). For now, it’s something that can vary greatly from company to company.

What are some popular mouse jigglers?

Tech8 has a colorful mouse mover with the dock design for $30 that you may be interested in. Vaydeer sells a similar model with a different design and also sells a $14 USB version that it claims won’t be detected as an unknown USB device. If you’re willing to pay a bit more, Liberty Mouse Mover has a larger, fourth-generation Mouse Mover for $50 that uses a microprocessor to create random mouse movements with the ability to adjust the parameters.

Editors’ Choice




Repost: Original Source and Author Link

Categories
AI

Why AI and digital twins could be the keys to a sustainable future

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.


Digital twins aren’t new, but AI is supercharging what they can do. Together, they are transforming how products are designed, manufactured and maintained. The combination of the technologies provides forensic insights into our increasingly complex and interconnected world.

By deploying digital twins and AI, organizations obtain granular insights into their operations, enabling them to achieve significant benefits spanning cost savings, efficiency gains and improved sustainability efforts. Product quality is also enhanced through a reduction in defects and the accelerated resolution of issues throughout the lifecycle. In addition, innovation increases through more frequent and comprehensive development.

Gartner defines a digital twin as “a digital representation of a real-world entity or system. Data from multiple digital twins can be aggregated for a composite view across a number of real-world entities, such as a power plant or a city, and their related processes.” AI enhances digital twins, enabling the technology to look at what-if scenarios and run simulations, providing previously unavailable insights. This improved situational awareness of cause and effect supports more agile and sustainable decision-making.

The ESG imperative

Digital twins are not only helping optimize operations; they play a pivotal role in enabling organizations to realize their environmental, social and governance (ESG) goals. Research from Cap Gemini found that 57% of organizations believe digital twin technology is critical to improving sustainability efforts. The digital twin provides a way to model and understand how to reduce energy consumption and emissions so organizations can test scenarios to reach sustainability and climate goals. And with sustainability a global imperative, this will accelerate adoption, particularly as AI is increasingly used to augment digital twins.

Event

MetaBeat 2022

MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.

Register Here

So, let’s look at how digital twins and AI are helping improve sustainability in various settings.

Smart cities 

Digital twins and AI will play a vital role as cities strive to reduce their environmental impact. Together they can create a virtual emulation to help planners understand how to reduce congestion, emissions, pollution and other challenges by analyzing data from various sources and testing different variables in the virtual model.

One city pioneering this approach is Las Vegas, which uses the technology to model future energy needs, emissions, parking, traffic and emergency management. IoT sensors collect data from cars, charging networks and municipal infrastructure to model and scenario plan. City officials will use the insights garnered to inform ESG policies and priorities.

As more cities across the globe focus on becoming carbon neutral, digital twins and AI provide a way to model and process vast volumes of data from disparate sources so municipalities can fully understand how different decisions and policies will impact strategic climate goals.

Smart industry

In industrial settings, digital twins provide manufacturers with a way to understand how to optimize their operations and improve sustainability. For example, the simulation can identify potential pain points, highlight where energy loss is occurring and highlight opportunities to reduce consumption. The AI algorithm can process data, recognize patterns and predict future outcomes far beyond human cognitive abilities. In addition, the virtual simulation reduces the waste and power associated with building physical prototypes.

By creating an emulation of a production line, manufacturers can understand how to make changes at every stage that reduce environmental impact and improve efficiency — thereby increasing cost savings. Unilever tested the technologies at one site and realized savings of $2.8 million by reducing energy consumption and achieving an uptick in productivity.

These are just a handful of examples that underscore how AI and digital twins are ushering in a new era of intelligent manufacturing.

Smart buildings

Another area where digital twins are aiding sustainability efforts is in creating smart buildings. With increasing regulation aimed at designing greener buildings, the construction industry needs a way to scenario plan to reduce the environmental impact and minimize energy consumption before any ground is broken.

The digital model enables infrastructure owners to utilize resources better, address human needs, and make decisions that support a more sustainable built environment. Better resource planning is now possible by tapping into data from various sources. To provide perspective on the impact, Accenture estimates that energy consumption in buildings can be reduced by 30% to 80% using virtual twin technologies.

As digital twin adoption and intelligent technologies become increasingly pervasive, they will enable better decisions that support a more circular, less carbon-intensive economy, ultimately creating a more sustainable planet.

Cheryl Ajluni is IoT solutions lead at Keysight Technologies.

DataDecisionMakers

Welcome to the VentureBeat community!

DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.

If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.

You might even consider contributing an article of your own!

Read More From DataDecisionMakers

Repost: Original Source and Author Link

Categories
AI

Nvidia advances digital twins for retail, rail and telco

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.


The concept of digital twins is all about modeling the physical world in the metaverse, enabling humans as well as AI to make better decisions.

Building digital twins requires both hardware and software.  At the Nvidia GTC conference today, the company announced the next generation OVX computing systems to help power metaverse applications, including Nvidia’s Omniverse. The new OVX systems are powered by eight Nvidia L40 GPUs and integrate the ConnectX-7 SmartNIC for high-speed networking and storage.

In a press briefing, Richard Kerris, vice president of Omniverse at Nvidia, said that the new OVX systems have been designed for building complex industrial digital twins.

“The new OVX systems are designed to build virtual worlds using leading 3D software applications from our many software partners to be able to operate immersive digital twin simulations in Nvidia Omniverse enterprise, which is a scalable end-to-end platform enabling enterprises to build and operate metaverse applications,” Kerris said.

Event

MetaBeat 2022

MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.

Register Here

[Follow along with VB’s ongoing Nvidia GTC 2022 coverage »]

Digital twins come to telco

“Omniverse extends and enhances existing workflows across industries, bringing AI superpowers to multi-trillion dollar industries across telecommunications, transportation, retail energy, media and entertainment and more,” Kerris said.

Among the industries that are embracing the digital twin concept is telecommunications. At GTC, Nvidia announced that Heavy AI is using Nvidia’s Omniverse digital twin technology to help telcos optimize 5G cellular networks. Kerris said that Heavy AI made use of digital twins to help Charter Communications with its network deployment.

“Heavy AI is an AI data analytics company that built an AI accelerated application framework on Omniverse, which enables telcos to develop physically accurate interactive digital twins to plan, build, and operate for and 5g networks at nationwide scale,” Kerris said.

Digital twins in all the aisles at home improvement retailer Lowe’s

Nvidia is also using GTC as a venue to highlight digital twin adoption by Lowe’s, which is one of the world’s largest home improvement retailers with over 2000 stores and over 300,000 retail associates. 

“Lowe’s is now using Omniverse as their platform to design, build and operate digital twins of their stores to optimize operations and enhance the shopping experience,” Kerris said.

Lowe’s store associates can now use augmented reality headsets to see what’s on the shelves and the current status of inventory levels. The digital twin also helps with store planning to make sure it’s as easy as possible for consumers to get what they need.

Riding the digital twin rails with Deutsche Bahn

Another industry use case for digital twins that Nvidia is talking about at GTC is in transportation with Deutsche Bahn.

Kerris said that Deutsche Bahn is the second-largest transport company in the world and the National Railway of Germany. Deutsche Bahn is using Omniverse to build and operate digital twins of over 5700 stations and over 33,000 kilometers of track. Omniverse is also being used for capacity optimization, as they’re using the digital twin to train and validate AI models that can continuously monitor the railways and trains to recognize hazards and situations that could affect network operations. 

“Deutsche Bahn expects to increase capacity and efficiency of the railway and reduce its carbon footprint without building any new tracks,” Kerris said.

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.

Repost: Original Source and Author Link

Categories
AI

Gartner predicts ‘digital twins of a customer’ will transform CX

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.


Digital twins of physical products and infrastructure are already transforming how companies design and manufacture products, equipment and infrastructure. In its latest Immersive Hype Cycle, Gartner predicts that digital twins of a customer (DToC) could transform the way enterprises deliver experiences. Simulating a customer experience (CX) is a bit more nuanced than a machine — and there are privacy considerations to address, not to mention the creepiness factor. Though if done right, Gartner predicts the DToC will drive sales while delighting customers in surprising ways. 

Gartner has a nuanced view of the customer, including individuals, personas, groups of people and even machines. It is worth noting that many enterprise technologies are moving toward this more comprehensive vision. Customer data platforms consolidate a data trail of all aspects of customer interaction. Voice of the customer tools help capture data from surveys, sensors and social media. While, customer journey mapping and customer 360 tools analyze how customers interact with brands across multiple apps and channels. 

The critical innovation point of DToC is that it helps contextualize data to help understand what customers really need to improve the overall experience, Gartner VP analyst Michelle DeClue-Duerst told VentureBeat. For example, a hotel with knowledge about a customer’s gluten allergy might identify nearby gluten-free restaurants and only stock the minibar with snacks the customer will enjoy. 

When done right, DToCs can help business teams design ways to serve or capture customers and facilitate new data-driven business models. They will also improve customer engagement, retention and lifetime failure. 

Event

MetaBeat 2022

MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.

Register Here

Developing core capabilities

Gartner notes that DToC implementations are still embryonic, with about 1-5%penetration of the target audience. At the same time, enterprises have been busy finding ways to get the most value from their investment using various marketing analytics tools. 

Subha Tatavarti, CTO, Wipro, told VentureBeat there have been several important milestones in using tools for simulating customers to improve experiences. The most notable have been the ability to define customer experience transformation objectives, including the capability to identify and assess data assets, personas and processes and tools for building and testing behavior models. New ModelOps approaches for integrating monitoring and enhancing the models are also advancing the field.

“A new generation of recommendation systems based on intention, context and anticipated needs is a very exciting development in combined modeling and simulation capabilities,” Tatavarti said. “Personalized learning and hyper-personalized products are great advancements and personalized healthcare will have critical impacts on that industry.”

Enterprises are taking advantage of new identity resolution capabilities that assemble pieces of data to create a holistic view of the customer. This stitching can help a company understand what an individual customer buys, how frequently they purchase, how much they spend, how often they visit a website and more. 

“Without identity resolution, the company may have to rely on only some of the attributed data sources to fill out the digital persona, meaning the simulation would be somewhat inaccurate,” said Marc Mathies, senior vice president of platform evolution at Vericast, a marketing solutions company.

Bumpy road

Enterprises will need to address a few challenges to scale these efforts. Gartner observed that privacy and security concerns could lengthen the time it takes DToCs to mature and increase regulatory risks. Organizations must also build teams familiar with machine learning and simulation techniques. 

Tatavarti said the most difficult obstacles are the quality and availability of customer data from physical and digital interaction and data sharing between multiple organizations. These challenges will also involve privacy considerations and the ability to connect physical systems and virtual models without affecting the experience or performance. Teams also need to ensure the accuracy of the models and eliminate bias.

Bill Waid, chief product and technology officer at FICO, a customer analytics leader, told VentureBeat that another challenge in implementing digital twins for customer simulation is the impact of localized versus global simulation. Frequently, teams only simulate subsegments of the decision process to improve scale and manageability. Enterprises will need to compose these digital twins for more holistic and reusable simulations.

Organizations will also need to be transparent. 

“Initially, it will be hard to convince customers they need a digital twin that your brand stores and that the customer should help create it to improve their experience,” said Jonathan Moran, head of MarTech solutions marketing at SAS.

Building the right foundation

Industry leaders have many ideas about how enterprises can improve these efforts. 

Unlike digital twins in areas like manufacturing, customer behavior shifts quickly and often, Karl Haller, partner at IBM Consulting said it is essential to implement ongoing optimization and calibration to analyze the simulation results and determine ways to improve the performance of the models. He also recommends narrowly defining the focus of a customer simulation to optimize outcomes and reduce costs. Innovations in natural language processing, machine learning, object andvisual recognition, acoustic analytics and signal processing could help. 

Moran recommends enterprises develop synthetic data generation expertise to build and augment virtual customer profiles. These efforts could help expand data analytics and address privacy considerations.

Mark Smith, vice president of digital engagement solutions at CSG, recommends business to overlay voice of customer data with behavioral data captured through customer journey analytics. This modeling method is typically the fastest and most accurate route to understanding the peaks and valleys of the customer journey. 

“Comparing customers’ actual actions with their reported lived experience data unearths disconnects between customers’ perception of the experience and brands’ analysis of their own offerings,” Smith said. 

A mixed future 

Eventually, enterprises will need to find ways to optimize for profits along with customer well-being. Eangelica Germano Aton, product owner at a conversational intelligence platform, Gryphon AI, predicts that things will initially get worse for people as machines get better at predicting choices that reduce emotional well-being. 

“I think it will take a customer-driven or a bottom-up revolution and rejection of the current model before a more sophisticated and genuinely humanist AI can emerge that doesn’t maximize such a shallow objective function as profit,” Germano Aton said. 

Others are more optimistic. 

“Over time, it will be possible to use a deep understanding of the customer in a way that creates value for the consumer, the brand and the employees of the brand,” said Chris Jones, chief product officer at Amperity, a CDP platform. “One of the things we are observing is the ability of these capabilities to deepen the human connection between brands and the customers they serve by empowering employees across the brand to truly see their customer and provide the most personalized experience possible.”

In the long run, digital twin capabilities could become embedded into marketing and customer experience automation tools.

“As digital twin work moves more into marketing and CX in five to ten years, I think we will see solutions with more simulation capabilities built in,” Moran said. “Any type of marketing KPI and expected results will be simulated within the tool. Vendors already have some simulation capabilities for optimization, reinforcement learning and predictions, but I think this will start to increase even more in the coming years.”

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.

Repost: Original Source and Author Link

Categories
Computing

How to install Ubuntu | Digital Trends

Ubuntu is one of the most popular Linux distributions in the world and has been for a decade. It is easy to use and pleasing to look at, with a straightforward user interface and a streamlined installation. The Ubuntu Software Store makes Linux repositories easy to use, like downloading apps from the Google Play Store or Apple App Store. It’s also completely free to use, like most Linux distros, and learning how to install Ubuntu on a Windows computer doesn’t take long.

There are three installation types to choose from, so you need to know how you want to use Ubuntu. For example, you can use it inside a Windows virtual machine, or you can use Bash features directly from Windows. But those are for very specific use cases. For our purposes, we’re going to assume you want to download a fresh copy of Ubuntu directly onto your computer for everyday use. Here’s how you can do that.


Canonical/Dell

Meet Ubuntu

The world’s most popular Linux distro is created by Canonical, a South Africa-based company. While Ubuntu is free to install and use for individuals, organizations can pay for support and for a host of other Ubuntu-based products, such as server software.

Ubuntu gets regular updates every six months and major overhauls every two years. These overhauls, called LTS (for long-term support), introduce new Ubuntu versions and usually a silly animal name. For example, the latest LTS is Ubuntu 22.04 “Jammy Jellyfish.” It was released in spring 2022, and Canonical will support it for two years.

Ubuntu uses the GNOME desktop environment, which is the underlying user interface you see when you use Ubuntu. GNOME keeps the desktop clean and uncluttered and allows you to launch applications simply by clicking on an icon rather than inputting code into the terminal. GNOME gives you a Mac-like taskbar along the top of your screen where you can see the date and time, access app-specific files and options, and use other useful features.

However, if you want to get more nerdy, you can. After all, Ubuntu is a Linux distro, which means the terminal is never too far away. This will also allow you to tweak settings and opens up the full power of Linux. For instance, you can use the terminal to convert a bunch of images into PDFs at once, or you can create nested file directories deep in the system.

A screenshot of Ubuntu 22.04 Jammy Jellyfish showing a purple digital jellyfish and a black menu open on the bottom left
image credit: Canonical

Canonical

Download Ubuntu

The first step to installing Ubuntu is to download Ubuntu. It comes as a single ISO image and is 2GB in size.

Step 1: Head to the Ubuntu desktop site on your favorite browser.

The Ubuntu site with download button

image: Nathan Drescher

Step 2: Select Download.

The Ubuntu download button circled in red

image: Nathan Drescher

Step 3: You will find the downloaded ISO in your downloads folder.

Format your USB stick

Once you’ve downloaded Ubuntu to your computer, you’ll need to create a live USB stick. This is what your computer will use to install and run Ubuntu the first time around.

You’ll need to install a free third-party USB formatting program to do this, but don’t worry, these are small applications and they’re extremely easy to use. You can delete it afterward if you don’t want to keep it.

We recommend using Rufus for Windows. Here’s how.

Step 1: Download and install Rufus.

The Rufus site

image: Nathan Drescher

Step 2: Launch Rufus once it is installed.

Step 3: Choose your USB drive from the Device menu if it hasn’t auto-populated.

Rufus with the USB drive menu circled in red

Nathan Drescher

Step 4: Choose Select, to the right of the Boot Selection menu

Rufus with the select button circled in red

Nathan Drescher

Step 5: Select the Ubuntu ISO file you downloaded.

Rufus with the ISO file menu circled in red

Nathan Drescher

Step 6: The Volume Label menu will update to show your version of Ubuntu. Leave the other settings as is.

Step 7: Select Start.

Rufus with the Start button circled in red

Step 8: Wait while Rufus writes the Ubuntu ISO to your USB. You can see the progress in the status bar at the bottom. It should take 10 minutes or less.

Step 9: Your USB is ready when the Status bar is green and displays “Ready.”

Replace your OS with Ubuntu

Next, you’re going to install Ubuntu on the computer or laptop of your choice. You can choose to partition your hard drive and dual-boot Ubuntu or to completely overwrite your current operating system and use only Ubuntu.

Step 1: Insert the USB thumb drive with your live Ubuntu ISO into the computer.

Step 2: Restart the device.

The computer should recognize the USB thumb drive and boot from it immediately. However, if this doesn’t happen, you will need to boot your computer.

With the thumb drive inserted, restart your computer. Then hold down F12 during the startup and select your USB device from the Device menu.

Step 3: Choose Install Ubuntu.

You can test-drive Ubuntu direct from the USB. However, this is a limited version of Ubuntu and will not save anything.

A black screen with Install Ubuntu highlighted in white

Nathan Drescher

Step 4: Choose your language, and select Continue.

The Ubuntu language selection screen

Nathan Drescher

Step 5: Select your preferred installation setup.

We recommend choosing Normal installation unless you are using an extremely limited computer.

Select Download updates while installing Ubuntu and Install third-party software for graphics and Wi-Fi hardware and additional media formats.

Select Continue.

A grey Ubuntu installation type screen

Nathan Drescher

Step 6: The Installation Type menu allows you to choose between overwriting your current system with Ubuntu and dual-booting Ubuntu from a partitioned drive.

Select Erase disk and install Ubuntu, followed by Install Now.

A grey Ubuntu install type screen

Nathan Drescher

Step 7: Choose your location.

Press Continue.

A flat map of the world with a green highlight over the eastern standard time part of North America

Nathan Drescher

Step 8: Input your details. Add your name, your computer name (make this up), and pick a username. You’ll also need to create a password for logging into Ubuntu and some Ubuntu services. Keep it secure but easy to remember.

You can then select to log in to your computer automatically or require a password each time you use it. Ignore the Use Active Directory option for now.

Select Continue.

A grey screen with various boxes for inputing names and information

Nathan Drescher

Step 9: Ubuntu will now complete the installation.

A burgundy and purple Ubuntu install screen with tips displayed while files load

Nathan Drescher

Step 10: Select Restart now when the installation is finished.

A grey box with 'Restart Now' button on the purple Ubuntu background

Nathan Drescher

Step 11: Congratulation! You just installed Ubuntu!

Install Ubuntu alongside your operating system

If you don’t want to completely erase your current OS, be it Windows, Mac OS, or another Linux distro, you’ll need to install Ubuntu on a partitioned drive. Thankfully, the Ubuntu installer makes it easy to do this.

For starters, you’ll need to go through steps one to six in the “Replace your OS with Ubuntu” section above. However, when you get to the Installation Type menu, you can jump down to here and we’ll walk you through the rest.

Step 1: From the Installation type menu, select Something else and click Install now.

The Ubuntu install choice menu with 'Something Else' circled in red

Nathan Drescher

Step 2: Select New partition table on the next screen.

The Ubuntu partition wizard

Nathan Drescher

Step 3: Select Continue on the pop-up.

The Ubuntu partition warning pop up

Nathan Drescher

Step 4: Double-click on Free space to create a new partition.

A new partition window pop-up will appear.

Step 5: In the Create partition menu, set the size to 1024MB.

Set the type for new partitions to Primary.

Set the location for the new partitions to Beginning of this space.

In the Use as menu, select Ext4 Journaling file system.

In the Mount point menu, select /boot.

Select OK.

The Ubuntu boot partition setup

Nathan Drescher

Step 6: Double-click on Free space to create a home partition.

Leave the free space as is.

Set the type for new partitions to Primary.

Set the location for the new partitions to Beginning of this space.

In the Use as menu, select XFS journaling file system.

In the Mount point menu, select /home.

Select OK.

The Ubuntu partition pop up with /home set

Nathan Drescher

Step 7: Double-click on Free space to create a root file system.

Set the type for new partitions to Primary.

Set the location for the new partitions to Beginning of this space.

In the Use as menu, select XFS journaling file system.

In the Mount point menu, select /.

Select OK.

the Ubuntu partition pop up with / directory set

Nathan Drescher

Step 8: Double-click on Free space to create a swap partition to help when RAM fills up.

Set size to 4096.

Set the type for new partitions to Primary.

Set the location for the new partitions to Beginning of this space.

In the Use as menu, select Swap area.

Select OK.

The Ubuntu partition wizard with swap area setup

Nathan Drescher

Step 9: Now that you’ve got a partition scheme set up, you can complete your Ubuntu install.

The Ubuntu partition setup table

Nathan Drescher

Step 10: Choose your location.

Press Continue.

Ubuntu region selection with a flat map of the world and Eastern Time highlighted by a green strip

Nathan Drescher

Step 11: Input your details.

Add your name, your computer name (make this up), and pick a user name.

You’ll also need to create a password for logging into Ubuntu and some Ubuntu services. Keep it secure but easy to remember.

You can then select to log in to your computer automatically or require a password each time you use it.

Ignore the Use Active Directory option for now.

Select Continue.

how to install ubuntu name input screen

Nathan Drescher

Step 12: Ubuntu will now complete the installation. Hit Restart now when the installation is complete.

Grey box with 'Restart Now' button

Nathan Drescher

Congratulations! You have successfully installed Ubuntu and can now dive into a smooth and fast OS with the ability to handle almost anything you throw at it. Interested in trying other distros? Here’s a list of our favorite Linux distros.

Editors’ Choice




Repost: Original Source and Author Link

Categories
AI

Why healthcare industry leaders need to prioritize digital equity

Presented by Optum


Barriers like broadband access, digital literacy, language, disabilities and more shut many out of the healthcare system. Watch now to learn why it’s urgent for industry leaders to close the gaps, and come away with a plan to identify and eliminate the digital challenges your customers face.

Watch free on demand here.


Healthcare is facing a new frontier, says Tushar Mehrotra, senior vice president, analytics at Optum. In just a few years, the industry has seen a boom in digital health tools and technologies on both the patient and provider side, along with an explosion of health data, which has been driving increasingly sophisticated predictive and prescriptive insights into individuals and populations.

Unfortunately, this frontier has proven to be hostile to marginalized communities. There is a growing digital divide, where healthcare technology has actually posed challenges, instead of benefits. The barriers to accessing newly digitized care are legion: it’s everything from language barriers to low income, lack of broadband or mobile access, disabilities and physical differences, low digital literacy, a fully understandable mistrust of the healthcare system and much more. The danger is that this divide will continue to grow, and even become insuperable.

“As we continue to advance healthcare technology and drive innovation in the space, we’ll see tremendous benefits — but it has to be done in a way where we’re thoughtful about implications and consumption across communities,” Mehrotra says. “The challenge is to reach all consumers without exacerbating the disparities that exist in our communities today.”

In other words, putting what he calls techquity front and center. Mehrotra describes techquity as using advancements in healthcare technology to drive health equity in underserved, vulnerable and at-risk populations, and close the access gaps.

Healthcare industry leaders are responsible for driving the techquity movement – it’s not only an ethical consideration, but also offers a number of advantages for consumers and organizations alike.

The real-world benefits of techquity

On the consumer side, techquity can change – or save — a person’s life. It unlocks new ways to drive health outcomes, safety and healthcare decisions, and enables the right care at the right point in time, in a way that wasn’t possible in the past. Access to healthcare technology and knowledge creates transparency into the system, enabling more choices for consumers navigating treatment.

But there are tremendous benefits for organizations as well. Techquity opens up innovation for organizations, promoting new ways of thinking, new avenues of exploration, and possibilities. It builds valuable trust between an organization and a customer, and opens up access to new potential customers that have previously been unreachable, or even invisible, in the past.

“As leaders we need to help consumers understand why it’s essential for their healthcare outcomes to stay on the digital landscape, and help them get comfortable it,” he says. “If you want to reap the potential of healthcare technology, fundamentally change the industry, and drive adoption, it’s going to be important to be a trusted partner for consumers navigating this new world.”

Why techquity rests in the hands of the C-suite

Techquity starts at the top, Mehrotra says.

“It’s important for a healthtech leadership team or an organization to really understand that you can build and design tools and technologies that are relevant for anyone in the population,” Mehrotra says. “We have influence, if we set up our product teams and tech teams in a way that we haven’t maybe thought about in the past. That’s why it’s important to treat this as a C-suite-level topic.”

For organizations, it’s about fundamentally changing their approach to building technology, doing the right research and market testing, and incorporating that equitable approach into designing, building and launching products. If this is not a top-team agenda item, then it isn’t going to be funneling down to the technology or product or design teams.

“If leadership isn’t there, you’ll run into challenges in terms of making sure it disseminates and is incorporated into your organizational approach,” he says.

But the biggest challenge is finding ways to address the fear or concern of the highest risk consumers who are at risk of being separated even further from access to healthcare. Leadership must take point on this effort too.

“There has to be a willingness, a persistence, a focus, and a commitment of resources in an organization, one, to understand that this is important, and two, to understand the implications of it,” he says. “There has to be proactive outreach to those communities. Unless you have that outreach — the partnerships in the local community to drive education, drive understanding — you’re not going to get the change in behavior.”

To learn more about the dangers of healthcare inequity, why industry leaders should care, how your organization can address your customer’s digital divide, and more, don’t miss this VB On-Demand event.


Watch free on demand here!


Agenda

  • How to build a data-driven map that identifies the health literacy, digital access and social determinants that impact digital engagement and outcomes
  • How to align your efforts with the cultural, social and economic environments experienced by the people you serve
  • Ideas for addressing the root causes that create barriers to health— and where simple digital solutions can close gaps
  • How to offer simple choices to ensure a consumer’s digital experience is consistent across the health journey

Presenters

  • Duncan Greenberg, VP of Product, Oscar Health
  • Michael Thompson, VP, Chief of Staff, Systems Improvement, Bassett Healthcare Network
  • Tushar Mehrotra, SVP, Data & Analytics, OptumInsight
  • John Li, Senior Director, Clinical Analytics and Product Solutions, Optum

Repost: Original Source and Author Link

Categories
AI

Death, resurrection and digital immortality in an AI world

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.


I have been thinking about death lately. Not a lot — a little. Possibly because I recently had a month-long bout of Covid-19. And, I read a recent story about the passing of the actor Ed Asner, famous for his role as Lou Grant in “The Mary Tyler Moore Show.” More specifically, the story of his memorial service where mourners were invited to “talk” with Asner through an interactive display that featured video and audio that he recorded before he died. The experience was created by StoryFile, a company with the mission to make AI more human. According to the company, their proprietary technology and AI can match pre-recorded answers with future questions, allowing for a real-time yet asynchronous conversation.

In other words, it feels like a Zoom conversation with a living person.

This is almost like cheating death. 

Even though the deceased is materially gone, their legacy appears to live on, allowing loved ones, friends, and other interested parties to “interact” with them.  The company has also developed these experiences for others, including the still very much alive William Shatner. Through this interactive experience, I asked Shatner if he had any regrets. He then “spoke” at length about personal responsibility, eventually coming back to the question (in Shatner-like style). The answer, by the way, is no. 

William Shatner introduces StoryFile. Source: https://www.youtube.com/watch?v=HVPmGbynBrw 

There are other companies developing similar technology such as HereAfter AI. Using conversational AI, the company aspires to reinvent remembrance, offering its clients “digital immortality.” This technology evolved from an earlier chatbot developed by a son hoping to capture his dying father’s memories.

Event

MetaBeat 2022

MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.

Register Here

It is easy to see the allure of this possibility. My father passed away ten years ago, shortly before this technology was available. While he did write a short book containing some of his memories, I wish I had hours of video and audio of him talking about his life that I could query and both see and hear the responses in his own voice. Then, in some sense, he would seem to still be alive. 

This desire to bring our deceased loved ones “back to life” is understandable as a motivation and helps to explain these companies and their potential. Another company is ETER9, a social network set up by Portuguese developer Henrique Jorge. He shared the multi-generational appeal of these capabilities:  “Some years from now, your great-grandchildren will be able to talk with you even if they didn’t have the chance to know you in person.”

How can you talk to dead people?

In “Be Right Back,” an episode from the Netflix show “Black Mirror,” a woman loses her boyfriend in a car accident and develops an attachment to an AI-powered synthetic recreation. This spoke to the human need for love and connection.

In much the same way, a young man named Joshua who lost his girlfriend Jessica to an autoimmune disease recreated her presence through a text-based bot developed by Project December using OpenAI’s GPT-3 large language transformer. He provided snippets of information about Jessica’s interests and their conversations, as well as some of her social media posts. 

The experience for Joshua was vivid and moving, especially since the bot “said” exactly the sort of thing the real Jessica would have said (in his estimation). Moreover, interacting with the bot enabled him to achieve a kind of catharsis and closure after years of grief. This is more remarkable since he had tried therapy and dating without significant results; he still could not move on. In discussing these bot capabilities, Project December developer Jason Rohrer said: “It may not be the first intelligent machine. But it kind of feels like it’s the first machine with a soul.”

It likely will not be the last. For example, Microsoft announced in 2021 that it had secured a patent for software that could reincarnate people as a chatbot, opening the door to even wider use of AI to bring the dead back to life.

In an AI-driven world, when is someone truly dead?

“We’ve got to verify it legally

To see if she is morally, ethically

Spiritually, physically

Positively, absolutely

Undeniably and reliably dead!” 

–Munchkinland scene — “Wizard of Oz”

In the novel “Fall; or, Dodge in Hell,” author Neal Stephenson imagines a digital afterlife known as “Bitworld” contrasting the here and now of “Meatworld.” In the novel, the tech industry eventually develops the ability to map Dodge’s brain through precise scanning of the one hundred billion neurons and seven hundred trillion synaptic connections humans have, upload this connectome to the cloud and somehow turn it on in a digital realm. Once Dodge’s digital consciousness is up-and-running, thousands of other souls who have died in Meatworld join the evolving AI-created landscape that becomes Bitworld. Collectively, they develop a digital world in which these souls have what appears as consciousness and a form of tech-fueled immortality, a digital reincarnation.

Just as the technology did not exist ten years ago to create bots that virtually maintain the memories and — to a degree — the presence of the deceased, today the technology does not exist to create a human connectome or Bitworld. According to Louis Rosenberg of Unanimous A.I.: “This is a wildly challenging task but is theoretically feasible.”

And people are working on these technologies now through the ongoing advances in AI, neurobiology, supercomputing, and quantum computing. 

AI could provide digital immortality

Neuralink, a company founded by Elon Musk focused on brain-machine interfaces, is working on aspects of mind-uploading. Some number of wealthy people, including tech entrepreneur Peter Thiel, have reportedly arranged to have their bodies preserved after death until such time as the requisite technology exists. Alcor is one such organization offering this preservation service. As futurist and former Alcor CEO Max Moore said: “Our view is that when we call someone dead it’s a bit of an arbitrary line. In fact, they are in need of a rescue.”

The mind-uploading concept is also explored in the Amazon series “Upload,” in which a man’s memories and personality are uploaded into a lookalike avatar. This avatar resides in what passes for an eternal digital afterlife in a place known as “Lakeview.” In response, an Engadget article asked: “Even if some technology could take all of the matter in your brain and upload it to the cloud, is the resulting consciousness still you?”

This is one of many questions, but ultimately may be the most relevant — and one that likely cannot be answered until the technology exists. 

When might that be? In the same Engadget article, “Upload” showrunner Greg Daniels implies that the ability to upload consciousness is all about information in the brain, noting that it is a finite amount, albeit a large amount. “And if you had a large enough computer, and a quick enough way to scan it, you ought to be able to measure everything, all the information that’s in someone’s brain.”

The ethical questions this raises could rival the connectome in number and will become critical much sooner than we think.

Although in the end, I would just like to talk with my dad again. 

Gary Grossman is the senior VP of technology practice at Edelman and global lead of the Edelman AI Center of Excellence.

DataDecisionMakers

Welcome to the VentureBeat community!

DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.

If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.

You might even consider contributing an article of your own!

Read More From DataDecisionMakers

Repost: Original Source and Author Link

Categories
AI

Why composability is key to scaling digital twins

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.


Digital twins enable enterprises to model and simulate buildings, products, manufacturing lines, facilities and processes. This can improve performance, quickly flag quality errors and support better decision-making. Today, most digital twin projects are one-off efforts. A team may create one digital twin for a new gearbox and start all over when modeling a wind turbine that includes this part or the business process that repairs this part. 

Ideally, engineers would like to quickly assemble more complex digital twins to represent turbines, wind farms, power grids and energy businesses. This is complicated by the different components that go into digital twins beyond the physical models, such as data management, semantic labels, security and the user interface (UI). New approaches for composing digital elements into larger assemblies and models could help simplify this process. 

Gartner has predicted that the digital twin market will cross the chasm in 2026 to reach $183 billion by 2031, with composite digital twins presenting the largest opportunity. It recommends that product leaders build ecosystems and libraries of prebuilt functions and vertical market templates to drive competitiveness in the digital twin market. The industry is starting to take note.

The Digital Twin Consortium recently released the Capabilities Periodic Table framework (CPT) to help organizations develop composable digital twins. It organizes the landscape of supporting technologies to help teams create the foundation for integrating individual digital twins. 

Event

MetaBeat 2022

MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.

Register Here

A new kind of model

Significant similarities and differences exist in the modeling used to build digital twins compared with other analytics and artificial intelligence (AI) models. All these efforts start with appropriate and timely historical data to inform the model design and calibrate the current state with model results.

However, digital twin simulations are unique compared to traditional statistical learning approaches in that the model structures are not directly learned from the data, Bret Greenstein, data, analytics and AI partner at PwC, told VentureBeat. Instead, a model structure is surfaced by modelers through interviews, research and design sessions with domain experts to align with the strategic or operational questions that are defined upfront.

As a result, domain experts need to be involved in informing and validating the model structure. This time investment can limit the scope of simulations to applications where ongoing scenario analysis is required. Greenstein also finds that developing a digital twin model is an ongoing exercise. Model granularity and systems boundaries must be carefully considered and defined to balance time investment and model appropriateness to the questions they are intended to support. 

“If organizations are not able to effectively draw boundaries around the details that a simulation model captures, ROI will be extremely difficult to achieve,” Greenstein said.

For example, an organization may create a network digital twin at the millisecond timescale to model network resiliency and capacity. It may also have a customer adoption model to understand demand at the scale of months. This exploration of customer demand and usage behavior at a macro level can serve as input into a micro simulation of the network infrastructure. 

Composable digital twins

This is where the DTC’s new CPT framework comes in. Pieter van Schalkwyk, CEO at XMPRO and cochair for Natural Resources Work Group at Digital Twin Consortium, said the CPT provides a common approach for multidisciplinary teams to collaborate earlier in the development cycle. A key element is a reference framework for thinking about six capability categories including data services, integration, intelligence, UX, management and trustworthiness.  

This can help enterprises identify composability gaps they need to address in-house or from external tools. The framework also helps to identify specific integrations at a capabilities level. The result is that organizations can think about building a portfolio of reusable capabilities. This reduces duplication of services and effort.

This approach goes beyond how engineers currently integrate multiple components into larger structures in computer-aided design tools. Schalkwyk said, “Design tools enable engineering teams to combine models such as CAD, 3D and BIM into design assemblies but are not typically suited to instantiating multi use case digital twins and synchronizing data at a required twinning rate.”

Packaging capabilities

In contrast, a composable digital twin draws from six clusters of capabilities that help manage the integrated model and other digital twin instances based on the model. It can also combine IoT and other data services to provide an up-to-date representation of the entity the digital twin represents. The CPT represents these different capabilities as a periodic table to make it agnostic to any particular technology or architecture. 

“The objective is to describe a business requirement or a use case in capability terms only,” Schalkwyk explained. 

Describing the digital twin in terms of capabilities helps match a specific implementation to the technologies that provide the appropriate capability. This mirrors the broader industry trend towards composable business applications. This approach allows different roles, such as engineers, scientists and other subject-matter experts, to compose and recompose digital twins for different business requirements. 

It also creates an opportunity for new packaged business capabilities that could be used across industries. For example, a “leak detection” packaged business capability could combine data integration and engineering analytics to provide a reusable component that can be used in a multitude of digital twins use cases, Schalkwyk explained. It could be used in digital twins for oil & gas, process manufacturing, mining, agriculture and water utilities.

Composability challenges

Alisha Mittal, practice director at Everest Group, said, “Many digital twin projects today are in pilot stages or are focused on very singular assets or processes.”

Everest research has found that only about 15% of enterprises have successfully implemented digital twins across multiple entities. 

“While digital twins offer immense potential for operational efficiency and cost reduction, the key reason for this sluggish scaled adoption is the composability challenges,” Mittal said. 

Engineers struggle to integrate the different ways equipment and sensors collect, process and format data. This complexity gets further compounded due to the lack of common standards and reference frameworks to enable easy data exchange. 

Suseel Menon, senior analyst at Everest Group, said some of the critical challenges they heard from companies trying to scale digital twins include:

  • Nascent data landscape: Polishing data architectures and data flow is often one of the biggest barriers to overcome before fully scaling digital twins to a factory or enterprise scale.
  • System complexity: It is rare for two physical things within a large operation to be similar, complicating integration and scalability. 
  • Talent availability: Enterprises struggle to find talent with the appropriate engineering and IT skills. 
  • Limited verticalization in off-the-shelf platforms and solutions: Solutions that work for assets or processes in one industry may not work in another. 

Threading the pieces together

Schalkwyk said the next step is to develop the composability framework at a second layer with more granular capabilities descriptions. A separate effort on a ‘digital-twin-capabilities-as-a-service’ model will describe how digital twin capabilities could be described and provisioned in a zero-touch approach from a capabilities marketplace. 

Eventually, these efforts could also lay the foundation for digital threads that help connect processes that span multiple digital twins. 

“In the near future, we believe a digital thread-centric approach will take center stage to enable integration both at a data platform silo level as well as the organizational level,” Mittal said. “DataOps-as-a-service for data transformation, harmonization and integration across platforms will be a critical capability to enable composable and scalable digital twin initiatives.”

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.

Repost: Original Source and Author Link

Categories
Computing

How to Overclock RAM | Digital Trends

When people talk about overclocking, they are usually referring to the CPU and GPU. However, it is also possible to overclock the RAM, and in some cases, it can lead to greater performance enhancements than any other tweaks you make.

If you’ve never tried your hand at it, overclocking your RAM might feel a little intimidating, but don’t worry — it’s actually quite easy. You don’t need to have one of the best RAM kits, either, even though it certainly helps! Should you overclock your computer’s memory, what can you gain from it, and how to do it? We’ll walk you through the whole process below.


Corsair

What are the benefits of overclocking RAM?

Overclocking RAM improves the data transfer rate, which refers to how quickly the RAM delivers data to the CPU to complete a process. If your RAM is too slow, it can create a bottleneck that doesn’t fully utilize the potential of your CPU. If you’re running a budget system, this might not matter much, but if your computer houses a top-notch processor, you don’t want to slow it down with a memory kit that’s underperforming.

The easiest solution to this is to just buy newer, faster RAM. However, you can actually bump up the speeds on your RAM manually as long as you don’t exceed the recommended voltage, and in some cases, you can only get the most from your faster memory kits by performing some system tweaks of your own.

Overclocking your RAM can net some performance gains in day-to-day uses, but it really shines in gaming. Playing games that rely more on the processor than the graphics card will highlight the importance of RAM. Using faster memory speeds up the data transfer to the CPU and may result in higher frames per second (fps).

Whether you’re an enthusiast, a gamer, a person who likes to get the most out of their hardware, or simply a curious user, overclocking your RAM can bring a share of benefits — and it’s safe if you do it correctly.

RAM overclocking explained

Before we jump into the ins and outs of overclocking your RAM, we’ll give you a quick rundown of what your RAM does and how the whole process works.

RAM, also known as random access memory, lets your computer store data, but only for a while. This means that it only handles the data that’s currently in use and then shares it with the CPU. Streamlining this process can have an impact on the performance of your whole system.

Every program that you use loads into the RAM directly from your storage, be it a hard-disk drive (HDD) or a solid-state drive (SSD), which is a persistent type of memory that can store files on a long-term basis. The purpose of overclocking RAM is to boost its speed, which in turn makes it quicker to receive information and then pass it on to your processor.

You’ll usually see RAM speed measured in megahertz; for example, your memory kit may be a DDR4 RAM that runs at 3200MHz. This refers to the clock speed. In simple terms, the memory speed shows you how many times per second the RAM stick can access its memory.

Many RAM kits come with a pre-overclocked option referred to as an XMP (Extreme Memory Profile). This can be a little tricky because when you install your new RAM, you’ll notice that it will often run at the box speed as opposed to the overclocked speed. Having a saved XMP profile will help you bring that kit up to the factory overclock provided by the company that made it.

In order to overclock RAM, you’ll be adjusting its speed and its timings. This will include having to find that sweet spot that works for your machine and then getting your RAM to be stable at that point. Let’s take a look at how that can be achieved.

A pair of G.Skill Trident Z5 DDR5 RAM modules.

Prepare to overclock

The most important thing to do when you start overclocking RAM (or any component) is to establish a baseline. Doing this is not all too difficult. Follow the steps to prepare for overclocking.

Step 1: Take note of your memory’s default speed and timings with a utility like CPU-Z. Note down the speed and timings because you’ll be comparing them to your new standings later.

In addition to CPU-Z, you’ll also want to have a tool like HWInfo running in the background to keep an eye on memory temperatures and for finer frequency tracking.

If you’re using an AMD Ryzen processor and you plan to overclock manually, you can also download a tool called DRAM Calculator for Ryzen. This will help you choose the right frequency for your particular hardware.

CPU-Z stress and monitoring tool.

Step 2: With both of your tools set up, it’s time to put your current RAM settings through a bit of a stress test. This is where benchmarking tools come in.

PassMark and AIDA64 are great synthetic benchmarks that will give you some raw bandwidth numbers to help figure out how much of an effect your overclocking has had. Cinebench is a CPU-intensive application that can show how much your RAM’s overclock has improved CPU performance. Lastly, Memtest is great for keeping tabs on your RAM, so use it throughout the process.

You need to run these programs before the overclocking and then again once more when you’re done in order to compare the scores.

For more real-world testing, CPU-intensive games like Shadow of the Tomb Raider, Civilization VI and GTA V can give you a good idea of what game performance improvements you have managed to achieve.

Cinebench stress test.

Overclock using XMP memory profiles

Most modern AMD and Intel CPUs support anywhere between 2666MHz and 3600MHz of memory right out of the box, which means your motherboard and processor will default to running memory at those speeds. If you’ve purchased a kit that’s rated to go faster than that, though, it will have shipped with an XMP, or extreme memory profile. These automatically “overclock” the memory by setting it to its rated speed and timings, providing a quick and easy way to boost performance.

We will guide you through the process of XMP overclocking in our steps below.

Step 1: To access XMP, head into your UEFI/BIOS by hitting your motherboard’s respective key on startup. It’s typically one of the F1 to F10 keys or Delete.

Step 2: With the BIOS open, it’s time to look around. Every motherboard is different, but you want to search for overclocking settings. In our ASUS example, it’s in the Extreme Tweaker menu. Look in the memory tuning section, and when you find your memory’s XMP settings, choose the one that you want to use.

Save the settings and restart, and you should be able to see your new memory settings.

Many of these kits can go further than the XMP profiles allow, however. To do so, you’ll need to dive into the more time-consuming and complicated world of manual overclocking.

XMP settings in BIOS.

Intel

Manual RAM overclocking

Manual overclocking is the most time-consuming option, but it can also have the biggest payoff if you know what you’re doing. It’s also the best way to hit higher RAM speeds than those saved in XMP profiles. While it might seem a bit scary, you can rest assured that overclocking your RAM should be fairly safe as long as you’re careful and don’t adopt an “all at once” kind of approach.

**Warning: ** Don’t raise your DDR4 memory’s voltage above 1.5v, as that can damage your RAM over the long term. You also want to keep the temperature of your memory under 122 degrees Fahrenheit (50 degrees Celsius) at all times to help avoid crashes and instability.

If you’re running an AMD CPU, it’s also important you consider the Infinity Fabric clock and its synchronization with your memory. Read more in the section below.

Step 1: As with the XMP settings, find the memory-tweaking menu in your UEFI/BIOS, only this time use Manual settings as opposed to the pre-determined XMP options.

Begin raising the frequency slowly, a step at a time. The lower the better, typically. You want to take it slow and steady instead of rushing into it.

RAM overclocking in the Asus BIOS.

Step 2: Once you’ve adjusted your memory frequency, restart your computer and boot Windows. It’s now time to run some benchmarks using the programs we mentioned above. Test thoroughly, not just using programs but also using your favorite CPU-intensive games.

Step 3: If you complete all the benchmarks without crashes or errors, raise the frequency again. If you run into crashes, you can scale back your overclock and consider the job complete, or raise the voltage to see if that improves stability.

Remember to take it slow and do your due diligence with testing. If you raise the frequency too high in one go, you won’t know what frequency is unstable and what isn’t, forcing you to go further back down to find a point of stability.

Keep an eye on the performance numbers in your benchmarking too. Raising the frequency can sometimes cause an automatic loosening of your RAM’s timings, which can affect its latency and performance. Sometimes it’s better to have a lower frequency with tighter timings.

Step 4: When you find a frequency you’re happy with, perform additional, longer-term benchmarking and stability testing to confirm that even under repeated load, your memory won’t cause any system crashes. If it does, lower the frequency or raise the voltage as necessary, and perform another round of heavy stability testing.

G.Skill RAM sticks installed in a computer.

How to tighten the timings

If you want to take things a step further, you can always tighten the timings in addition to tweaking your RAM’s frequency.

You can typically do so in the UEFI/BIOS in the same section as adjusting the frequency. You’ll have to disable XMP profiles and switch to manual overclocking.

Tightening the timings means that you’ll be tweaking the various numbers, saving your new settings, and then restarting the computer to see if it runs smoothly. This is a bit of a trial-and-error job and should mostly be reserved for more advanced users who feel confident that they know what to do.

Never change too many things at once, and remember that not all frequency and timing combinations will work well together at all. Play with the settings until the benchmarks return stable results for at least 30 minutes of testing.

Things to remember in regard to AMD and Infinity Fabric

Overclocking RAM with an AMD Ryzen processor is very similar to Intel CPUs, but you also need to consider the Infinity Fabric. It’s a proprietary system interconnect architecture within AMD CPUs that has its clock speed synchronized with your memory’s. When it goes up, so does the Infinity Fabric’s, up to a point. That 1:1 ratio changes after 3,600MHz, and while that can mean greater overall performance, the latency loss isn’t always worth it.

Infinity Fabric overclocking is also a possibility for those who want to play with frequencies after de-syncing it from the memory, but that is more advanced overclocking and requires time and energy of its own.

Just know that if you want to raise the memory frequency over 3,600MHz on an AMD Ryzen system, you’re going to need to adjust the Infinity Fabric as well to get the best performance.

Editors’ Choice




Repost: Original Source and Author Link

Categories
AI

How digital twins are transforming network infrastructure: Future state (part 2)

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.


This is the second of a two-part series. Read part 1 about the current state of networking and how digital twins are being used to help automate the process, and the shortcomings involved.

As noted in part 1, digital twins are starting to play a crucial role in automating the process of bringing digital transformation to networking infrastructure. Today, we explore the future state of digital twins – comparing how they’re being used now with how they can be used once the technology matures.

The market for digital twins is expected to grow at a whopping 35% CAGR (compound annual growth rate) between 2022 and 2027, from a valuation of $10.3 billion to $61.5 billion. Internet of things (IoT) devices are driving a large percentage of that growth, and campus networks represent a critical aspect of infrastructure required to support the widespread rollout of the growing number of IoT devices.

Current limitations of digital twins

One of the issues plaguing the use of digital twins today is that network digital twins typically only help model and automate pockets of a network isolated by function, vendors or types of users. However, enterprise requirements for a more flexible and agile networking infrastructure are driving efforts to integrate these pockets.

Event

MetaBeat 2022

MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.

Register Here

Several network vendors, such as Forward Networks, Gluware, Intentionet and Keysight’s recent Scalable Networks acquisition, are starting to support digital twins that work across vendors to improve configuration management, security, compliance and performance. 

Companies like Asperitas and Villa Tech are creating “digital twins-as-a-service” to help enterprise operations.

In addition to the challenge of building a digital twin for multivendor networks, there are other limitations that digital twin technology needs to overcome before it’s fully adopted, including:

  • The types of models used in digital twins needs to match the actual use case. 
  • Building the model, supporting multiple models and evolving the model over time all require significant investment, according to Balaji Venkatraman, VP of product management, DNA, at Cisco.
  • Keeping the data lake current with the state of the network. If the digital twin operates on older data, it will return out-of-date answers. 

Future solutions

Manas Tiwari, client partner for cross-industry comms solutions at Capgemini Engineering, believes that digital twins will help roll out disaggregated networks composed of different equipment, topologies and service providers in the same way enterprises now provision services across multiple cloud services. 

Tiwari said digital twins will make it easier to model different network designs up front and then fine-tune them to ensure they work as intended. This will be critical for widespread rollouts in healthcare, factories, warehouses and new IoT businesses. 

Vendors like Gluware, Forward Networks and others are creating real-time digital twins to simulate network, security and automation environments to forecast where problems may arise before these are rolled out. These tools are also starting to plug into continuous integration and continuous deployment (CI/CD) tools to support incremental updates and rollback using existing devops processes.

Cisco has developed tools for what-if analysis, change impact analysis, network dimensioning and capacity planning. These areas are critical for proactive and predictive analysis to prevent network or service downtime or impact user experience adversely.

Overcoming the struggle with new protocols

Early modeling and simulation tools, such as the GNS3 virtual labs, help network engineers understand what is going on in the network in terms of traffic path, connectivity and isolation of network elements. Still, they often struggle with new protocols, domains or scaling to more extensive networks. They also need to simulate the ideal flow of traffic, along with all the ways it could break or that paths could be isolated from the rest of the network. 

Christopher Grammer, vice president of solution technology at IT solutions provider Calian, told VentureBeat that one of the biggest challenges is that real network traffic is random. The network traffic produced by a coffee shop full of casual internet users is a far cry from the needs of petroleum engineers working with real-time drilling operations. Therefore, simulating network performance is subject to the users’ needs, which can change at any time, making it more difficult to actively predict.

Not only that, but, modeling tools are costly to scale up. 

“The cost difference between simulating a relatively simple residential network model and an AT&T internet backbone is astronomical,” Grammer said. 

Thanks to algorithms and hardware improvements, vendors like Forward Enterprise are starting to scale these computations to support networks of hundreds of thousands of devices.

Testing new configurations

The crowning use case for networking digital twins is evaluating different configuration settings before updating or installing new equipment. Digital twins can help assess the likely impact of changes to ensure equipment works as intended. 

In theory, these could eventually make it easier to assess the performance impact of changes. However, Mike Toussaint, senior director analyst at Gartner, said it may take some time to develop new modeling and simulation tools that account for the performance of newer chips.

One of the more exciting aspects is that these modeling and simulation capabilities are now being integrated with IT automation. Ernest Lefner, chief product officer at Gluware, which supports intelligent network process automation, said this allows engineers to connect inline testing and simulation with tools for building, configuring, developing and deploying networks. 

“You can now learn about failures, bugs, and broken capabilities before pushing the button and causing an outage. Merging these key functions with automation builds confidence that the change you make will be right the first time,” he said.

Wireless analysis

Equipment vendors such as Juniper Networks are using artificial intelligence (AI) to incorporate various kinds of telemetry and analytics to automatically capture information about wireless infrastructure to identify the best layout for wireless networks. Ericsson has started using Nvidia Omniverse to simulate 5G reception in a city. Nearmap recently partnered with Digital Twin Sims to create dynamically updated 5G coverage maps into 5G planning and operating systems. 

Security and compliance

Grammer said digital twins could help improve network heuristics and behavioral analysis aspects of network security management. This could help identify potentially unwanted or malicious traffic, such as botnets or ransomware. Security companies often model known good and bad network traffic to teach machine learning algorithms to identify suspicious network traffic. 

According to Lefner, digital twins could model real-time data flows for complex audit and security compliance tasks. 

“It’s exciting to think about taking complex yearly audit tasks for things like PCI compliance and boiling that down to an automated task that can be reviewed daily,” he said. 

Coupling these digital twins with automation could allow a step change in challenging tasks like identifying up-to-date software and remediating newly identified vulnerabilities. For example, Gluware combines modeling, simulation and robotic process automation (RPA) to allow software robots to take actions based on specific network conditions. 

Peyman Kazemian, cofounder of Forward Networks, said they are starting to use digital twins to model network infrastructure. When a new vulnerability is discovered in a particular type of equipment or software version, the digital twins can find all the hosts that are reachable from less trustworthy entry points to prioritize the remediation efforts. 

Cross-domain collaboration

Network digital twins today tend to focus on one particular use case, owing to the complexities of modeling and transforming data across domains. Teresa Tung, cloud first chief technologist at Accenture, said that new knowledge graph techniques are helping to connect the dots. For example, a digital twin of the network can combine models from different domains such as engineering R&D, planning, supply chain, finance and operations. 

They can also bridge workflows between design and simulations. For example, Accenture has enhanced a traditional network planner tool with new 3D data and an RF simulation model to plan 5G rollouts. 

Connect2Fiber is using digital twins to help model its fiber networks to improve operations, maintenance and sales processes. Nearmap’s drone management software automatically inventories wireless infrastructure to improve network planning and collaboration processes with asset digital twins. 

These efforts could all benefit from the kind of innovation driven by building information models (BIM) in the construction industry. Jacob Koshy, information technology and communications associate, Arup, an IT services firm, predicts that comparable network information models (NIM) could have a similarly transformative role in building complex networks. 

For example, the RF propagation analysis and modeling for coverage and capacity planning could be reused during the installation and commissioning of the system. Additionally, integrating the components into a 3D modeling environment could improve collaboration and workflows across facilities and network management teams.

Emerging digital twin APIs from companies like Mapped, Zyter and PassiveLogic might help bridge the gap between dynamic networks and the built environment. This could make it easier to create comprehensive digital twins that include the networking aspects involved in more autonomous business processes. 

The future is autonomous networks

Grammer believes that improved integration between digital twins and automation could help fine-tune network settings based on changing conditions. For example, business traffic may predominate in the daytime and shift to more entertainment traffic in the evening. 

“With these new modeling tools, networks will automatically be able to adapt to application changes switching from a business video conferencing profile to a streaming or gaming profile with ease,” Grammer said. 

How digital twins will optimize network infrastructure

The most common use case for digital twins in network infrastructure is testing and optimizing network equipment configurations. Down the road, they will play a more prominent role in testing and optimizing performance, vetting security and compliance, provisioning wireless networks and rolling out large-scale IoT networks for factories, hospitals and warehouses. 

Experts also expect to see more direct integration into business systems such as enterprise resource planning (ERP) and customer relationship management (CRM) to automate the rollout and management of networks to support new business services.

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.

Repost: Original Source and Author Link