Teleoperation and the future of safe driving

This post was written by Amit Rosenzweig, CEO of Ottopia.

Teleoperation: the technology that enables a human to remotely monitor, assist and even drive an autonomous vehicle.

Teleoperation is a seemingly simple capability, yet it involves numerous technologies and systems in order to be implemented safely. In the first article of this series, we established what teleoperation is and why it is critical for the future of autonomous vehicles (AVs). In the second article, we showed the legislative traction and emphasis gained for this technology. In the third and fourth articles, we explained two of the many technical challenges that needed to be overcome in order to enable remote vehicle assistance and operation. In this article, we will explore how this is all achieved in the safest possible way. 

More than a decade ago, the major AV companies made a promise. They claimed that autonomous vehicles would by now be completely self-sufficient. Human driving was obsolete. As the years pass, we continue to see how this goal is elusive, and that there will always be the need for a human to be kept in the loop. The initial response to this was remote driving.

Remote Driving? Major danger

Teleoperation was originally a system that overrides the autonomy of a vehicle and allows a human to manually drive it remotely. Essentially it would replace all self-driving functions and safety systems with a remote driver. This would appear to make a degree of sense. Currently, the solution for unknown situations, aka edge cases, is to put a “safety driver” in the driver’s seat. This way, when the autonomy does not know what to do and gets stuck, the human can manually solve the problem by driving the car for just a few seconds. By enabling the human driver to be in a remote location, they can monitor and solve problems for multiple vehicles, thereby cutting down on driver costs.

Chances are when people first envisioned this remote driving, they assumed we would have perfect and fully immersive virtual reality with zero latency as seen in a sci-fi movie like Black Panther. Unfortunately, there are critical shortcomings with regard to remote driving. As it is, from the instant a driver recognizes an obstacle in the road until their foot hits the brake pedal – brake reaction time – it takes about 0.7 seconds. This means that at a speed of only 30 mph, which translates to 44 feet per second, over 30 feet of braking distance are needed to prevent a collision. This is if the driver is IN the vehicle, traveling at ONLY 30 mph, and the car stops on the spot.

Ottopia Teleoperation in crowded environments

Above: Figure 1: “Obstacles” can appear in almost every environment

Image Credit: Ottopia

For a remote driver, one must factor in at least a few fractions of a second in latency plus the lack of haptic feedback. In other words, the brake reaction time alone is at least 0.8 seconds, with a minimum of 35 feet needed to avoid a collision at 30 mph. And this does not even factor in braking distance. Maybe this is why in a different sci-fi movie, Guardians of the Galaxy 2, one can see how remote pilots are inferior to those onboard the ship.

Clearly, humans cannot be allowed to drive a vehicle from a remote location. At least not on their own.

Advanced Teleoperator Assistance System (ATAS ®): the first transformation for teleoperation

Yes, originally the teleoperation system would shut off the autonomy stack and enable a person to drive the vehicle, but why? Why would you shut off this incredible piece of technology that already knows how to sense, react and respond in ways a person will never be able to do? This is why the second stage of teleoperation involved systems like ATAS® (an Ottopia registered trademark).

Like the more familiar ADAS (Advanced Driver Assistance System) the purpose of ATAS® is to work with the (remote) driver while leveraging the existing safety functions enabled by the vehicle’s autonomous capabilities. The main directive of an ATAS® is to prevent collisions. There are two main ways to do this, both made possible by the autonomy stack.

The first is collision warning. At every given moment, the powerful LiDAR, perception, and computation capabilities are ascertaining each and every object in the field of view of the AV. As the vehicle progresses on its way, the system identifies the speed and trajectory of the vehicle in addition to things that may pose a safety hazard. The teleoperator display has a layer that shows their heading and can alert if anything might be a reason to slow down, stop or circumnavigate the particular obstacle. This system helps compensate for the reactive shortcomings of a human driver while still allowing them to make the important decisions of how to get where they need to go.

Remote collision warning in action

Above: Figure 2: Remote collision warning in action

Image Credit: Ottopia

The second is collision avoidance. The ultimate safety decision-making power does not and cannot lie with the human driver. Yes, the human is subject to what the autonomy decides is safest! This may seem backwards until you remember that the vehicle is in the moment. It has instant perception abilities. It sees the oncoming crash before any human ever could. Furthermore, even if the human driver could see the potential risk, it is possible they are distracted or blinded or otherwise incapable of recognizing the impending danger. That is why, only with regard to braking in safety situations, the vehicle and its corresponding autonomy system must make the decision to stop the vehicle and prevent a disaster.

Clearly, a remote driver must have a system like ATAS® in order to ensure the safety of those in an AV and those around it. However, there remains serious room for improvement.

Tele-assistance. The final form?

Tele-assistance, also known as remote vehicle assistance (RVA), high-level commands, or indirect control – is when the operator gives certain orders to the AV without directly deciding how it completes that task. Tele-assistance helps reduce many of the risks involved in remote driving, even with ATAS®. Tele-assistance is also dramatically more efficient in terms of how many operators are needed.

This is how Tele-assistance works: In the traditional teleoperation situation, an AV would be driving along when it encounters an event which it does not know how to handle. It pulls over to the safest possible spot, stops, and triggers an alert for human intervention. That human would link in, observe the situation, and decide on how best to remedy the problem. Instead of putting their hands on a steering wheel and feet on pedals, the operator will choose from a menu of commands they can give to the vehicle to guide it out of its predicament.

Examples of such commands include path choosing – where the operator selects one of a few offered choices for an optimal path forward; path drawing – where the operator makes a custom path for the AV to follow; and object override – recognizing when the seeming obstacle is not a problem (e.g., a small cardboard box in the middle of the lane) and, in fact, the vehicle can simply continue on its way.

Tele-assistance in action (Image courtesy of Ottopia)

Above: Figure 3: Tele-assistance in action

Image Credit: Ottopia

Traditional teleoperation created more problems than it solved. It is hubristic to claim that a human can remote-drive any normal-sized automobile or truck without any assistance or dedicated safety technology. While humans are required to handle situations confronted by autonomy, the solution for driving is ideally assistance, and at the very least, driving with a safety system like ATAS®.

When tele-assistance is coupled with maximized network connectivity and dynamic video compression, as described in the previous two articles, autonomous vehicles can be commercially deployed in the safest and most efficient manner.

Amit Rosenzweig is the CEO & Founder of Ottopia


VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link


3 trends driving data observability

Enterprise “data observability” is a hot space right now.

Over the past couple of months, investors have pumped $200 million into each of Cribl and Grafana Labs, two data observability startups, and lesser amounts into related companies like Acceldata and DeepFactor.

What’s behind this frenzy?

Well, enterprise data systems are like a busy family household. From room to room, you have a complex ebb and flow of activity, with people coming and going, and doors opening and closing. Various inbound streams from utilities make it all go: water flowing through pipes, electricity, and Wi-Fi powering everything, and warm or cool air flowing through the vents.

The difference is that in the enterprise, the data deluge is increasing at an unprecedented rate.

At home, as in the enterprise, it’s easy to take this complexity for granted day-to-day, but when something goes haywire, life can instantly grind to a halt. At home, this is why we have modern conveniences such as smart thermostats, connected appliances, and webcam security systems. These gadgets let us monitor what’s going on in the home, be it a dead lightbulb or an unwanted intruder — and then try to rectify the problem.

This ability to monitor and understand the system is the reason why data observability is one of the hottest topics in enterprise IT at the moment. To be clear, here is what we’re discussing:

  • Monitoring: solutions that allow teams to watch and understand what is happening in their data systems, based on gathering predefined sets of metrics or logs.
  • Observability: solutions that allow teams why changes are happening in their systems, including answering questions that may not have been previously asked or thought of.

The home analogy is what Clint Sharp, cofounder, and CEO of data observability company Cribl, sometimes uses while trying to explain data observability in relatable terms.

“Observability is the ability to ask and answer questions of complex systems, including questions I may not have planned in advance,” Sharp said, likening observability tools to a thermostat that will notify you if the temperature in your home suddenly goes dramatically higher or lower than expected.

“A harder question to answer is: Why did the temperature go awry?” Sharp said. “That can be a difficult thing to diagnose, especially if I’m doing it on a modern application with dozens of developers working on it and all kinds of complex interactions.”

Data observability is about the ‘why’

The “why” part is what data observability is all about, and it’s what sets it apart from simply monitoring for problems — meaning the “what” — within IT infrastructure and data systems. During the last few years, enterprises have begun shifting from mere data monitoring to data observability, and the trend is only now beginning to pick up steam.

By 2024, enterprises will increase their adoption rate of observability tools by 30%, according to research firm Gartner. And 90% of IT leaders say that observability is critical to the success of their business, with 76% saying they expect to see their observability budgets increase next year, according to New Relic’s 2021 Observability Forecast,

This is good news for people such as Cribl’s Sharp, whose startup is just one of many players in this fast-growing ecosystem. For its part, Cribl offers a centralized observability infrastructure that can plug into a vast array of data sources and observability tools. There are plenty of them out there: Splunk, Accel Data, Monte Carlo, Bigeye, and Databand are just a handful of the companies focused on data observability at the moment.

Data observability is a fast-growing discipline in the world of enterprise technology that seeks to help organizations answer one question: How healthy is the data in their system? With all the disparate (and often differently formatted) data flowing into, within, and out of enterprises, where are the potential weaknesses — such as missing, broken, or incomplete data — that could lead to a business-crippling outage?

Observability consists of five pillars

Good data observability includes:

  • Freshness, or how up-to-date the data tables are;
  • Distribution, or whether the data covers the correct range;
  • Volume, or the amount and completeness of data;
  • Schema, which monitors changes to data’s structure;
  • Lineage, which identifies where data breaks and tells you which sources were impacted.

The cost of data outages can be enormous. From lost revenue and eroded customer confidence to reduced team productivity and morale, enterprises have a lot to lose when data pipelines break. As enterprise data systems grow more complex and multi-layered — with data flowing from a wide variety of sources and more people interacting with it — the need for observability is becoming increasingly urgent.

Good data observability is about more than just preventing a catastrophe. By applying observability best practices to their data stacks, enterprises can boost efficiency, speed up innovation, and even reduce IT costs by making it easier to optimize their data infrastructure and avoid unnecessary over-provisioning. It can even help with talent retention, as a well-oiled and problem-free environment keeps engineers and other team members happy.

It’s no wonder enterprises are starting to take data observability seriously. So what’s next for this up-and-coming space? Here are three major trends shaping the future of data observability.

Trend No. 1: AI supercharges data observability

Like many aspects of modern life, artificial intelligence is making its mark on enterprise data observability. In fact, many would argue that AIOps — or the use of AI to automate and enhance IT operations — is an essential requirement for true observability. At a high level, machine learning and other AI technologies can help teams more easily analyze large, interconnected sets of data. This automatically detects problematic patterns and zeroes in on the root of issues when they do occur.

Observability platform company Monte Carlo, for example, uses AI models to identify patterns in query logs, trigger investigative follow-up results, and look for upstream dependency changes to determine the cause of a given issue., an observability tool for call centers, uses natural language processing and automatic speech recognition to transcribe and analyze customer service phone calls, while automatically flagging repetitive patterns, data shifts, and anomalies.

Trend No. 2: data standardization helps observability evolve

There’s a reason that the schema of data is one of the five pillars of observability. With data coming from so many sources and in different formats, it’s no wonder that variances in the structure of those datasets can cause mismatches and other data problems.

So enterprises are pushing for standardization. For example, OpenTelemetry is a new, open source framework that aims to tame some of the data chaos and make observability easier across different platforms, pipelines, and data sources. Through its collection of open, vendor-neutral tools, SDKs, and APIs, OpenTelemetry gives organizations a standardized way to collect telemetry data — the metrics, traces, and logs that make up the heart of data observability — and easily route that data between various services and data analysis tools.

Trend No. 3: data observability shifts further into the cloud

With more and more aspects of enterprise tech and operations happening in the cloud, it’s no surprise that data observability would be shifting in that direction as well. Increasingly popular cloud data architectures such Snowflake allow enterprises to store and use their data in the cloud, while data virtualization and visualization tools make it easier for teams to make sense of that data.

The cloud is also becoming a friendlier place for data observability itself. Cribl, for example, recently announced a new feature called LogStream Cloud Enterprise, which allows companies to move sensitive data processing to the cloud in a way that protects the security of local data using cryptographically secured, zero trust tunnels.


VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link


AI Weekly: AI adoption is driving cloud growth

All the sessions from Transform 2021 are available on-demand now. Watch now.

The adoption of cloud technologies continues to accelerate. According to the newest report from Canalys, in Q2 2021, companies spent $5 billion more on cloud infrastructure services compared to the previous quarter. While a number of factors are responsible, including an increased focus on business resiliency planning, the uptick illustrates the effect AI’s embracement has had — and continues to have — on enterprise IT budgets.

In a recent survey, 80% of U.S. enterprises said they accelerated their AI adoption over the past two years. A majority consider AI to be important in their digital transformation efforts and intend to set aside between $500,000 to $5 million per year for deployment efforts. Organizations were projected to invest more than $50 billion in AI systems globally in 2020, according to IDC, up from $37.5 billion in 2019. And by 2024, investment is expected to reach $110 billion.

The cloud is playing a role in this due to its potential to improve AI training and inferencing performance, lowering costs and in some cases providing enhanced protection against attacks. Most companies lack the infrastructure and expertise to implement AI applications themselves. As TierPoint highlights, outside of corporate datacenters, only public cloud infrastructure can support massive data storage as well as the scalable computing capability needed to crunch large amounts of data and AI algorithms. Even companies that have private datacenters often opt to avoid ramping up the hardware, networking, and data storage required to host big data and AI applications. According to Accenture global lead of applied intelligence Sanjeev Vohra, who spoke during VentureBeat’s Transform 2021 conference, the cloud and data have come together to give companies a higher level of compute, power, and flexibility.

Cloud vendor boost

Meanwhile, cloud vendors are further stoking the demand for AI by offering a number of tools and services that make it easier to develop, test, enhance, and operate AI systems without big upfront investments. These include hardware optimized for machine learning, APIs that automate speech recognition and text analysis, productivity-boosting automated machine learning modeling systems, and AI development workflow platforms. In a 2019 whitepaper, Deloitte analysts gave the example of Walgreens, which sought to use Microsoft’s Azure AI platform to develop new health care delivery models. One of the world’s largest shipbuilders is using Amazon Web Services to develop and manage autonomous cargo vessels, the analysts also noted. And the American Cancer Society uses Google’s machine learning cloud services for automated tissue image analysis.

“The symbiosis between cloud and AI is accelerating the adoption of both,” the analysts wrote. “Indeed, Gartner predicts that through 2023, AI will be one of the top workloads that drive IT infrastructure decisions. Technology market research firm Tractica forecasts that AI will account for as much as 50% of total public cloud services revenue by 2025: AI adoption means that, ‘essentially, another public cloud services market will be added on top of the current market.’”

With the global public cloud computing market set to exceed $362 billion in 2022 and the average cloud budget reaching $2.2 million today, it appears clear that investments in the cloud aren’t about to slow down anytime soon. As long as AI’s trajectory remains bright — and it should — the cloud industry will have an enormous boom from which to benefit.

For AI coverage, send news tips to Kyle Wiggers — and be sure to subscribe to the AI Weekly newsletter and bookmark our AI channel, The Machine.

Thanks for reading,

Kyle Wiggers

AI Staff Writer


VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link


TSMC’s 5nm chip enhancements steer AI driving, 5G

Elevate your enterprise data technology and strategy at Transform 2021.

Taiwan Semiconductor Manufacturing Company (TSMC) is pressing ahead with more improvements to its industry-leading 5-nanometer process technology, including the introduction of its N5A enhancement for automotive application processors designed for AI-enabled driver assistance and N6RF for 5G smartphone chips.

TSMC is billing N5A as a way to bring technology used in supercomputers to vehicles and smartphones. Set for availability in the third quarter of 2022, the new design “packs the performance, power efficiency, and logic density” of the foundry’s industry-first 5nm N5 process, the company said in a statement.

The new process went into volume production last year. TSMC said this week that it has seen quicker defect density improvements with its 5nm process than it had with the preceding 7nm generation.

The N5A enhancement meets automotive safety and quality standards for packaged integrated circuits (ICs) such as the AEC-Q100 Grade 2 stress test, TSMC said. N5A-based ICs are expected to appear in driver assistance applications, digitized cockpit systems, and other applications.

TSMC on Wednesday announced its roadmap for enhanced and specialized 5nm process technologies at its 2021 Technology Symposium, held virtually for the second year in a row. The Hsinchu, Taiwan-based semiconductor giant said N4, the next enhancement to its first-generation N5 5nm process, is set for risk production in the third quarter of this year. The next stage after N4 is N3, which is currently testing and is projected for volume production in the second half of 2022. The N3 process is expected to deliver either a 15% speed gain or a 30% reduction in power consumption compared with N5 and will also provide up to a 70% logic density gain.

Advanced packaging stacks more memory on-chip

In addition to new specialty chip designs, TSMC highlighted its 3DFabric advanced packaging and chip-stacking technologies, such as InFO_B, which supports DRAM stacking on an integrated mobile processor package for better performance and power efficiency. The company said it will offer InFO_B later this year, as well as INFO_oS and CoWoS packaging solutions with integrated high-bandwidth memory for high-performance computing (HPC).

The pure-play foundry said these packaging technologies and specialty processes like N5A and N6RF — a newly announced process that translates the benefits of N6 logic to 5G radio frequency (RF) and WiFi 6/6e solutions — are instrumental to the continuing digitization and AI-enhancement of more parts of our daily lives. The company’s expansive plans come despite growing concerns about global chip shortages.

“Digitalization is transforming society faster than ever as people use technology to overcome the barriers created by the global pandemic to connect, collaborate, and solve problems,” TSMC CEO Dr. C.C. Wei said in a statement. “This digital transformation has opened up a new world full of opportunities for the semiconductor industry.”


VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link


Fortnite Season 6, Week 10 Challenge Guide: How to Destroy Opponent Structure While Driving a Modded Vehicle

With the latest set of Fortnite challenges for season 6, week 10, you’ll likely have an easy time blasting through most of them. However, one that might cause some problems is to destroy an opponent’s structure while driving a modded vehicle. There are a few moving parts with this one, so you’ll want to do everything you can to prepare yourself for it. On paper, it might not seem too difficult, but a few things have to go right for it to work.

In this guide, we’ll show you everything you’ll need to know about destroying an opponent’s structure while driving a modded vehicle in Fortnite.

Recommended reading:

How to mod your vehicle

This challenge is a callback to one from week 5 of this season. We covered how to mod your vehicle in that guide, but the short version is that you need to find tires around the map. Using the map above (thanks,, you’ll see numerous locations that feature tires you can use to upgrade your vehicles. Although there are a lot of areas where you can do this, we recommend visiting one close to a high traffic area so you can prepare for the next step.

For instance, the areas around Retail Row are great since there are a few spots nearby that are frequently visited. You’ll find either a tire you can pick up or a stack of tires that can be destroyed. Either way, you’ll need to pick up the tire and throw it at the vehicle of choice to upgrade it. This will give your vehicle large tires, lifting it off the ground even higher.

How to destroy an opponent’s structure while driving a modded vehicle


So, once you’ve acquired a modded vehicle, you should drive around while looking for enemy player-built structures. This is tough because it’s usually unclear if these structures have been built by enemies or teammates. The most effective way to do this is to keep driving your vehicle around while running through any structure you see. With some luck, you’ll run into an enemy’s structure and get credit for the challenge.

Alternatively, if you last until the later portions of a match, you’ll almost certainly come across an enemy who will build in front of themselves for protection. When this happens, ram into the structure they’ve built to gain credit. You can attempt this in Battle Royale or Team Rumble. Both have pros and cons, but we’ve found it easier to finish in Battle Royale since there are more enemies in any given match. This means there’s a higher chance of finding their structures around — but it also means more opportunities to get taken out.

Either way, break through one structure, and you’ll earn 24,000 XP.

Editors’ Choice

Repost: Original Source and Author Link


Gartner says low-code, RPA, and AI driving growth in ‘hyperautomation’

Join Transform 2021 this July 12-16. Register for the AI event of the year.

Research firm Gartner estimates the market for hyperautomation-enabling technologies will reach $596 billion in 2022, up nearly 24% from the $481.6 billion in 2020.

Gartner is expecting significant growth for technology that enables organizations to rapidly identify, vet, and automate as many processes as possible and says it will become a “condition of survival” for enterprises. Hyperautomation-enabling technologies include robotic process automation (RPA), low-code application platforms (LCAP), AI, and virtual assistants.

As organizations look for ways to automate the digitization and structuring of data and content, technologies that automate content ingestion, such as signature verification tools, optical character recognition, document ingestion, conversational AI, and natural language technology (NLT), will be in high demand. For example, these tools could be used to automate the process of digitizing and sorting paper records.

Gartner currently anticipates the hyperautomation market reaching $532.4 billion this year.

Drivers of growth

Gartner said process-agnostic tools such as RPA, LCAP, and AI will drive the hyperautomation trend because organizations can use them across multiple use cases. Even though they constitute a small part of the overall market, their impact will be significant, with Gartner projecting 54% growth in these process-agnostic tools.

Through 2024, the drive toward hyperautomation will lead organizations to adopt at least three out of the 20 process-agonistic types of software that enable hyperautomation, Gartner said.

The demand for low-code tools is already high as skills-strapped IT organizations look for ways to move simple development projects over to business users. Last year, Gartner forecast that three-quarters of large enterprises would use at least four low-code development tools by 2024 and that low-code would make up more than 65% of application development activity.

Software automating specific tasks, such as enterprise resource planning (ERP), supply chain management, and customer relationship management (CRM), will also contribute to the market’s growth, Gartner said.

Lots of potential use cases

Hyperautomation extends the idea of intelligent automation, as it promises end-to-end process automation with minimal human intervention required. The convergence of intelligent process automation technologies and cloud computing, along with the need to process unstructured content, helps make the case for hyperautomation across several industries, including shared services, hospitality, logistics, and real estate.

Some day-to-day examples of automation include self-driving cars, self-checkouts at grocery stores, smart home assistants, and appliances. Business use cases include applying data and machine learning to build predictive analytics that react to consumer behavior changes and implementing RPA to streamline operations on a manufacturing floor.

Gartner earlier included hyperautomation in its Top 10 Strategic Technology Trends for 2021.

Benefits of hyperautomation

Gartner said tools that provide visibility to map business activities, automate and manage content ingestion, orchestrate work across multiple systems, and provide complex rule engines make up the fastest-growing category of hyperautomation-enabling software. Organizations will be able to lower operational costs 30% by 2024 through combining hyperautomation technologies with redesigned operational processes, Garner projected.

“Hyperautomation has shifted from an option to a condition of survival,” Gartner VP Fabrizio Biscotti said in a statement. “Organizations will require more IT and business process automation as they are forced to accelerate digital transformation plans in a post-COVID-19, digital-first world.”


VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link

Tech News

The Best Cars for Driving in the Snow

If you regularly drive through blizzards, look no further than the Subaru Crosstrek the next time you’re shopping for a car. It’s rugged, it’s reliable, and above all, it’s extremely capable, thanks in part to Subaru’s time-tested all-wheel drive system and a generous amount of ground clearance. It remains user-friendly around town, too.

See more

There are other good options if the Crosstrek is too small or underpowered for your needs. Digital Trends has traveled to the coldest, snowiest parts of the world to find out which cars keep old man winter at bay and which ones get stuck on ice. We’ve also selected the best electric snow car and the best luxury snow car, among other choices.

Product Category Rating
Subaru Crosstrek Best snow car overall Not yet rated
Volvo V90 Cross Country Best luxury snow car Not yet rated
Audi E-Tron Best electric snow car Not yet rated
Subaru WRX Best performance snow car Not yet rated
Jeep Grand Cherokee Best SUV for the snow 3.5 out of 5

The best: Subaru Crosstrek

Why you should buy this: It will get you where you need to go, regardless of the weather.

Who it’s for: The winter-weary.

How much it will cost: $22,245+

Why we picked the Subaru Crosstrek:

Almost every Subaru is a good winter car. With the notable exception of the rear-wheel-drive BRZ sports car, which just entered its second generation, every model in the Japanese automaker’s lineup comes standard with all-wheel drive. In particular, we think the Crosstrek hatchback is a good all-around package for winter driving.

The Crosstrek is basically an Impreza hatchback with extra ground clearance and plastic body cladding added to mimic the styling of SUVs. It isn’t an SUV though; it proves that you don’t need one.

All-wheel drive lets the Crosstrek handle all sorts of nasty weather, and the extra ground clearance is helpful on dirt roads. The rest of the time, the Crosstrek drives like a normal car. Its compact dimensions give it relatively responsive handling, and its acceleration is adequate, though we wouldn’t call it fast. Subaru did add a bigger engine for the 2021 model year that puts much-needed extra power under the driver’s right foot. All told, it’s a well-executed package with handsome styling, a spacious interior, and a modern infotainment system. What more do you need?

The best luxury car for the snow: Volvo V90 Cross Country

best cars for the snow volvo v90 cross country

Why you should buy this: It’s a masterpiece of Swedish design.

Who it’s for: People who want a rugged wagon with more appeal than a Subaru Outback.

How much it will cost: $54,900+

Why we picked the Volvo V90 Cross Country:

Volvo has been building its Cross Country-badged models in one form or another since 1997. They’re station wagons (and, rarely, sedans) with SUV-like ground clearance and rugged-looking styling cues such as plastic body cladding.

All-wheel drive turns the V90 Cross Country into a true winter warrior. Digital Trends tested it in the middle of winter in northern Sweden, and it never got stuck. It offered excellent traction even on a frozen lake. In addition to an extra dose of ruggedness, the V90 Cross Country offers everything that’s great about recent additions to the Volvo family, like an ergonomic interior made with high-quality materials, and user-friendly tech features.

Volvo offers the V90 Cross Country with a supercharged and turbocharged 2.0-liter four-cylinder engine tuned to deliver 316 horsepower. That makes for brisk acceleration, but the Cross Country is happier when it’s cruising on the highway. It’s perfect for, well, crossing the country.

The best electric car for the snow: Audi E-Tron

Why you should buy this: It offers an electrified version of Audi’s Quattro system.

Who it’s for: Tech-savvy motorists.

How much it will cost: $65,900+

Why we picked the Audi E-Tron:

Quattro all-wheel drive is one of Audi’s claims to fame. It helped the company dominate the rallying scene during the 1980s, and it allows thousands of motorists to drive through awful weather each year. Going electric wasn’t an excuse for Audi to ditch Quattro; it perfected it. Two electric motors power the E-Tron — one is mounted over the front axle to spin the front wheels, and the other is positioned above the rear axle to zap the rear wheels into motion.

This is called a through-the-road setup, because there’s no physical connection between the axles yet all four wheels are driven. Speaking to Digital Trends, Audi engineer Tobias Greiner compared the powertrain his team developed to a network. The different components share information and work together to decide how much torque each axle needs in real-time. For example, if the armada of sensors detects understeer during hard cornering it will brake the inside wheels to counter it. If the sensors detect that the rear axle loses traction, they’ll send more torque to the front wheels to keep the car moving. In other words, snow and sand won’t stop the E-Tron in its tracks.

We also liked the E-Tron’s infotainment system, which is one of the most intuitive systems on the market, and we appreciated its smooth, silent ride on the highway. It’s a good daily driver — even in the winter — that just happens to be electric.

The best performance car for the snow: Subaru WRX

best cars for the snow subaru wrx

Why you should buy this: It’s a performance car that foul weather can’t stop.

Who it’s for: Snowbound speed freaks.

How much it will cost: $27,495+

Why we picked the Subaru WRX:

If the Crosstrek is a good all-rounder for winter driving, then the WRX is a performance-focused smile machine that plays well in slippery conditions. Like the Crosstrek, the WRX is a derivative of the Subaru Impreza compact, but it’s based on an older body style. That’s not the difference that really counts, though.

The WRX packs a turbocharged 2.0-liter boxer-four engine, which produces 268 hp and 258 lb.-ft. (Subaru also offers a WRX STI with a 2.5-liter, 305-hp engine). All-wheel drive allows the WRX to keep going when most other performance cars would be spinning off the road and into snowbanks. Torque vectoring channels power side-to-side, helping to turn the car into corners. That’s something you’ll appreciate even on dry pavement.

All-wheel drive isn’t the only thing that makes the WRX a practical choice. Underneath the boy-racer hood scoop and quad exhaust tips, it’s still a practical four-door sedan. A reasonably sized interior and trunk, as well as good road manners, make the WRX a performance car you’ll actually want to use every day.

The best SUV for the snow: Jeep Grand Cherokee

best cars for the snow 2018 jeep grand cherokee

Why you should buy this: It’s a family SUV for the Rubicon Trail.

Who it’s for: Outdoorsy types.

How much it will cost: $36,220+ (4×4)

Why we picked the Jeep Grand Cherokee:

When you think of a Jeep, you picture a vehicle with impressive off-road prowess for the serious adventurer. To the company, this is more than just innovative marketing; they put an impressive amount of hardware into their vehicles to help them tackle the elements. Like the Wrangler, the Grand Cherokee benefits from Jeep’s decades-long expertise in manufacturing serious go-anywhere off-roaders, but it also reflects the firm’s upmarket ambitions.

The Grand Cherokee has a long legacy of being a capable winter vehicle and can handle snowy roads with ease. Whether you’re heading to the store or the slopes, the Grand Cherokee has plenty of room for five adults and gear. 

The Grand Cherokee is available in 12 trim levels, all including A touchscreen-based UConnect infotainment system for easy navigation for your winter excursions. While the standard V-6 is extremely capable, the hot-rodded Trackhawk delivers an amazing 707 horsepower for those wanting extra performance.

How we test

Our team evaluates each vehicle we review by using extensive testing processes. We take advantage of the car experts on staff to ensure that the reviews posted on Digital Trends give buyers all of the information they need before making such a major decision.

Each vehicle is put through real-world testing on highways and back roads, as well as off-road and on race tracks when applicable. We use experienced test drivers to take the vehicles out in all sorts of weather so they can accurately report on the safety levels of each car.

We also evaluate all of a vehicle’s safety features, testing as many as possible in a controlled environment. The experts test and review each vehicle from the inside out to make sure you’re getting exactly what you expect according to the dealer’s listing. Vehicles are ranked based on others in their class to help you in your decision process.

Editors’ Choice

Repost: Original Source and Author Link

Tech News

Is your internet connection driving you mad? Here’s what might be behind it

For most people, diagnosing a dodgy internet connection is nigh on impossible. After all, the internet is a complex hodgepodge of hardware and software, and the odd jumpy Zoom call is often accepted as an inexplicable feature of a network we don’t well understand.

But internet connection issues are actually quite easy to explain. They arise when the flow of data along internet cables is interrupted, most frequently when the demand to use the cables is very high. That’s why your connection seems worst during “peak TV viewing” hours, when everyone’s trying to stream videos using the same cables at the same time.

And while modern fiber-optic cables lead to faster internet speeds, it’s likely that we’ll always experience frustratingly slow internet from time to time. It’s a byproduct of a network that’s built to be flexible – and the finite load of the cables that support it.

The physical network

The internet is a network of cables that send digital data across vast distances at close to the speed of light. Between countries and continents, the internet is distributed via a vast series of undersea cables. Within countries, smaller cables run underground until they eventually branch into each of our homes.

In the UK, BT and Virgin Media are the major cable infrastructure providers. It’s they who physically plug the internet into UK homes, and they’re also responsible for laying and updating the underground cables that carry your data around the country or to the undersea cables to go further afield.

Some homes have “fiber to the premises” (FTTP) connections, connecting homes directly to fiber optic cables which can carry digital data incredibly quickly. But most UK homes have “fiber to the cabinet” (FTTC) connections, which are a little slower.

These deliver a high-speed fiber optic connection to local internet cabinets, from where slower copper wires run the “final mile” to surrounding homes. Copper can only carry analog signals, so digital data has to be continually converted to analog in homes that are connected to the internet via copper wires.