How digital twins are transforming network infrastructure: Future state (part 2)

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.

This is the second of a two-part series. Read part 1 about the current state of networking and how digital twins are being used to help automate the process, and the shortcomings involved.

As noted in part 1, digital twins are starting to play a crucial role in automating the process of bringing digital transformation to networking infrastructure. Today, we explore the future state of digital twins – comparing how they’re being used now with how they can be used once the technology matures.

The market for digital twins is expected to grow at a whopping 35% CAGR (compound annual growth rate) between 2022 and 2027, from a valuation of $10.3 billion to $61.5 billion. Internet of things (IoT) devices are driving a large percentage of that growth, and campus networks represent a critical aspect of infrastructure required to support the widespread rollout of the growing number of IoT devices.

Current limitations of digital twins

One of the issues plaguing the use of digital twins today is that network digital twins typically only help model and automate pockets of a network isolated by function, vendors or types of users. However, enterprise requirements for a more flexible and agile networking infrastructure are driving efforts to integrate these pockets.


MetaBeat 2022

MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.

Register Here

Several network vendors, such as Forward Networks, Gluware, Intentionet and Keysight’s recent Scalable Networks acquisition, are starting to support digital twins that work across vendors to improve configuration management, security, compliance and performance. 

Companies like Asperitas and Villa Tech are creating “digital twins-as-a-service” to help enterprise operations.

In addition to the challenge of building a digital twin for multivendor networks, there are other limitations that digital twin technology needs to overcome before it’s fully adopted, including:

  • The types of models used in digital twins needs to match the actual use case. 
  • Building the model, supporting multiple models and evolving the model over time all require significant investment, according to Balaji Venkatraman, VP of product management, DNA, at Cisco.
  • Keeping the data lake current with the state of the network. If the digital twin operates on older data, it will return out-of-date answers. 

Future solutions

Manas Tiwari, client partner for cross-industry comms solutions at Capgemini Engineering, believes that digital twins will help roll out disaggregated networks composed of different equipment, topologies and service providers in the same way enterprises now provision services across multiple cloud services. 

Tiwari said digital twins will make it easier to model different network designs up front and then fine-tune them to ensure they work as intended. This will be critical for widespread rollouts in healthcare, factories, warehouses and new IoT businesses. 

Vendors like Gluware, Forward Networks and others are creating real-time digital twins to simulate network, security and automation environments to forecast where problems may arise before these are rolled out. These tools are also starting to plug into continuous integration and continuous deployment (CI/CD) tools to support incremental updates and rollback using existing devops processes.

Cisco has developed tools for what-if analysis, change impact analysis, network dimensioning and capacity planning. These areas are critical for proactive and predictive analysis to prevent network or service downtime or impact user experience adversely.

Overcoming the struggle with new protocols

Early modeling and simulation tools, such as the GNS3 virtual labs, help network engineers understand what is going on in the network in terms of traffic path, connectivity and isolation of network elements. Still, they often struggle with new protocols, domains or scaling to more extensive networks. They also need to simulate the ideal flow of traffic, along with all the ways it could break or that paths could be isolated from the rest of the network. 

Christopher Grammer, vice president of solution technology at IT solutions provider Calian, told VentureBeat that one of the biggest challenges is that real network traffic is random. The network traffic produced by a coffee shop full of casual internet users is a far cry from the needs of petroleum engineers working with real-time drilling operations. Therefore, simulating network performance is subject to the users’ needs, which can change at any time, making it more difficult to actively predict.

Not only that, but, modeling tools are costly to scale up. 

“The cost difference between simulating a relatively simple residential network model and an AT&T internet backbone is astronomical,” Grammer said. 

Thanks to algorithms and hardware improvements, vendors like Forward Enterprise are starting to scale these computations to support networks of hundreds of thousands of devices.

Testing new configurations

The crowning use case for networking digital twins is evaluating different configuration settings before updating or installing new equipment. Digital twins can help assess the likely impact of changes to ensure equipment works as intended. 

In theory, these could eventually make it easier to assess the performance impact of changes. However, Mike Toussaint, senior director analyst at Gartner, said it may take some time to develop new modeling and simulation tools that account for the performance of newer chips.

One of the more exciting aspects is that these modeling and simulation capabilities are now being integrated with IT automation. Ernest Lefner, chief product officer at Gluware, which supports intelligent network process automation, said this allows engineers to connect inline testing and simulation with tools for building, configuring, developing and deploying networks. 

“You can now learn about failures, bugs, and broken capabilities before pushing the button and causing an outage. Merging these key functions with automation builds confidence that the change you make will be right the first time,” he said.

Wireless analysis

Equipment vendors such as Juniper Networks are using artificial intelligence (AI) to incorporate various kinds of telemetry and analytics to automatically capture information about wireless infrastructure to identify the best layout for wireless networks. Ericsson has started using Nvidia Omniverse to simulate 5G reception in a city. Nearmap recently partnered with Digital Twin Sims to create dynamically updated 5G coverage maps into 5G planning and operating systems. 

Security and compliance

Grammer said digital twins could help improve network heuristics and behavioral analysis aspects of network security management. This could help identify potentially unwanted or malicious traffic, such as botnets or ransomware. Security companies often model known good and bad network traffic to teach machine learning algorithms to identify suspicious network traffic. 

According to Lefner, digital twins could model real-time data flows for complex audit and security compliance tasks. 

“It’s exciting to think about taking complex yearly audit tasks for things like PCI compliance and boiling that down to an automated task that can be reviewed daily,” he said. 

Coupling these digital twins with automation could allow a step change in challenging tasks like identifying up-to-date software and remediating newly identified vulnerabilities. For example, Gluware combines modeling, simulation and robotic process automation (RPA) to allow software robots to take actions based on specific network conditions. 

Peyman Kazemian, cofounder of Forward Networks, said they are starting to use digital twins to model network infrastructure. When a new vulnerability is discovered in a particular type of equipment or software version, the digital twins can find all the hosts that are reachable from less trustworthy entry points to prioritize the remediation efforts. 

Cross-domain collaboration

Network digital twins today tend to focus on one particular use case, owing to the complexities of modeling and transforming data across domains. Teresa Tung, cloud first chief technologist at Accenture, said that new knowledge graph techniques are helping to connect the dots. For example, a digital twin of the network can combine models from different domains such as engineering R&D, planning, supply chain, finance and operations. 

They can also bridge workflows between design and simulations. For example, Accenture has enhanced a traditional network planner tool with new 3D data and an RF simulation model to plan 5G rollouts. 

Connect2Fiber is using digital twins to help model its fiber networks to improve operations, maintenance and sales processes. Nearmap’s drone management software automatically inventories wireless infrastructure to improve network planning and collaboration processes with asset digital twins. 

These efforts could all benefit from the kind of innovation driven by building information models (BIM) in the construction industry. Jacob Koshy, information technology and communications associate, Arup, an IT services firm, predicts that comparable network information models (NIM) could have a similarly transformative role in building complex networks. 

For example, the RF propagation analysis and modeling for coverage and capacity planning could be reused during the installation and commissioning of the system. Additionally, integrating the components into a 3D modeling environment could improve collaboration and workflows across facilities and network management teams.

Emerging digital twin APIs from companies like Mapped, Zyter and PassiveLogic might help bridge the gap between dynamic networks and the built environment. This could make it easier to create comprehensive digital twins that include the networking aspects involved in more autonomous business processes. 

The future is autonomous networks

Grammer believes that improved integration between digital twins and automation could help fine-tune network settings based on changing conditions. For example, business traffic may predominate in the daytime and shift to more entertainment traffic in the evening. 

“With these new modeling tools, networks will automatically be able to adapt to application changes switching from a business video conferencing profile to a streaming or gaming profile with ease,” Grammer said. 

How digital twins will optimize network infrastructure

The most common use case for digital twins in network infrastructure is testing and optimizing network equipment configurations. Down the road, they will play a more prominent role in testing and optimizing performance, vetting security and compliance, provisioning wireless networks and rolling out large-scale IoT networks for factories, hospitals and warehouses. 

Experts also expect to see more direct integration into business systems such as enterprise resource planning (ERP) and customer relationship management (CRM) to automate the rollout and management of networks to support new business services.

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.

Repost: Original Source and Author Link


How Softiron used digital twins to reduce its carbon footprint

We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 – 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!

A few years back, Softiron, which makes data center hardware for software-defined storage, turned to digital twins to help optimize its hardware, not just for cost and performance, but also to dramatically reduce its carbon footprint. A recent assessment by ESG investment firm Earth Capital found that these efforts are paying off.  

SoftIron’s latest products generate about 20% the heat of comparable enterprise storage products and consume as little as 20% of the power of comparable offerings. In total, Earth Capital estimates every 10 petabytes of storage installed translates to savings of about 6,656 tons of carbon, compared to industry norms. 

SoftIron’s COO Jason Van der Schyff detailed how the company has used digital twins to achieve such impressive gains in an exclusive interview with VentureBeat. The exec also explains how they brought environmental considerations into their design workflow for both the products and the factories that build and ship them. This helped them recognize that a focus on I/O rather than CPU performance could help them hit enterprise requirements and sustainability goals. 

VentureBeat: How do you go about building digital twins to optimize your carbon and energy footprint?

Jason Van der Schyff: At SoftIron, we use a variety of digital twinning strategies across our physical products and our facilities and supply chain to determine and analyze our carbon and energy footprint. Our products are entirely digitally modelled, from the foundational circuit boards to mechanical components and all internal active and passive components. This allows us to not only model the thermal performance but also analyze both indigenous and foreign influences, such as vibrations induced from harmonic oscillations caused by cooling fans – an innovation we were recently awarded a patent for. This type of analysis in a digital form allows us to adapt our designs to create less heat, use less cooling and therefore less energy, allowing us to provide our customers with some of the lowest power consumption in the market and aid them in their carbon reduction goals.

With respect to our manufacturing, digital twins provide efficiency in designing and deploying new manufacturing techniques. It enables us to model digitally the impact of design changes in the production workflow across our various manufacturing sites before it ever manifests in the real world – all without wasting materials. 

SoftIron’s factory digital twin enables innovation from our manufacturing center of excellence in Berlin. There we are able to model and control our global manufacturing footprint as a single global capability. While this modeling sometimes happen thousands of miles from where actual production is taking place, it means that the physical product can be manufactured close to the point of consumption and in a way is able to utilize local supply chains, all of which has a positive impact on both supply chain resilience and sustainability. Digital twinning underpins this strategy – which we call “Edge Manufacturing.”

VentureBeat: What kind of tools do you use to store the raw data and share it among different stakeholders in the process?

Van der Schyff: As a designer and manufacturer of enterprise storage, SoftIron chooses to deploy our digital twin on our own infrastructure in our facility in Berlin, with real-time resilience provided by geo-replication across our facilities in California and Sydney. A unified internal network allows all collaborators direct access in real-time to collaborate and contribute to the iteration of the digital twin designs.

VentureBeat: What is involved in identifying some of the biggest contributions to inefficiency and then mitigating these in the final products?

Van der Schyff: The bulk of inefficiencies is introduced through waste, be it wasted power, extraneous componentry or even manufacturing wasted time. By developing a digital twin throughout the development process, we’re able to model and analyze inefficiencies in the design and functionality of our products. This improves quality and minimizes rework. Our manufacturing floor is further modeled to provide accurate time and motion studies and utilize a variety of layouts to optimize efficiency before physical construction is completed to further mitigate inefficiencies.

VentureBeat: What have been some of your discoveries around the specific improvements or changes that led to the most significant impact?

Van der Schyff: Early in the history of the company, by modeling the performance and interaction between the hardware and software layer, we were able to determine that software-defined storage is primarily an I/O problem rather than a compute problem. This discovery informed the selection of components and the adoption of a low power ARM64 architecture to provide highly performant, yet economical storage appliances. These low-power appliances provide savings such that for every 10 PB of data storage shipped by SoftIron, an estimated 6,656 tons of CO2 are saved by reduced energy consumption alone in the customer data center over its lifetime.

VentureBeat: How do digital twins fit into this process?

Van der Schyff: Digital twins provide open access to all data in one place increasing cross border and asynchronous collaboration. Through this collaboration, SoftIron can bring cross-functional expertise to each design, be it a product or a manufacturing process, to observe and mitigate inefficiencies and exploit opportunities to optimize our carbon and energy footprint. 

The significant supply chain disruptions we have seen over the last year or more have only highlighted the weaknesses in the way IT is currently produced. In this way, we believe that sustainability and resilience are inextricably linked. Manufacturing has historically placed all of its eggs in a few very large, low cost, baskets in the world – driving for every increased volume of smaller and smaller component variation in order to drive out costs.

Digital twinning is one enabling technology (along with the current generations of super flexible, efficient low-volume assembly line machinery) that helps to break the cost-to-volume equation apart. This fosters small, distributed manufacturing operations, opens up the supply chain to more local, perhaps lower volume suppliers and, over time, enables a more resilient, sustainable, global IT industry to emerge. SoftIron, we believe, is at the vanguard of this, but we expect this model to become more widespread over the coming decade.

VentureBeat: What’s next?

Van der Schyff: As SoftIron expands its Edge Manufacturing strategy, further opportunities will become available to optimize our carbon and energy footprint and implement further reductions by shortening supply chains, increasing local recycling opportunities and drastically reducing the amount of energy spent in delivering SoftIron’s low-energy appliances to its customers.

We believe what we are doing will serve as both a model and catalyst for others to follow. Over the last 12 months we have seen some major announcements regarding developing chip production in the U.S. and Europe and we hope that by the time these facilities come online, there will be a U.S. and European IT manufacturing economy, of which SoftIron is a leading part.

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.

Repost: Original Source and Author Link


12 factors heating up the popularity of digital twins and simulations

The concept of digital twins is a leading trend in enterprise strategy. It gets its name from the way that companies are building virtual equivalents, or twins, of physical objects. These digital copies are increasingly popular because they can be used to drive important simulations that haven’t been possible until now.

Take a wind turbine, as an example of where digital twin technology comes in handy. The turbine can be outfitted with sensors, which produce real-time data about the turbine’s performance, be it speed, energy output, or weather conditions. This data can then be used to make a digital copy of the turbine, including a 3D digital representation. Machine learning and other models can be applied to recognize patterns in this turbine — for example, whether it is working optimally. The digital copy can be used to run simulations without bothering the original turbine, and improvements can then be fed back to the original.

Simulation drives interest in digital twins

Observers see significant demand for multi-physics simulations that present a holistic view across different physical domains like electronics, structures, and heat. This is critical for areas like noise and vibration. Top simulation techniques include computational fluid dynamics (CFD), multi-body systems (MBS), or finite element analysis (FEA) technologies.

Simulation is increasingly relevant in the manufacturing industry. Simulation software is an insurance policy for manufacturers, ABI Research principal analyst Michael Larner told VentureBeat.  There is an arms race in the supplier community regarding the algorithms that can be deployed, he said. This “insurance” allows them to respond to rapid changes in consumer demands and supply chain disruptions, such as the chip shortage currently hobbling the auto industry.  ABI forecasts that simulation technologies for manufacturing could grow at a rate of 7.1% to $2.6 billion by 2030.

Others expect to see simulation advances used to improve various aspects of operations, particularly with the rise of the so-called “omniverse” for rendering models — referring to the use of things like VR and AR, automated data labeling, AI-powered physics, and improved supply chains.

12 ways simulation trends affect digital twins

1. Omniverse for collaboration

“The most exciting development in simulation and modeling tools over the next three to five years will be the evolution of the omniverse,” Blackshark CEO and cofounder Michael Putz said. Some more exciting improvements will involve AI-supported modeling for reconstructing buildings, live labeling, and AI frameworks. Top omniverse use cases will combine simulation and collaboration for urban planning, location scouting, architectural acceptance, logistics, UAV flight planning, and insurance.

2. Learning with less data

The Generative AI techniques used in deep fakes are also getting better at refining and optimizing the simulation models used for different digital twins. As NASA JPL chief technology and innovation officer Chris Mattmann explained to VentureBeat, “The key is balancing between the need for labeled training data and realistic environmental simulation for ground truth of the digital twin environment.” He predict there will be more adoption of synthetic data techniques to improve model accuracy and efficacy with less manual labeling.

3. Covering gaps in physics

Modeling and simulation tools are improving using AI to build physics models from live data captured from physical industrial processes. Nnaisense’s CEO and cofounder Faustino Gomez told VentureBeat that digital twins from conventional physics models are too slow to be used for complex processes involving chemistry and fluid dynamics in real time. For example, Nnaisense worked with EOS GmbH to develop a digital twin for modeling heat in additive manufacturing processes without explicit physics models. These models can predict important phenomena in real time instead of days. Top AI algorithms he sees bridging the physics gap include geometric deep learning, neural ordinary differential equations (ODE) models, and contrastive learning.

4. Inferential models simulate manufacturability

Digital twins simulations have traditionally focused on simulating product performance characteristics. Improvements in sensors embedded in the manufacturing process are enabling inferential models that can simulate manufacturability characteristics that affect quality, cost, and ease of assembly. Tempo Automation’s chief product officer Jeff Kowalski said that inferential modeling techniques automate the process of generating digital twin models through direct observation. This reduces the human effort in handcrafting the rules that go into a model. It also automatically updates models in response to changes in the environment.

5. Improving autonomous systems

Better digital twins could also improve models that guide autonomous cars, ships, forklifts, and even factories. Kalypso director of data science Jordan Reynolds told VentureBeat, “Major advancements in autonomous system performance are attributable to model predictive control (MPC), a digital twin methodology that simulates how a complex system will respond to operational inputs and changes in its environment.” These models are used to simulate dynamic system behavior and autonomously control these systems in the physical world. MPC is also used to simulate the spread of COVID-19 and determine the optimal interventions to accelerate its decline.

6. Simulation orchestration

Simulation containers promise to build on the success of application containers underpinning agile software development and deployment practices. Aveva chief technologist of XR Maurizio Galardo expects to see simulation tools move from finessing products designed to solve specific tasks to enabling a container of features that allow users to synthesize complete product designs quickly. These simulation microservices could be reused across different design, simulation, and production workflows.

7. Generative design of systems

Generative design techniques automate design suggestions from a set of starting specifications. PTC’s vice president of product management, Paul Sagar, explained that engineers have traditionally used generative design to create and optimize single parts. He expects improvements in algorithms and processing capacity to solve broader problems around simulating complete assemblies, such as seeing how a carburetor might perform using a digital twin of the complete car.

8. Engineering business products

Improvements in computational horsepower and interoperability are ushering in digital twins that combine business and technical simulation techniques. Deloitte Consulting national emerging tech research director Scott Buchholz explained, “Digital twins can be very useful for simulating things like the change from selling widgets to selling as-a-service.” For example, Bridgestone uses digital twins to optimize fleets’ cost per mile, maintenance, and tire selection. This helps business teams sell tire miles as a service and align engineering decisions around longevity and maintenance strategies to improve this new business model.

9. Supply chain collaboration

Simulation tools providers like Synopsys are finding ways to simulate that span chip design and the software that runs on them. This promises to improve collaboration for products like automobile chips that have faced significant shortages owing to the integration challenges of more modern chip designs. Synopsys verification group vice president of engineering Tom De Schutter sees big promise in developing scalable digital threads that operate across the supply chain to reflect individual components through full system products. This includes digital twins of individual hardware designs, systems on a chip, electronic subsystems, and full systems. However, this will also require new infrastructure to capture, share, and track the fine-grained data powering these hybrid digital twins.

10. Smaller models

AI can also be honed to build smaller models that require less data and compute power than traditional approaches called reduced-order models, said Altair’s chief technology officer, Brett Chouinard. He expects this to increasingly support the provisioning of sophisticated digital twin models on remote devices like edge gateways and equipment. These smaller digital twins will increasingly add value to new products and services. Chouinard said, “While this is already happening, it will only get more center-stage with newer applications built around it and resulting in increased sophistication and demand for more capacities at the edge.”

11. Multi-domain digital twins

Simulation integration techniques are also opening opportunities for multi-domain simulations. These build on multi-physics and integration techniques to support working across domains such as security and physical infrastructures like power grids and gas pipelines, Scalable Network Technologies CEO and founder Dr. Rajive Bagrodia said. For example, in the power grid, attacks that delay control of a circuit breaker or falsify sensor load reporting may cause a series of cascading effects with potentially catastrophic outcomes to a regional power grid.  Multi-domain digital twins that couple physical system simulators with network emulators could improve resilience, detection, and response to these kinds of scenarios.

12. Democratization of simulation

The democratization of simulation could open new planning and development opportunities for less technical business users. Today the simulation market principally addresses industrial designers and engineers in R&D departments. More accessible tools will lower the barriers to adoption for business users, purchasing departments, and subject matter experts. “Design decisions will be much smarter because, instead of new products being selected purely based on aesthetics or performance, they will be chosen based on a full range of factors,” Roger Assaker, president of Hexagon Manufacturing Intelligence’s MSC Software division, explained.


VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link


Qualcomm Smart Cities partner weaves IoT lighting into large-scale digital twins

Join gaming leaders online at GamesBeat Summit Next this upcoming November 9-10. Learn more about what comes next. 

IoT product provider Zyter has partnered with Juganu, a technology company providing solutions for the professional lighting market, to weave advanced lighting tech into smart cities.

Zyter’s IoT platform breaks down information by integrating and consolidating data from devices and applications, weaving IoT infrastructure into large-scale smart city digital twins for buildings, stadiums, campuses, and cities. It allows municipalities to get visibility across a network of connected devices and sensors supported by analytics. Juganu uses light fixtures as the base for a self-orchestrated wireless grid that provides tech support for smart cities, which makes it a natural choice for this digital twin project.

“The partnership with Juganu will help us bring robust smart lighting, AI-enabled security, communication, and other capabilities to any of these verticals,” Zyter founder and CEO Sanjay Govil told VentureBeat.

Teaming up for smart city digital twins

The partnership will integrate Zyter’s core platform and Juganu’s lighting tech with tools and software from more than 400 Qualcomm Smart Cities Accelerator Program partners. Zyter is a core partner in Qualcomm’s program.

Juganu’s The Foam platform integrates with cameras, pedestrian counters, and edge computers to customize lighting and characterize foot traffic with privacy safeguards. Its latest lighting, which the company claims kills the COVID-19 virus, has attracted fresh funding from Comcast and NCR.

Previous Zyter partnerships have focused on construction safety (Everguard), clinical data management (TruCare), and remote patient monitoring (Ceiba). Its platform aims to unify app development across devices in verticals such as health care, education, logistics, retail, travel, and construction.

This partnership could also take advantage of Zyter’s work in creating lidar-based digital twins, which are virtual representations of an object or system that span its lifecycle, are updated with real-time data, and use simulation, machine learning, and reasoning to aid decision-making. Digital twins make it easier to interface with and control IoT devices in a certain space. For example, a city manager can now view and manage all the lights in both indoor and outdoor spaces. A manager can also explore areas of the city by pulling up camera surveillance feeds or reviewing all recorded incidents in an area while exploring the digital twin of a city or buildings and spaces in a city.

In the long run, Zyter’s CEO Govil believes there are opportunities to collaborate with other companies working on APIs for physical infrastructures, like the data infrastructure platform Mapped, to help with standardization and interoperability. Mapped simplifies access to physical building assets through a standard vocabulary while supporting a secured API perimeter.

“We feel that Mapped and Zyter are trying to solve similar problems, although in slightly different ways,” Govil said.


VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link


Using digital twins in health care to stave off the grim reaper

All the sessions from Transform 2021 are available on-demand now. Watch now.

VentureBeat caught up with NTT Research Medical & Health Informatics Lab director Dr. Joe Alexander, who elaborated on his view of the future of “bio digital twins,” which promise to improve precision medicine and bring digital transformation to the health care industry.

Japanese telecom giant NTT has launched a major initiative to improve digital health through precision medicine using digital twin technology. This project is part of NTT Research, a new R&D hub focused on basic research. The goal is to address long-term technological challenges with solutions that, once achieved, can positively impact wider ranges of businesses and many parts of our lives. These projects are not tied to specific product roll-out plans but could lead to much more significant long-term improvements than conventional incremental research conducted by enterprises.

The why behind the digital twin application

VentureBeat: What exactly is medical and health informatics — where does it fit into the landscape of other enterprise medical software like EHRs, diagnostics, telemedicine, and research?

Dr. Joe Alexander: Medical informatics is the sub-discipline of health informatics that directly impacts the patient-physician relationship. It focuses on the information technology that enables the effective collection of data using technology tools to develop medical knowledge and to facilitate the delivery of patient medical care. The goal of medical informatics is to ensure access to critical patient medical information at the precise time and place it is needed to make medical decisions. Medical informatics also focuses on the management of medical data for research and education.

The acquisition, storage, retrieval, and use of health care information to foster better collaboration among a patient’s various health care providers is the study of health informatics. It plays a critical role in the push toward health care reform. Health informatics is an evolving specialization that links information technology, communications, and health care to improve the quality and safety of patient care. EHRs help providers better manage care for patients and are an important part of health informatics.

Telemedicine has more to do with the access and sharing of medical information for the purpose of treating patients remotely. The term “diagnostics” can be applied to any process or device that involves techniques for (medical) diagnoses.

One current area of research that is of particular interest to our team is precision cardiology. This includes the cardiovascular bio digital twin as well as heart-on-a-chip technologies.

Research at MEI Labs does not currently target EHR software development or telemedicine per se. Our work does support remote monitoring, diagnostics, and advanced therapeutics.

VentureBeat: What is the bio digital twin initiative, and how do you plan to advance it?

Alexander: A bio digital twin is an up-to-date virtual representation (an electronic replica) which provides real-time insights into the status of a real-world asset to enable better management and to inform decision-making. This concept has been applied to the preventive maintenance of jet engines and may be applied as well to the predictive maintenance of health.

The Bio Digital Twin (BioDT) initiative aims to individualize and revolutionize health care by use of BioDT technologies. We will first realize precision cardiology on multiple scales through development of a cardiovascular BioDT (CV BioDT) and heart-on-a-chip platforms. The CV BioDT is at the whole organ physiological system level, whereas the heart-on-a-chip is at a microfluidics level, making use of an individual’s stem cells to make in vitro organs.

For the CV BioDT, we will begin with acute conditions (acute myocardial infarction and acute heart failure) and progress to chronic cardiovascular conditions and their co-morbidities and complications. The latter requires heavy dependence on organ systems other than the heart. Ultimately, based on our accumulating knowledge of underlying physiological and pathophysiological mechanisms (together with advanced sensing technologies), we will be able to move into wellness and prevention.

Can digital twins in health care save a life?

VentureBeat: What is the value of a digital twin, and how does it build upon other technologies for capturing and managing medical data or simulating things?

Alexander: We expect that our bio digital twin will best enable individualized care. By reproducing an individual’s entire physiology based on causal mechanisms, we should be able to predict health issues as well as provide recommendations for therapies in complex patients through “what if” scenario testing.

Autonomous therapies — delivered by the bio digital twin — become possible, where the physician would simply monitor autonomous devices. Virtual clinical trials in populations of bio digital twins also become feasible and would dramatically accelerate drug (or vaccine) development.

What we are proposing is not evolutionary, but revolutionary. An ambitious project of this scope and scale will take time. We will certainly need continuously to inventory the evolving trajectories of clinical and technology landscapes for facilitatory impact points.

VentureBeat: Why did you decide to start with the heart, and how will this complement other, similar efforts?

Alexander: We started with cardiovascular disease because it is the global leading cause of death. One of the principal missions of NTT Research is to provide long-term benefits to humanity; this is fundamental to deciding what projects to pursue.

Our immediate cardiovascular disease targets will be acute myocardial infarction (AMI) and acute heart failure (acute HF). We will pursue chronic heart failure and other conditions afterwards.

VentureBeat: What’s next in digital twins and why?

Alexander: Following development of the CV BioDT, our next pursuit will be neurodegenerative diseases, e.g., Alzheimer’s disease and Parkinson’s disease. Our reasoning here is similar: neurodegenerative diseases are the 2nd leading cause of death, at least in the U.S.

Organs on a chip

VentureBeat: What kinds of things are you working on with nano and microscale sensors and electrodes?

Alexander: MEI Lab is developing “organ-on-a-chip” microfluidics platforms as well as three-dimensionally transformable and implantable electrodes. This work involves the exploration and examination of new materials that include nanofibers and nanofiber-based paper electrodes.

VentureBeat: Which ones show the most promise in the short term and possibility in the long term?

Alexander: This is a difficult question for me to answer since I am not directly involved in the research. However, all our targets tend to be long term. Based on current progress, microscale three-dimensionally transformable electrodes for sensing are more promising in the shorter term, followed by similar types of electrodes for both stimulating and sensing. Organ-on-a-chip platforms will likely mature in the longer term.

VentureBeat: What are some of the key developments in digital biomarkers, wearable technologies, and remote sensing you are exploring?

Alexander: While we are in an ongoing background process of doing a clinical and technical landscape inventory of such devices, we have not yet developed a strategy within the MEI Lab to point us in any particular directions. Our focus right now is on acute conditions where patients are hospitalized and well-instrumented for access to the directly observable data necessary for early model building, verification, and validation.


VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link


Aforza boosts digital twins for consumer packaged goods with $20M raise

All the sessions from Transform 2021 are available on-demand now. Watch now.

Aforza, a digital twin platform for orchestrating consumer packaged goods sales, has landed $22 million in series A funding. The investment will help grow employees and create a new U.S. headquarters. Investors included DN Capital, Bonfire Ventures, Daher Capital, and Next47.

Consumer packaged goods (CPG) include food and beverage, alcohol, beers and spirits, consumer health care, household products, tobacco, pet care, and consumer electronics. The company was founded by former Salesforce execs involved in CPG efforts.

The cloud service sits on top of Salesforce and Google Cloud platforms and helps connect the dots across key processes required to sell consumer goods. This includes commercial planning, field sales to retail channels, coordinating promotions, orchestrating distribution, and tracking the impact of promotional experiments. This allows teams to iteratively experiment with ideas across different markets and scale up the successful ones.

Growing in a slow growth market

The company plays in the $14.5 billion consumer packaged goods (CPG) software market led by SAP, Microsoft, Adobe, and Salesforce, according to Apps Run the World.  As a whole, this category of tools is only growing at 1.5%.

However, Aforza believes its strategy focused on building digital twins to enhance workflows gives it a competitive edge. Research from Oliver Wyman found that companies using legacy CPG platforms lost at least 5% of sales because of lack of availability. And Progressive Grocer found that 70% of the money invested into trade promotion programs was unprofitable.

“We see an inherent disconnect across the industry in the way commercial planning and field sales teams are working together. This is both in the way they communicate with each other and the systems they use,” Aforza CEO and cofounder Dominic Dinardo told VentureBeat.

Vendors are exploring ways to create digital twins for more ephemeral things like digital twins of the organization and supply chains. Aforza is arguably creating something similar for product distribution.

For example, there are hiccups between the applications used for finance, trade planning, sales, and marketing. Even when these systems are integrated together, the data flow only provides historical context. As a result, finance needs to wait to see the impact investments, trade planners have trouble tracking execution, and sales teams have difficulty tracking which stores faithfully carry out the promotions.

Aforza believes that a digital twin of the sales and distribution system helps to align these efforts and provides real-time data exchanges across different workflows such as:

  • Launching highly targeted promotions, measuring results, and then simulating different scenarios to estimate budget impact.
  • Connecting new plans to the retail auditing process to see which stores comply with promotion agreements and correlate these efforts with sales, pricing information, and competitive product strategies.
  • Capturing the financial progress and ROI of a new promotion in real time so that teams can launch multiple experiments across various markets and then pivot to the most successful ones.

Digitizing ephemeral things

Digital twins are typically associated with concrete physical things like cars, airplanes, or buildings. Supply chains and distribution channels are a bit more ephemeral, which can make it difficult for everyone to see the outlines and mechanics of how it works, how it breaks, and what strategies bring the most success.

Aforza’s tools provide a digital twin of a company’s route-to-market and distribution channel. This approach has strong parallels with supply chain digital twins because it provides a real-time closed loop of information across multiple parties.

In other industries, the digital supply chain twin is a digital copy of a company’s actual supply chain, with inputs fed into the model in real time. Aforza does something similar for trade promotion ROI and optimization.

The CPG industry is starting to adopt the term “trade promotion execution” (TPX) to describe the lifecycle of experimenting with new marketing and promotion ideas. This helps companies close the loop between trade promotion planning and retail execution. They can plan, execute, and improve promotions in real time.

One thing companies need to keep an eye on is compliance. Just because a larger retailer has taken your money to launch a new promotion does not mean that every branch will faithfully hang the new signs, which some managers may find disagreeable or too much effort. This may hurt sales, but also planning teams cannot properly correlate sales changes with what is happening in the stores.

Another key element of building a digital twin is understanding how outside events like the weather and the competition affect sales efforts. For example, an ice cream company may want to experiment with promotions across several outlets by the beach. Meanwhile, a competitor has just launched a series of competitive campaigns that are hitting sales. The first field team picks up on this, which is immediately shared with the digital twin. This input is fed into an AI model, which automatically optimizes the targeting, suggests a set of new promotions for the sales team to run, and predicts the impact of various strategies for outmaneuvering the competition.

Dinardo maintains the Aforza approach is part of a much larger trend driving digital transformation across various industries.

“The digital twin concept is all about moving away from using historical data and outdated analytical models to inform your decision-making process, towards a real-time, closed-loop way of working,” he said.

He points to the success of other leaders such as Veeva Systems in the life sciences industry and Vlocity (Salesforce Industries) across the communications, energy & utilities, and health care sectors.

“I am a great admirer of what they have done and how they have connected planning to execution in real time using the cloud,” Dinardo said.


VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link


Ansys CTO sees simulation accelerating digital twins development

All the sessions from Transform 2021 are available on-demand now. Watch now.

Long before there were digital twins or the internet of things, Ansys was making simulation tools to help engineering teams design better products, model the real world, and expand the boundaries of science research.

VentureBeat caught up with Ansys CTO Prith Banerjee, who elaborated on why interest in digital twins is taking off, how modeling and simulation are undergoing key developments, and how AI and traditional simulation approaches are starting to complement one another. His view is that of a foundational player surveying a robust set of new applications.

This interview has been edited for clarity and brevity.

VentureBeat: What do executive managers need to know about modeling and simulation today? They both allow us to peer deeper into things, but how do these underlying technologies serve in various contexts to speed up the ability to explore different designs, trade-offs, and business hypotheses?

Prith Banerjee: Simulation and modeling help companies around the world develop the products that consumers rely on every day — from mobile devices to cars to airplanes and frankly everything in between. Companies use simulation software to design their products in the digital domain — on the computer — without the need for expensive and time-consuming physical prototyping.

The best way to understand the advantages of simulation is by looking at an example: One blue chip customer is leveraging simulation technology to kickstart digital transformation initiatives that will benefit customers by lowering development costs, cutting down the time it takes to bring products to market. A more specific example would be a valve in an aircraft engine that regulates pressure in a pipe, or a duct that needs to be modeled in many ways.

Through digital modeling, engineers can vary the pressure and temperature of the valve to gauge its strength and discover failure points more quickly. As a result, engineers no longer need to build and test several different configurations. In the past, engineers would build multiple prototypes in hardware, resulting in long times and cost. Now they can build the entire virtual prototype through software simulation and create an optimal design by exploring thousands of designs.

VentureBeat: How would you define a digital twin, and why do you think people are starting to talk about them more as a segment?

Banerjee: Think of a digital twin as a connected, virtual replica of an in-service physical entity, such as an asset, a plant, or a process. Sensors mounted on the entity gather and relay data to a simulated model (the digital twin) to mirror the real-world experience of that product. Digital twins enable tracking of past behavior of the asset, provide deeper insights into the present, and, most importantly, they help predict and influence future behavior.

While digital twins as a concept are not new, the technology necessary to enable digital twins (such as IoT, data, and cloud computing) has only recently become available. So, digital twins represent a distinct new application of these technology components in the context of product operations and are used in various phases — such as design, manufacturing, and operations — and across various industries — like aerospace, automotive, manufacturing, buildings and infrastructure, and energy. Also, they typically impact a variety of business objectives. That could include services, predictive maintenance, yield, and [overall equipment effectiveness], as well as budgets. They also scale with a number of monitored assets, equipment, and facilities.

In the past, customers have built digital twins using data analytics from data gathered from sensors using an IOT platform alone. Today, we have demonstrated that the accuracy of the digital twins can be greatly enhanced by complementing the data analytics with physics-based simulation. It’s what we call hybrid digital twins.

Above: Ansys CTO Prith Banerjee

VentureBeat: In what fundamental ways do you see modeling and simulation complementing digital twins and vice versa?

Banerjee: Simulation is used traditionally to design and validate products — reducing physical prototyping and cost, yielding faster time to market, and helping design optimal products. The connectivity needed for products to support digital twins adds significant complexity. That complexity could include support for 5G or increased concerns about electromagnetic interference.

With digital twins, simulation plays a key role during the product operation, unlocking key benefits for predictive and prescriptive maintenance. Specifically, through physics, simulation provides virtual sensors, enables “what-if” analysis, and improves prediction accuracy.

VentureBeat: AI and machine learning models are getting much press these days, but I imagine there are equally essential breakthroughs in other types of models and the trade-offs between them. What do you think are some of the more exciting advances in modeling for enterprises?

Banerjee: Artificial intelligence and machine learning (AI/ML) have been around for more than 30 years, and the field has advanced from concepts of rule-based expert systems to machine learning using supervised learning and unsupervised learning to deep learning. AI/ML technology has been applied successfully to numerous industries such as natural language understanding for intelligent agents, sentiment analysis in social media, algorithmic trading in finance, drug discovery, and recommendation engines for ecommerce.

People are often unaware of the role AI/ML plays in simulation engineering. In fact, AI/ML is applied to simulation engineering and is critical in disrupting and advancing customer productivity. Advanced simulation technology, enhanced with AI/ML, super-charges the engineering design process.

We’ve embraced AI/ML methods and tools for some time, well before the current buzz around this area. Physics-based simulation and AI/ML are complementary, and we believe a hybrid approach is extremely valuable. We are exploring the use of these methods to improve the runtimes, workflows, and robustness of our solvers.

On a technical level, we are using deep neural networks inside the Ansys RedHawk-SC product family to speed up Monte Carlo simulations by up to 100x to better understand the voltage impact on timing. In the area of digital twins, we are using Bayesian techniques to calibrate flow network models that then provide highly accurate virtual sensor results. Early development shows flow rate correlation at multiple test points within 2%.

Another great example where machine learning is meaningfully impacting customer design comes from autonomous driving simulations. An automotive customer in Europe leveraged Ansys OptiSLang machine learning techniques for a solution to the so-called “jam-end” traffic problem, where a vehicle in front changes lanes suddenly, [impacting] traffic. According to the customer, they were able to find a solution to this 1,000 times faster than when using their previous Monte Carlo methods.

VentureBeat: So, Ansys has been in the modeling and simulation business for quite a while. How would you characterize some of the significant advances in the industry over this period, and how is the pace of innovation changing with faster computers, faster DevOps processes in software and in engineering, and improvements in data infrastructure?

Banerjee: Over time, model sizes have grown drastically. Fifty years ago, simulation was used to analyze tiny portions of larger components, yet it lacked the detail and fidelity we rely on today. At that time, those models were comprised of dozens –at most hundreds — of simulation “cells.” Today, simulation is solving massive models that are comprised of millions (and sometimes even billions) of cells.

Simulation is now deployed to model entire products, such as electric batteries, automobiles, engines, and airplanes. As a result, simulation is at the forefront of advancing electrification, aerospace, and key sustainability initiatives aimed at solving the world’s biggest problems.

The core concepts of simulation were known a decade ago; however, customers were forced to run their simulations using coarse meshing to approximate their simulations to get the results back overnight. Today, with advances in high-performance computing, it is possible to accomplish incredibly accurate simulation of the physics in a very short amount of time. Furthermore, by using AI/ML we are exploring another factor of ten to one hundred times the speed and accuracy that was previously possible, all enabled by HPC on the cloud.

VentureBeat: What do you think are some of the more significant breakthroughs in workflows, particularly as you cross multiple disciplines like mechanical, electrical, thermal, and cost analysis for designing new products?

Banerjee: The world around us is governed by the laws of physics, and we solve these physics equations using numerical methods such as finite element or finite volume methods. In the past, our customers used simulation to model only a single physics — such as structures or fluids or electromagnetics — at a given time since the computational capabilities were limited. But the world around us is not limited to single physics interactions. Rather, it has multiphysics interactions.

Our solvers now support multiphysics interactions quickly and accurately. Ansys Workbench, which allows cross-physics simulation tools to integrate seamlessly, was a key breakthrough in this market. Workbench opened new simulation capabilities that, prior to its inception, would have been nearly impossible. Our LS-DYNA tool supports multiphysics interactions in the tightest manner at each time step. Beyond Workbench, today the market is continuing to expand into areas like model-based systems engineering, as well as broader systems workflows like cloud.

Finally, with the use of AI/ML, we are entering a world of generative design, exploring 10,000 different designs to specification, and rapidly simulating all of them to give the best option to the designer. A very exciting future indeed!


VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link


21 ways medical digital twins will transform health care

Where does your enterprise stand on the AI adoption curve? Take our AI survey to find out.

The health care industry is starting to adopt digital twins to improve personalized medicine, health care organization performance, and new medicines and devices. Although simulations have been around for some time, today’s medical digital twins represent an important new take. These digital twins can create useful models based on information from wearable devices, omics, and patient records to connect the dots across processes that span patients, doctors, and health care organizations, as well as drug and device manufacturers.

It is still early days, but the field of digital twins is expanding quickly based on advances in real-time data feeds, machine learning, and AR/VR. As a result, digital twins could dramatically shift how we diagnose and treat patients, and help realign incentives for improving health. Some proponents liken the current state of digital twins to where the human genome project was 20 years ago, and it may require a similar large-scale effort to take shape fully. A team of Swedish researchers recently wrote, “Given the importance of the medical problem, the potential of digital twins merits concerted research efforts on a scale similar to those involved in the HGP.”

While such a “moon shot” effort may not be immediately underway, there are many indicators that digital twins are gaining traction in medicine. Presented here are 21 ways digital twins are starting to shape health care today, broken roughly into personalized medicine, improving health care organizations, and drug and medical devices and development. In fact, many types of digital twins span multiple use cases and even categories; it is these cross-domain use-cases that form a major strength of digital twins.

Personalized medicine

Digital twins show tremendous promise in making it easier to customize medical treatments to individuals based on their unique genetic makeup, anatomy, behavior, and other factors. As a result, researchers are starting to call on the medical community to collaborate on scaling digital twins from one-off projects to mass personalization platforms on par with today’s advanced customer data platforms.

1. Virtual organs

Several vendors have all been working on virtual hearts that can be customized to individual patients and updated to understand the progression of diseases over time or understand the response to new drugs, treatments, or surgical interventions. Philip HeartModel simulates a virtual heart, starting with the company’s ultrasound equipment.  Siemens Healthineers has been working on a digital twin of the heart to improve drug treatment and simulate cardiac catheter interventions. European startup FEops has already received regulatory approval and commercialized the FEops Heartguide platform. It combines a patient-specific replica of the heart with AI-enabled anatomical analysis to improve the study and treatment of structural heart diseases.

Dassault launched its Living Heart Project in 2014 to crowdsource a virtual twin of the human heart. The project has evolved as an open source collaboration among medical researchers, surgeons, medical device manufacturers, and drug companies. Meanwhile, the company’s Living Brain project is guiding epilepsy treatment and tracking the progression of neurodegenerative diseases. The company has organized similar efforts for lungs, knees, eyes, and other systems.

“This is a missing scientific foundation for digital health able to power technologies such as AI and VR and usher in a new era of innovation,” Dassault senior director of virtual human modeling Steve Levine told VentureBeat. He added that this “could have an even greater impact on society than what we have seen in telecommunications.”

2. Genomic medicine

Swedish researchers have been mapping mice RNA into a digital twin that can help predict the effect of different types and doses of arthritis drugs. The goal is to personalize human diagnosis and treatment using RNA. The researchers observed that medication does not work about 40% to 70% of the time. Similar techniques are also mapping the characteristics of human T-cells that play a crucial role in immune defense. These maps can help diagnose many common diseases earlier when they are more effective and cheaper to treat.

3. Personalized health information

The pandemic has helped fuel the growth of digital health services that help people assess and address simple medical conditions using AI. For example, Babylon Health‘s Healthcheck App captures health data into digital twins. It works with manually entered data such as health histories, a mood tracker, symptom tracker, and automatic capture from fitness devices and wearables like the Apple Watch. The digital twin can provide basic front-line information or help guide priorities and interactions with doctors to address more severe or persistent conditions.

4. Customize drug treatment

The Empa research center in Switzerland is working on digital twins to optimize drug dosage for people afflicted by chronic pain. Characteristics such as age and lifestyle help customize the digital twin to help predict the effects of pain medications. In addition, patient reports about the effectiveness of different dosages calibrate digital twin accuracy.

5. Scanning the whole body

Most approaches to digital twins build on existing equipment to capture the appropriate data, while Q Bio’s new Gemini Digital Twin platform starts with a whole-body scan. The company claims to capture a whole-body scan in 15 minutes without radiation or breath holds, using advanced computational physics models that are more precise than conventional MRI for many diagnoses. The company has received over $80 million from Andreessen Horowitz, Kaiser Foundation Hospitals, and others. Q Bio is also developing integrations to improve these models using data from genetics, chemistry, anatomy, lifestyle, and medical history.

6. Planning surgery

A Boston hospital has been working with Dassault’s digital heart to improve surgical procedure planning and assess the outcomes afterward. The digital twins also help them to generate the shape of a cuff between the heart and arteries.

Sim&Cure’s Sim&Size is a digital twin to help brain surgeons treat aneurysms using simulations to improve patient safety. Aneurysms are enlarged blood vessels that can result in clots or strokes. These digital twins can improve the ability to plan and execute less invasive surgery using catheters to install unique implants. Data from individual patients helps customize simulations that run on an embedded simulation package from Ansys.  Preliminary results have dramatically reduced the need for follow-up surgery.

Improving health care organizations

Digital twins also show promise in improving the way health care organizations deliver care. Gartner coined the term digital twin of the organizations to describe this process of modeling how an organization operates to improve underlying processes.

In most industries, this can start by using process mining to discover variations in business processes. New health care-specific tools can complement these techniques.

7. Improving caregiver experience

Digital twins can also help caregivers capture and find information shared across physicians and multiple specialists. John Snow Labs CTO David Talby said, “We’re generating more data than ever before, and no one has time to sort through it all.” For example, if a person sees their regular primary care physician, they will have a baseline understanding of the patient, their medical history, and medications. If the same patient sees a specialist, they may be asked many of the same repetitive questions.

A digital twin can model the patient and then use technologies like NLP to understand all of the data and cut through the noise to summarize what’s going on. This saves time and improves the accuracy of capturing and presenting information like specific medications, health conditions, and more details that providers need to know in context to make clinical decisions.

8. Driving efficiency

The GE Healthcare Command Center is a major initiative to virtualize hospitals and test the impact of various decisions on changes in overall organizational performance. Involved are modules for evaluating changes in operational strategy, capacities, staffing, and care delivery models to objectively determine which actions to take. For example, they have developed modules to estimate the impact of bed configurations on care levels, optimize surgical schedules, improve facility design, and optimize staff levels. This allows managers to test various ideas without having to run a pilot. Dozens of organizations are already using this platform, GE said.

9. Shrinking critical treatment window

Siemens Healthineers has been working with the Medical University of South Carolina to improve the hospital’s daily routine through workflow analysis, system redesign, and process improvement methodologies. For example, they are working to reduce the time to treat stroke patients. This is important since early treatment is critical but requires the coordination of several processes to perform smoothly.

10. Value-based health care

The rising cost of health care has many nations exploring new incentive models to better align new drugs, interventions, and treatments with outcomes. Value-based health care is one approach that is growing in popularity. The basic idea is that participants, like drug companies, will only get compensation proportionate to their impact on the outcomes. This will require the development of new types of relationships across multiple players in the health delivery systems. Digital twins could provide the enabling infrastructure for organizing the details for crafting these new types of arrangements.

11. Supply chain resilience

The pandemic illustrated how brittle modern supply chains could be. Health care organizations immediately faced shortages of essential personal protection equipment owing to shutdowns and restrictions from countries like China. Digital twins of a supply chain can help health care organizations model their supply chain relationships to understand better how to plan around new events, shutdowns, or shortages. This can boost planning and negotiations with government officials in a pinch, as was the case in the recent pandemic. A recent Accenture survey found that 87% of health care executives say digital twins are becoming essential to their organization’s ability to collaborate in strategic ecosystem partnerships.

12. Faster hospital construction

Digital twins could also help streamline construction of medical facilities required to keep up with rapid changes, such as were seen in the pandemic. Atlas Construction developed a digital twin platform to help organize all the details for health care construction. The project was inspired long before the pandemic when Atlas founder Paul Teschner saw how hard it was to get new facilities built in remote areas of the world. The platform helps organize design, procurement, and construction processes. It is built on top of the Oracle Cloud platform and Primavera Unifier asset lifecycle management service.

13. Streamlining call center interactions

Digital twins can make it easier for customer service agents to understand and communicate with patients. For example, a large insurance provider used a TigerGraph graph database to integrate data from over 200 sources to create a full longitudinal health history of every member. “This level of detail paints a clear picture of the members current and historical medical situation,” said TigerGraph health care industry practice lead Andrew Anderson.

A holistic view of all diagnosis claims prescriptions, refills, follow-up visits, and outstanding claims reduced call handling time by 10%, TigerGraph claimed, resulting in over $100 million in estimated savings. Also, shorter but more relevant conversations between the agents and members have increased Net Promoter Score and lowered churn.

Drug and medical device development

There are many ways that digital twins can improve the design, development, testing, and monitoring of new medical devices and drugs. The U.S. FDA has launched a significant program to drive the adoption of various types of digital approaches. Regulators in the U.S. and Europe are also identifying frameworks for including modeling and simulation as sources of evidence in new drug and device approvals.

14. Software-as-a-medical device

The FDA is creating the regulatory framework to allow companies to certify and sell software-as-a-medical device. The core idea is to generate a patient-specific digital twin from different data sources, including lab tests, ultrasound, imaging devices, and genetic tests. In addition, digital twins can also help optimize the software in medical devices such as pacemakers, automated insulin pumps, and novel brain treatments.

15. Classifying drug risks

Pharmaceutical researchers are using digital twins to explore the heart risks of various drugs. This could help improve drug safety of individual drugs and drug combinations more cost-effectively than through manual testing. They have built a basic model for 23 drugs. Extending this model could help reduce the estimated $2.5 billion required to design, test, get approved, and launch new drugs.

16. Simulating new production lines

Siemens worked with several vaccine manufacturers to design and test various vaccine production line configurations. New mRNA vaccines are fragile and must be precisely combined using microfluidic production lines that precisely combine nanoscale-sized particles. Digital twins allowed them to design and validate the manufacturing devices, scale these processes, and accelerate its launch from 1 year down to 5 months.

17. Improve device uptime

Philips has launched a predictive maintenance program that collates data from over 15,000 medical imaging devices. The company is hoping that digital twins could improve uptime and help their engineers customize new equipment for the needs of different customers. In addition, it is hoping to apply similar principles across all of its medical equipment.

18. Post-market surveillance

Regulators are beginning to increase the emphasis for device makers to monitor the results of their equipment after-sales as part of a process called post-market surveillance. This requires either staffing expensive specialists to maintain the equipment or embedding digital twins capabilities into the equipment. For example, Sysmex worked with PTC to incorporate performance testing into its blood analyzer to receive a waiver from these new requirements, PTC CTO Steve Dertien told VentureBeat. This opened the market for smaller clinical settings closer to patients, which can speed diagnosis.

19. Simulating human variability

Skeletons and atlases commonly depict the perfect human. However, real-life humans typically have some minor variations in their muscles or bones that mostly go unnoticed. As a result, medical device makers struggle with how common anatomical variations among people may affect the fit and performance of their equipment. Virtonomy has developed a library of common variations to help medical equipment makers test conduct studies on how these variations may affect the performance and safety of new devices. In this case, they simulate the characteristics representing common variations in a given population rather than individuals.

20. Digital twin of a lab

Modern drug development often requires testing out thousands or millions of possibilities in a highly controlled environment. A digital twin of the lab can help to automate these facilities. It can also help to prioritize tests in response to discoveries. Digital twins could also improve the reproducibility of experiments across labs and personnel in the same lab. In this quest, Artificial recently closed $21.5 million in series A funding from Microsoft and others to develop lab automation software. The company is betting that unified data models and platforms could help them jump to the front of the $10 billion lab automation market.

21. Improving drug delivery

Researchers at Oklahoma State have been working with Ansys to develop a digital twin to improve drug delivery using models of simulated lungs as part of the Virtual Human System project. They found that only about 20% of many drugs reached their target. The digital twins allowed them to redesign the drug’s particle size and composition characteristics to improve delivery efficiency to 90%.


VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link


Digital twins help transform the construction industry

Elevate your enterprise data technology and strategy at Transform 2021.

Digital twins promise to be a key enabler as the construction industry races to catch up with demand for new facilities and new layouts in the wake of COVID-19. Use of such technology, which creates a digital representation of real-world systems and components, is important for an industry seen as slow to adopt digital technology relative to others.

Construction is a complex undertaking, with legacy processes that span regulators, architects, contractors, and building owners. Digital transformation requires finding ways to bridge these divides — not just within elements of each participant’s domain, but also between them.

Still, practical benefits will come from harmonizing the way different groups create and manage data, according to John Turner, vice president of innovative solutions at Gafcon, a digital twin systems integrator.

Growing demand, increased complexity, and more sophisticated design authoring tools will drive the change, according to Rich Humphrey, vice president of construction at infrastructure software maker Bentley Systems. He estimates that the construction software market is currently upwards of $10 billion and could grow significantly thanks to the adoption of digital twins. “The industry is already seeing value in managing risk, reducing rework, and driving efficiencies in the way they deliver projects using digital twins,” Humphrey told VentureBeat.

Change could be far-reaching in an industry that represents one of the largest asset classes in the world.

“There are more than 4 billion buildings in the world today, which is twice as many as websites are online,” said RJ Pittman, CEO of Matterport, a reality capture service for buildings. The rush is on, not only to build more efficiently, but also to increase the value of existing buildings, which today represent a $230 trillion asset class.

Warp speed ahead

COVID is accelerating the demand for digital twin technology. CRB, a construction provider for the biotech industry, recently turned to Matterport to help design and build new vaccine plants as part of Operation Warp Speed. They used Matterport to capture the layout of existing plants, as well as to improve the design and layout of new ones. A digital twin also allowed them to model the workflow and safety properties of the new facilities to identify and rectify any bottlenecks before the new facilities were started.

“Tools like Matterport enable seamless collaboration in the same space because it’s browser-based,” said Chris Link, virtual design and construction manager at CRB. Data is not lost from multiple handoffs between a designer, builder, and owner.

Digital twins also dramatically reduced the need for engineers to travel to existing or new plants. On one project, CRB reduced the number of onsite engineers from 10 to 1, reduced travel costs by 33%, and expedited design by three weeks. One key benefit is that Matterport can capture and harmonize data across different participants and enable people to collaborate within a single platform instead of what was previously a handoff scenario between design and engineering.

Digital twins can reduce the operational expenditures associated with a facility occurring after facility handoff, accounting for 80% or more of the total facility lifetime cost.

“A digital twin is a goldmine to a facility owner because there is currently a significant data loss in engineering and construction,” Link said. Building managers can use digital twins to understand why things were engineered and designed in the manner they were, and this understanding translates to simplified maintenance. For example, maintenance technicians called in to repair a broken pump can utilize the digital twin to understand the design and intent of the pump. They can see the bigger picture, not just the broken pump in front of them.

Reshape, rewire, rethink

Construction-related spending accounts for about 14% of the world GDP and is expected to grow from $10 trillion in 2017 to $14 trillion in 2025, according to McKinsey. The consulting firm also says that about $1.6 trillion in additional value could be created through higher productivity. McKinsey identified seven best practices that could use digital twins to boost productivity by 50 to 60%:

  1. Reshape regulation — Accelerate approvals with testable plans and enable the adoption of performance-based requirements.
  2. Rewire contracts — Improved information sharing enables new contractual models.
  3. Rethink design — New designs could be tested and iterated more efficiently.
  4. Improve onsite execution — Easier detection of scheduling clashes.
  5. Infuse technology and innovation — Improve orchestration with IoT, drones, and AI planning.
  6. Reskill workers — Facilitate new training programs for innovative technologies using VR.
  7. Improve procurement and supply chain — Better harmonization between current progress and deliveries.

McKinsey predicts that firms could see further productivity gains by adopting a manufacturing system of mass production, with much more standardization appearing across global factory sites. These efforts require greater harmonization between design, manufacturing, and construction, as well as much tighter tolerances. Some early successes include: Barcelona Housing Systems estimates it can reduce labor 5 to 10 times for multi-story homes;
Finnish industrial company Outotec has created a process for small mines that reduces labor by 30%, capital by 20%, and time by 30%; and Broad Sustainable Buildings of China erected a 30-story hotel in 15 days.

Digital twins mind the gaps

“Digital twins are about connecting to real-life objects or information,” said Connor Christian, senior product manager at Procore, a construction software provider. That is a key issue in an area that combines so many different engineering facets.

In fact, the construction industry has evolved a piecemeal approach to managing different data sources, including GIS for location data, building information modeling (BIM) for 3D data, and virtual design and construction (VDC) for project management. This challenges digital twin implementation.

While any job site with sensors or cameras has the potential to create digital twins that allow for access, control, and reporting from those devices, the fact is that not all data is good data, so there must be standards, processes, and verifications in place to help filter out unnecessary data, Christian said.

Different processing stages are involved in turning raw data into the higher-level abstractions required to improve construction processes, said David McKee, CEO, CTO, and founder at Slingshot Simulations and co-chair at the Digital Twin Consortium. For example, Slingshot recently deployed a workflow that combined European Space Agency Sentinel-1 InSAR data from SatSense that looks at ground movement, merged this with infrastructure data, correlated this with traffic data, and presented that back to stakeholders to understand the risks to transport infrastructure.

McKee has found it helpful to adopt IBM Design Thinking approach and Agile software engineering practices for building and deploying digital twins.

“This approach means that even in some of the biggest infrastructure projects, you can start engaging stakeholders within a couple of weeks,” McKee said. For example, his team has recently kicked off a project to improve the transport network in one of the busiest shipping hubs in the UK in the wake of Brexit.

Digital twins can also help fill in the semantic gaps in traditional BIM and GIS tools, said Remi Dornier, vice president of construction, cities, and territories at Dassault Systemes. Digital twins also provide a way to include all the necessary details to perform purchasing and construction assembly. And they can also improve ergonomics. For example, Dassault has been working on a simulation for nursing homes to help eliminate heavy lifting associated with caring for patients.

DevOps for construction

Gafcon’s Turner said the next era of digital twins involves using digital twins to bring a DevOps-like approach to construction. That can transform the entire construction lifecycle.

But teams need to rethink the entire construction and management process to see the highest efficiencies. For example, mass timber construction is a new approach to building that uses standardized manufactured wood products with different properties than traditional wood. It involves gluing small pieces of wood together in the proper orientation.

If teams treat the material like traditional timber, they might see marginal improvements in costs, productivity, and speed. But more dramatic improvement may be possible. The kinship to IT DevOps should be apparent. Digital transformation for construction will mean including test and ops teams earlier in the process. This collaboration can sort out issues like defining assembly steps and how components must be delivered to create, hopefully, far better results.

It is not entirely clear how the construction industry will evolve from a patchwork of different tools to the well-orchestrated CI/CD-like pipelines transforming software development.

But vendors are in the hunt. Leading vendors include a patchwork of companies expanding beyond their core strengths in fields such as GIS (Trimble, ESRI), BIM (Autodesk, Bentley, Dassault), construction management (Procore, and Oracle Construction), reality capture (Matterport and SiteAware), and supply chain management (SiteSense). Digital twins integrators such as Swinerton, Gafcon, and Lendlease Podium help to meld these tools into well-orchestrated workflows that span the design, construction, and operations lifecycle.

Construction ahead

This industry’s attempts at transformation are complicated, and a lot of subsidiary elements need to successfully evolve in order for digital twins to gain traction. The recent Katerra bankruptcy underscores the challenges that even high-profile operations face in trying to transform the construction industry.

For one thing, the industry needs better data quality and context, Oracle senior director of new products, BIM, and innovation Frank Weiss told VentureBeat. The technology to gather and integrate data to create an ecosystem of digital twins is available today.

But it comes from many different sources in different formats, which can be challenging for analysis. “It’s going to take vendors, governments, and other stakeholders to work together,” Weiss said.

In addition, the industry will also have to find consensus on what defines digital twins and how they plug into existing processes. “There is still a general lack of understanding of what a digital twin is,” said Procore’s Christian. Right now, any virtual object associated with data is being called a digital twin, he suggested.

And more challenges are in the offing, including the lack of a common data interchange environment that would allow data to easily flow from software to software.

“Even with all the great APIs, cloud-based data, and platform solutions, there still remains a massive amount of data stuck in silos that are not able to be fully accessed,” Christian said.

Assembly required

Today, experts believe enterprises are barely scratching the surface of what digital twins can accomplish, Steve Holzer, principal at Holzer, an architectural and planning consultancy and member of the infrastructure working group at the Digital Twin Consortium, told VentureBeat.

While much attention focuses on the bright shiny side of digital twins, pragmatic considerations are coming into greater play, and guides from other industries are being studied. In the long run, the industry will need to adopt a new mindset to replace most legacy construction methods and processes with the product-driven mindset used in other industries.

“Once we have project thinking replaced with product thinking, construction will be replaced with assembly,” Holzer said.


VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link


Microsoft paves digital twins’ on-ramp for construction, real estate

Elevate your enterprise data technology and strategy at Transform 2021.

Digital twins for the smart building of the future are still under construction. But Microsoft is working to enable this advanced technology with a special ontology that works with its internet of things (IoT) platform Azure Digital Twins. Such capabilities move smart buildings closer to reality.

An ontology is essentially a shared data model that simplifies the process of connecting applications in a particular domain, and it’s one of the core elements for developing digital twins.

“Microsoft is investing heavily in enabling our partners with the technology and services they need to create digital twin solutions that support new and existing needs of the world’s largest real estate portfolios,” said Microsoft Azure IoT general manager Tony Shakib.

This recent push into construction extends the utility of Microsoft’s Azure Digital Twins, released last year.

To gain a foothold in the field, Microsoft partnered with RealEstateCore, a Swedish consortium of real estate owners, tech companies, and research groups, to integrate these services with various industry standards. Making a Smart Building RealEstateCore ontology for Azure Digital Twins enables the various parties in building markets — owners, construction teams, and vendors — to collaborate and communicate about real estate.

This could accelerate the ability to weave IoT data, AI models, and analytics into digital twins, and to help simplify the transition to sustainable and green innovation, currently one of one of the fastest-growing venture capital sectors.

Accelerating digital transformation

Digital transformation has been slow to develop in construction and real estate markets. Microsoft believes that the development of better standards and integrations could help accelerate such transformation. That is important if only because real estate represents one of the largest asset classes in the world. In its recent Global Building Stock Database update, Guidehouse Insights predicts the square footage of buildings will grow from about 166 billion square meters in 2020 to 196 billion square meters in 2030.

Building owners are hoping that digital twins could help increase the value of their existing holdings at less cost than building new ones.

But figuring out how to increase building asset value and net operating income is a complicated problem that spans technology and change management issues, Shakib said.

This shift is further complicated by challenges in retrofitting digital twins’ capabilities to existing building management systems. Shakib said many building management and automation vendors have attempted to limit buildings to custom, proprietary “walled garden” approaches that can hurt clients in the long run.

Better ontologies could smooth this transition. Such thinking was behind the RealEstateCore Consortium, which was born out of a partnership between academia and industry. The consortium created the RealEstateCore ontology that employed a graph data model and built on years of best practices gleaned from experience with larger property owners such as Vasakronan.

RealEstateCore can provide a bridge to various building industry standards such as Brick Schema, Project Haystack, W3C Building Topology Ontology (W3C BOT), and more. Today, different partners can run into problems integrating applications using custom data formats. This is especially relevant in construction, as there are huge pitfalls from data loss in the steps from building design to construction, commission, handover, and operation.

Seeing a return

Improved digital twins promise significant ROI for building owners and operators. By improving the categorization, integration, and fidelity of data, digital twin developers can create better digital replicas of physical buildings and the components they comprise.

Some of the early gains come from cost savings related to energy efficiency. Microsoft has been exploring these techniques on its campuses to realize 20% to 30% energy savings. These projects can start by harvesting data from existing building control systems to find room for improvement.

Microsoft’s Project Bonsai has been able to squeeze an additional 10% to 15% of savings by applying AI to optimize controls further. Down the road, the U.S. Department of Energy’s Grid-Interactive Efficient Buildings could help owners save even more by enabling their facilities to interact with the digital electric grid in real time.

Beyond energy savings, there has been rapidly growing interest in using digital twins to optimize building space, activate building amenities, and support various health and wellness scenarios in the wake of COVID. For example, RXR Realty uses Azure Digital Twins to combine building data with people counting, social distance detection, face mask detection, and air quality monitoring to provide a building wellness index. The appropriate ontology also allowed them to capture important metrics while still respecting privacy and ethics.

Turning things into assets

Digital twins help a group of people make sense of the data surfaced by IoT devices. An ontology provides a set of models for wiring these up in a particular domain, such as a building structure, system, city, or energy grid.

An ontology can provide a starting point for organizing the information to solve a problem that spans different roles, such as designers, builders, vendors, and operators. For example, a construction team might need to know how to install a new heater; a general contractor would want to know how long installing it will take, while the owner would want to know the appropriate maintenance schedule.

The built world is complex, and a smart building’s ontology must seek to represent that intricate reality in a way that is simple for developers to use. “An ontology must balance power and comprehensiveness with simplicity and ease of use to generate enough adoption,” Shakib said.

All of the major cloud vendors have announced various kinds of IoT initiatives for helping to weave sensors and actuators into new cloud applications. But Microsoft has been the only one to champion digital twins thus far. The real value of digital twins lies in helping decision-makers frame how their decisions about these IoT-related applications can be woven together to impact assets in the real world.


VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link