Categories
AI

Nvidia creates digital twin of Earth to battle climate change

Join gaming leaders online at GamesBeat Summit Next this upcoming November 9-10. Learn more about what comes next. 


Nvidia CEO Jensen Huang boasted of many world-changing technologies in his keynote at Tuesday’s GPU Technology Conference (GTC), but he chose to close on a promise to help save the world.

“We will build a digital twin to simulate and predict climate change,” he said, framing it as a tool for understanding how to mitigate climate change’s effects. “This new supercomputer will be E2, Earth 2, the digital twin of Earth, running Modulus-created AI physics at a million times speeds in the Omniverse. All the technologies we’ve invented up to this moment are needed to make E2 possible. I can’t imagine greater and more important news.”

Utilizing digital twins to model improvements for the real world

Consider this, Nvidia’s goal for 2021 — a stretch challenge that ultimately feeds into not just scientific computing but Nvidia’s ambition to transform into a full-stack computing company. Although he spent a lot of time talking up the Omniverse, Nvidia’s concept for connected 3D worlds, Huang wanted to make clear that it’s not intended as a mere digital playground but also a place to model improvements in the real world. “Omniverse is different from a gaming engine. Omniverse is built to be data center scale and hopefully, eventually, planetary scale,” he said.

Earth 2 is meant to be the next step beyond Cambridge-1, the $100 million supercomputer Nvidia launched in June and made available to health care researchers in the U.K. Nvidia pursed that effort in partnership with drug development and academic researchers, with participation from GSK and AstraZeneca. Nvidia press contacts declined to provide more details about who else might be involved in Earth 2, although Jensen may say more in a press conference scheduled for Wednesday.

The lack of detail left some wondering if E2 was for real. Tech analyst Addison Snell tweeted, “I believe the statement was meant to be visionary. If it’s a real initiative, I have questions, which I will ask after a good night’s sleep.”

Jensen Huang closes #GTC21 keynote announcing @nvidia will create an "E2" or "Earth 2" supercomputer as a digital twin of Earth. I believe the statement was meant to be visionary. If it's a real initiative, I have questions, which I will ask after a good night's sleep. #HPC #AI

By definition, a supercomputer is many times more powerful than the general-purpose computers used for ordinary business applications. That means the definition of what constitutes a supercomputer keeps changing, as performance trickles down into general-purpose computing — to the point where an iPhone of today is said to be more powerful than the IBM supercomputer that beat chess master Gary Kasporov in 1997 and far more powerful than the supercomputers used to guide the Apollo mission in the 1970s.

Battling climate change with Earth’s digital twin

Many of the advances Nvidia announced are aimed at making very high-performance computing more broadly available, for example by allowing businesses to tap into it as a cloud service and apply it to purposes such as zero-trust computing.

Today’s supercomputers are typically built out of large arrays of servers running Linux, wired together with very fast interconnects. As supercomputing centers begin opening access to more researchers — and cloud computing providers begin offering supercomputing services — Nvidia’s Quantum-2 platform, available now, offers an important change in supercomputer architecture, Huang said.

“Quantum-2 is the first networking platform to offer the performance of a supercomputer and the shareability of cloud computing,” Huang said. “This has never been possible before. Until Quantum-2, you get either bare metal high performance or secure multi-tenancy, never both. With Quantum-2, your valuable supercomputer will be cloud-native and far better utilized,”

Quantum-2 consists of a 400Gbps InfiniBand networking platform that consists of the Nvidia Quantum-2 switch, the ConnectX-7 network adapter, the BlueField-3 data processing unit (DPU), and supporting software.

Nvidia did not detail the architecture of E2, but Huang said modeling the climate of the earth in enough detail to make accurate predictions ten, 20, or 30 years in the future is a devilishly hard problem.

“Climate simulation is much harder than weather simulation, which largely models atmospheric physics — and the accuracy of the model can be validated every few days. Long-term climate prediction must model the physics of Earth’s atmosphere, oceans and waters, ice, the land, and human activities and all of their interplay. Further, simulation resolutions of one to ten meters are needed to incorporate effects like low atmospheric clouds that reflect the sun’s radiation back to space.”

Nvidia is tackling this issue using its new Modulus framework for developing physics machine learning models. Progress is sorely needed, given how fast the Earth’s climate is changing, for example with evaporation-induced droughts and drinking water reservoirs that have dropped by as much as 150 feet.

“To develop strategies to mitigate and adapt is arguably one of the greatest challenges facing society today,” Huang said. “The combination of accelerated computing, physics ML, and giant computer systems can give us a million times leap — and give us a shot.”

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link

Categories
AI

Buildots boosts digital twin process mining with $30M

All the sessions from Transform 2021 are available on-demand now. Watch now.


Buildots, a construction digital twin company, garnered a $30 million series B round led by Lightspeed Ventures, bringing its total investment to $46 million. Buildots will use the new funds to double the size of its global team, focusing on sales and R&D to expand its digital twins efforts, which use process mining techniques to improve outcomes as construction trades go digital.

“The new funding will support our ambitious growth plans for 2021-2022, including extending our existing sales team and opening new territories,” Buildots cofounder and CEO Roy Danon told VentureBeat.

“It will also support additional enhancements to the product, such as supporting more project workflows, integrations with other ecosystem players, and [fine-tuning] our AI to provide more critical insights to our clients,” he continued.

Buildots has early customers in 13 different countries, including Build Group in California and Washington state, MBN in Germany, Gammon in Hong Kong, and Wates in the U.K. Previous investors include TLV Partners, Future Energy Ventures, and Tidhar Construction Group.

Operationalizing digital twins

While other companies focus on the design or presentation of 3D construction data, Buildots specializes in operationalizing it. Buildots has concentrated on the gap between existing tools for design, scheduling, document management, and process controls that provide visibility into what’s happening on construction sites. The company focuses on higher-frequency updates and greater detail.

Founded in 2018, Buildots aims to improve the user experience for workers and managers. Its special sauce lies in streamlining and automating the reality capture process using hardhat-mounted 360-degree cameras.

The Buildots tools bring process mining techniques to construction projects. The software is able to track the exact process by which construction projects are built for the first time, Danon said. Connecting these process models with the original design and schedule information is intended to help managers learn more about bottlenecks in their existing processes and how to get them right the first time.

In the background, Buildots’ AI algorithm double-checks new work against the plan, tracks progress, and updates an as-built digital twin model. The granularity of information in Buildots enables teams to drill down on any issue found on-site and take immediate actions to keep the project on budget and on schedule.

Identifying bottlenecks through process mining

A project’s current state is captured on an ongoing basis through cameras while teams make incremental changes. Proprietary AI and computer vision algorithms fuse this data with the latest design and scheduling plans and update the platform’s internal digital twin.

For example, one European company using Buildots discovered that its concrete finishing team was proceeding much more slowly than the partition building team. This created a bottleneck for the construction of new floors.

The Buildots application alerted managers to the problem. Then it helped them formulate a new plan that diverted workers away from building partitions to finishing the concrete, which reduced delays for everyone.

Improving 3D model quality

The platform can also identify quality gaps between the plan and what was actually built. It is common for humans to miss some elements when manually comparing building documents to what they see.

Manual tracking processes tend to be infrequent; have low granularity; and rely on people’s objectivity, skill, and attention to detail. Once such processes are automated, teams capture details more frequently, which reduces the delays in resolving problems. It is also possible to drill down into construction progress at the level of an individual socket and its different stages of installation.

For example, the two images below show a 3D model of the plan on the right and a white outline where the application detected a missing outlet.

“While this isn’t a huge deal for any given outlet, on the average project, we spot a missing element for every 50 to 100 square meters,” Danon said. Averting hundreds of those issues can lead to a substantial efficiency improvement.

Digital Twins Display Discrepancies in Construction

Above: Here, software detected an overlooked power outlet requirement.

Image Credit: Buildots

Transparent AI builds trust

The focus on updating and auditing the data trail across the lifecycle of a project is another key feature. Existing market solutions such as PlanGrid, Procore, and others have already paved the way for construction teams now using mobile apps on the construction site. Today’s engineers and managers are generally comfortable using iPads or web applications in their day-to-day work.

But all these tools require someone to enter data manually. In contrast, Buildots’ approach to digital twins automates this process and connects the data to an audit trail woven into AI models. This transparency allows construction teams to understand how conclusions about a particular project scheduling problem were reached.

“We have built our platform with the principle of transparent AI, meaning that every conclusion the system makes can be drilled down into so that construction managers can develop trust with their new virtual team members,” Danon said.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link

Categories
AI

Cupix digital twin plugs into Autodesk BIM 360 for 3D builder workflows

All the sessions from Transform 2021 are available on-demand now. Watch now.


Construction digital twins pioneer Cupix today announced an integration with Autodesk’s BIM 360 construction management platform. This is intended to streamline construction workflows that weave up-to-date information about the construction process into Autodesk planning tools.

Cupix’s move builds on a prior integration into the Autodesk PlanGrid platform for construction planning. For the vendor and its customers alike, such integrations with the Autodesk environment are a key to bringing digital twins to wider markets. As a mainstay provider of tools for organizing architectural, engineering, and construction management processes, Autodesk will likely influence uptake of digital twins in these key sectors.

“We believe the 3D digital twin platform will come to be seen as a new IT pillar — in the same way ERP, BIM, CRM, and groupware are relied on to improve corporate-wide productivity,” Cupix CEO and Founder Simon Bae told VentureBeat.

He said Cupix’s goal is to simplify the process of capturing real-time data about construction progress using low-cost cameras. This allows remote contractors, managers, owners, and architects to virtually walk through job sites, create new requests for information (RFIs), and assign them based on what they see and learn in the virtual walkthrough.

Streamlining digital twin workflows

Cupix’s special sauce lies in reducing the time, cost, and effort needed to capture up-to-date spatial data in a 3D digital twin for construction. This complements other tools that generate 3D walkthroughs from drawings.

“To date, construction remains largely a 2D industry and one that hungers for technological innovation,” Bae said. “We believe 3D digital twin technology, and CupixWorks in particular, is a game-changer for customers.”

Importantly, Cupix allows non-technical users to capture a 3D representation of a job site using a consumer-grade 360-degree camera, rather than high-end cameras or lidar, components that can cost tens of thousands of dollars. With the Cupix approach, teams can update scans daily rather than waiting days or weeks for a fresh scan.

Benefits go beyond the act of data capture because traditional 3D scan data eats up a lot of bytes.

“You can easily end up with several gigabytes of data after scanning just 10,000 square feet of space,” Bae said. Cupix has focused on reducing data requirements while preserving enough fidelity for everyday use cases, he indicated.

Cupix has particularly focused on improving user experience and workflows in the construction industry. Bae argues that other 3D scanning platforms, such as Matterport, focus on wider sectors, with different requirements. Although they may provide high-resolution imagery at a low price, it can be time-consuming to complete regular 3D scans of an actual job site, making them less useful when it comes to frequently capturing data on job site views during construction, Bae maintains.

“We believe that the Cupix approach will deliver to customers the collaboration, confidence, and control they need to stay on time, on budget, and on target,” Bae said. That is important if digital twin technology is going to fulfill its promise of bringing digital transformation to construction.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link

Categories
AI

Esri boosts digital twin tech for its GIS mapping tools

All the sessions from Transform 2021 are available on-demand now. Watch now.


Geographic information system (GIS) mainstay Esri is looking to expand its stake in digital twin technologies through significant updates in its product portfolio. As it announced at its recent user conference, the company is updating complex data conversion, integration, and workflow offerings to further the digital twin technology mission.

In fact, GIS software is foundational to many digital twin technologies, although that is sometimes overlooked as the nebulous digital twin concept seeks greater clarity in the market.

Esri’s updates to its ArcGIS Velocity software promise to make diverse big data types more readily useful to digital twin applications. At Esri User Conference 2021, these enhancements were also joined by improvements in reality capture, indoor mapping, and user experience design for digital twin applications.

Reality capture is a key to enabling digital twins, according to Chris Andrews, who leads Esri product development in geo-enabled systems, intelligent cities, and 3D. Andrews gave VentureBeat an update on crucial advances in Esri digital twins’ capabilities.

“Reality capture is a beginning — an intermittent snapshot of the real world in high accuracy 3D, so it’s an integral part of hydrating the digital twin with data,” he said. “One area we will be looking at in the future is indoor reality capture, which is something for which we’re hearing significant demand.”

What is reality capture? One of the most important steps in building a digital twin is to automate the process of capturing and converting raw data into digital data.

There are many types of raw data, which generally involve manual organization. Esri is rapidly expanding workflows for creating, visualizing, and analyzing reality capture content from different sources. This includes point clouds (lidar), oriented and spherical imagery (pictures or circular pictures), reality meshes, and data derived from 2D and 3D raster and vector content such as CAD drawings.

For example, Esri has combined elements it gained from acquiring SiteScan and nFrames over the last two years with its in-house developed Drone2Map. Esri also created and is growing the community around I3S, an open specification for fusing data captured by drones, airplanes, and satellites, Andrews told VentureBeat.

ArcGIS Velocity handles big data

Esri recently disclosed updates to ArcGIS Velocity, its cloud integration service for streaming analytics and big data.

ArcGIS Velocity is a cloud-native, no-code framework for connecting to IoT data platforms and asset tracking systems, and making their data available to geospatial digital twins for visualization, analysis, and situational awareness. Esri released the first version of ArcGIS Velocity in February 2020.

“Offerings like ArcGIS Velocity are integral in bringing data into the ArcGIS platform and detecting incidents of interest,” said Suzanne Foss, Esri product manager.

Updates include stateful real-time processing introduced in December 2020, machine learning tools in April and June 2021, and dynamic real-time geofencing analysis in June 2021. The new stateful capabilities allow users to detect critical incidents in a sensor’s behavior over time, such as change thresholds and gap detection. Dynamic geofencing filters improve the analysis between constantly changing data streams.

Velocity is intended to lower the bar for bringing in data from across many different sources, according to Foss. For example, a government agency could quickly analyze data from traffic services, geotagged event data, and weather reports to make sense of a new problem. While this data may have existed before, it required much work to bring it all together. Velocity lets users get mashup data into new analytics or situational reports with a few clicks and appropriate governance. It is anticipated that emerging digital twins will tap into such capabilities.

Building information modeling tie-ins

One big challenge with digital twins is that vendors adopt file formats optimized for their particular discipline, such as engineering, operations, supply chain management, or GIS. When data is shared across tools, some of the fidelity may be lost. Esri has made several advances to bridge this gap such as adding support for Autodesk Revit and open IFC formats. It has also improved the fidelity for reading CAD data from Autodesk Civil 3D and Bentley MicroStation in a way that preserves semantics, attribution, and graphics. It has also enhanced integration into ArcGIS Indoors.

Workflows are another area of focus for digital twin technology. The value of a digital twin comes from creating digital threads that span multiple applications and processes, Andrews said. It is not easy to embed these digital threads in actual workflows.

“Digital twins tend to be problem-focused,” he said. “The more that we can do to tailor specific product experiences to include geospatial services and content that our users need to solve specific problems, the better the end user experience will be.”

Esri has recently added new tools to help implement workflows for different use cases.

  • ArcGIS Urban helps bring together available data with zoning information, plans, and projects to enable a digital twin for planning applications.
  • ArcGIS Indoors simplifies the process of organizing workflows that take data from CAD tools for engineering facilities, building information modeling (BIM) data for managing operations, and location data from tracking assets and people. These are potentially useful in, for example, ensuring social distancing.
  • ArcGIS GeoBIM is a new service slated for launch later this year that will provide a web experience for connecting ArcGIS and Autodesk Construction Cloud workflows.

Also expected to underlie digital twins are AR/VR technologies, AI, and analytics. To handle that, Esri has been working to enable the processing of content as diverse as full-motion imagery, reality meshes, and real-time sensor feeds. New AI, machine learning, and analytics tools can ingest and process such content in the cloud or on-premises.

AI digital twin technology farm models

The company has also released several enhancements to organizing map imagery, vector data, and streaming data feeds into features for AI and machine learning models. These can work in conjunction with ArcGIS Velocity either for training new AI models or for pushing them into production to provide insight or improve decision making.

For example, a farmer or agriculture service may train an AI model on digital twins of farms, informed by satellite feeds, detailed records of equipment movement, and weather predictions, to suggest steps to improve crop yield.

Taken as a whole, Esri’s efforts seek to tie very different kinds of data together into a comprehensive digital twin. Andrews said the company has made strides to improve how these might be scaled for climate change challenges. Esri can potentially power digital twins at “the scale of the whole planet” and address pressing issues of sustainability, Andrews said.

Like so many events, Esri UC 2021 was virtual. The company pledged to resume in-person events next year.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link

Categories
AI

Mapped raises $6.5M to build API for the ‘digital twin of data infrastructure’

Where does your enterprise stand on the AI adoption curve? Take our AI survey to find out.


Data infrastructure platform maker Mapped has raised an additional $6.5 million in seed II funding, on top of $3 million raised in February, to build out a universal API for connecting control systems for elevators, HVAC, and other physical assets. Investors include MetaProp and Allegion Ventures as co-leads, as well as Singtel Innov8, Greycroft, and Animo Ventures.

Mapped simplifies access to physical building assets through a standard vocabulary, while supporting a secured API perimeter. The company already provides access to 30,000 different types of equipment. This investment will help it expand to support more equipment types and integrations and grow its go-to-market efforts.

The platform can be thought of as a digital twin of data infrastructure, Mapped founder and CEO Shaun Cooley told VentureBeat. This complements other types of construction digital twins for pre-construction modeling, simulation, and digital representation.

Today, developers confront different APIs, security models, and data elements when building applications spanning multiple physical devices and cloud applications. Mapped provides a middleware tier to simplify the work required to describe the relationship between people, places, and equipment in new applications. For example, a developer can create applications for triggering restroom service requests driven by fixture sensors, integrate visitor management applications with access control systems, or share building data with managed service providers.

Painful integration challenges

Cooley conceived of the idea while vice president and CTO for IoT at Cisco as he watched the largest industrial and commercial businesses in the world struggle to scale their digitization efforts.

“A successful pilot in one building or factory would immediately run into the painful realities of integration when it was moved to the building or factory across the street,” he said.

The big challenge is that existing modern buildings were automated by one or more systems integrators using whatever technologies were available at the time. For example, HVAC systems used one approach, lighting systems another, and plumbing another. The integrator’s job was to automate the necessary processes — and not to worry about how the components, configuration, and programming of the automation compared to that building or factory across the street.

Mapped provides a consolidated data infrastructure layer for discovering, ingesting, and normalizing data. This eliminated months of manual integration efforts for applications that may span multiple systems.

Ontologies on the rise

Ontologies provide a dictionary for characterizing how to structure data in an organized, simple, and extensible way to support data reuse. But there are many ways to do this. Cooley said that existing middleware platforms and digitization efforts left ontology as an exercise for the implementer, which resulted in custom ontologies on top of custom equipment.

Mapped has spearheaded an open source ontology called the Brick schema for describing physical, logical, and virtual assets and the relationship between them. It complements other industry ontologies for describing physical building layouts (Building Topology Ontology), tagging building assets (Project Haystack), and describing smart appliances (Smart Applications Reference, or SAREF).

Cooley is betting that Mapped can follow in the footsteps of other companies that have normalized data and ontology layers in pivotal use cases. Examples could include Twilio for communications, Plaid for financial records, and Stripe for payment. Mapped can normalize data from nearly 50 different system and source types, including HVAC, lighting controls, elevators, 3D maps access controls, air quality monitoring, calendaring systems, digital signage, energy management, fire safety systems, indoor positioning, visitor management, parking systems, security systems, telepresence units, and Wi-Fi.

Mapped uses several techniques to discover and ingest data from disparate devices in industrial and commercial environments. It can both actively speak to machines and controllers and passively monitor existing communications to extract data without adding load. The gateway can also reach into a controller, interpret existing programs, and present and generate a service interface to other apps for securely adjusting equipment.

Another significant innovation is that Mapped can move the security perimeter from each class of equipment ecosystem to a unified security perimeter in the cloud for all types of equipment. As a result, partners and software developers get a simple, robust, and reliable API rather than wasting months building one-off integrations for each deployment.

“The inbound interest from partners and their developers that want to make use of Mapped’s APIs has been amazing, and we will continue to work with partners to simplify their integration efforts and build new capabilities,” Cooley said.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link

Categories
Tech News

Our Milky Way isn’t as unique as we thought — there’s a twin galaxy 320 million light-years away

It’s no surprise the Milky Way is the most-studied galaxy in the universe, given it’s where we live.

But studying just one galaxy can only tell us so much about the complex processes by which galaxies form and evolve.

One crucial question that can’t be solved without looking farther afield is whether the Milky Way is a run-of-the-mill galaxy, or whether it’s unusual or even unique.

Our research, published today in The Astrophysical Journal Letters, suggests the former is true. Key details of our galaxy’s structure are shared by other nearby galaxies, suggesting our home isn’t all that special.