Categories
AI

Meow Wolf, Anthos team for multi-cloud app management in art shows

Join gaming leaders online at GamesBeat Summit Next this upcoming November 9-10. Learn more about what comes next. 


Meow Wolf’s work with SADA, a Google Cloud Premier Partner with multiple specializations, and its use of Anthos multi-cloud app management were featured in a spotlight session on immersive art experiences at the Google Cloud Next ’21 conference held online through October 14.

Meow Wolf is an American arts and entertainment company that creates large-scale immersive art installations and produces streaming content, music videos, and arts and music festivals. SADA is a cloud-computing consultant based in North Hollywood, California. Google Anthos is a next-gen, hybrid- and multi-cloud application management platform that aims to provide a consistent development and operations experience for cloud and on-premises environments.

Scalable, flexible multi-cloud app management

Known more for its work with enterprise clients, SADA is helping Meow Wolf design and apply solutions for its permanent multimedia installations, such as Omega Mart, now open in Las Vegas. Anthos fit Meow Wolf’s requirements for a modern cloud application that could be deployed on-premises to ensure low latency and fault tolerance.

The complex, always-on nature of Omega Mart required the scalable IT infrastructure Anthos offers. Anthos allows apps to run unmodified on existing on-premises hardware and many public clouds in simple, flexible, and secure ways.

“Anthos has helped us create a groundbreaking experience that immerses guests in a way that’s never been done before,” said Jordan Snyder, vice president of platform at Meow Wolf. “It gives us a ‘single pane of glass’ to monitor, maintain, and quickly push out app updates.”

Omega Mart, an interactive “supermarket,” is Meow Wolf’s second permanent art exhibition leveraging the hybrid cloud platform to run sensory installations.

Omega Mart, which opened in February, is an art installation billed as the world’s most surreal supermarket and sensory playground, featuring otherworldly displays, hidden portals, immersive art experiences, and shelves stocked with peculiar products. With live, interactive displays that can be accessed via RFID-powered Boop Card readers, shoppers become part of the experience.

“It’s exciting to know that technology like Anthos can be applied to bring artistic visions to life in new and creative ways,” said Miles Ward, CTO at SADA. “Omega Mart is one of many amazing ways to apply Anthos technology.”

“SADA has been instrumental to this process, from helping us conceive the technical solutions to tackling various hurdles along the way,” Snyder said.  SADA’s consultants worked with Anthos and helped Meow Wolf design and apply solutions to meet Omega Mart’s needs. He added, “Their guidance, expertise, and support helped make the launch of Omega Mart a huge success.”

Anthos is now used to host Meow Wolf’s applications and various installations that capture customer interactions with the various Boop and computer stations, which facilitate the interactive gameplay element of the experience and drive the exhibit’s story. Since the opening of the exhibit, SADA has continued to provide technical account management and support.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link

Categories
AI

Copado acquires Qentinel to bring multicloud software testing to DevOps

Where does your enterprise stand on the AI adoption curve? Take our AI survey to find out.


Copado, a developer operations (DevOps) company initially built for the Salesforce ecosystem, has announced plans to acquire AI-powered software testing platform Qentinel. Terms of the deal were not disclosed.

The software testing market was estimated to be worth $45.6 billion last year, a figure that’s expected to more than double within six years. With every company now effectively a software company, and the cloud taking center stage, businesses across every sector need the tools to help them develop and ship bug-free software at speed. This is why we’ve seen a flurry of activity across the software testing space of late, with BrowserStack and LambdaTest raising $200 million and $16 million, respectively in the past few weeks alone.

Founded in 2013, Copado serves enterprises with an integrated Salesforce-native platform spanning the whole DevOps process, including agile planning, continuous delivery, automated testing, and compliance. More recently, the company has expanded its support to additional clouds and platforms outside of Salesforce, such as Veeva, Heroku, and MuleSoft.

Finnish company Qentinel provides automated software testing tools for major enterprises such as Kone and Thyssenkrupp that work across major cloud platforms including Oracle, SAP, and Microsoft.

This is a big deal for Copado, as it signals the company’s intentions to continue its transition beyond the Salesforce ecosystem for which it is better known.

“Bringing Qentinel into Copado DevOps allows us to ensure continuous quality when building and deploying in multicloud environments,” Copado CEO Ted Elliott said in a statement.

The Qentinel acquisition, expected to close in the next month, follows shortly after Copado raised $96 million in funding and acquired multicloud developer security operations (DevSecOps) startup New Context.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link

Categories
AI

Microsoft turns attention to multi-cloud and AI with Azure updates

Elevate your enterprise data technology and strategy at Transform 2021.


Microsoft delivered more than 100 product and service announcements at its Build conference last week. As CEO Satya Nadella said in his keynote, these announcements support the continued “tech intensity” that has accelerated throughout the 2020 global pandemic, expediting the company’s investments in digital transformation — especially the adoption of cloud computing, artificial intelligence, and big data.

Build, the company’s annual event for developers, focused on the intersection of IT modernization and the development of applications that can leverage data and AI to power the way we work, learn, and communicate. The event also focused a lot on the idea of “creators” — not the kind we hear about in the YouTube or TikTok world, but the kind Microsoft hopes to support with its no-code, low-code, and pro-code offerings designed to allow more members of an organization to concurrently deliver material value to the group, making development a team sport where both coders and business experts contribute.

With so many new capabilities brought to market over the three-day event, it would be hard to cover the whole gamut of announcements. But as an analyst, I was particularly drawn to updates around the Azure ecosystem and wanted to provide some insights on what caught my attention and what it means for Microsoft and its ecosystem.

Key enhancements for Azure Arc

The first thing that caught my eye was the increased support for Azure Arc and the company’s clear positioning that multi-cloud is here and that everything it builds needs not only to support Azure but on-prem, the edge, and other clouds like Google and AWS.

Arc was originally announced in 2019 as the company’s solution for managing resources from across clouds. Still, this year’s event included the announcement of several newly added cloud services, including Azure App Service, Functions, Logic Apps, API Management, and Event Grid, to enable more of Microsoft’s services to run in other clouds.

Here is a quick rundown of each.

Azure App Service: A fully managed service for building, deploying, and scaling web apps.

Functions: Event-driven serverless compute platform designed to solve complexities in applications orchestration.

Logic Apps: New integration platform as a service (iPaaS) built on a containerized runtime to make apps more scalable, portable, and automated across the IT environment.

API Management: Hybrid and multi-cloud management platform allowing developers to deploy API gateways side-by-side no matter the host location, optimizing API traffic flow.

Event Grid: Single service for managing event routing from any source to any destination.

In brief, these services enable users to run Kubernetes clusters, on-premises, multi-cloud, and within edge environments using Azure, unlocking data no matter its location for use on Azure compute services. I believe these offerings are essential to meet the current landscape of IT modernization, which often works in parallel to the development of applications — as time goes on, these functions will continue to work more harmoniously.

Azure forms new Azure Applied AI Services

Microsoft announced a plethora of enhancements for Azure AI, Azure Cognitive Services, and Azure ML. All of these updates are designed to enable users to do more with their data. Moreover, the company was pretty straightforward about wanting to allow developers to better leverage the power of AI in apps being developed on its platform.

The new tools, along with some newly minted general-availability (GA) announcements, include:

Updates to Azure Bot Service: A visual authoring canvas along with open-source tools that let developers add telephony and speech capabilities when testing, debugging, and deploying multi-channel bots without requiring massive changes to developer code.

Azure Metrics Advisor: This service went into preview last September but is now GA. Metrics Advisor is a monitoring platform that offers APIs for data ingestion, diagnostics, and anomaly detection without the requirement of machine learning knowledge.

Azure Video Analyzer: Video analytics are hot, and this new offering combines Live Video Analytics and Video Indexer. This service is currently in preview and is designed to deliver analytics from streaming and stored videos, including the auto-extraction of advanced metadata.

Microsoft, notably announced it will combine these new and updated offerings along with Azure Form Recognizer, Azure Immersive Reader, and Azure Cognitive search to make up what it will call the ‘Azure Applied AI Services Group.’ The importance of these tools and connectors comes down to making AI more usable for developers. Despite the constant chatter about AI’s ability to enhance applications, Microsoft knows the critical path is to shorten the development time and simplify the inclusion of AI while improving and building applications.

Azure, for the foreseeable future, will continue to chase AWS’s massive infrastructure business. Still, it is hard to argue with the vast developer ecosystem that the company has created and the hooks that tie these tools and services together. Azure serves as the foundation for the data ecosystem. The continued evolution of Microsoft’s developer ecosystem being more “Code meets Creator” provides a platform for companies to embrace the massive portfolio of software and solutions that Microsoft offers its customers.

This year’s event addressed the rapid proliferation of hybrid cloud, multi-cloud, and the enterprise requirement for applying data at scale. I expect this trend to continue as Microsoft seeks to cement further the adoption of its applications and the use of its cloud and AI services.

Daniel Newman is the principal analyst at Futurum Research, which provides research, analysis, advising, and/or consulting to high-tech companies in the tech and digital industries.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member



Repost: Original Source and Author Link

Categories
AI

AI, cyber terrain analytics improve hybrid multicloud security

Elevate your enterprise data technology and strategy at Transform 2021.


Typical hybrid cloud IT integration strategies have fundamental design flaws that CIOs and CISOs need to address if they’re going to avert another attack on the scale of SolarWinds. The design flaws are evident in existing approaches to integrating public and private clouds with legacy systems. Inconsistent endpoint security and privileged access management has turned out to be highly penetrable and painfully lacking.

The first two articles in this series explain how getting hybrid cloud security right is hard and how the SolarWinds hack exposed hybrid clouds’ greatest weaknesses. This post lays out an approach to solve hybrid cloud security challenges today.

Finding security gaps with network maps

The best first step to improving hybrid cloud security is to gain an accurate, real-time view of every public, private, and community cloud and its integrations into legacy systems. The goal is to gain greater visibility and control across the entire network by continually capturing data on network activity down to the endpoint. Applying machine learning algorithms and cyber terrain analysis to the data uncovers security gaps hidden in data logs or points to openings where data is not captured at all.

Network mapping strategy must focus on quantifying how data moves within and between hybrid platforms. Hidden in the terabytes of data that hybrid clouds generate are indicators of potential vulnerabilities, and — in worst cases — anomalous activity indicating a breach attempt.

Comprehensive network maps that range down to the IP address level, combined with a network’s activity data, can identify potential security gaps. A data-centric approach based on real-time monitoring of a hybrid cloud network identifies the most vulnerable systems, network connections, and endpoints.

Real-time network monitoring also proves more effective than unifying the completely different monitoring approaches every public cloud platform has. Please don’t believe the hype from cloud platform providers that claim to support visibility across third-party cloud platforms and secure a hybrid cloud configuration. It’s best to take an impartial, independent strategy when it comes to network mapping a hybrid cloud configuration, ideally choosing a monitoring platform that delivers real-time data monitoring too.

Look for these core areas of expertise when evaluating hybrid cloud mapping and security analysis platforms.

First, understand that, at a minimum, any cyber risk modeling platform needs to identify and isolate device endpoint vulnerabilities at the physical level of the work. It’s essential that a mapping platform supports this, because the telemetry data this generates is the foundation for creating an accurate network map.

Second, networking mapping platforms need to identify if each endpoint is up to date when it comes to patch management, where the endpoint is in the configuration structure of the hybrid cloud network, and what the potential vulnerabilities are, down to the level of the operating system and endpoint security patches.

Third, an effective network mapping platform can track each device down to the IP address, providing contextual intelligence and locational data.

Fourth, any network mapping platform needs to excel at visualization and provide insightful analysis at a graphical level to identify potential security anomalies and actual breach activity.

Useful in understanding this is the following example of how RedSeal’s cyber risk modeling software for hybrid cloud environments works. Cisco has standardized on this approach to identify security gaps in their hybrid cloud strategies and optimize hybrid cloud network performance.

Real-time monitoring with visualization from RedSeal

Above: Combining real-time monitoring with visualization is key to finding security gaps in hybrid cloud networks.

Image Credit: RedSeal

Machine learning identifies network vulnerabilities

Machine learning models are proving effective at identifying security gaps in hybrid cloud networks. That’s being accomplished by combining supervised and unsupervised algorithms to identify anomalies and create new predictive models based on results. The value of having real-time monitoring data obtained from network mapping starts to pay off when risk and threat correlation engines provide terrain mapping data and visualizations of a hybrid cloud network. Flaws, gaps, overlooked security configurations, and potential breach attempts are faster to find and remediate using machine learning analysis and visualization techniques.

Machine learning’s impact on hybrid cloud network mapping and vulnerability assessment has led some to create threat reference libraries. These compare configurations using threat correlation engines. By capitalizing on the insights gained from supervised machine learning models continually learning based on real-time data monitoring, threat correlation engines prove to be accurate in identifying breach attempts and anomalous activity. For organizations pursuing a hybrid cloud infrastructure strategy to support new businesses and services, that’s welcome news.

Paralleling the development of correlation engines are risk engines that capitalize on the data captured from real-time network monitoring. Risk engines use advanced predictive analytics to calculate the relative risk levels posed by unique combinations of hosts. By employing algorithms to cycle through multiple scenarios involving randomized hosts, these risk engines identify the most critical vulnerabilities. From there, risk scores define a prioritized list of vulnerabilities that need security teams’ immediate attention.

Cyber terrain analytics combines risk and threat correlation engines’ results, continually refining them using real-time network monitoring data. Over time, machine learning algorithms supporting the two engines fine-tune terrain analytics to quantify how resilient a hybrid cloud network is while also identifying vulnerabilities. The approach is proving effective in identifying threats in real time and taking action to thwart breach attempts in hybrid cloud configurations that would otherwise go undetected. Terrain analytics effectively model or simulate threat scenarios, providing invaluable data to organizations focused on hardening their hybrid cloud configurations.

RedSeal's cloud dashboard

Above: Cyber terrain analytics provide a real-time assessment of hybrid cloud resilience levels by combining insights gained from machine learning-based risk and threat correlation engines.

Image Credit: RedSeal

Answers lurk in the real-time data streams

Hybrid clouds’ greatest security weaknesses haven’t been discovered yet. That’s because they’re being managed for the most part with security techniques and tools that are decades old and were made for a time when business models were much simpler.

Today we need a more data-centric approach to security for hybrid cloud infrastructure, one that combines the best of what data governance can provide with the latest machine learning technologies for identifying and acting on vulnerabilities.

The answers to how to improve hybrid cloud security are hidden in the real-time data streams these platforms produce as they operate and interact with both valid internal users and bad actors attempting to breach the system. Creating a contextual intelligence, along with a real-time view of all hybrid cloud activity, is where it needs to start.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link