Categories
AI

Algolia improves site search functionality with Search.io acquisition

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.


AI-powered search API platform provider Algolia is acquiring privately held vector-search vendor Search.io in a deal being formally announced today. Financial terms of the deal are not being publicly disclosed.

Algolia has developed its own proprietary technology that enables organizations to search internal resources and websites. To date, Algolia’s technology has used a keyword-based approach for search, which benefits from artificial intelligence (AI) to help improve relevance. Search.io has developed its own system as well, though unlike Algolia’s core system, it doesn’t rely on keyword relevancy. Rather what Search.io has built is a vector database–based engine that uses AI to convert content into numerical values, where relevancy can be determined based on proximity to the next nearest number.

With its acquisition of Search.io, the goal for Algolia is to enable an even more accurate approach to site search, using the power of AI. For example, instead of just a basic search using one or two keywords like “women’s clothes,” Bernadette Nixon, CEO of Algolia, told VentureBeat that a more natural way to search would be to specify what the user wants. So she said that if her sister’s son is getting married, she would want to use a search query like “killer outfit for the mother of the bride.”

“Consumers question whether keywords are the most effective way for them to search when they’re shopping,” Nixon said. “What people want is to be able to search as they think.”

Event

MetaBeat 2022

MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.

Register Here

Search.io has branded the technology it has developed as Neuralsearch, which provides AI-powered semantic search capabilities. At its core, it’s a vector database that enables highly relevant search queries to be executed.

“The reason that the vector database is so much more powerful than previous incarnations of how you deliver semantic search, for example, is because it has been trained on literally billions of documents,” Nixon said.  “So the vector engine is therefore able to make the connections and give better context.”

Nixon explained that in the vector engine, the content is computed into a number that is multidimensional, meaning there are multiple associations with other things in the same index. She added that there are also computations in there as to distance from other things because that also affects and impacts context.

She noted that with vector engines, a concern can often be that it is more expensive to do the processing, storage and retrieval, due to the conversion of data into floating point numbers. That’s actually where Search.io has taken a unique approach with its Neuralsearch technology, which uses an innovative hashing technique to enable the vector engine to scale without needing specialized hardware and infrastructure.

Combining keyword and vector engines will enable a new type of site search and better recommendations

A traditional keyword-based search index is very different from a vector-based index. What Nixon said her team plans to do is bring to market a hybrid search engine that combines both keyword search and vector search.

Nixon said that Algolia will need to maintain two different indices, but that will be abstracted to users. Algolia’s technologies use an API that an organization can connect to in order to query and get search results. So what will happen with the new hybrid keyword/vector search will be that Algolia combines both of the indices into a single API call. As such, a user will make a query via the API that can then be sent to both engines, with a result that provides the highest level of accuracy and relevance.

Algolia has a range of technologies, including search engines, as well as a recommendation product that suggests products to users. The recommendation engine will also benefit from the Search.io technology that will bring in new AI models to help improve results there as well.

“Both companies have a long history of really focusing on relevancy,” Nixon said. “Combining the capabilities that we have as the two companies is what is going to be able to make us be able to have the most performance and the most cost effective results on the market.”

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.

Repost: Original Source and Author Link

Categories
AI

At Ignite 2021, Microsoft showcases functionality for scalable AI apps

Part of what Microsoft was determined to showcase at its Ignite conference, which kicked off Tuesday, is the extent to which Azure is maturing into an architecture for building scalable, AI-infused apps that also work in hybrid cloud and edge computing scenarios.

Microsoft put out a barrage of AI, data analytics, and DevOps announcements for Azure announcements at Ignite. They include updates to services like Azure Stack HCI, a Windows Server 2019-based cluster that uses “validated” hardware to run virtualized workloads locally.

Azure Arc, a hybrid cloud platform introduced in 2019; and Azure Kubernetes Service (AKS), Microsoft’s implementation of the open source standard for containerized applications.

Taken together, they provide a way of packaging applications or application infrastructure such as SQL Server instances to make them portable between cloud, datacenter, and edge locations, in addition to being manageable within the same framework.

The conference gave Microsoft an opportunity to highlight how beta customers are putting these capabilities to work. Vinh Tran, Head of Cloud Engineering at RBC, Canada’s largest bank, told the audience he’d used Azure Arc to automate and manage on-premises database deployments in Kubernetes containers. By simplifying deployments, whether on-premise or in the cloud, Azure Arc has helped his data team stretch its skill sets and capabilities, he said. “It’s allowed us to focus more on the integration of these products and capabilities into our systems than on building, securing, and managing them ourselves,” he said. “It’s allowed us to reduce our operational overhead managing on-premise databases at scale.”

Another customer Microsoft cited several times is SKF, a Swedish manufacturer of ball bearings and industrial seals with more than 100 factories in 28 countries. By extending cloud services for factory automation to run within their factories, SKF said it saved 40% in hardware costs and 30% in overtime related to machine downtime.

A factory is a good example of the difference between hybrid cloud computing and edge computing. A hybrid architecture might mean running applications or data services in a manufacturer’s datacenter, whereas edge computing would mean putting the technology in the individual factories. Microsoft says it is designing the Azure architecture so that even sophisticated machine learning “inferencing” can happen within an edge location without the need to ship data to the cloud.

This is also the reason Microsoft is making Azure Virtual Desktop deployable on the Azure Stack HCI, so it can be deployed in corporate datacenters or offices “at the edge.”

An architecture for scalable AI

“When I talk to customers about their cloud strategy and adoption, I often hear that they want the new innovation and agility that the cloud enables,” Scott Guthrie, executive VP for cloud and AI at Microsoft, said in a keynote presentation “But they also need to integrate with the existing technology investments within their organizations. They sometimes have dozens, hundreds, or even thousands of servers, applications, and databases that they need to manage across their multiple cloud and on-premises environments,” he said. Moving on-premise resources to the cloud often isn’t practical for regulatory issues or latency reasons, he said.

Healthcare is a good example, where privacy and security might dictate that patient data not leave the hospital, and where data-intensive applications like medical imaging will perform better with data processed locally. A medical technology customer exemplifying this approach is Siemens Healthineers, which is using Azure Arc to deploy and maintain apps across tens of thousands of Edge locations that include clinics, and diagnostics equipment, according to Microsoft.

Building on Kubernetes allows Microsoft to claim portability to any container that adheres to the Cloud Native Computing Foundation standards, in the cloud or otherwise. It’s also enabling technology for multicloud and the ability to migrate between clouds or between cloud and on-premises environments.

However, Guthrie acknowledged that learning the intricacies of Kubernetes can also be “a little daunting” for the uninitiated. Microsoft’s workaround is Azure Container Apps, a simplified packaging of the technology. “Container Apps enables you to easily start building container-based microservices with just your app code, while giving you the flexibility to choose to upgrade to our full Azure Kubernetes Service if and when you’re ready to leverage the full power of Kubernetes.”

Microsoft’s approach is not necessarily unique. Amazon Web Services offers its EKS Anywhere and EKS Connector for Kubernetes, for example. But Microsoft’s on-premise clout is allowing it to claim fans that, in addition to those mentioned above, include the likes of Walmart, Starbucks, HSBC, and the UK’s National Health Service.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link

Categories
AI

Transposit unveils CloudOps workflow-building functionality

If someone were to select three terms that represent the frontier of IT in 2021, they might choose: automation, automation, automation.

Others might select user experience (UX), AI, edge computing, CloudOps workflows, or DevOps. They’re all legitimate, but the common denominator is automation because each relies to some extent on automated processes.

One would think that next-gen self-service workflows, which are a major enterprise automation IT trend and use the aforementioned functions, would be installed everywhere at the usual-suspect cloud-service providers (AWS, Azure, and Google Cloud Platform) since they themselves are progressive IT consumers. But this isn’t the case; a new DevOps player, Transposit, knows it and sees a new niche market.

The San Francisco, California-based startup, which describes itself as a DevOps process orchestration provider, today announced new functionality for its Amazon Web Services (AWS), Google Cloud Platform (GCP), and Azure connectors that it claims to allow engineering and cloud operations teams to move faster and more confidently in enabling self-service infrastructure through automated workflows.

Transposit Actions, as these functions are called, don’t have to be used only at a major cloud-service provider. They can be added to any step within an enterprise run book to create automated workflows across diverse stacks so that whichever tools teams are using, the visibility, context, and actionability to use them optimally are right in front of them, Transposit’s VP of marketing, Ed Sawma told VentureBeat.

Bringing cloud services into these workflows empowers CloudOps admins to ensure outstanding customer service, UX, and reliability, Sawma said. This is about enabling engineering teams to get immediate access to the services and infrastructure they need to ship software rapidly and safely, he said.

CloudOps workflows get enterprise traction

“First and foremost, the big cloud providers are focused on core infrastructure, storage, compute, and providing those core building blocks,” Sawma said. “They’ve moved up the stack a bit from there, but traditionally they haven’t really offered solutions for how you operate the workflow, how you deploy code and run that code.  They know DevOps, they see ICD platforms and continuous delivery platforms, but because it’s not the core building blocks of cloud infrastructure, they’re not expert in it.”

Transposit uses what it calls a “human-in-the-loop” methodology in its specialized platform, Sawma said.

“A lot of what IT and ops teams have to do require a human to think about it and to make some kind of decision, and so we built our platform from the ground up for what we believe should be called intelligence augmentation,” Sawma said. “It’s like, ‘How do we take a human operator and give them superpowers to be able to go and take action across all these hundreds of different tools?’— Which is the complexity of a modern development stack.”

Transposit delivers DevOps process orchestration. Its fully integrated, human-in-the-loop approach to automation empowers engineering operations teams to streamline DevOps practices, improve service reliability, and resolve incidents faster. As the glue between tools, data, and people, Transposit claims to codify institutional knowledge to make processes work more efficiently.

Transposit’s no-code builder, coupled with developer customization, enables operations to create self-service workflows that let users do everything from provisioning infrastructure to creating new accounts and permissions, Sawma said.

The new functionality for AWS, GCP, and Azure connectors allow engineering and cloud ops teams to immediately access data from cloud services (such as recent deployments, service status across regions, and list instances); to reroute connected cloud services during a disabled server event; and enable service request automation to provision a new EC2 instance, create a new AWS account, or grant new permissions.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link