Categories
AI

Hugging Face takes step toward democratizing AI and ML

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.


The latest generation of artificial intelligence (AI) models, also known as transformers, have already changed our daily lives, taking the wheel for us, completing our thoughts when we compose an email or answering our questions in search engines. 

However, right now, only the largest tech companies have the means and manpower to wield these massive models at consumer scale. To get their model into production, data scientists typically take one to two weeks, dealing with GPUs, containers, API gateways and the like, or have to request a different team to do so, which can cause delay. The time-consuming tasks associated with honing the powers of this technology are a main reason why 87% of machine learning (ML) projects never make it to production. 

To address this challenge, New York-based Hugging Face, which aims to democratize AI and ML via open-source and open science, has launched the Inference Endpoints. The AI-as-a-service offering is designed to be a solution to take on large workloads of enterprises — including in regulated industries that are heavy users of transformer models, like financial services (e.g., air gapped environments), healthcare services (e.g., HIPAA compliance) and consumer tech (e.g., GDPR compliance). The company claims that Inference Endpoints will enable more than 100,000 Hugging Face Hub users to go from experimentation to production in just a couple of minutes. 

Hugging Face Inference Endpoints is a few clicks to turn any model into your own API, so users can build AI-powered applications, on top of scalable, secure and fully managed infrastructure, instead of weeks of tedious work reinventing the wheel building and maintaining ad-hoc infrastructure (containers, kubernetes, the works.),” said Jeff Boudier, product director at Hugging Face.   

Event

MetaBeat 2022

MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.

Register Here

Saving time and making room for new possibilities

The new feature can be useful for data scientists — saving time that they can instead spend working on improving their models and building new AI features. With their custom models integrated into apps, they can see the impact of their work more quickly.

For a software developer, Inference Endpoints will allow them to build AI-powered features without needing to use machine learning. 

“We have over 70k off-the-shelf models available to do anything from article summarization to translation to speech transcription in any language, image generation with diffusers, like the cliché says the limit is your imagination,” Boudier told VentureBeat. 

So, how does it work? Users first need to select any of the more than 70,000 open-source models on the hub, or a private model hosted on their Hugging Face account. From there, users need to choose the cloud provider and select their region. They can also specify security settings, compute type and autoscaling.  After that, a user can deploy any machine learning model, ranging from transformers to diffusers. Additionally, users can build completely custom AI applications to even match lyrics or music creating original videos with just text, for example. The compute use is billed by the hour and invoiced monthly.  

“We were able to choose an off the shelf model that’s common for our customers to get started with and set it so that it can be configured to handle over 100 requests per second just with a few button clicks,” said Gareth Jones, senior product manager at Pinecone, a company using Hugging Face’s new offering. “With the release of the Hugging Face Inference Endpoints, we believe there’s a new standard for how easy it can be to go build your first vector embedding-based solution, whether it be semantic search or question answering system.”

Hugging Face started its life as a chatbot and aims to become the GitHub of machine learning. Today, the platform offers 100,000 pre-trained models and 10,000 datasets for natural language processing (NLP), computer vision, speech, time-series, biology, reinforcement learning, chemistry and more.

With the launch of the Inference Endpoints, the company hopes to bolster the adoption of the latest AI models in production for companies of all sizes.  

“What is really novel and aligned with our mission as a company is that with Inference Endpoints even the smallest startup with no prior machine learning experience can bring the latest advancements in AI into their app or service,” said Boudier. 

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.

Repost: Original Source and Author Link

Categories
AI

Hugging Face triples investment in open source machine learning models

Join Transform 2021 for the most important themes in enterprise AI & Data. Learn more.


Hugging Face launched in 2016 with a chatbot app designed to be your “AI friend.” Now the NLP company has more than 100,000 community members and is planning to triple its efforts and expand beyond language models into fields like computer vision. Developers have used a hub on Hugging Face to share thousands of models, and CEO and cofounder Clement Delangue told VentureBeat Hugging Face wants to become to machine learning what GitHub is to software engineering.

As part of that effort, Hugging Face closed a $40 million series B funding round today. The round was led by Addition, with participation from Lux Capital, A.Capital, and Betaworks. Notable individual investors in the round include MongoDB CEO Dev Ittycheria, NBA star Kevin Durant, Dataiku CEO Florian Douetteau, and former Salesforce chief scientist Richard Socher.

Delangue said Hugging Face believes transfer learning is critical to the future of machine learning. As evidence of this trend, Delangue points to an AI research paper published earlier this week by researchers from Google Brain, Facebook AI Research, and UC Berkeley about pretrained language models working with numerical computation, vision, and protein fold prediction. This and other recent advances, he said, signify that “transfer learning models are starting to eat the whole field of machine learning.”

“Everything transfer learning-based we believe is here to stay and is going to transform machine learning for the next five years,” he told VentureBeat. “We’ve seen that they completely changed the NLP field, and they’re starting to change the computer vision fields, like with vision transformers and the speech-to-text fields. Ultimately, we think transfer learning is going to power machine learning, and hopefully we’re going to be able to power all these transfer learning models.”

Hugging Face has also published AI research. A paper about the Transformers NLP library that’s seen more than 10 million Python pip installs and been used by a number of businesses — including Microsoft’s Bing and MongoDB — received the Best Demo paper award at the EMNLP research conference last year.

In addition to tripling efforts to grow an open source community for the development of language models, Delangue said the funds will help ensure Hugging Face has the resources to act as a “counter-power” to major cloud AI services being sold to enterprise customers. NLP is an area of interest for a number of companies hoping to sell AI services to enterprise customers, including Databricks, which raised $1 billion last month and plans to focus on acquiring NLP startups.

“I think one of the big challenges that you have in machine learning, it seems these days, is that most of the power is concentrated in the hands of a couple of big organizations,” he said. “We’ve always had acquisition interests from Big Tech and others, but we believe it’s good to have independent companies — that’s what we’re trying to do.”

Democratization, Delangue said, will be key to assuring the benefits of AI extend to smaller organizations. Hugging Face CTO Julien Chaumond echoed that thought. In a statement shared with VentureBeat, he said democratization of AI will be one of the biggest achievements for society and that no single company, not even a Big Tech business, can do it alone. 

Hugging Face began monetizing ways to help businesses create custom models six months ago, and now it works with over 100 companies, including Bloomberg and Qualcomm. A Hugging Face spokesperson told VentureBeat the company has been cash-positive in the first months of 2021.

“You can start seeing that companies are really going to have dozens of what we call machine learning features or NLP features,” he said. “It’s not going to be like one big feature, but they’re going to have a lot of different NLP features that are going to be really deeply embedded into their products or their workflow in multiple different ways.”

In other recent Hugging Face news, Hugging Face extended into machine translation last year and in recent weeks launched subcommunities for people working with low-resource languages to create language models.

Hugging Face raised $15 million in a 2019 series A funding round and has raised a total of $60 million to date. In 2017, Hugging Face was part of the Voicecamp startup accelerator hosted by Betaworks in New York City.

Hugging Face currently has 30 employees, with offices in New York and Paris.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link