Alphabet is launching a company that uses AI for drug discovery

A new Alphabet company will use artificial intelligence methods for drug discovery, Google’s parent company announced Thursday. It’ll build off of the work done by DeepMind, another Alphabet subsidiary that has done groundbreaking work using AI to predict the structure of proteins.

The new company, called Isomorphic Laboratories, will leverage that success to build tools that can help identify new pharmaceuticals. DeepMind CEO Demis Hassabis will also serve as the CEO for Isomorphic, but the two companies will stay separate and collaborate occasionally, a spokesperson said.

For years, experts have pointed to AI as a way to make it faster and cheaper to find new medications to treat various conditions. AI could help scan through databases of potential molecules to find some that best fit a particular biological target, for example, or to fine-tune proposed compounds. Hundreds of millions of dollars have been invested in companies building AI tools over the past two years.

Isomorphic will try to build models that can predict how drugs will interact with the body, Hassabis told Stat News. It could leverage DeepMind’s work on protein structure to figure out how multiple proteins might interact with each other. The company may not develop its own drugs but instead sell its models. It will focus on developing partnerships with pharmaceutical companies, a spokesperson said in a statement to The Verge.

Developing and testing drugs, though, could be a steeper challenge than figuring out protein structure. For example, even if two proteins have structures that fit together physically, it’s hard to tell how well they’ll actually stick. A drug candidate that looks promising based on how it works at a chemical level also might not always work when it’s given to an animal or a person. Over 90 percent of drugs that make it to a clinical trial end up not working, as chemist and writer Derek Lowe pointed out in Science this summer. Most of the problems aren’t because there was something wrong at the molecular level.

The work done at DeepMind and the proposed work at Isomorphic could help bust through some research bottlenecks but aren’t a quick fix for the the countless challenges of drug development. “The laborious, resource-draining work of doing the biochemistry and biological evaluation of, for example, drug functions” will remain, as Helen Walden, a professor of structural biology at the University of Glasgow, previously told The Verge.

Repost: Original Source and Author Link


Alphabet is putting its prototype robots to work cleaning up around Google’s offices

What does Google’s parent company Alphabet want with robots? Well, it would like them to clean up around the office, for a start.

The company announced today that its Everyday Robots Project — a team within its experimental X labs dedicated to creating “a general-purpose learning robot” — has moved some of its prototype machines out of the lab and into Google’s Bay Area campuses to carry out some light custodial tasks.

“We are now operating a fleet of more than 100 robot prototypes that are autonomously performing a range of useful tasks around our offices,” said Everyday Robot’s chief robot officer Hans Peter Brøndmo in a blog post. “The same robot that sorts trash can now be equipped with a squeegee to wipe tables and use the same gripper that grasps cups can learn to open doors.”

These robots in question are essentially arms on wheels, with a multipurpose gripper on the end of a flexible arm attached to a central tower. There’s a “head” on top of the tower with cameras and sensors for machine vision and what looks like a spinning lidar unit on the side, presumably for navigation.

One of Alphabet’s Everyday Robot machines cleans the crumbs off a cafe table.
Image: Alphabet

As Brøndmo indicates, these bots were first seen sorting out recycling when Alphabet debuted the Everyday Robot team in 2019. The big promise that’s being made by the company (as well as by many other startups and rivals) is that machine learning will finally enable robots to operate in “unstructured” environments like homes and offices.

Right now, we’re very good at building machines that can carry out repetitive jobs in a factory, but we’re stumped when trying to get them to replicate simple tasks like cleaning up a kitchen or folding laundry.

Think about it: you may have seen robots from Boston Dynamics performing backflips and dancing to The Rolling Stones, but have you ever seen one take out the trash? It’s because getting a machine to manipulate never-before-seen objects in a novel setting (something humans do every day) is extremely difficult. This is the problem Alphabet wants to solve.

Unit 033 makes a bid for freedom.
Image: Alphabet

Is it going to? Well, maybe one day — if company execs feel it’s worth burning through millions of dollars in research to achieve this goal. Certainly, though, humans are going to be cheaper and more efficient than robots for these jobs in the foreseeable future. The update today from Everyday Robot is neat, but it’s far from a leap forward. You can see from the GIFs that Alphabet shared of its robots that they’re still slow and awkward, carrying out tasks inexpertly and at a glacial pace.

However, it’s still definitely something that the robots are being tested “in the wild” rather than in the lab. Compare Alphabet’s machines to Samsung’s Bot Handy, for example; a similar-looking tower-and-arm bot that the company showed off at CES last year, apparently pouring wine and loading a dishwasher. At least, Bot Handy looks like it’s performing these jobs, but really it was only carrying out a prearranged demo. Who knows how capable, if at all, this robot is in the real world? At least Alphabet is finding this out for itself.

Repost: Original Source and Author Link


Alphabet is repurposing Google TPUs for quantum computing simulations

Join Transform 2021 for the most important themes in enterprise AI & Data. Learn more.

Alphabet at X, Google parent company Alphabet’s second, secretive quantum computing team, plans to launch a set of APIs called Floq that will allow developers to use tensor processing units (TPUs) to simulate quantum computing workloads. The announcement, which was made during a February livestream that garnered little mainstream attention, hints at the potential for hardware originally designed for AI applications to extend into the quantum realm.

Experts believe that quantum computing, which at a high level entails the use of quantum-mechanical phenomena like superposition and entanglement to perform computation, could one day accelerate AI workloads compared with classical computers. Scientific discoveries arising from the field could transform energy storage, chemical engineering, drug discovery, financial portfolio optimization, machine learning, and more, leading to new business applications. Emergen Research anticipates that the global quantum computing market for the enterprise will reach $3.9 billion by 2027.

According to Sandbox at Alphabet research scientist Guillaume Verdon, Floq, which will initially be made available in alpha to 50 teams in the QHack Open Hackathon, will offer a simulator API that leverages the “bleeding edge” of AI compute for experimentation. The Sandbox at Alphabet team repurposed TPUs, chips developed by Google specifically for AI training and inference, to accelerate simulations in the cloud so that developers can use frontends like TensorFlow Quantum and PennyLane to create quantum models and run them remotely on Floq.

Alphabet Floq

Google’s TPUs are liquid-cooled and designed to slot into server racks; deliver up to 100 petaflops of compute; and power Google products like Google Search, Google Photos, Google Translate, Google Assistant, Gmail, and Google Cloud AI APIs. Google announced the third generation in 2018 at its annual I/O developer conference and this morning took the wraps off the successor, which is in the research stages.

Alphabet Floq

Verdon says that Floq simulators, dubbed Floq Units, run 10 to 100 times faster than state-of-the-art simulations accelerated by GPUs when compared in terms of runtime. Moreover, by quantum volume — the maximum size of square quantum circuits that can be implemented successfully by a machine — Floq currently exceeds 4 billion and in the future will reach into the trillions. In simplest terms, quantum circuits are a sequence of matrix math operations performed on qubits, the quantum version of a bit.

Alphabet Floq

“The team has been experimenting with how to use Floq for physics, machine learning — all kinds of cool applications,” Verdon said. “And we’ve developed our own open source library for tensor networks that runs on TPUs … It’s [surprising] how good the chips are for quantum simulation. It’s almost like they were designed for this task.”

When it launches more broadly, Floq will complement Cirq, Google’s service that gives developers access to its quantum computing hardware. And it’ll compete with a number of other quantum simulators already available on the market, including a service in IBM’s Quantum Experience suite and simulators from Intel, Amazon, and Microsoft.

But Floq’s reliance on TPUs looks to set it apart from the pack — at least on the performance end of the equation.


VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link