Categories
AI

InVia Robotics raises $30M for warehouse robotics push

All the sessions from Transform 2021 are available on-demand now. Watch now.


InVia Robotics, an industrial robotics company based in Los Angeles, California, today announced it has raised $30 million in a series C round co-led by Microsoft’s M12 Ventures and Qualcomm, with participation from Hitachi. InVia says the new equity funding will be used to support its growth, specifically through adopting Qualcomm’s Robotics RB5 Platform and drawing on AI expertise from Hitachi and Microsoft.

Companies are increasingly determined to improve warehouse automation in light of pandemic-related supply chain challenges. A recent Honeywell survey found that 14% of enterprises rank improving automation within their facilities as a top priority, while 37% rank it among their top three near-term goals. A separate report published in Forbes found that 96% of warehouse executives expect the warehouse automation value proposition to increase over the next three years.

To fill this need, InVia Robotics offers a subscription-based service that includes autonomous robots, optimization software, and dedicated monitoring teams. The platform integrates with existing warehouses, determining paths, mapping workflows, and identifying task interdependencies to minimize inefficiencies and “warehouse walking.” InVia says it robots can be scaled up or down or have tasks reassigned in response to changes in order volume and seasonality. The company owns, operates, and maintains its equipment over the lifetime of service contracts.

“You don’t need to buy robots or be a robotics expert to reap the benefits of warehouse automation. We handle everything. Through our robotics-as-a-service (RaaS) model, we own, operate, and continuously optimize our robots to integrate with your fulfillment processes and maximize your efficiency,” InVia explains on its website. “Our engineers work remotely to keep your system maintained and optimized, [and we] make sure all … resources are fully utilized [by directing] the workforce to execute all workflow tasks — that can be [handled by] people, our robots, or both.”

Simulation and deployment

InVia’s robots retrieve and move goods with the help of hot-swappable batteries, a self-charging feature, and built-in lighting. Via a tool called Pickmate, human workers get directions to pick goods from the robots, a pallet, or a rack and place them into order bins.

Customers can view status reports via a dashboard that shows the productivity of various segments of their fulfillment chain.

Despite the appeal of warehouse automation, it remains far from a perfect science. Accidents can happen, like the fire at U.K.-based grocer Ocado’s fulfillment center that was reportedly caused by a robot collision.

As a safety measure, InVia says it employs predictive monitoring for each of its warehouse clients. The company’s engineering staff watches simulations of warehouses, as well as real-time feeds, and attempts to troubleshoot issues before they become a problem.

InVia competes with Locus, Berkshire Grey, 6 River Systems, and others in the growing RaaS market, which accounts for an estimated 30% of the robotic industry’s total worth. But InVia, which recently notched customer wins with ecommerce company Hollar and Cascade Orthopedic, claims its model is unique in that it allows businesses to pay only for productivity, rather than leasing at a flat rate. InVia subscribers only pay for what they need based on throughput requirements.

“Warehouse automation is critical for ecommerce companies competing against behemoths like Amazon, but the overhead cost of purchasing a fleet of robots to streamline efficiency can be crippling,” founder and CEO Lior Elazary told VentureBeat in a previous interview. “InVia’s innovative RaaS technology eliminates this challenge for our customers.”

This latest round of funding brings InVia’s total raised to date to $59 million.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link

Categories
AI

Oracle’s Autonomous Data Warehouse expansion offers potential upside for tech professionals

Where does your enterprise stand on the AI adoption curve? Take our AI survey to find out.


In March, Oracle announced an expansion to their Autonomous Data Warehouse that can bring the benefits of ADW — automating previously manual tasks — to large groups of new potential users. Oracle calls the expansion “the first self-driving database,” and its goal with the new features is to “completely transform cloud data warehousing from a complex ecosystem … that requires extensive expertise into an intuitive, point-and-click experience” that will enable all types of professionals to access, work with, and build business insights with data, from engineers to analysts and data scientists to business users, all without the help of IT.

A serious bottleneck to data work delivering business value across industries is the amount of expertise required at many steps along the data pipeline. The democratization of data tooling is about increasing ROI when it comes to an organization’s data capabilities, as well as increasing the total addressable market for Oracle’s ADW. Oracle is also reducing the total cost of ownership with elastic scaling and auto-scaling for changing workloads. We spoke with George Lumpkin, Neil Mendelson, and William Endress from Oracle, who shared their time and perspective for this article.

The landscape: democratization of data tooling

There is a growing movement of data tooling democratization, and the space is getting increasingly crowded with tools such as AWS SageMaker Studio (which we have reviewed here, here, and here), DataRobot, Qlik, Tableau, and Looker. It is telling that in recent times, Google has acquired Looker and Salesforce has acquired Tableau. On top of this, the three major cloud providers are all providing drag-and-drop data tooling, to various extents: AWS has an increasing amount of GUI-based data transformation and machine learning tools; Microsoft Azure has a point-and-click visual interface for machine learning, “data preparation, feature engineering, training algorithms, and model evaluation”; and Google Cloud Platform has similar functionality as part of their Cloud AutoML offering.

In their announcement, Oracle frames the AWS enhancements as self-service tools for:

  • Analysts, including loading and transforming data, building business models, and extracting insights from data (note that ADW also provides some interesting third-party integrations, such as automatically building data models that can be consumed by Tableau or Qlik).
  • Data scientists (and “citizen data scientists”), along with building and deploying machine learning models (in a video, Andrew Mendelsohn, executive VP of Oracle Database Server Technologies, describes how data scientists can “easily create models with AutoML” and “integrate ML models into apps via REST or SQL”).
  • LoB developers, including Low-Code App Dev and API-Driven Development.

Oracle Autonomous Data Warehouse competes with incumbent products including Amazon Redshift, Azure Synapse, Google BigQuery, and Snowflake. But Oracle does not necessarily see ADW as directly competitive, targeting existing on-premises customers in the short run but with an eye to self-service ones in the longer term. As Lumpkin explained, “Many of Oracle’s Autonomous Data Warehouse customers are existing on-prem users of Oracle who are looking to migrate to the cloud. However, we have also designed Autonomous Data Warehouse for the self-service market, with easy interfaces that allow sales and marketing operations teams to move their team’s workloads to the cloud.”

Oracle’s strategy highlights a tension in tech: Traditional CIOs with legions of database administrators (DBAs) are worried about the migration to the cloud. DBAs who have built entire careers around being an expert at patching and tuning databases may find themselves lacking work in a self-service world where cloud providers like Oracle are patching and tuning enterprise databases.

CIOs who measure their success based on headcount and on-premises spend might also be worried. As Mendelson put it: “70% of what the DBA used to do should be automated.” Given that Oracle’s legacy business is still catering towards DBAs and CIOs, how do they feel about potentially upsetting their traditional advocates? While they acknowledged that automation would reduce some of the tasks traditionally performed by DBAs, they were not worried about complete job redundancy. Lumpkin explained, “By lowering the total cost of ownership for analytics, the business will be demanding 5x the number of databases.” In other words, DBAs and CIOs will see the same transformation that accountants saw with the advent of the spreadsheet, and there should be plenty of higher-level strategic work for DBAs in the new era for Oracle cloud.

Of course, this isn’t to say there won’t be any changes. After all, change is inevitable as certain functions are automated away. DBAs need to refocus on their unique value add. “Some DBAs may have built their skill sets around patching Oracle databases,” explains Lumpkin. “That’s now automated because it was the same for every customer, and we could do it more consistently and reliably in the cloud. It was never adding value to the customer. What you want is your people doing work that is unique to your datasets and your organization.”

We did a deep dive into different parts of ADW tools. Here’s what we found.

Autonomous Data Warehouse setup

The automated provisioning and database setup tools were well done. The in-app screens and tutorials mostly adhered to one another and we could get set up in about five minutes. That said, there were still some rather annoying steps. For example, the user needs to create both a “database user” and an “analytics user.” This makes a lot of sense on centrally administered databases serving an entire enterprise, but is overkill for a tool for a single analyst trying to get started (much less a tutorial for an analyst tool). The vast majority of data scientists and data analysts do not want to be database administrators, and the tool could benefit from a mode that hides this detail from the end user. This is a shortcoming that Oracle understands. As Lumpkin explains, “We have been looking at how to simplify the create-user flow for new databases. There are competing best practices for security [separation of duties between multiple users] and fastest onboarding experiences [with only one user].” But overall, the documentation is very well done, and onboarding is straightforward but could be a bit smoother.

Data insights

The automated insights tool is also interesting and could prove powerful. The insights run many queries against your dataset, generating predicted values against a target column. They then highlight the unexpected values where the predicted values deviate significantly from actual values. The algorithm appears to be running multiple groupbys and identifying groups with highly unexpected values. While this may lead to some risk of data dredging if used naively, it does provide some quick speedups: Some large fraction of data analysis comes from understanding unexpected results, and this feature can help with that.

Business model

One of the pervasive challenges with data modeling is defining business logic on raw enterprise data. Typically, this logic might reside in the heads of individual business analysts, leading to the inconsistent application of business logic across reports by different analysts. Oracle’s Data Tools provide a “Business Model” centralizing business logic into the database, increasing consistency and improving performance via caching. The tool offers some excellent features, like automatically detecting schemas and finding the keys for table joins. However, some of these features may not be very robust. While the tool could identify many valuable potential table joins in the tutorial movie dataset, it could only find a small subset of the relationships in the publicly available MovieLens dataset. Nonetheless, this is a valuable tool for solving a critical enterprise problem.

Data transform

The data transform tool provides a GUI to specify functions to clean data. Cleaning data is the No. 1 job of a data scientist or data analyst, making this a critical feature. Unfortunately, the tool has made certain questionable design choices. They stem from the use of a GUI: Rather than specifying the transformation using a CREATE TABLE query in SQL, they ask you to write code in a GUI, awkwardly connecting functions with lines and clicking through menus to select options. While the end result is a CREATE TABLE query, this abandons the syntax that data scientists and analysts are familiar with, makes code less reproducible and less portable, and ultimately makes analysts and their queries more dependent on Oracle’s GUI. Data professionals may wish to avoid this feature if they are eager to develop transferable skills and sidestep tool lock-in.

To be clear, there are useful drag-and-drop features in a SQL integrated development environment (IDE). For example, Count.co, which offers a BI notebook for analysts, supports drag and drop for table and field names into SQL queries. This nicely connects the data catalog to the SQL IDE query and helps prevent misspelled table or field names without abandoning the fundamental text-based query scripts we are used to. Overall, it felt much more natural as an interface.

Oracle Machine Learning

Oracle’s Machine Learning offering is growing and now includes ML notebooks, AutoML capabilities, and model deployment tools. One of the big challenges for Oracle and its competitors will be to demonstrate utility to data scientists and, more generally, people working in both ML and AI. While these new capabilities have come a long way, there’s still room for improvement. Making data scientists use Apache Zeppelin-based notebooks will likely hamper adoption when so many of us are Jupyter natives; so will preventing users from custom-installing Python packages, such as PyTorch and TensorFlow.

The problem Oracle is attempting to solve here is one of the biggest in the space: How do you get data scientists and machine learners to use enterprise data that sits in databases such as Oracle DBs? The ability to use familiar objects such as pandas data frames and APIs such as matplotlib and scikit-learn is a good step in the right direction, as is the decision to host notebooks. However, we need to see more: Data scientists often prototype code on their laptops in Jupyter Notebooks, VSCode, or PyCharm (among many other choices) with cutting-edge OSS package releases. When they move their code to production, they need enterprise tools that mimic their local workflows and allow them to utilize the full suite of OSS packages.

A representative of Oracle said that the ability to custom install packages on Autonomous Database is a road map item to address in future releases. In the meantime, the inclusion of scikit-learn in OML4Py allows users to work with familiar Python ML algorithms directly in notebooks or through embedded Python execution, where user-defined Python functions run in database-spawned and controlled Python engines. This supplements the scalable, parallelized, and distributed in-database algorithms and provides the ability to manipulate data in database tables and views using Python syntax. Overall, this is a step in the right direction.

Oracle Machine Learning’s documentation and example notebook library is extensive and valuable, allowing us to get up and running in a notebook in a matter of minutes with intuitive SQL and Python examples of anomaly detection, classification, and clustering among many others. This is welcome in a tooling landscape that all too often falls short in useful DevRel material. Learning new tooling is a serious bottleneck, and Oracle has removed a lot of friction here with their extensive documentation.

Oracle has also recognized that the MLOps space is heating up and that table stakes include the need to deploy and productionize machine learning models. To this end, OML4Py provides a REST API with Embedded Python Execution, as well as providing a REST API that allows users to store ML models and create scoring endpoints for them. It is welcome that this functionality not only supports classification and regression OML models, but also Open Neural Network Exchange (ONNX) format models, which include TensorFlow. Once again, the documentation here is extensive and very useful.

Graph Analytics

Oracle’s Graph Analytics offers the ability to run graph queries on databases. It is unique in that it allows users to directly query their data warehouse data. In contrast, Neptune, AWS’ graph solution, requires loading data from their data warehouse (Redshift). Graph Analytics uses PGQL, an Oracle-supported language that queries graph data in the same way that SQL queries structured tabular data. The language’s design is closer to SQL, and it is released under the open-source Apache 2.0 License. However, the main contributor is an Oracle employee, and Oracle is the only vendor supporting PGQL. The preferred mode of interacting with PGQL is through the company’s proprietary Graph Studio tool, which doesn’t promote reproducibility, advanced workflows, or interfacing with the rest of the development ecosystem. Lumpkin promised that REST APIs with Python and Java would be coming soon.

Perhaps unsurprisingly, Oracle’s graph query language appears to be less popular than Cypher, the query language supported by neo4j, a rival graph database (i.e., the PGQL language has 114 stars on GitHub, while neo4j has 8K+ stars). A proposal to bring together PGQL, Cypher, and G-Core has over 95% support from users for nearly 4K votes, has its own landing page, and is gaining traction internationally. While the survey methodology may be questionable — the proposal is authored by the Neo4j team on a Neo4j website — it’s understandable why graph database users would prefer a more commonly used open standard. Hopefully, graph query standards will emerge to streamline competing standards and simplify graph querying for data scientists.

Final thoughts

Oracle is a large incumbent in an increasingly crowded space that’s moving rapidly. The company is playing catch-up, with recent developments in open source tooling and the long tail of emerging data tooling businesses as well as with the ever-growing total addressable market of the space. We’re not only talking about just well-seasoned data scientist and machine learning engineers, but the increasing number of data analysts and citizen data scientists.

For Oracle, best known for its database software, these recent moves are intended to update its offerings to the data analytics, data science, machine learning, and AI spaces. In many ways, this is the data tooling equivalent of Disney making moves to streaming with Disney+. For the most part, Oracle’s recent expansion of its Autonomous Data Warehouse delivers on its promise: to bring the benefits of ADW to large groups of new potential users. There are some lingering questions around whether these tools will meet all the needs of working data professionals, such as being able to work with their open-source packages of choice. We urge Oracle to prioritize such developments on its road map, as access to open source tooling is now table stakes for working data scientists.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member



Repost: Original Source and Author Link

Categories
Tech News

New Amazon robots could enable ‘safer’ exploitation of warehouse staff

Weeks after a study revealed that Amazon warehouse workers are injured at higher rates than staff at rival firms, the company has revealed it’s testing new robots designed to improve employee safety.

The e-commerce giant has ingratiatingly named two of the bots after Sesame Street’s Bert and Ernie.

Bert is an Autonomous Mobile Robot (AMR) that’s built to navigate through Amazon facilities. In the future, the company envisions the bot carrying large and heavy items or carts across a site, reducing the strain on its human coworkers.

Ernie, meanwhile, is a workstation system that removes totes from robotic shelves and then deliveries them to employees.

“The innovation with a robot like Ernie is interesting because while it doesn’t make the process go any faster, we’re optimistic, based on our testing, it can make our facilities safer for employees,” said Kevin Keck worldwide director of Advanced Technology at Amazon.

The duo may one day be joined at work by another pair of robot colleagues: Scooter and Kermit, which transport carts across facilities.

Amazon said it plans to deploy Scooter in at least one Amazon facility this year, and introduce Kermit in a minimum of 12 North American sites.

[Read: Why entrepreneurship in emerging markets matters]

The robots were unveiled amid growing concerns about worker safety at Amazon. Earlier this month, a union-backed report on safety data found serious injury rates at the company were almost 80% higher than the rest of the industry.

Amazon has previously been accused of deceiving the public about the rising injury rates in its warehouses. But in recent months, the company has begun to publicly acknowledge the problem.

In April, Jeff Bezos revealed another system designed to improve worker safety: an algorithm that rotates staff around tasks that use different body parts.

These initiatives are unlikely to discourage accusations that Amazon treats workers like robots. But hopefully, the systems can provide some support for their overworked human colleagues — and don’t end up replacing them.

Greetings Humanoids! Did you know we have a newsletter all about AI? You can subscribe to it right here.

Repost: Original Source and Author Link

Categories
AI

Locus Robotics raises $150 million to scale its warehouse robotics platform

Locus Robotics, a Wilmington, Massachusetts-based warehouse robotics startup, today announced it has raised $150 million in series E funding at a $1 billion post-money valuation. The company says the funding will allow it to accelerate product innovation and global expansion. Locus expects that in the next four years, over a million warehouse robots will be installed and that the number of warehouses using them will grow tenfold.

Worker shortages attributable to the pandemic have accelerated the adoption of automation. According to ABI Research, more than 4 million commercial robots will be installed in over 50,000 warehouses around the world by 2025, up from under 4,000 warehouses as of 2018. In China, Oxford Economics anticipates 12.5 million manufacturing jobs will become automated, while in the U.S., McKinsey projects machines will take upwards of 30% of such jobs.

Locus’ autonomous robots — called LocusBots — can be reconfigured with totes, boxes, bins, containers, or peripherals like barcode scanners, label printers, and sensors. They work collaboratively with humans, minimizing walking with an app that recognizes workers’ Bluetooth badges and switches to their preferred language. On the backend, Locus’ LocusServer directs robots so they learn efficient travel routes, sharing the information with other robots and clustering orders to where workers are. As orders come into warehouse management systems, Locus organizes them before transmitting back confirmations, providing managers real-time performance data, including productivity, robot status, and more.

Locus Robotics

When new LocusBots are added to the fleet, they share warehouse inventory status and item locations. Through LocusServer, they detect blockages and other traffic issues to improve item pick rates and order throughput. Locus’ directed picking technology points workers to their next picks, optionally providing challenges through a gamification feature that supports individual, team, and shift goals, plus events and a mechanism managers can use to provide feedback. In addition, Locus’ backend collates various long-tail metrics, including hourly pick data, daily and monthly pick volume, current robot locations, and robot charging levels.

Locus offers a “robot-as-a-service” program through which customers can scale up by adding robots on a limited-time basis. For a monthly subscription fee, the company sends or receives robots to warehouses upon request, and it provides those robots software and hardware updates, in addition to maintenance.

Locus claims that its system, which takes about four weeks to deploy, has delivered a 2 to 3 times increase in productivity and throughput and 15% less overtime spend for brands like Boots UK, Verst Logistics, Ceva, DHL, Material Bank, Radial, Port Logistics Group, Marleylilly, and Geodis. The company’s robots passed 100 million units picked in February 2020, and last April UPS announced that it would be piloting Locus machines in its own facilities.

Locus recently opened a new headquarters in the EU and surpassed 50 customer deployments, with companies including DHL, Boots UK, and Geodis. As of late 2020, the company’s robots had picked and sorted over 300 million units, equating to ten thousand units every 15 minutes or roughly a million units a day.

Locus Robotics

Bond and Tiger Global Management led the round announced today. It brings the Quiet Logistics spinout’s total raised to over $105 million, following a $40 million series D investment last June.

Locus competes in the $3.1 billion intelligent machine market with Los Angeles-based robotics startup InVia, which leases automated robotics technologies to fulfillment centers. Gideon Brothers, a Croatia-based industrial startup backed by TransferWise cofounder Taavet Hinrikus, is another contender. And then there’s robotics systems company GreyOrange; Otto Motors; and Berkshire Grey, which combines AI and robotics to automate multichannel fulfillment for retailers, ecommerce, and logistics enterprises. Fulfillment alone is a $9 billion industry — roughly 60,000 employees handle orders in the U.S., and companies like Apple manufacturing partner Foxconn have deployed tens of thousands of assistive robots in assembly plants overseas.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform
  • networking features, and more

Become a member

Repost: Original Source and Author Link