Categories
AI

Report: Data and enterprise automation will drive tech and media spending to $2.5T

Join gaming leaders, alongside GamesBeat and Facebook Gaming, for their 2nd Annual GamesBeat & Facebook Gaming Summit | GamesBeat: Into the Metaverse 2 this upcoming January 25-27, 2022. Learn more about the event. 


According to a new report released by Activate Consulting, the global technology and media spend will balloon to $2.5 trillion by 2025.  This analysis comes as 2021 netted a spend of more than $2 trillion.

The report indicates that one of the major drivers of this tech boom will be data solutions and enterprise automation.  According to the report, “Activate Technology and Media Outlook for 2022,” a set of new companies are paving the way for the future, delivering infrastructure, tools, and applications that will enable all enterprises to operate and innovate as if they were major technology companies.

Businesses and consumers can expect to see accelerated development of customer experiences, better (faster, less bureaucratic) employee experiences, improved intelligence and decision-making, and improved operational and financial efficiency as a result.  Technology like autonomy (self-driving cars, home automation), voice recognition, AR/VR, gaming and more will enable end-user experiences while enterprises will become more productive in their marketing effectiveness, IT service management, cross-planning and forecasting, and more.

New data startups are spurring the next era of innovation.  They’re focusing on leveraging data and information, improving end-user experience, and improving storage and connectivity — all of which will drive the business-to-business and business-to-consumer experiences of the future.

Event

The 2nd Annual GamesBeat and Facebook Gaming Summit and GamesBeat: Into the Metaverse 2

Learn More

According to the report, more than 80% of the companies driving this innovation are U.S.-based, half of which are headquartered in the Bay Area.  They’re growing fast thanks to large venture capital infusions – and many of these startup companies have scaled at an unprecedented pace.  Fifteen of them have raised more than $1 billion since their launch.

In order for the next generation of companies to reach their full potential, the report indicates they must zero in on three specific areas of focus: strategy and transformation, go-to-market pricing, as well as their sales and marketing approach.

Read the full report by Activate Consulting.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link

Categories
AI

SnapLogic seeks to accelerate digital transformation with enterprise automation

Hear from CIOs, CTOs, and other C-level and senior execs on data and AI strategies at the Future of Work Summit this January 12, 2022. Learn more


Following a pandemic that halted the world in 2020 and forced organizations to develop new ways to do things, more companies are now leveraging cloud-based technologies for their business operations. With Gartner forecasting the global hyperautomation-enabling software market to reach nearly $600 billion by 2022, application management and data integration are playing a key part in the increased automation that enterprise technologies are now being built for.

Hyperautomation is no longer an option but a condition of survival for organizations, according to Fabrizio Biscotti, research vice president at Gartner. “Organizations will require more IT and business process automation as they are forced to accelerate digital transformation plans in a post-COVID-19, digital-first world,” said Biscotti in a report.

SnapLogic, a San Mateo, California-based company, offers an application management and data integration platform for on-premises or cloud-based data and process flow acceleration. SnapLogic’s chairman and CEO, Gaurav Dhillon, told VentureBeat that SnapLogic’s technology uses AI-powered algorithms to provide enterprises moving huge volumes of data with high-level automation — enabling them to be seven times more productive than when they use traditional batch-based reporting.

Dhillon says enterprise automation is the future, adding that enterprises are going to be hybrid and multicloud for the foreseeable future, as they operate with a combination of the technologies they already have while exploring newer, more effective technologies.

Latest trends in digital integration

Commenting on the latest trends in digital integration, Dhillon noted there’s no question that the future is going to be highly automated. The big difference between the old world and now is that today, Dhillon said, it’s not just about business intelligence and reporting — AI also needs data.

Gartner's Top Strategic Technology Trends for 2022. Topics include: data fabric, cybersecurity mesh, privacy-enhancing computation, cloud-native platforms, composable applications, decision intelligence, hyperautomation, AI engineering, distributed enterprise, total experience, autonomic systems, and generative AI.

Gartner’s top strategic technology trends for 2022 highlights data fabric, which involves data integration across platforms and users, cloud-native technologies, AI, decision intelligence, and hyperautomation as part of the “12 trends that will accelerate digital capabilities and drive growth for technology executives in 2022.”

“In the old days, the consumers of data — who are usually humans — needed the data to make decisions. Today, the consumer of data is likely to be an algorithm, which needs data to do a better job of running the business, as well as the analytics to understand process flows and make smart decisions like taking customer orders, fulfilling each order level, tackling shipping logistics, and more,” said Dhillon.

With so many SaaS applications in the enterprise today, Dhillon noted that the future will have a high degree of automation. “We can have autonomous cars. Why can’t we have autonomous integration? We call that enterprise automation — one platform that connects your apps, data, APIs, and more,” said Dhillon.

Dhillon claims many enterprises like Schneider Electric, AstraZeneca, Adobe, Box, Yelp, Kaplan, and others are procuring SnapLogic’s product because they see it as bringing the best of both worlds — connecting legacy technologies and the cloud in a way that nobody else can.

Transitioning from relational data to unstructured data

Today’s data integration requirement is more informed in real-time business processes and automation of mundane tasks, Dhillon explained. SnapLogic’s continued forward momentum in the data integration landscape saw the company named the only visionary in Gartner’s Magic Quadrant for Data Integration Tools, with other companies like Informatica — where Dhillon was once cofounder and CEO — as well as Microsoft, Oracle, Talend, IBM, and others.

Image of scatterplot graph showing various enterprises measured by completeness of vision in the X-axis and their ability to execute in the Y-axis. All findings can be found in the original copy of the article.

While the company moves about 2.7 trillion JSON documents through its system monthly, up from 10 million documents monthly about four to five years ago, Dhillon said the real headline is the change in data. The transition from relational data with rows and columns like an Excel spreadsheet to unstructured data like the web, is the real story in the future of data integration, Dhillon said.

“There’s so much unstructured data in the world: web browsing, data from machines, data from routers and firewalls, and more. This type of data will become more important this year to next year, and even at the end of the decade. Therefore, companies that are built from the ground up to handle unstructured data are more likely to provide value to their customers.”

Key competitors and differentiation in the industry

Most enterprises, particularly service industries like banking and insurance, are most likely to have one-third of their employees moving data for the other two-thirds. This is a big problem that Dhillon claims is becoming bigger because of the amount of SaaS applications and unstructured data in the enterprise today.

SnapLogic’s first competitors are people writing codes by hand and those who get some open-source technology from GitHub to try to accomplish tasks. SnapLogic also has strong competition in legacy products from companies like Informatica, Microsoft, and IBM, as well as some new cloud products coming up and focusing on more cloud-based applications.

According to Dhillon, SnapLogic was built for hybrid-multicloud solutions providers. He claims SnapLogic is differentiated in the industry because of its capability to provide multicloud productivity in a way that both legacy companies and the new companies cannot.

Dhillon claims SnapLogic’s technology enables technical decision-makers to move beyond the frustrating process of data siloing. “Pillar applications like CRM, marketing apps, data warehouses all go to the cloud, and during this process, the legacy interconnectivity existing between these apps breaks. The most important thing for technical decision-makers today is that SnapLogic provides the connective tissues that connect data, application, supplies, and API management capabilities in one suite of products,” he concluded.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link

Categories
AI

What it takes to become a smart enterprise

Hear from CIOs, CTOs, and other C-level and senior execs on data and AI strategies at the Future of Work Summit this January 12, 2022. Learn more


We have smartphones, smart cars, even smart cities. In fact, there aren’t many things left that aren’t becoming smarter by the day.

But what about smart enterprises? With digital transformation well underway and artificial intelligence quickly making its way into the IT stack – in part to support smartphones, smart cars, and smart cities – how can we expect the enterprise itself to become smarter? And will we even be able to pinpoint the moment in which it becomes smart?

Smart actions, not words

Clearly, smartness is more than a mission statement or press release touting things like data science and intelligent analytics. As tech consultancy Plekton Labs noted recently, a smart enterprise is defined more by the way it uses these and other technologies, both strategically and operationally. To be considered smart, the enterprise will have to display a range of capabilities that it doesn’t have now, or at least cannot leverage to make an appreciable impact on the business model. These include the following:

  • Continuous availability
  • Employee empowerment at all levels
  • Collaboration inside and outside the organization
  • Deployment of user-centric tools and services
  • Improved innovation through next-gen networks, operations, and processes
  • New levels of productivity and creativity
  • Support for rapid digital transformation in both processes and practices.

While this transformation doesn’t depend solely on AI, it’s fair to say that it will play a leading role. As more tasks become automated, the enterprise becomes more responsive to the demands of a digital economy, in part by focusing its human capital on key tasks that cannot be automated so easily.

But as Kumar Singh, research director at SAPInsider, notes, it’s not like a day will come when an enterprise becomes smart at the flip of a switch. Instead, we’ll see gradual steps in maturity as organizations embrace these new capabilities.

A freshmen enterprise, for example, is still charting out the potential for intelligent operations and data-driven decision-making to alter the business model. Meanwhile, sophomores are starting to implement cultural changes to create new processes and streamline operations, while juniors are taking this to the next level by focusing on the creation of new revenue streams and business models. Finally, senior organizations have converted their operations to rapid, iterative experimentation with an eye toward building customized AI toolchains and developing in-house talent around the new data ecosystem.

The smart (data-driven) plan

None of these milestones will be achieved without a plan, however. Salesforce recently posted four key pillars that organizations should strive for to become a data-driven organization. The first key step is to develop adequate data management, focusing not just on markets or customers but employees, operations, and virtually everything else. Secondly, organizations should choose their data analytics technology carefully. While it may be tempting to deploy out-of-the-box solutions, a more effective strategy is to focus on lower-level tools and programming languages to preserve high levels of flexibility.

From there, you’ll need to upskill your workforce in the use of AI and then embed these new talents directly into business units rather than spin them off into their own department. This provides the fastest, most accurate turnaround for data-driven decisions. Finally, the smart enterprise requires a cultural shift that embraces change. This can only come about through proper leadership and a clear strategic roadmap that incorporates all aspects of the company.

To become truly smart, however, tools like AI must integrate seamlessly into the operational model, says BMC’s Ali Siddiqui. This is why AIOps has become a key strategic imperative for enterprises making this transition. Like with DevOps, AIOps strives to create a more proactive and predictive IT environment in which machines can resolve their own issues using Big Data and advanced analytics. By putting IT at the forefront of digital transformation, it can then accelerate the deployment of smart capabilities across the rest of the enterprise.

As with people, however, smart is a relative term. More than likely, organizations will become very smart at some things and not so smart at others. And no matter how smart you become, there are always ways to become smarter.

Ultimately, the smartest enterprises will be those that recognize how much they have to learn.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link

Categories
AI

Logik.io, which simplifies complex enterprise sales processes, gets over $10M

Hear from CIOs, CTOs, and other C-level and senior execs on data and AI strategies at the Future of Work Summit this January 12, 2022. Learn more


Logik.io, a Chicago-based next-gen software company that simplifies complex sales processes with headless operation configuration, today announced it has raised over $10M in seed funding.

Logik.io claims its modern configuration solution extends CPQ (configure, price, quote) tools, which help companies  generate quotes for orders. Logik.io says it provides logic, math calculations, and more, and works on top of  Salesforce’s CPQ tools and and other ecommerce applications.

The company says its headless operation differentiates it by allowing it to power any front-end user interface via APIs, and says it’s significant because many businesses today have disparate and siloed configuration data models — for example one to power CPQ, one to power eCommerce, and one to power order management.  While several enterprises streamline their quote-to-cash processes with Salesforce CPQ, opening up ecommerce selling channels and sharing accurate information with enterprise resource planning (ERP), doing so with supply chain applications still remains challenging. In a press release, Logik.io says it is making this possible by overlaying the headless configurator on top of Salesforce CPQ, available on the Salesforce AppExchange.

Logik.io CEO, Christopher Shutts, gave a broader context on the company’s technology and new funding direction with VentureBeat via email.

Faster performance than traditional CPQ

According to Shutts, Logik.io’s proprietary solving engine allows for faster performance than traditional CPQ tools. He said the company’s technology enables more complex configurations than traditional engines can handle— delivering a better experience for end-users, as well as ensuring businesses can maintain all their configuration logic in one system.

“Unlike traditional configuration models, our proprietary solving engine (rules engine powering the CPQ logic) was purpose-built to automatically and more efficiently manage configuration rules in the optimal order.  In traditional CPQ rule engines, performance and speed of loading and processing within CPQ suffers severely because they are built with more legacy linear rules-based processing. This can make extremely complex configuration nearly impossible without Logik.io,” he added.

Logik.io claims its solution is designed to work with today’s ecommerce solutions “to provide guided selling and configuration to B2B and B2C self-service selling.”

Industry differentiation

With Logik.io, Shutts claims B2B and B2C self-service selling companies can easily build and maintain their data models once and re-use them across all applications.  “We are building a truly innovative technology for companies looking to improve their CPQ experience and enable B2B ecommerce,” Shutts told VentureBeat.

“We know both buyers and sellers demand speed and convenience throughout the sales process, and Logik.io delivers on that promise with lightning fast selection and configuration tools,” he added.

CPQ and omnichannel use cases

Logik.io says its technology has applications in sales, helping sales teams to augment their Salesforce CPQ with a configuration tool that offers team speed, advanced configuration logic, and a simple, intuitive administration. This eliminates the need for complex integrations, allowing companies to optimize their CRM capabilities with all their data in one place — the company said.

For B2B and B2C omnichannel selling, Logik.io says its headless robust rules engine integrates with all major ecommerce platforms— enabling companies to sell high-complexity products online via existing self-service digital commerce platforms. This allows companies to use a data set for all their product configuration use cases, further streamlining overall IT architecture.

Propelling digital transformation in the CPQ space

This investment round was led by High Alpha with participation from Salesforce Ventures and other private investors. Logik.io says this additional capital will be used to accelerate its product development strategy as well as support the growth of its go-to-market engine into 2022.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link

Categories
AI

Enterprise process automation startup Workato nabs $200M

Mountain View, California-based Workato, an enterprise process automation platform, today announced that it raised $200 million at a $5.7 billion valuation, bringing its total funding to over $420 million to date. The series E — which was led by Battery Ventures with equal participation from Insight Partners, Altimeter Capital, and Tiger Global — will be used to support global expansion as well as future mergers and acquisitions, according to CEO Vijay Tella.

Remote work, support of remote business, and other pandemic headwinds have prompted enterprises to adopt — or at least consider adopting — automation solutions. According to Forrester, 20% of companies will in the coming year expand their use of technologies like intelligent document extraction, which can extract and analyze data from a mix of digital and physical documents. The time and cost savings can be substantial — a 2017 study found that 53% of employees could save up to two work hours a day through automation, equating to 240 hours per year.

Founded by Alexey Timanovskiy, Dimitris Kogias, Gautham Viswanathan, Harish Shetty, and Tella in 2013, Workato lets companies integrate a range of data and apps to automate backend and front-end business workflows. The company’s platform delivers robotic process automation (RPA), integration platform-as-a-service, business process automation, and chatbot capabilities in a solution designed to enable IT and business teams to collaborate — ostensibly without compromising security, compliance, or governance.

“I was fortunate to be a part of the team that created the very first integration platform in the late 1980s and early 1990s. It was called The Information Bus, or TIB, and led to TIBCO and a wave of middleware technologies. I went on to become the founding SVP of engineering of TIBCO and was a part of the team that took TIBCO public. After that, I was the chief strategy officer of Oracle Fusion Middleware,” Tella told VentureBeat via email. “I, along with my cofounders Viswanathan and Shetty, built Workato in 2013 from the ground up. Our founding team has been largely involved in building some of the earliest integration platforms. We created a fusion of our past teams — people with completely different backgrounds that were deep in the consumer or cloud and integration space.”

With Workato, users can create automations from scratch or opt for over 500,000 prebuilt recipes addressing marketing, sales, finance, HR, IT, and other processes. The company says its over 11,000 customers and partners are creating over 500 new connectors to apps and systems each month.

Workato

Above: Creating a workflow automation recipe using Workato’s tools.

Image Credit: Workato

“For HR, for example, Workato can automate HR onboarding and offboarding. This includes automatically generating accounts for new hires in HR apps like Workday or Namely, monitoring benefits like time off, and triggering app provisioning or deprovisioning or hardware provisioning,” Tella said. “Workato also helps customers streamline mission-critical business processes in the finance department by automating the entire revenue process end to end — for example, product configuration, pricing, quoting, contracts, invoicing, billing, orders, revenue recognition, and renewals. [And Workato has been] supporting customers in building automations to help facilitate navigating the workplace during the pandemic.”

Workato also makes extensive use of AI and machine learning. As Tella explained to TechCrunch in a 2018 interview: “Leveraging the tens of billions of events processed, hundreds of millions of metadata elements inspected and hundreds of thousands of automations that people have built on our platform — we leverage machine learning to guide users to build the most effective [automation] by recommending next steps as they build these automations. It recommends the next set of actions to take, fields to map, auto-validates mappings, [and more]. The great thing with this is that as people build more automations — it learns from them and continues to make the automation smarter.”

In May at its Automate 2021 Conference, Workato introduced a number of new services including Automation HQ, a set of capabilities encompassing federated workspaces, a business operations console, lifecycle management tools, and custom communities for companies. Automation Accelerators, another product announced in May, delivers prepackaged solutions containing prebuilt recipes, custom connectors, reference data, and instructional guides.

Expanding platform

Workato competes with a number of companies in a workflow automation market that’s anticipated to be worth $18.45 billion by 2023, according to Markets and Markets. (From November 2019 to November 2020, over $2.2 billion in venture capital was funneled into tech companies building workflow automation solutions.) AirSlate offers products that automate repetitive enterprise tasks like e-signature collection. Tonkean is expanding its no-code workflow automation platform. There’s also Tray.io, Daylight, Leapwork, DeepSee.ai, Kore.ai, Aisera, and Berlin-based Camunda, each of which have closed funding rounds in the tens of millions for their process automation toolkits.

For Workato’s part, the company says that it more than doubled new annual recurring revenue, its headcount, and its customer base with additions like Stitch Fix, GitLab, NYU, Nokia, and Lucid Motors. Tella also noted that Workato recently acquired Chennai, India-based RailsData — which specializes in connectivity between apps, databases, and devices — to create what he describes as an “app connector factory,” with the goal of scaling the number of connectors on Workato by over 10 times in the next few years.

Workato — which has 650 employees — plans to focus its expansion efforts particularly in Europe, the Middle East, and Africa after it saw a 289% surge in usage over the past 12 months in the region. Beyond this, the company intends to open additional datacenters in Asia and add support for more regional languages.

“The pandemic has driven an even greater need for business excellence, and companies have responded by automating core workflows; seeking out low-code tools that empower employees to work quickly and autonomously. This investment arrives at a time of rapid growth for Workato and the automation market as a whole, as enterprises recognize the urgent need to increase their agility, innovation and efficiency against a backdrop of transformation and change,” Tella continued. “There’s really nobody that does quite what we do. Existing solutions either focus on integration and are too complex for business users, or they focus on automation and support only specific use cases.”

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link

Categories
AI

ControlUp lands $100M to help enterprise IT teams manage remote software

Join gaming leaders online at GamesBeat Summit Next this upcoming November 9-10. Learn more about what comes next. 


San Jose, California-based ControlUp, an IT infrastructure management, monitoring, and troubleshooting platform, today announced that it raised $100 million co-contributed by K1 Investment Management and JVP, bringing its total raised to $140 million. CEO Asaf Ganot says that the investment will enable ControlUp to expand its employee headcount while supporting ongoing product development efforts.

“This injection of capital will accelerate our ability to help more enterprises open the door to the limitless possibilities of a simpler, more reliable work-from-anywhere experience,” Ganot said in a statement. “We give IT real-time visibility into system status, with the ability to resolve help desk calls faster, and even handle potential system issues before they happen. All this translates to fewer headaches, lower costs, higher productivity, and happier people.”

ControlUp’s tranche comes as IT teams struggle to contend with remote and hybrid work setups emerging during the pandemic. According to a PricewaterhouseCoopers survey, 17% of employers say that the shift to remote work hasn’t been successful for their company. Among the other headaches are security and governance vulnerabilities — 45% of professionals expect their company to suffer a data breach during the pandemic due to staff using personal devices that aren’t properly protected.

ControlUp aims to address the growing challenges with a software-as-a-service product that collects device metrics (e.g., CPU, RAM, bandwidth, and I/O usage; protocol latency; and app load time) to help customers troubleshoot and remediate software issues. The platform collects up to one year of virtual desktop interface, server, and device environment data, analyzing it to proactively warn of potential issues with the availability of enterprise resources including domain name servers, file shares, and print services.

Device monitoring

ControlUp was founded in 2008 by Ganot and Yoni Avital, who began their careers at a Citrix services company implementing end-user computing projects. While there, they built tools that helped expose common technical problems in virtual desktop environments, which became the cornerstone of ControlUp’s current product offering.

ControlUp provides telemetry dashboards, updated every few seconds, that highlight and help to fix problems with virtual desktops and apps — for example, slow app response. IT admins can leverage search and grouping options to show resources as they change states or opt for automated actions and scripts that clean up temp directories, expand disk size, log off idle users, and more.

ControlUp’s “top insights” pane summarizes findings through widgets that spotlight anomalies, key performance indicators, and other metrics that could impact performance. The metrics are compared against both internal averages and a “global benchmark” consisting of anonymized data aggregated from all ControlUp’s enterprise customers. ControlUp uses the data to, among other things, provide AI-driven recommendations for increasing or decreasing assigned CPU and RAM to machines, and to answer granular questions like “Is my user’s profile load time phase long or short compared to other organizations with SSD storage?”

ControlUp

Above: ControlUp’s monitoring dashboard.

“By analyzing data from tens of thousands of troubleshooting sequences and continuously improving its machine learning algorithms, [ControlUp] recommends the shortest drill down path to uncover the root cause of [a] problem,” the company says on its website. “ControlUp’s … real-time engine connects to a multitude of data sources using flexible and expandable data collectors that cover a wide array of architectures and technologies. It utilizes a high performance in-memory database in order to digest, associate, and correlate hundreds of thousands of records in a single node.”

Expanding market

A 2021 Omdia survey predicts that, going forward, only 24% of knowledge workers will be permanently based in a office and working from a single desk. This is likely to further strain IT departments already struggling to adapt to the new norm. According to Riverbed, 94% of companies experienced technology problems that impacted their business while employees worked remotely, particularly disconnections from corporate networks, slow file downloads, and long response times when loading apps.

Against this backdrop, business has been booming for 250-employee ControlUp, which says it’s seen 50% revenue growth and 67% growth in enterprise accounts year-over-year, with over 1 million new seats deployed around the world. While it competes with 1E, Nexthink, and Lakeside, ControlUp notes that it currently supports over 5 million devices across 1,500 customers including four of the top five U.S. health insurance companies and five of the top eight US health care companies.

“The pandemic amplified the complexities of supporting employees when they started working from remote locations. These issues have been significant — and our solutions help,” Ganot told VentureBeat via email. “While [the pandemic] did not create the ‘work from anywhere’ trends, we have seen this experience accelerated. Enterprises across industries and around the world are looking closely at how they solve these problems. ControlUp is focused on helping companies give their people the freedom and flexibility to work from anywhere. We do this by empowering IT teams to optimize remote environments, prevent user downtime, and resolve issues faster. Ultimately, we enable businesses deliver an outstanding digital employee experience.”

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link

Categories
AI

MindsDB wants to give enterprise databases a brain

Let the OSS Enterprise newsletter guide your open source journey! Sign up here.

Databases are the cornerstone of most modern business applications, be it for managing payroll, tracking customer orders, or storing and retrieving just about any piece of business-critical information. With the right supplementary business intelligence (BI) tools, companies can derive all manner of insights from their vast swathes of data, such as establishing sales trends to inform future decisions. But when it comes to making accurate forecasts from historical data, that’s a whole new ball game, requiring different skillsets and technologies.

This is something that MindsDB is setting out to solve, with a platform that helps anyone leverage machine learning (ML) to future-gaze with big data insights. In the company’s own words, it wants to “democratize machine learning by giving enterprise databases a brain.”

Founded in 2017, Berkeley, California-based MindsDB enables companies to make predictions directly from their database using standard SQL commands, and visualize them in their application or analytics platform of choice.

To further develop and commercialize its product, MindsDB this week announced that it has raised $3.75 million, bringing its total funding to $7.6 million. The company also unveiled partnerships with some of the most recognizable database brands, including Snowflake, SingleStore, and DataStax, which will bring MindsDB’s ML platform directly to those data stores.

Using the past to predict the future

There are myriad use cases for MindsDB, such as predicting customer behavior, reducing churn, improving employee retention, detecting anomalies in industrial processes, credit-risk scoring, and predicting inventory demand — it’s all about using existing data to figure out what that data might look like at a later date.

An analyst at a large retail chain, for example, might want to know how much inventory they’ll need to fulfill demand in the future based on a number of variables. By connecting their database (e.g., MySQL, MariaDB, Snowflake, or PostgreSQL) to MindsDB, and then connecting MindsDB to their BI tool of choice (e.g., Tableau or Looker), they can ask questions and see what’s around the corner.

“Your database can give you a good picture of the history of your inventory because databases are designed for that,” MindsDB CEO Jorge Torres told VentureBeat. “Using machine learning, MindsDB enables your database to become more intelligent to also give you forecasts about what that data will look like in the future. With MindsDB you can solve your inventory forecasting challenges with a few standard SQL commands.”

Above: Predictions visualization generated by the MindsDB platform

Torres said that MindsDB enables what is known as In-Database ML (I-DBML) to create, train, and use ML models in SQL, as if they were tables in a database.

“We believe that I-DBML is the best way to apply ML, and we believe that all databases should have this capability, which is why we have partnered with the best database makers in the world,” Torres explained. “It brings ML as close to the data as possible, integrates the ML models as virtual database tables, and can be queried with simple SQL statements.”

MindsDB ships in three broad variations — a free, open source incarnation that can be deployed anywhere; an enterprise version that includes additional support and services; and a hosted cloud product that recently launched in beta, which charges on a per-usage basis.

The open source community has been a major focus for MindsDB so far, claiming tens of thousands of installations from developers around the world — including developers working at companies such as PayPal, Verizon, Samsung, and American Express. While this organic approach will continue to form a big part of MindsDB’s growth strategy, Torres said his company is in the early stages of commercializing the product with companies across numerous industries, though he wasn’t at liberty to reveal any names.

“We are in the validation stage with several Fortune 100 customers, including financial services, retail, manufacturing, and gaming companies, that have highly sensitive data that is business critical — and [this] precludes disclosure,” Torres said.

The problem that MindsDB is looking to fix is one that impacts just about every business vertical, spanning businesses of all sizes — even the biggest companies won’t want to reinvent the wheel by developing every facet of their AI armory from scratch.

“If you have a robust, working enterprise database, you already have everything you need to apply machine learning from MindsDB,” Torres explained. “Enterprises have put vast resources into their databases, and some of them have even put decades of effort into perfecting their data stores. Then, over the past few years, as ML capabilities started to emerge, enterprises naturally wanted to leverage them for better predictions and decision-making.”

While companies might want to make better predictions from their data, the inherent challenges of extracting, transforming, and loading (ETL) all that data into other systems is fraught with complexities and doesn’t always produce great outcomes. With MindsDB, the data is left where it is in the original database.

“That way, you’re dramatically reducing the timeline of the project from years or months to hours, and likewise you’re significantly reducing points of failure and cost,” Torres said.

The Switzerland of machine learning

The competitive landscape is fairly extensive, depending on how you consider the scope of the problem. Several big players have emerged to arm developers and analysts with AI tooling, such as the heavily VC-backed DataRobot and H2O, but Torres sees these types of companies as potential partners rather than direct competitors. “We believe we have figured out the best way to bring intelligence directly to the database, and that is potentially something that they could leverage,” Torres said.

And then there are the cloud platform providers themselves such as Amazon, Google, and Microsoft which offer their customers machine learning as add-ons. In those instances, however, these services are really just ways to sell more of their core product, which is compute and storage. — Torres also sees potential for partnering with these cloud giants in the future. “We’re a neutral player — we’re the Switzerland of machine learning,” Torres added.

MindDB’s seed funding includes investments from a slew of notable backers, including OpenOcean, which claims MariaDB cofounder Patrik Backman as a partner, YCombinator (MindsDB graduated YC’s winter 2020 batch), Walden Catalyst Ventures, SpeedInvest, and Berkeley’s SkyDeck fund.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link

Categories
AI

GPT-3 comes to the enterprise with Microsoft’s Azure OpenAI Service

During its Ignite conference this week, Microsoft unveiled the Azure OpenAI Service, a new offering designed to give enterprises access to OpenAI’s GPT-3 language model and its derivatives along with security, compliance, governance, and other business-focused features. Initially invite-only as a part of Azure Cognitive Services, the service will allow access to OpenAI’s API through the Azure platform for use cases like language translation, code generation, and text autocompletion.

According to Microsoft corporate VP for Azure AI Eric Boyd, companies can leverage the Azure OpenAI Service for marketing purposes, like helping teams brainstorm ideas for social media posts or blogs. They could also use it to summarizing common complaints in customer service logs or assist developers with coding by minimizing the need to stop and search for examples.

“We are just in the beginning stages of figuring out what the power and potential of GPT-3 is, which is what makes it so interesting,” he added in a statement. “Now we are taking what OpenAI has released and making it available with all the enterprise promises that businesses need to move into production.”

Large language models

Built by OpenAI, GPT-3 and its fine-tuned derivatives, like Codex, can be customized to handle applications that require a deep understanding of language, from converting natural language into software code to summarizing large amounts of text and generating answers to questions. People have used it to automatically write emails and articles, compose poetry and recipes, create website layouts, and create code for deep learning in a dozen programming languages.

GPT-3 has been publicly available since 2020 through the OpenAI API; OpenAI has said that GPT-3 is now being used in more than 300 different apps by “tens of thousands” of developers and producing 4.5 billion words per day. But according to Microsoft corporate VP of AI platform John Montgomery, who spoke recently with VentureBeat in an interview, the Azure OpenAI Service enables companies to deploy GPT-3 in a way that complies with the laws, regulations, and technical requirements (for example, scaling capacity, private networking, and access management) unique to their business or industry.

“When you’re operating a national company, sometimes, your data can’t [be used] in a particular geographic region, for example. The Azure OpenAI Service can basically put the model in the region that you need for you,” Montgomery said. “For [our business customers,] it comes down to question like, ‘How do you handle our security requirements?’ and ‘How do you handle things like virtual networks?’ Some of them need all of their API endpoints to be centrally managed or use customer-supplied keys for encryption … What the Azure OpenAI Service does is it folds all of these Azure backplane capabilities [for] large enterprise customers [into a] true production deployment to open the GPT-3 technology.”

Montgomery also points out that the Azure OpenAI Service makes billing more convenient by charging for model usage under a single Azure bill, versus separately under the OpenAI API. “That makes it a bit simpler for customers to pay and consume,” he said. “Because at this point, it’s one Azure bill.”

Enterprises are indeed increasing their investments in natural language processing (NLP), the subfield of linguistics, computer science, and AI concerned with how algorithms analyze large amounts of language. According to a 2021 survey from John Snow Labs and Gradient Flow, 60% of tech leaders indicated that their NLP budgets grew by at least 10% compared to 2020, while a third — 33% — said that their spending climbed by more than 30%.

Customization and safety

As with the OpenAI API, the Azure OpenAI Service will allow customers to tune GPT-3 to meet specific business needs using examples from their own data. It’ll also provide “direct access” to GPT-3 in a format designed to be intuitive for developers to use, yet robust enough for data scientists to work with the model as they wish, Boyd says.

“It really is a new paradigm where this very large model is now itself the platform. So companies can just use it and give it a couple of examples and get the results they need without needing a whole data science team and thousands of GPUs and all the resources to train the model,” he said. “I think that’s why we see the huge amount of interest around businesses wanting to use GPT-3 — it’s both very powerful and very simple.”

Of course, it’s well-established that models like GPT-3 are far from technically perfect. GPT-3 was trained on more than 600GB of text from the web, a portion of which came from communities with pervasive gender, race, physical, and religious prejudices. Studies show that it, like other large language models, amplifies the biases in data on which it was trained.

In a paper, the Middlebury Institute of International Studies’ Center on Terrorism, Extremism, and Counterterrorism claimed that GPT-3 can generate “informational” and “influential” text that might radicalize people into far-right extremist ideologies and behaviors. A group at Georgetown University has used GPT-3 to generate misinformation, including stories around a false narrative, articles altered to push a bogus perspective, and tweets riffing on particular points of disinformation. Other studies, like one published by Intel, MIT, and Canadian AI initiative CIFAR researchers in April, have found high levels of bias from some of the most popular open source models, such as Google’s BERT and XLNet and Facebook’s RoBERTa.

Even fine-tuned models struggle to shed prejudice and other potentially harmful characteristics. For example, Codex can be prompted to generate racist and otherwise objectionable outputs as executable code. When writing code comments with the prompt “Islam,” Codex outputs the word “terrorist” and “violent” at a greater rate than with other religious groups.

More recent research suggests that toxic language models deployed into production might struggle to understand aspects of minority languages and dialects. This could force people using the models to switch to “white-aligned English” to ensure the models work better for them, or discourage minority speakers from engaging with the models at all.

OpenAI claims to have developed techniques to mitigate bias and toxicity in GPT-3 and its derivatives, including code review, documentation, user interface design, content controls, and toxicity filters. And Microsoft says it will only make the Azure OpenAI Service available to companies who plan to implement “well-defined” use cases that incorporate its responsible principles and strategies for AI technologies.

Beyond this, Microsoft will deliver safety monitoring and analysis to identify possible cases of abuse or misuse as well as new tools to filter and moderate content. Customers will be able to customize those filters according to their business needs, Boyd says, while receiving guidance from Microsoft on using the Azure OpenAI Service “successfully and fairly.”

“This is a really critical area for AI generally and with GPT-3 pushing the boundaries of what’s possible with AI, we need to make sure we’re right there on the forefront to make sure we are using it responsibly,” Boyd said. “We expect to learn with our customers, and we expect the responsible AI areas to be places where we learn what things need more polish.”

OpenAI and Microsoft

OpenAI’s deepening partnership with Microsoft reflects the economic realities that the company faces. It’s an open secret that AI is a capital-intensive field — in 2019, OpenAI became a for-profit company called to secure additional funding while staying controlled by a nonprofit, having previously been a 501(c)(3) organization. And in July, OpenAI disbanded its robotics team after years of research into machines that can learn to perform tasks like solving a Rubik’s Cube.

Roughly a year ago, Microsoft announced it would invest $1 billion in San Francisco-based OpenAI to jointly develop new technologies for Microsoft’s Azure cloud platform. In exchange, OpenAI agreed to license some of its intellectual property to Microsoft, which the company would then package and sell to partners, and to train and run AI models on Azure as OpenAI worked to develop next-generation computing hardware.

In the months that followed, OpenAI released a Microsoft Azure-powered API — OpenAI API — that allows developers to explore GPT-3’s capabilities. In May during its Build 2020 developer conference, Microsoft unveiled what it calls the AI Supercomputer, an Azure-hosted machine co-designed by OpenAI that contains over 285,000 processor cores and 10,000 graphics cards. And toward the end of 2020, Microsoft announced that it would exclusively license GPT-3 to develop and deliver AI solutions for customers, as well as creating new products that harness the power of natural language generation, like Codex.

Microsoft last year announced that GPT-3 will be integrated “deeply” with Power Apps, its low-code app development platform — specifically for formula generation. The AI-powered features will allow a user building an ecommerce app, for example, to describe a programming goal using conversational language like “find products where the name starts with ‘kids.’” More recently, Microsoft-owned GitHub launched a feature called Copilot that’s powered by OpenAI’s Codex code generation model, which GitHub says is now being used to write as much as 30% of new code on its network.

Certainly, the big winners in the NLP boom are cloud service providers like Azure. According to the John Snow Labs survey, 83% of companies already use NLP APIs from Google Cloud, Amazon Web Services, Azure, and IBM in addition to open source libraries. This represents a sizeable chunk of change, considering the fact that the global NLP market is expected to climb in value from $11.6 billion in 2020 to $35.1 billion by 2026. In 2019, IBM generated $303.8 million in revenue alone from its AI software platforms.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link

Categories
AI

Report: Enterprise use of AI to predict cash flow expected to increase 450%

Enterprise deployment of AI and machine learning (ML) for cash flow forecasting is expected to increase 450% over the next two years, according to the recently released 2021 Cash Forecasting & Visibility Survey from GTreasury and Strategic Treasurer. The survey of nearly 250 enterprises across industries highlights a growing appetite for AI/ML modernization among finance and treasury teams seeking more accurate and more immediate cash flow forecasts.

To sharpen forecasting capabilities (which are critical for determining business direction and priorities), today’s enterprises are embracing new technology strategies and refining methods to introduce greater automation and efficiency. While just 6% of respondents currently use AI/ML technology to predict and understand their cash forecasting, enterprises’ reported plans indicate that, within two years, that number will reach 27%.

Respondents also indicate a similarly bright trajectory for regression analysis: 12% use it currently, but projected usage will grow to 29% in two years, and 43% use or expect to use it at some point in the future.

Graphic. Title: Rise of AI/ML in the near future. 6% of people are using AI/ML and 12% are using regression analysis. 21% of respondents plan to use AI/ML and 17% plan to use regression analysis in the next two years. 24% plan to use AI/ML and 14% plan to use regression analysis in over two years.

The vast majority of enterprises still rely on traditional manual methods for cash forecasting — 91% of survey respondents report using Excel spreadsheets as one of their forecasting tools. In comparison, 25% have a more modern digital treasury platform in place, and 28% use ERP systems. Fifteen percent use financial reporting and analysis (FR&A) or budgeting tools to assist in their forecasts, and just 5% use a dedicated forecasting platform.

Variance analysis is another task requiring heavy manual effort from enterprises: 57% of respondents say that their variance analysis activities are fully manual, and another 19% report significant manual activities. One-fifth of companies avoid this manual effort only by performing no variance analysis whatsoever. The remaining 5% of respondents do utilize variance analysis that’s backed by fully automated processes.

The survey’s findings are beads strung along a common thread: Enterprises recognize and demand the benefits of more efficient and effective cash forecasting. With investments in AI/ML and other advanced capabilities, many enterprises are already pursuing new strategies and spending what it takes to place the tools and technologies they require at their command.

Read the full report from GTreasury and Strategic Treasurer.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link

Categories
AI

Google’s future in enterprise hinges on strategic cybersecurity

Gaps in Google’s cybersecurity strategy make banks, financial institutions, and larger enterprises slow to adopt the Google Cloud Platform (GCP), with deals often going to Microsoft Azure and Amazon Web Services instead.

It also doesn’t help that GCP has long had the reputation that it is more aligned with developers and their needs than with enterprise and commercial projects. But Google now has a timely opportunity to open its customer aperture with new security offerings designed to fill many of those gaps.

During last week’s Google Cloud Next virtual conference, Google executives leading the security business units announced an ambitious new series of cybersecurity initiatives precisely for this purpose. The most noteworthy announcements are the formation of the Google Cybersecurity Action Team, new zero-trust solutions for Google Workspace, and extending Work Safer with CrowdStrike and Palo Alto Networks partnerships.

The most valuable new announcements for enterprises are on the BeyondCorp Enterprise platform, however. BeyondCorp Enterprise is Google’s zero-trust platform that allows virtual workforces to access applications in the cloud or on-premises and work from anywhere without a traditional remote-access VPN. Google’s announced Work Safer initiative combines BeyondCorp Enterprise for zero-trust security and their Workspace collaboration platform.

Workspace now has 4.8 billion installations of 5,300 public applications across more than 3 billion users, making it an ideal platform to build and scale cybersecurity partnerships. Workspace also reflects the growing problem chief information security officers (CISOs) and CIOs have with protecting the exponentially increasing number of endpoints that dominate their virtual-first IT infrastructures.

Bringing order to cybersecurity chaos

With the latest series of cybersecurity strategies and product announcements, Google is attempting to sell CISOs on the idea of trusting Google for their complete security and public cloud tech stack. Unfortunately, that doesn’t reflect the reality of how many legacy systems CISOs have lifted and shifted to the cloud for many enterprises.

Missing from the many announcements were new approaches to dealing with just how chaotic, lethal, and uncontrolled breaches and ransomware attacks have become. But Google’s announcement of Work Safer, a program that combines Workspace with Google cybersecurity services and new integrations to CrowdStrike and Palo Alto Networks, is a step in the right direction.

The Google Cybersecurity Action Team claimed in a media advisory it will be “the world’s premier security advisory team with the singular mission of supporting the security and digital transformation of governments, critical infrastructure, enterprises, and small businesses.”  But let’s get real: This is a professional services organization designed to drive high-margin engagement in enterprise accounts. Unfortunately, small and mid-tier enterprises won’t be able to afford engagements with the Cybersecurity Action Team, which means they’ll have to rely on system integrators or their own IT staff.

Why every cloud needs to be a trusted cloud

CISOs and CIOs tell VentureBeat that it’s a cloud-native world now, and that includes closing the security gaps in hybrid cloud configurations. Most enterprise tech stacks grew through mergers, acquisitions, and a decade or more of cybersecurity tech-buying decisions. These are held together with custom integration code written and maintained by outside system integrators in many cases. New digital-first revenue streams are generated from applications running on these tech stacks. This adds to their complexity. In reality, every cloud now needs to be a trusted cloud.

Google’s series of announcements relating to integration and security monitoring and operations are needed, but they are not enough. Historically Google has lagged behind the market when it comes to security monitoring by prioritizing its own data loss prevention (DLP) APIs, given their proven scalability in large enterprises. To Google’s credit, it has created a technology partnership with Cybereason, which will use Google’s cloud security analytics platform Chronicle to improve its extended detection and response (XDR) service and will help security and IT teams identify and prevent attacks using threat hunting and incident response logic.

Google now appears to have the components it previously lacked to offer a much-improved selection of security solutions to its customers. Creating Work Safer by bundling the BeyondCorp Enterprise Platform, Workspace, the suite of Google cybersecurity products, and new integrations with CrowdStrike and Palo Alto Networks will resonate the most with CISOs and CIOs.

Without a doubt, many will want a price break on BeyondCorp maintenance fees at a minimum. While BeyondCorp is generally attractive to large enterprises, it’s not addressing the quickening pace of the arms race between bad actors and enterprises. Google also includes Recapture and Chrome Enterprise for desktop management, both needed by all organizations to scale website protection and browser-level security across all devices.

It’s all about protecting threat surfaces

Enterprises operating in a cloud-native world mostly need to protect threat points. Google announced a new client connector for its BeyondCorp Enterprise platform that can be configured to protect Google-native and also legacy applications — which are very important to older companies. The new connector also supports identity and context-aware access to non-web applications running in both Google Cloud and non-Google Cloud environments. BeyondCorp Enterprise will also have a policy troubleshooter that gives admins greater flexibility to diagnose access failures, triage events, and unblock users.

Throughout Google Cloud Next, cybersecurity executives spoke of embedding security into the DevOps process and creating zero trust supply chains to protect new executable code from being breached. Achieving that ambitious goal for the company’s overall cybersecurity strategy requires zero trust to be embedded in every phase of a build cycle through deployment.

Cloud Build is designed to support builds, tests, and deployments on Google’s serverless CI/CD platform. It’s SLSA Level -1 compliant, with scripted builds and support for available provenance. In addition, Google launched a new build integrity feature as Cloud Build that automatically generates a verifiable build manifest. The manifest includes a signed certificate describing the sources that went into the build, the hashes of artifacts used, and other parameters. In addition, binary authorization is now integrated with Cloud Build to ensure that only trusted images make it to production.

These new announcements will protect software supply chains for large-scale enterprises already running a Google-dominated tech stack. It’s going to be a challenge for mid-tier and smaller organizations to get these systems running on their IT budgets and resources, however.

Bottom line: Cybersecurity strategy needs to work for everybody  

As Google’s cybersecurity strategy goes, so will the sales of the Google Cloud Platform. Convincing enterprise CISOs and CIOs to replace or extend their tech stack and make it Google-centric isn’t the answer. Recognizing how chaotic, diverse, and unpredictable the cybersecurity threatscape is today and building more apps, platforms, and adaptive tools that learn fast and thwart breaches.

Getting integration right is just part of the challenge. The far more challenging aspect is how to close the widening cybersecurity gaps all organizations face — not only large-scale enterprises — without requiring a Google-dominated tech stack to achieve it.

 

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link