Categories
AI

Gartner predicts ‘digital twins of a customer’ will transform CX

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.


Digital twins of physical products and infrastructure are already transforming how companies design and manufacture products, equipment and infrastructure. In its latest Immersive Hype Cycle, Gartner predicts that digital twins of a customer (DToC) could transform the way enterprises deliver experiences. Simulating a customer experience (CX) is a bit more nuanced than a machine — and there are privacy considerations to address, not to mention the creepiness factor. Though if done right, Gartner predicts the DToC will drive sales while delighting customers in surprising ways. 

Gartner has a nuanced view of the customer, including individuals, personas, groups of people and even machines. It is worth noting that many enterprise technologies are moving toward this more comprehensive vision. Customer data platforms consolidate a data trail of all aspects of customer interaction. Voice of the customer tools help capture data from surveys, sensors and social media. While, customer journey mapping and customer 360 tools analyze how customers interact with brands across multiple apps and channels. 

The critical innovation point of DToC is that it helps contextualize data to help understand what customers really need to improve the overall experience, Gartner VP analyst Michelle DeClue-Duerst told VentureBeat. For example, a hotel with knowledge about a customer’s gluten allergy might identify nearby gluten-free restaurants and only stock the minibar with snacks the customer will enjoy. 

When done right, DToCs can help business teams design ways to serve or capture customers and facilitate new data-driven business models. They will also improve customer engagement, retention and lifetime failure. 

Event

MetaBeat 2022

MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.

Register Here

Developing core capabilities

Gartner notes that DToC implementations are still embryonic, with about 1-5%penetration of the target audience. At the same time, enterprises have been busy finding ways to get the most value from their investment using various marketing analytics tools. 

Subha Tatavarti, CTO, Wipro, told VentureBeat there have been several important milestones in using tools for simulating customers to improve experiences. The most notable have been the ability to define customer experience transformation objectives, including the capability to identify and assess data assets, personas and processes and tools for building and testing behavior models. New ModelOps approaches for integrating monitoring and enhancing the models are also advancing the field.

“A new generation of recommendation systems based on intention, context and anticipated needs is a very exciting development in combined modeling and simulation capabilities,” Tatavarti said. “Personalized learning and hyper-personalized products are great advancements and personalized healthcare will have critical impacts on that industry.”

Enterprises are taking advantage of new identity resolution capabilities that assemble pieces of data to create a holistic view of the customer. This stitching can help a company understand what an individual customer buys, how frequently they purchase, how much they spend, how often they visit a website and more. 

“Without identity resolution, the company may have to rely on only some of the attributed data sources to fill out the digital persona, meaning the simulation would be somewhat inaccurate,” said Marc Mathies, senior vice president of platform evolution at Vericast, a marketing solutions company.

Bumpy road

Enterprises will need to address a few challenges to scale these efforts. Gartner observed that privacy and security concerns could lengthen the time it takes DToCs to mature and increase regulatory risks. Organizations must also build teams familiar with machine learning and simulation techniques. 

Tatavarti said the most difficult obstacles are the quality and availability of customer data from physical and digital interaction and data sharing between multiple organizations. These challenges will also involve privacy considerations and the ability to connect physical systems and virtual models without affecting the experience or performance. Teams also need to ensure the accuracy of the models and eliminate bias.

Bill Waid, chief product and technology officer at FICO, a customer analytics leader, told VentureBeat that another challenge in implementing digital twins for customer simulation is the impact of localized versus global simulation. Frequently, teams only simulate subsegments of the decision process to improve scale and manageability. Enterprises will need to compose these digital twins for more holistic and reusable simulations.

Organizations will also need to be transparent. 

“Initially, it will be hard to convince customers they need a digital twin that your brand stores and that the customer should help create it to improve their experience,” said Jonathan Moran, head of MarTech solutions marketing at SAS.

Building the right foundation

Industry leaders have many ideas about how enterprises can improve these efforts. 

Unlike digital twins in areas like manufacturing, customer behavior shifts quickly and often, Karl Haller, partner at IBM Consulting said it is essential to implement ongoing optimization and calibration to analyze the simulation results and determine ways to improve the performance of the models. He also recommends narrowly defining the focus of a customer simulation to optimize outcomes and reduce costs. Innovations in natural language processing, machine learning, object andvisual recognition, acoustic analytics and signal processing could help. 

Moran recommends enterprises develop synthetic data generation expertise to build and augment virtual customer profiles. These efforts could help expand data analytics and address privacy considerations.

Mark Smith, vice president of digital engagement solutions at CSG, recommends business to overlay voice of customer data with behavioral data captured through customer journey analytics. This modeling method is typically the fastest and most accurate route to understanding the peaks and valleys of the customer journey. 

“Comparing customers’ actual actions with their reported lived experience data unearths disconnects between customers’ perception of the experience and brands’ analysis of their own offerings,” Smith said. 

A mixed future 

Eventually, enterprises will need to find ways to optimize for profits along with customer well-being. Eangelica Germano Aton, product owner at a conversational intelligence platform, Gryphon AI, predicts that things will initially get worse for people as machines get better at predicting choices that reduce emotional well-being. 

“I think it will take a customer-driven or a bottom-up revolution and rejection of the current model before a more sophisticated and genuinely humanist AI can emerge that doesn’t maximize such a shallow objective function as profit,” Germano Aton said. 

Others are more optimistic. 

“Over time, it will be possible to use a deep understanding of the customer in a way that creates value for the consumer, the brand and the employees of the brand,” said Chris Jones, chief product officer at Amperity, a CDP platform. “One of the things we are observing is the ability of these capabilities to deepen the human connection between brands and the customers they serve by empowering employees across the brand to truly see their customer and provide the most personalized experience possible.”

In the long run, digital twin capabilities could become embedded into marketing and customer experience automation tools.

“As digital twin work moves more into marketing and CX in five to ten years, I think we will see solutions with more simulation capabilities built in,” Moran said. “Any type of marketing KPI and expected results will be simulated within the tool. Vendors already have some simulation capabilities for optimization, reinforcement learning and predictions, but I think this will start to increase even more in the coming years.”

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.

Repost: Original Source and Author Link

Categories
AI

New Gartner survey: Only half of AI models make it into production

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.


Automation and artificial intelligence (AI) are being broadly embraced by organizations even as multiple challenges remain – though the challenges may not be what many think. 

Across multiple aspects of IT and AI, a lack of qualified IT professionals is often cited as a barrier to adoption. According to a new survey released by Gartner today, a lack of AI talent really isn’t an issue. A whopping 72% of organizations surveyed claimed they can either source or already have the AI talent they need. 

Everyone is building AI models, but production is harder

While lack of talent isn’t an issue, moving from pilot to production certainly is. Gartner’s survey identified a stubborn gap between the number of AI models developed by organizations and the actual number that make it into production. 

The survey reported that, on average, only 54% of AI models move from pilot to production. That figure is just nominally higher than the often-cited 53% that Gartner reported in a 2020 survey.

Event

MetaBeat 2022

MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.

Register Here

“The biggest surprise was the sheer number of organizations that reported having thousands of AI models deployed coupled with the fact that only 54% make it into production, and many [indicating] they have not aligned to business value,” Frances Karamouzis, distinguished VP analyst at Gartner, told VentureBeat. 

So what is needed to move the needle to have more AI projects move from pilot to production? Karamouzis said that the one-word answer is discipline.

In her view, organizations must have a disciplined approach to aligning to value, ensuring the right talent is in place and ensuring critical areas of AI trust and security are properly implemented.

Governance remains a challenge

The Gartner study also found that 40% of organizations have thousands of AI models deployed and that volume leads to complexity for governance, as well as tracking the value and return on investment from AI. 

The challenge of AI’s lack of governance has been identified in other surveys released in 2022. A global research project conducted by Juniper Networks and Wakefield Research released June 15 found a lack of maturity in AI governance policies as being a barrier to further adoption. The Wakefield Research report, however, also found that a lack of talent was an issue, which isn’t what Gartner is seeing.

An April 2022 report from O’Reilly Media also found governance to be an AI adoption challenge, with 51% of organizations lacking some form of governance plan for AI projects.

The intersection of security, privacy and AI

Security was not identified as a top barrier to adoption by respondents to the Gartner survey either. Only 3% of respondents listed security as a top barrier, with the top barriers identified as being the ability to measure value, a lack of understanding for AI benefits and uses, and data accessibility challenges. 

Yet even though security did not crack the list of top barriers, AI-related security and privacy issues are rampant, with 41% of organizations admitting they have had an issue at some point in the past. 

Digging deeper into the question of AI security, half of organizations (50%) were worried about competitors or even partners as risks. The actual source of risk, however, appears to be insiders. Of those organizations that admitted to having an AI-related privacy or security issue, 60% were attributed to insiders.

“Organizations’ AI security concerns are often misplaced, given that most AI breaches are caused by insiders,” Erick Brethenoux, distinguished VP analyst at Gartner, wrote in a release. “While attack detection and prevention are important, AI security efforts should equally focus on minimizing human risk.”

The Gartner survey was conducted in October through December 2021, across the U.S., Germany and the U.K. There were 699 respondents to the survey, employed by organizations that intended to deploy inside of the next three years, or have actually already deployed AI.

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.

Repost: Original Source and Author Link

Categories
AI

Gartner research: 2 types of emerging AI near hype cycle peak

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.


According to new Gartner research, two types of emerging artificial intelligence (AI) — emotion and generative AI — are both reaching the peak of the digital advertising hype cycle. This is thanks to AI’s expansion into targeting, measurement, identity resolution and even generating creative content. 

“I think one of the key pieces is that the options for marketers have been accelerating,” Mike Froggatt, senior director analyst in the Gartner marketing practice, told VentureBeat. “When you think about the fragmentation of digital media, ten years ago, there was display, search, video, rich media, but now, there’s podcasts, over-the-top platforms, blockchain and NFTs. AI is helping marketers target, measure and identify consumers, even generating the content that can appear in those channels, creating all new artifacts to give marketers a voice in those channels.” 

Traditional methods for targeting customers are depreciating, noted the Gartner report, Hype Cycle for Digital Advertising 2022, evolving from an assumed quid pro quo to an explicit consent-driven media and advertising economy.

While Google continues to delay the date it will stop supporting third-party cookies — which digital advertisers have historically relied on for ad tracking — digital marketers will need to learn to adapt as customer data becomes more scarce and targeting difficulty increases. 

Event

MetaBeat 2022

MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.

Register Here

Emotion AI: Opportunities and privacy challenges

According to an analysis by Gartner analyst Andrew Frank, emotion AI technologies “use AI techniques to analyze the emotional state of a user…[and] can initiate responses by performing specific, personalized actions to fit the mood of the customer.”

Frank says it is part of a larger trend called “influence AI” that “seeks to automate elements of digital experience that guide user choices at scale by learning and applying techniques of behavioral science.” 

With public criticism around the use, or even potential use, of emotion AI tools, privacy and trust will be essential to emotion AI’s success, said Froggatt.

“It’s going to have to be transparent in how it’s being used and we’re going to have to move away from bundling it in types of tracking within apps that collect things implicity,” he explained. 

But emotion AI will create interesting opportunities for brands if tied to trust and explicit consent, he added. According to the Gartner report, access to emotion data “delivers insights into motivational drivers that help test and refine content, tailor digital experiences and build deeper connections between people and brands.” 

The Gartner report cautioned that emotion AI would likely take another decade to become firmly established. For now, organizations should review vendor capabilities carefully, since the emotion AI market is immature and companies may only support limited use cases and industries. 

Generative AI: Soon to reach mainstream adoption

The Gartner report also found that generative AI covers a broad swath of tools that “learn from existing artifacts to generate new, realistic artifacts such as video, narrative, speech, synthetic data and product designs that reflect the characteristics of the training data without repetition.”

Within the next two to five years, the report predicts, these solutions will reach mainstream adoption. 

Elements of the metaverse, including digital humans, will rely on generative AI. Transformer models, like Open AI’s DALL-E 2, can create original images from a text description. Synthetic data is also an example of generative AI, helping to augment scarce data or mitigate bias. 

For marketing professionals, generative AI tackles many issues they face today, including the need for more content, more assets and to engage customers in smart and personalized ways.

“Imagine a brand taking a generative AI tool and feeding their existing creative and copy assets into it and coming up with whole new versions of ad, video and email content,” said Froggart. “It automates a lot of that and allows marketers to focus on the strategy around it.”

In addition, generative data assets can remove the individual identity necessary for targeting.

“I think that it can be super-powerful for advertisers and media,” he added.

Still, steep challenges around possible regulations and issues such as deepfakes remain. The Gartner report recommends examining and quantifying the advantages and limitations of generative AI, as well as weighing technical capabilities with ethical factors. 

Gartner research: Future of AI in marketing

For now, marketing pros still have the old tools – like third-party cookies – available to them. But with trends like media fragmentation and deprecation of customer data sources not slowing down, they will need the right tools to adapt to new forms of measurement and targeting. 

“I think that’s where AI is really going to start showing its value,” said Froggart, adding that while he doesn’t think solutions like generative and emotion AI will avoid the Gartner Hype Cycle’s “trough of disillusionment” after reaching the peak, “I think they will be finding their own route through the hype cycle.” 

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.

Repost: Original Source and Author Link

Categories
AI

Gartner: Citizen developers will soon outnumber professional coders 4 to 1

To meet a coming wave of hyper-automation, IT organizations need to do a better job of partnering with professionals outside of IT to automate business processes and data integration, according to research firm Gartner.

Gartner defines hyper-automation as “a business-driven, disciplined approach that organizations use to rapidly identify, vet, and automate as many businesses and IT processes as possible.” The possibilities are so vast that IT can’t pursue them alone — and shouldn’t try to, Gartner distinguished VP, and analyst on software design and development, Jason Wong, argued in a presentation at the Gartner IT Symposium this week.

Instead of complaining about “shadow IT” efforts outside the control of the CIO, IT should engage with business unit developers to make sure they have what they need to get their work done.

This group includes “business technologists,” or trained, full-time developers who are embedded in a department like marketing. But it also includes “citizen developers” who know how to use no-code, low-code, or data management and analytics tools to automate processes for themselves and their teams. Gartner’s prediction: “By 2023, the number of active citizen developers at large enterprises will be at least four times the number of professional developers.”

IT organizations must break out of the mindset that the work done by these groups is trivial or insignificant, Wong said. “In fact, they are doing serious work. They create algorithms. They create user interfaces that make it easier for their teams to do their work,” he said. Often, they are creating new capabilities, not just making tweaks, he said. “They see the power of workflow and business logic.”

A prescription for business-driven automation

In recent Gartner surveys, only 42% of these workers reported that they were using tools specifically designed for them, such as robotic process automation software. Wong suggested that may be simply because they haven’t been offered access to those tools, which are often only made available to centers of excellence or other pockets within the organization. In contrast, 64% said they were working with database, data science, analytics, and AI tools, 59% said they worked with application development tools, and 45% said they worked with integration tools including data integration and API management tools.

As a result of their work, 82% say they are making their departments more effective, 68% say they are improving efficiency, and 63% say they are boosting business agility.

Wong’s prescription for organizations that want to get the most value out of business-driven automation includes:

  • Co-operate, take these teams seriously and work to amplify their skills
  • Co-own, provide access to a variety of tools, and include business technologists and citizen developers in communities of practice that promote excellence
  • Co-create, scale-up what can be accomplished by including these developers in multidisciplinary teams who plan big initiatives and help them understand broader enterprise considerations such as technology risk

For a win-win strategy, organizations should work to ensure their business achieves automation that makes it more effective and efficient, while also ensuring that whatever the business develops, IT can support, Wong concluded.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link

Categories
AI

Gartner advises tech leaders to prepare for action as quantum computing spreads

Quantum computing has hit the radar of technical leaders, because of the huge efficiency it offers at scale. It will take years to develop for most applications, however, even as it makes limited progress in the near term in highly specialized fields of materials science and cryptography.

Quantum methods are gaining more rapid attention, however, with special tools for AI, as seen in recent developments around natural language processing that could open up the “black box” of today’s neural networks.

Last week’s release of a Quantum Natural Language Processing (QNLP) toolkit by Cambridge Quantum shows the new possibilities.

Known as lambeq, the kit takes the form of a conventional Python repository that is hosted on GitHub. It follows the arrival at Cambridge Quantum of noted AI and NLP researchers and affords the chance for hands-on experience in QNLP.

The lambeq package, which takes its name from late semantics researcher Joachim Lambek, is said to convert sentences into quantum circuits, offering a new view into text mining, language translation, and bioinformatics corpora.

Using quantum principles, NLP can provide explainability not possible in “bag of words” neural approaches done on classical computers today, according to Bob Coecke, the chief scientist at Cambridge Quantum. QNLP, he said, layers a compositional structure on circuits. As represented on schema, these structures do not look too unlike parsed sentences on grade-school blackboards.

Presently popular methods of NLP “don’t have an ability to compose things together to find a meaning,” Coecke told VentureBeat. “What we want to bring in is compositionality in the classical sense — to use the same compositional structure. We want to bring reasoning back.”

 Quantum computing timelines

Cambridge Quantum’s efforts to expand quantum infrastructure got significant backing earlier this year when Honeywell said it would merge its own quantum computing operations with Cambridge Quantum, to form an independent company to pursue cybersecurity, drug discovery, optimization, material science, and other applications, including AI.

Honeywell said it would invest between $270 million – $300 million in the new operation. Cambridge Quantum said it would remain independent, working with various quantum computing players, including IBM.

The lambeq work is part of an overall AI project that is the longest-term project among the efforts at Cambridge Quantum, said Ilyas Khan, founder, and CEO of Cambridge Quantum, in an e-mail interview.

“We might be pleasantly surprised in terms of timelines, but we believe that NLP is right at the heart of AI more generally and therefore something that will really come to the fore as quantum computers scale,” he said. Khan cited cybersecurity and quantum chemistry as the most advanced application areas in Cambridge Quantum’s estimation.

What kind of timeline does Khan see ahead for quantum hardware?

“There is a very well-informed consensus not only about the hardware roadmap,” he replied, citing Honeywell and IBM among credible corporate players in this regard.

These “and the very well amplified statement by Google about having fault-tolerant computers by 2029 are just some of the reasons why we say that the timelines are generally well-understood,” Khan said.

The march of quantum

Alliances, modeling advances, mergers, and even — in the cases of IonQ and Rigetti — public offerings comprise most of the quantum computing industry advancements of late.  Often hybrid couplings of quantum and classical computing features are involved.

New developments in the quantum industry include:

  • D-Wave, builders of a quantum annealing computer that carried forward much of the early research in the area, this year added constrained quadratic model solvers to hybrid tooling for problems that run across classical and quantum systems;
  • Rigetti Computing is working with Riverlane and Astex Pharmaceuticals to pair Rigetti’s quantum processors with cloud-based classical computing resources that, in effect, test quantum algorithms for drug discovery on a hybrid platform that mixes classical and quantum processing;
  • IBM said it would partner with European electric utility company E.ON to develop workflow solutions for future decentralized electrical grids using the open-source Qiskit quantum computing SDK and the IBM Cloud; and,
  • Sandbox, at Alphabet, has reportedly launched APIs that let developers use Google Tensor Processing Units to simulate quantum computing workloads.

Use case drill down

Indications are that, as researchers bounce between breakthroughs and setbacks, a variety of new quantum-inspired algorithms and software tools will appear. Enterprises need to pick targets carefully while treading some novel ground.

Gartner analyst Chirag Dekate emphasized that, where applicable, enterprises should begin to prepare for quantum computing. He spoke this week at Gartner IT Symposium/Xpo 2021 Americas.

He said companies should be sure not to outsource quantum innovation, but to instead use this opportunity to foster skills via small quantum working groups.

“Starting early is the surest form of success,” he said.

He said enterprise decision-makers must drill down on very specific use cases, as they prepare for quantum commercialization.

“Quantum computing is not a general-purpose technology — we cannot use quantum computing to address all the business problems that we currently experience,” Dekate told the assembled and virtual conference audiences.

Gartner’s Hype Cycle for Computing Infrastructure for 2021 has it that more than 10 years will elapse before quantum computing reaches the Plateau of Productivity. That’s the place where the analyst firm expects IT users to truly benefit from employing a given technology.

The assessment is the same as it was in 2020, as is quantum computing’s present post on the Peak of Inflated Expectations — Gartner’s designation for rising technologies that are considered overhyped.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link

Categories
AI

Digital transformation will spur economic boom in 2021, CEOs tell Gartner

Join Transform 2021 this July 12-16. Register for the AI event of the year.


Chief executives around the world expect a return to strong economic growth over the next two years and are betting on digital transformation, AI technology, and corporate activism to help make it happen.

Some 60% of CEOs polled for Gartner’s 2021 CEO Survey said they anticipate a return to economic growth this year and in 2022. That follows pandemic-ravaged global economic performance in 2020, the research firm said. Gartner on Tuesday released its annual survey, which over six months last year polled 465 CEOs and other senior business executives employed at companies of varying size, revenue, and industries located in North America, EMEA, and APAC.

“CEOs’ top priorities for 2021 show confidence,” said Mark Raskino, research vice president at Gartner. “Over half report growth as their primary focus and see opportunity on the other side of the crisis, followed by technology change and corporate action.”

“This year, all leaders will be working hard to decode what the post-pandemic world looks like, and redeveloping mid- to long-range business strategy accordingly. In most cases, that will uncover a round of new structural changes to capability, location, products, and business models,” Raskino said in a statement.

AI, quantum computing, 5G are strategic priorities

Respondents cited business growth, technology change, and corporate actions such as mergers and acquisitions as the top three priorities for their companies over the next two years. Technology is a particularly strategic concern for CEOs — digital capabilities were the only area where a majority of respondents said they planned to increase investment in 2021.

Gartner found that more CEOs than ever are citing digital change and investment as a priority for their organizations. When they gave answers about top strategic business priorities in their own words, 20% of CEOs used the word “digital,” up from 17% in 2020 and 15% in 2019. The unprompted citation of digitization as a priority has been steadily increasing in Gartner’s survey over the past several years, growing from just 2% of citations in 2012.

Drilling down to specific technological areas where CEOs expect to invest, respondents cited AI as the “most industry-impactful technology” over the coming years, Gartner said. Some 30% of respondents said quantum computing would be “highly relevant” to their companies’ long-term plans, but a majority weren’t certain how that would look. Respondents also cited blockchain and 5G as technologies they were focused on.

While a majority of CEOs polled did not have designated data officers such as chief digital officers or chief data officers, 83% of respondents said they employed chief information officers. A majority of CEOs surveyed by Gartner said their “top ask” of their CIOs is digitalization.

The United States-China economic rivalry and trade relations between the countries was another area of concern for Gartner respondents. One-third of surveyed CEOs said that “evolving trade disputes between the two nations” over core technologies like AI and 5G were “a significant concern for their businesses.”

CEOs see M&A opportunities, remote work in store

Global CEOs also cited M&As and other corporate actions, social and environmental issues, and new workplace conditions resulting from the pandemic as primary areas of focus.

Interestingly, fewer respondents than in previous surveys cited “sales revenue” as a growth priority, while more mentioned “new markets.” Gartner’s Raskino suggested that this shift, plus the increased emphasis on M&A opportunities, “shows that CEOs and senior executives seeking advantage from a cyclical downturn are going shopping for structural inorganic growth” rather than counting on incremental sales growth “using the strategies that have served them well in the past.”

“‘Techquisitions’ can bolster digital business progress, while also providing access to potential fast-growth market sectors,” Raskino said.

Meanwhile, more than 80% of CEOs believe “societal behavior change” taking place during the pandemic to become more or less the “new normal.” Most expect hybrid work-from-home arrangements to become permanent for many workers, while expenditures on travel-related activities will remain lower than before the pandemic.

These developments, as well as nearly half of surveyed companies’ prioritization of sustainability to mitigate climate change, will further increase companies’ reliance on digital technology and digital channel flexibility in the coming years, said Kristin Moyer, Gartner research vice president.

“This suggests that continuing to improve the way customers are served digitally will be vital,” Moyer said.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link

Categories
AI

Gartner says composable data and analytics key to digital transformation

Join Transform 2021 this July 12-16. Register for the AI event of the year.


Gartner wrapped up the Data & Analytics Summit Americas 2021 virtual event this week with a lively overview of top trends for enterprises to watch in 2021. Overall, Gartner analysts saw pressing uber trends around accelerating change, operationalizing business value, and — more and more — “distributed everything.”

Accelerating change these days means feeding and scaling AI, and composable data and analytics are key, said Donald Feinberg, Gartner’s distinguished analyst.

This is about making it easy to assemble AI from across many different tools for BI, data management, and predictive analytics. This trend will allow companies to use microservices and containerization to bring together the necessary pieces to create a service, Feinberg said.

“This is a great way to pursue experiments because you can pick and choose how it works together,” Feinberg said.

Composable data and analytics initiatives might uncover new ways of packaging data as part of a service or product. These could be built using low-code and no-code tools that could be sold via the cloud or new kinds of data service intermediaries.

Providing the foundation for composable data and analytics is the data fabric, which allows easy access and sharing across distributed data environments.

“You should not have to worry about where it is and how to access it,” Feinberg said of composable data. It is not a single tool but rather the set of tools put together into a solution. Metadata powered by a graph database is the glue that holds this together. It is not easy to do, but the technology is getting better.

Big data becomes small and wide data

There is a growing need to weave a wider variety of data into applications to improve situational awareness and decision making.

COVID-19 caused a lot of historical data to become obsolete. There are also many small data use cases where there is just less data to work with. This trend requires investigating technologies like federated learning, few-shot learning, and content analytics that can organize new types of data such as voice, text, and video.

There is more on the accelerated road to digital transformation and responsible, scalable AI.

Teams now need to pay attention to new privacy and AI models, said Rita Sallam, distinguished analyst, Gartner.

Trust is growing in importance owing to regulations like GDPR in Europe and CCPA in California and new AI regulations being proposed in Europe.

“We see that many organizations are struggling with scaling AI prototypes and pilots into production, and the effort to integrate AI into production is underestimated,” Sallam said.

Uber trend: Operationalizing business value

Gartner said business-facing data initiatives were key drivers of digital transformation in the enterprise. Research showed that 72% of data and analytics leaders are leading, or are heavily involved, in their organizations’ digital transformation efforts. These data leaders now confront emerging trends on varied fronts.

XOps: The evolution of DataOps to support AI and machine learning workflows is now XOps. The X could also stand for MLOps, ModelOps, and even FinOps. This promises to bring flexibility and agility in coordinating the infrastructure, data sources and business needs in new ways.

Engineering decision intelligence: Decision support is not new, but now decision making is more complex. Engineering decision intelligence frames a wide range of techniques from conventional analytics to AI to align and tune decision models and make them more repeatable, understandable, and traceable.

Data and analytics as the core business function: With the chaos of the pandemic and other disruptors, data and analytics are more central to the organization’s success. Successful companies will have to prioritize data and analytics as core functions rather than a secondary activity done by IT. This will also drive data literacy efforts and new organizational models that distribute analytics functions across more teams.

Everything is distributed, and graph relates everything: Graph databases have been around for a while but struggled due to limited tools, data sources, and workflows. But the technology is seeing major growth due to graph data improvements in popular BI and analytics tools. There are a wide variety of graph techniques for representing knowledge, relationships, properties, social networks, business rules, and metadata. Gartner predicts that graph technologies will underpin 80% of data analytics innovations by 2025.

Data and analytics at the edge: The Internet of Things (IoT) allows enterprises to work with data at the edge. What’s new is the different ways enterprises are also embedding analytics, AI, and decision intelligence into edge applications. Use cases include providing better predictive maintenance for factories, delivering new insights to oil rigs, and enabling better mobile apps. The edge improves speed and resiliency because there’s no need for constant cloud connectivity. However, analytics at the edge complicates governance, so enterprises need to find tools that help with governance and analytics at the edge, Feinberg said.

Rise of the augmented consumer: Gartner is focusing on business consumers and the importance of making analytics exploration easier and richer, such as the shift from pre-designed dashboards to new, more automated and dynamic presentation and delivery of analytics. This will shift the analytics superpower to the augmented consumer, Sallam said. Expect to see a significant growth in new vendors that deliver more conversational and interactive analytics experiences across new channels such as voice, mobile, and web applications, she said.

Gartner’s presenters advised enterprises to keep in mind that these are all technologies and practices companies can pick up and apply today using commercial software.

These trends complement one another in many ways. It’s useful to consider the entire collection in an integrated manner, and then prioritize the one worth researching for your own business domain, with an eye toward how it may work for others, the analysts said.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link

Categories
AI

Gartner says low-code, RPA, and AI driving growth in ‘hyperautomation’

Join Transform 2021 this July 12-16. Register for the AI event of the year.


Research firm Gartner estimates the market for hyperautomation-enabling technologies will reach $596 billion in 2022, up nearly 24% from the $481.6 billion in 2020.

Gartner is expecting significant growth for technology that enables organizations to rapidly identify, vet, and automate as many processes as possible and says it will become a “condition of survival” for enterprises. Hyperautomation-enabling technologies include robotic process automation (RPA), low-code application platforms (LCAP), AI, and virtual assistants.

As organizations look for ways to automate the digitization and structuring of data and content, technologies that automate content ingestion, such as signature verification tools, optical character recognition, document ingestion, conversational AI, and natural language technology (NLT), will be in high demand. For example, these tools could be used to automate the process of digitizing and sorting paper records.

Gartner currently anticipates the hyperautomation market reaching $532.4 billion this year.

Drivers of growth

Gartner said process-agnostic tools such as RPA, LCAP, and AI will drive the hyperautomation trend because organizations can use them across multiple use cases. Even though they constitute a small part of the overall market, their impact will be significant, with Gartner projecting 54% growth in these process-agnostic tools.

Through 2024, the drive toward hyperautomation will lead organizations to adopt at least three out of the 20 process-agonistic types of software that enable hyperautomation, Gartner said.

The demand for low-code tools is already high as skills-strapped IT organizations look for ways to move simple development projects over to business users. Last year, Gartner forecast that three-quarters of large enterprises would use at least four low-code development tools by 2024 and that low-code would make up more than 65% of application development activity.

Software automating specific tasks, such as enterprise resource planning (ERP), supply chain management, and customer relationship management (CRM), will also contribute to the market’s growth, Gartner said.

Lots of potential use cases

Hyperautomation extends the idea of intelligent automation, as it promises end-to-end process automation with minimal human intervention required. The convergence of intelligent process automation technologies and cloud computing, along with the need to process unstructured content, helps make the case for hyperautomation across several industries, including shared services, hospitality, logistics, and real estate.

Some day-to-day examples of automation include self-driving cars, self-checkouts at grocery stores, smart home assistants, and appliances. Business use cases include applying data and machine learning to build predictive analytics that react to consumer behavior changes and implementing RPA to streamline operations on a manufacturing floor.

Gartner earlier included hyperautomation in its Top 10 Strategic Technology Trends for 2021.

Benefits of hyperautomation

Gartner said tools that provide visibility to map business activities, automate and manage content ingestion, orchestrate work across multiple systems, and provide complex rule engines make up the fastest-growing category of hyperautomation-enabling software. Organizations will be able to lower operational costs 30% by 2024 through combining hyperautomation technologies with redesigned operational processes, Garner projected.

“Hyperautomation has shifted from an option to a condition of survival,” Gartner VP Fabrizio Biscotti said in a statement. “Organizations will require more IT and business process automation as they are forced to accelerate digital transformation plans in a post-COVID-19, digital-first world.”

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link

Categories
AI

Gartner: 75% of VCs will use AI to make investment decisions by 2025

Join Transform 2021 for the most important themes in enterprise AI & Data. Learn more.


By 2025, more than 75% of venture capital and early-stage investor executive reviews will be informed using AI and data analytics. In other words, AI might determine whether a company makes it to a human evaluation at all, deemphasizing the importance of pitch decks and financials. That’s according to a new whitepaper by Gartner, which predicts that in the next four years, the AI- and data-science-equipped investor will become commonplace.

Increased advanced analytics capabilities are shifting the early-stage venture investing strategy away from “gut feel” and qualitative decision-making to a “platform-based” quantitative process, according to Patrick Stakenas, senior research director at Gartner. Stakenas says that data gathered from sources like LinkedIn, PitchBook, Crunchbase, and Owler, along with third-party data marketplaces, will be leveraged alongside diverse past and current investments.

“This data is increasingly being used to build sophisticated models that can better determine the viability, strategy and potential outcome of an investment in a short amount of time. Questions such as when to invest, where to invest and how much to invest are becoming almost automated,” Stakenas said. “The personality traits and work patterns required for success will be quantified in the same manner that the product and its use in the market, market size and financial details are currently measured. AI tools will be used to determine how likely a leadership team is to succeed based on employment history, field expertise and previous business success.”

As the Gartner report points out, current technology is capable of providing insights into customer desires and predicting future behavior. Unique profiles can be built with little to no human input, which can be further developed via natural language processing AI that can determine qualities about a person from real-time or audio recordings. While this technology is currently used primarily for marketing and sales purposes, by 2025, investment organizations will be leveraging it to determine which leadership teams are most likely to succeed.

Already, one venture capital firm — San Francisco, California-based Signalfire — is using a proprietary platform called Beacon to track the performance of more than 6 million companies. At the cost of over $10 million per year, the platform draws on 10 million data sources including academic publications, patent registries, open-source contributions, regulatory filings, company webpages, sales data, social networks, and even raw credit card data. Companies that are outperforming are flagged up on a dashboard, allowing Signalfire to see deals ostensibly earlier than traditional venture firms.

This isn’t to suggest that AI and machine learning are — or will be — a silver bullet when it comes to investment decisions. In an experiment last November, Harvard Business Review built an investment algorithm and compared its performance with the returns of 255 angel investors. Leveraging state-of-the-art techniques, they trained the system to select the most promising investment opportunities among 623 deals from one of the largest European angel networks. The model, whose decisions were based on the same data available to investors, outperformed novice investors but fared worse than experienced investors.

Part of the problem with Harvard Business Review’s model was that it exhibited biases that experienced investors did not. For example, the algorithm tended to pick white entrepreneurs rather than entrepreneurs of color and preferred investing in startups with male founders. That’s potentially because women tend to be disadvantaged in the funding process and ultimately raise less venture capital which may lead to their startups not being as successful. In other words, the AI was projecting into future discrimination the societal mechanisms that make ventures of female and non-white founders die at an earlier stage.

Because it might not be possible to completely eliminate these forms of bias, it’s crucial that investors take a “hybrid approach” to AI-informed decision-making with humans in the loop, according to Harvard Business Review. While it’s true that algorithms can have an easier time picking out better portfolios because they analyze data at scale, potentially avoiding bad investments, there’s always a tradeoff between fairness and efficiency.

“Managers and investors should consider that algorithms produce predictions about potential future outcomes rather than decisions. Depending on how predictions are intended to be used, they are based on human judgement that may (or may not) result in improved decision making and action,” Harvard Business Review wrote in its analysis. “In complex and uncertain decision environments, the central question is, thus, not whether human decision making should be replaced, but rather how it should be augmented by combining the strengths of human and artificial intelligence

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link