Report: Data and enterprise automation will drive tech and media spending to $2.5T

Join gaming leaders, alongside GamesBeat and Facebook Gaming, for their 2nd Annual GamesBeat & Facebook Gaming Summit | GamesBeat: Into the Metaverse 2 this upcoming January 25-27, 2022. Learn more about the event. 

According to a new report released by Activate Consulting, the global technology and media spend will balloon to $2.5 trillion by 2025.  This analysis comes as 2021 netted a spend of more than $2 trillion.

The report indicates that one of the major drivers of this tech boom will be data solutions and enterprise automation.  According to the report, “Activate Technology and Media Outlook for 2022,” a set of new companies are paving the way for the future, delivering infrastructure, tools, and applications that will enable all enterprises to operate and innovate as if they were major technology companies.

Businesses and consumers can expect to see accelerated development of customer experiences, better (faster, less bureaucratic) employee experiences, improved intelligence and decision-making, and improved operational and financial efficiency as a result.  Technology like autonomy (self-driving cars, home automation), voice recognition, AR/VR, gaming and more will enable end-user experiences while enterprises will become more productive in their marketing effectiveness, IT service management, cross-planning and forecasting, and more.

New data startups are spurring the next era of innovation.  They’re focusing on leveraging data and information, improving end-user experience, and improving storage and connectivity — all of which will drive the business-to-business and business-to-consumer experiences of the future.


The 2nd Annual GamesBeat and Facebook Gaming Summit and GamesBeat: Into the Metaverse 2

Learn More

According to the report, more than 80% of the companies driving this innovation are U.S.-based, half of which are headquartered in the Bay Area.  They’re growing fast thanks to large venture capital infusions – and many of these startup companies have scaled at an unprecedented pace.  Fifteen of them have raised more than $1 billion since their launch.

In order for the next generation of companies to reach their full potential, the report indicates they must zero in on three specific areas of focus: strategy and transformation, go-to-market pricing, as well as their sales and marketing approach.

Read the full report by Activate Consulting.


VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link


Intel CTO Greg Lavender interview — Why chip maker is spending on both manufacturing and software

Join gaming leaders online at GamesBeat Summit Next this upcoming November 9-10. Learn more about what comes next. 

Intel has been on a spending spree ever since Pat Gelsinger returned to the company as CEO earlier this year. He pledged to spend $20 billion on U.S. factories and another $95 billion in Europe. Those expenses are scary to investors as they could take a toll on the chip giant’s bottom line, but Gelsinger said he hopes they will pay off over four or five years.

And Intel is making investments in other ways too. In June, Gelsinger brought aboard Greg Lavender, formerly of VMware, as chief technology officer and senior vice president and general manager of the Software and Advanced Technology Group.

I spoke with Lavender in an interview in advance of the online Intel Innovation event happening on October 27-28. In that event, a revival of the Intel Developer Forum that Gelsinger used to lead years ago, Intel will re-engage with developers.

The event will highlight not only what Intel is doing with its manufacturing recovery (after multiple years of delays and costly mistakes). It will also focus on software, such as Intel’s oneAPI technology. Lavender is tasking Intel’s thousands of software engineers to create more  sophisticated software that help brings more value with a systems-focused approach, rather than just a chip-based approach.


Three top investment pros open up about what it takes to get your video game funded.

Watch On Demand

We talked about a wide variety of subjects across the spectrum of technology. Here’s an edited transcript of our interview.

Above: Greg Lavender is CTO of Intel.

Image Credit: Intel

VentureBeat: Tell me more about yourself. This seems like a very different role for you.

Greg Lavender: I’ve been in the technology industry for a long time, working for hardware companies like Sun and Cisco. In the early days I was a network software engineer for 25 years, writing system software. Always working close to the metal. I have graduate degrees in engineering and computer science. We all get the same courses on Maxwell’s electromagnetic theory and physics. I’m a math geek. But I came up with the growth of the industry, right? Pat is three months older than me. Our careers have kind of tracked along. We’ve both known each other for not quite 14 years.

VentureBeat: What is the task that [Intel CEO] Pat Gelsinger gave you when he brought you aboard?

Lavender: We’ve known each other since I was running Solaris engineering. He was CTO at Intel. Intel launched the Nehalem platforms, if you remember back when that was their first server CPU. We were only shipping AMD Opteron, dual socket, dual core boxes at the time. Pat gave us some money to port it over to the Intel CPU chipsets. We got to know each other and built a trust relationship there. He obviously hired me into VMware and continued that relationship. He knows I’ve got that hardware and software background.

He surprised me when he called me up. I understood the CTO part, but then he also said I’d be the SVP GM of the software group. I said, “How big is that software group?” He said, “Well, we don’t have a software group. We have fragmented parts of software across the company.” In my first 120 days, about how long I’ve been here, I ran a defrag, a disk defrag, and pulled the other 6,000 person software organization together. Everything from firmware to BIOS to compilers to operating systems, all the Linux, Windows, Chrome, Android. All of our system software, all the security software.

I have a big team now. There’s other parts of software going on in the company, but I’m in the driver’s seat for the software strategy and ensuring the software quality for every hardware product we ship.

Above: Intel is focusing on oneAPI to make software creation easier.

Image Credit: Intel

VentureBeat: Is this a smaller percentage of the staff than it would have been in different years? There were things like Intel Architecture Labs and some of the investments that happened in the last decade way outside the chip space. Has that narrowed down again to a smaller percentage of the overall employees?

Lavender: We have a lot, and I’m hiring more. But I’d just say that Pat came in with his eight years at VMware. I was there for half of that. It’s a real software mindset, that the value of software is enabling the open source software ecosystem. Maybe we don’t need to directly monetize our software, right? We can monetize our very diverse platforms.

I’ve spent most of my time here pushing changes into the new compiler system. We just delivered the AMX accelerator code into the Linux kernel, so that when Sapphire Rapids comes out next year we already have the advanced matrix multiplier for machine learning and AI workloads in the Linux kernel. I have a compiler team — I’m sure you’re familiar with the LLVM compiler ecosystem, where all of our new compilers are built on LLVM. We can accelerate our GPUs, CPUs, and FPGAs. It’s a massive set of IP, and it’s IP we give away for free to enable our platforms. We’re contributing to PyTorch, TensorFlow, ONNX. We just updated Intel acceleration into TensorFlow 2.6. That had 8 million downloads just in Q3. We’re enabling the ecosystem for all the developers out there with these accelerated capabilities. We have our crypto library using OpenSSL, accelerated crypto as software.

I think Intel has just failed to tell everyone about all the cool stuff we’re doing. We talk about our chips and our hardware and our customers. We don’t talk about all this great software. We’ve pulled it all together into my org. And I have Intel Labs, 700 researchers at Intel Labs, with all our future software and AI and ML, as well as our quantum computing group. We have this neural computing chip. We just taped out the second version of it. We open-sourced the programming environment for it, called Lava. There were some articles about Loihi 2. That’s our neural processing chip.

VentureBeat: Is some of the investment in software more around the edges of what Intel does? Would that be harder, because there’s so much capital spending going into manufacturing now, with this recommitment to making sure the core manufacturing part of Intel was taken care of? Maybe that leaves less money for software investment.

Lavender: Our view is we need to prime the ecosystem. We need to be open, be trusted. We need to practice responsible AI in all the things we do with our software. My goal is to meet the developers where they are. Historically Intel wanted to capture the developers. I want to enable them and set them free, so that they have choice.

You may be familiar with the SYCL open source programming language, data parallel C++. It’s an extension to C++ for programming GPUs and FPGAs. We have a SYCL compiler built on LLVM. We make that freely available through our oneAPI ecosystem. We have a new website coming online next week,, where you’ll find all these things. We’ve just been poor about letting the world know about what those investments have already paid for and delivered. Developers would be shocked to know how much of the open source technology they’re currently using has Intel free software in it. It gives them both a better TCO for running their workloads in the clouds, as well as the datacenters or on their laptops.

If anything is lacking, it’s efficient amplification and communication. Just telling everybody, “This is already here.” From my perspective, I just have to leverage it and go further up the stack. We’ve mostly just pushed out software that enables and tickles the hardware. But we’ve been quietly, or relatively quietly, sprinkling all of these accelerated capabilities in all the common open source environments. I mentioned PyTorch. We just don’t talk about it. What I have to change is marketing and communication. We’re going to do this at Intel.

That’s one of the major themes: engaging with the developer community and getting them access to all this cool technology so that they can choose which platforms they want to run on and get that enablement for free. They don’t have to do anything. Maybe set a flag or something. But they don’t have to do any new coding. As you well know, most developers — of 24 million developers, according to some recent data — are up the stack. If you look at the systems people, there’s maybe 1 million. There’s this big group of people in the middleware layer, the dev sec ops people. Maybe not the no-code/low-code developers, the top of the stack. But there are four million enterprise developers just on Red Hat. The fact that I’m pushing stuff into the new compiler ecosystem, pushing stuff into the Linux kernel, into Chrome, means all that technology will be there for all those enterprise developers. I can instantly enable 4 million developers for Sapphire Rapids or Ponte Vecchio GPU.

Intel's Ponte Vecchio is an almagation of graphics cores.

Above: Intel’s Ponte Vecchio is an amalgamation of graphics cores.

Image Credit: Intel

VentureBeat: If you think of things that Intel is getting back to, that maybe it used to do when it communicated through things like the Intel Developer Forum, are there things you expect will be reminders of that?

Lavender: Intel Developer Forum was one of the best tech conferences back when I was at Sun and Cisco. I think it stopped in, what, 2013? Intel Innovation is essentially a relaunch of that theme. “The geek is back,” as Pat would like to say. We were just rehearsing our dialogues for next week. I love it. We’ve grown up together in the industry. I was originally an assembly language programmer on the 8088 and the 8086. Pat and I cut our teeth on Intel as young kids. It’s just so great to be here together at this time given some of Intel’s missteps in the past. We’re in the driver’s seat, and we’re going to steer this massive company into the future.

All those investments we’ve talked about into our fabs and our foundry services business are part of the overall game plan. But if we build all these chips and then don’t have software to make it sing, what good is that? The software is what makes the hardware sing.

VentureBeat: What are some of the messages for people about how Intel has gotten over those missteps in things like the manufacturing process?

Lavender: Pat’s already been out communicating on that and what he’s doing, putting the company’s balance sheet to work to address the world’s lack of capacity to support the demand for semiconductor technologies. When we broke ground in Arizona three weeks ago, there was a lot of press around that. I think you covered Intel Accelerated, where we discussed Ponte Vecchio and how it will use our new process technology, even using TSMC tiles for the Ponte Vecchio general-purpose GPU. We’ve been adopting the new processes we’ve talked about. We’re getting the yields we need. We’re highly optimistic that the industry demand for semiconductor technologies will make IFS a strong business for us. My team, by the way, develops all the pre-silicon simulation software that IFS customers can use to simulate the functionality of their chip before they send it for tape-out.

VentureBeat: I’ve written a few stories from Synopsys and Cadence about how much AI is going into chip design these days. I imagine you’re making use of all that.

Lavender: Being CTO, I get to look across the whole company. That’s one of the advantages of being CTO. I spend a lot of time with the people in our process technology. They’re leading adopters of AI and ML technology in the manufacturing process, both in terms of optimizing yield from each wafer — wafers are expensive and you want to get the most out of every wafer — and then also for diagnostics, for defects.

Every company has silent data errors as a result of their manufacturing processes. As you get to lower and lower nanometer, into angstroms, the physics gets interesting. Physics is a statistical science. You need statistical reasoning, which is what AI and ML are really about, to help us make sure we’re reducing our defects per million, as well as getting the densities we want per wafer. You’re right. That’s the data to physics layer. You have to use machine learning and inference. We have our own models for that, about how to optimize that so we’re more competitive than our competitors.

Above: Intel CEO Pat Gelsinger breaking ground on chip production.

Image Credit: Intel

VentureBeat: If we go back in history some, Nvidia’s investments in Cuda were interesting for breaking the GPU out of its straitjacket, loosening it up for AI. That led to many changes in the industry. Does Intel have its own version of how you’d like to have something like that happen again?

Lavender: There’s at least three parts to that in the way I think about it. Everyone’s interested in roofline performance. Those are the bragging rights in the industry, whether it’s for a CPU or a GPU. We’ve released some preliminary ML performance numbers for Ponte Vecchio. I think it’s on the 23rd of this month that we’ll be submitting additional ML performance numbers for Xeon into the community for validation and publication. I don’t want to pre-announce those, but wait a couple of days.

We’re continually making progress on what we’re doing there. But it’s really about the software stack. You mentioned Cuda. Cuda has become the de facto standard for programming the GPU in the AI and ML space, not just for gaming. But there are alternatives. Some people do OpenCL. Are you familiar with SYCL, the open source effort for data parallel C++? All of our oneAPI compilers compile for CPU, for Xeon and our client CPUs, for GPU and FPGAs, which are also going into network accelerators particularly. If you want to program in C++ with the SYCL extensions, which are up for standardization in the ISO C++ standards bodies, there’s a lot of effort going into writing SYCL as an open source, industry neutral technology. We’re supporting that for our own platforms, and we’d like to see more adoption across the industry.

I’m sure you’re familiar with AMD announcing their HIP, this thing called a heterogeneous programming environment, which is essentially — think of it as a source-to-source translation of Cuda into this HIP syntax for running on their own CPU and GPU. From Intel’s perspective, we want to support the open source community. We want open standards for how to do this. We’re investing, and we’re going to support the SYCL open source community, which is the Khronos Group. We think that provides a more neutral environment. In fact, I’m told you can program SYCL on top of Nvidia GPUs.

That’s sort of step two, once you get competitive at the GPU level. Step three is, what’s the ecosystem that’s already out there? There’s lots of ISVs that are already in these spaces like health care, edge computing, automotive. Everybody wants choice. Nobody wants proprietary lock-in. We’re going to pursue the path of presenting the market and the industry and our customers with choice.

VentureBeat: How open do you want to be? That’s always a good question.

Lavender: We’ll announce this more specifically at Intel Innovation, but the oneAPI ecosystem we’ve talked about — in some sense, the oneAPI name doesn’t mean there’s one single API. It’s really just a brand name. We have more than seven different vertical toolkits for building various things with the technology. We have more than 40 components — toolkits, SDKs, and so on — that make up the oneAPI ecosystem. It’s really an ecosystem of Intel accelerated technologies, all freely available. We’re doing the oneAPI release. We’re accelerating everything from crypto to codecs to GPUs to FPGAs to CPUs — x86 CPUs, obviously, but not necessarily ours. You can use those tools on AMD if you choose.

Our view is to provide the toolkits out there, and we’ll compete at the system level together with our customers, our partners. We’ll enable all the ISVs. It’s not just the open source. We’ll enable the ISVs to use those libraries. It enables anybody doing cloud development. It enables those 4 million enterprise developers on Red Hat. Just enable everybody. We all know about how software eats the world. The more software that’s out there, in the end, cloud to edge — ubiquitous computing, we call it — that enables the advancement of society, the advancement of culture, the advancement of security.

We’re big on pushing our security features in our hardware through those software components. We’re going to get to a more secure world with less supply chain risk from hackers. Even now, machine learning models are being stolen. People spend millions of dollars to train these things, develop these models, and when they deploy them at the edge people are stealing them, because the edge is not secure. We can use all the security features like SGX and TDX in our hardware to create a security as a service capability for software. We can have secure containers. We pushed an open source project called Kata Containers that gets security from our trusted extensions and our hardware through Linux.

The more we can deliver the value of those innovations in our hardware — that most people don’t know about — through the software stack, then that value materializes. If you use Signal messenger for your communications, did you know that Signal’s servers run on Intel hardware with SGX providing a secure enclave for your security credentials, so your communications aren’t hacked or viewed by the cloud vendors? Only Signal has access to the certificates. That’s enabled by us running on Intel hardware in the cloud. The CTO of Signal will be on stage with me as we talk about this, along with the CTO of Red Hat. The CTO of Signal did his undergraduate honors thesis under me on secure anonymous communication over the internet in 2002. I’m really proud of my student and what he’s done.

Greg Lavender came to Intel in June from VMware.

Above: Greg Lavender came to Intel in June from VMware.

Image Credit: Intel

VentureBeat: How do you think about something like RISC-V?

Lavender: It shows that innovation is ever-present and always occurring. RISC-V is another set of technologies that will be adopted particularly, I would think, outside the United States, as in Europe and China and elsewhere in Asia people want alternatives to ARM for their own reasons. It’ll be another open architecture, open ecosystem, but the challenge we have as an industry is we have to develop the software ecosystem for RISC-V. There’s a massive software system that’s evolved over a decade or more for ARM. Either we co-opt that software ecosystem for RISC-V, or a new one emerges. There’s appetite for both, I think. There’s already investment in ARM, but at the same time there’s potential to develop something that’s not tied to the ARM environment.

There are differing opinions. I’ve heard from various people about the opportunity for RISC-V. But clearly it’s happening. I think it’s good. It gives more choice in the industry. Intel will track and see where it goes. I generally believe that it’s a positive trend in the industry.

VentureBeat: As far as what people can expect next week, when it was in person there were so many different kinds of options for deep dives. I guess you may have even more options when you’re doing it online. How would you compare this experience to what people might remember from before about Intel Developer Forum?

Lavender: It’s going to be very interactive, with Pat and myself, Sandra Rivera, Gregory Bryant for the client side, Nick McKeown. Sandra, myself, and Nick are all new in our roles, around 100-plus days. It’s going to be a lively conversation style. I forget the total number, but we have more than 100 “meet the geek” demos. We’ll have some cool stuff, everything from 5G edge robotics to deep learning, AI, ML, obviously graphics. We’re going to show off our new Alder Lake processor. Lots of stuff about various open source toolkits we’ve launched. You may not have heard of iPDK. It’s an open source project we launched. A lot of people are jumping on the bandwagon to offload workloads that traditionally run on the cores to the smart NIC. We have some partners that will be showing up to talk about our technology and how they’re using it.

It’s only a two-day event, but there’s a lot of material packed into those two days. It’s a video format. You can browse around and pick and choose what you want. I think we’re all fatigued of these virtual conferences. We’re trying to make it not just a bunch of talking heads, but more of an interactive dialogue about things we’re doing, about our customers and how they’re taking advantage of it, and then quickly transitioning to live or recorded demos to show that it’s real. It’s not just marketing. It’s real.

VentureBeat: Does this sort of thing make you wish the metaverse was here, that we could make it happen faster?

Lavender: There’s this whole sociological, anthropological conversation to have about the transition we’ve all been through for the last two years. For me, I worked in banking, so I’ve learned to think like a global economist. You can’t help but do that when you’re CTO of a global financial company. I look at these things at more of the macroeconomic level in terms of the likely societal changes. Clearly the shortages in the supply chain and the chokes in the supply chain have shown the insatiable demand for technology generally. Everything we’re doing now is technology-enabled. Can you imagine if we didn’t have Zoom, Teams, whatever? What would that have been like? Obviously this is something in the human experience. We’ve all experienced that.

Above: Intel has 6,000 software engineers.

Image Credit: Intel

But without a doubt, the demand for semiconductors, the demand for software will outstrip the talent, the global talent we have to produce it. We have to get economies of scale. This is where Intel has an advantage. We have those economies of scale more than anyone. We can satisfy more of that demand, even if we have to build factories. We have to accelerate all of that with software. This is why there’s a software-first strategy here. If we’re talking five years from now, it could be a very different story, because the company is putting its mojo back into software, particularly open source software. We’re going to continue to deliver a broad portfolio of technologies to enable that global demand to be met in multiple verticals. We all know software is the liquid. It’s the lubricant that enables that technology to add social and economic value.

VentureBeat: Does it look like 2023 is when the supply chain gets back to its healthier self?

Lavender: I read the same press you read. It seems like it’s a two-year cycle to get there. I’ve read stories about people building their own containers to take over on a ship and collect the parts to bring back. Walking supplies through customs in various countries to get it through the process and the bureaucracy. Right now it seems like a lot of unusual things are happening. I’ve even heard about people receiving SOC packages and they go to test them and there’s actually no guts inside the SOC. That hasn’t happened to us, but these are the stories I’ve read about in the press.

VentureBeat: I would hope that the U.S. government comes around and sees the need to invest in bringing a lot of this back onshore.

Lavender: The CHIPS Act — I’m sure you’re familiar with that. It’s passed the Senate. It hasn’t yet passed the House. I think it’s tied up in the politics of the current spending bill. The Biden administration is trying to put it through. Obviously we’re supporters of that. It’s as good for the industry as it is for Intel. But your guess is as good as mine about geopolitics. It’s not an area that I have any expertise in.

VentureBeat: As far as some futuristic things, I wonder if you’ve thought about some things like Web 3 and the decentralized web, whether that may come to pass or whether it needs certain investments across the industry to happen.

Lavender: There’s a lot of talk. We all think that the datacenter of the future — you may have heard us talk about going from exascale to zettascale. When you get to those scales, to zettascale, it becomes a communications issue. We’ve invested and pioneered in silicon photonics. We can get latencies over distances to a millisecond. That’s quite a distance you can travel at the speed of light.

First off, the innovations in core networking and the edge — it’s not just 5G. I have a new Nighthawk modem from Netgear. I get 400 megabits download. It cost me 800 bucks for that device, but if you’re on a good 5G network, you see the value of it. We’re going to be close to gigabit before too much longer. 6G is going to give you much more antenna bandwidth as well. The bandwidth has to go there before all the other compute density distributes.

I think what you’re talking about is workloads moving not necessarily to the cloud, but away from the cloud and more to the edge. That’s certainly a trend. We see that in our own business and our own growth, in demand for FPGAs and our 5G technologies. Compute becomes ubiquitous. That’s what we’ve said. Network connectivity becomes pervasive. And it’s software-controlled. There has to be software to manage that level of distribution, that level autonomy, that level of disaggregation.

Humans aren’t good about building distributed control planes. Just look at what goes on today. The security architecture that has to overlay all of that — you’ve created a massive surface area for attack vectors. Again, here at Intel we think about these things. We have the capacity and the manufacturing capability to start building prototype technology. I have Intel Labs. That’s 700 researchers. Those are areas we’re discussing as we look at our funding for the next fiscal year, to start exploring these distributed architectures. But most important, back to the software story — I can build the hardware. We can do that. It’s about how you actually manage that at zettascale.

Above: Intel is taking a systems approach to software.

Image Credit: Intel

VentureBeat: You must be happy that Windows 11 has that hardware security feature built in. I think some of these game companies are starting to realize that ring zero access for things like anti-cheat in multiplayer games is important.

Lavender: Windows 11 requires TPM. I have an old Intel NUC that I use for programming. I’ve tried to upgrade to Windows 11 and it told me I needed to buy a new one because I didn’t have the Trusted Platform Module. I asked my colleagues here when the next NUC is coming out. I don’t want to get the currently shipping one. I want one with the new chips. So I’m in line for a beta box.

I just got put onto the Open Source Security Foundation, along with the CTOs of VMware and Red Hat and HPE and Dell. We’re really going to tackle this problem for the industry in that form. From my platform at Intel as the CTO, I want to engage with all my ecosystem partners so that we solve this problem as an industry. It’s too big a problem to solve one-off.


GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it.

How will you do that? Membership includes access to:

  • Newsletters, such as DeanBeat
  • The wonderful, educational, and fun speakers at our events
  • Networking opportunities
  • Special members-only interviews, chats, and “open office” events with GamesBeat staff
  • Chatting with community members, GamesBeat staff, and other guests in our Discord
  • And maybe even a fun prize or two
  • Introductions to like-minded parties

Become a member

Repost: Original Source and Author Link


IDC: AI spending will reach $342B in 2021

All the sessions from Transform 2021 are available on-demand now. Watch now.

Companies could spend nearly $342 billion on AI software, hardware, and services in 2021. That’s according to the latest edition of IDC’s Worldwide Semiannual Artificial Intelligence Tracker, which forecast that the AI market will accelerate in 2022, with 18.8% growth, and remain on track to break the $500 billion mark by 2024.

For its report, IDC surveyed over 700 large enterprises across a total of 27 countries and five rest-of regions. While the report suggests the competitive AI landscape remains highly fragmented, 2020 was the year that strengthened the value of enterprise AI, according to IDC’s Ritu Jyoti.

“Disruption is unsettling, but it can also serve as a catalyst for innovation and transformation. We have now entered the domain of AI-augmented work and decision across all the functional areas of a business,” Jyoti, group VP for AI and automation research, said in a statement. “Responsible creation and use of AI solutions that can sense, predict, respond, and adapt at speed is an important business imperative.”

Among AI software, services, and hardware, software — which includes applications such as lifecycle management and enterprise relationship management solutions, platforms, and system infrastructure controls — occupies 88% of the overall AI market, according to IDC. AI lifecycle software is anticipated to grow the most quickly within the AI platforms segment, reflecting the increased need for governance, development, and maintenance solutions. However, in terms of expansion, the AI hardware market is expected to grow the most quickly in the next several years, while AI services is forecast to become the fastest-growing category from 2023 onward.

Growth in AI services

IDC estimates that the AI services market and its subcategories, IT services and business services, were worth $19.4 billion in 2020, representing the steepest uptick relative to hardware and software. As a case in point, IBM, Accenture, and Tata Consultancy Services notched over $1 billion in revenues in 2020. For 2021, AI services are forecast to grow at 19.3%, according to IDC, reaching a compound annual growth rate (CAGR) of 21% over the next five years.

AI hardware — specifically servers and storage — represents the smallest portion of the larger AI market, with a 5% share, IDC found. Nonetheless, it’s projected to grow the fastest in 2021, at 29.6% year over year — and to hold the best growth spot in 2022. Over the next five years, IDC estimates AI hardware will hit a 19.4% CAGR, with companies like Dell, HPE, Huawei, IBM, Inspur, and Lenovo poised to be the big winners. In 2020, each of the companies generated over $500 million in the AI server market, the IDC report notes.


Other reports agree with IDC’s top-level finding: AI technologies are becoming prevalent in enterprises around the world. While the adoption rate varies between businesses, a majority of them — 95% in a recent S&P Global report — consider AI to be important in their digital transformation efforts. The benefits could be enormous. McKinsey predicts automation alone could raise productivity growth globally by 0.8% to 1.4% annually.

“AI has emerged as an essential component of the future enterprise, fueling demand for services partners to help organizations clear the many hurdles standing between pilot projects and enterprise AI,” IDC research manager Jennifer Hamel said in a press release. “Client demand for expertise in developing production-grade AI solutions and establishing the right organization, platform, governance, business process, and talent strategies to ensure sustainable AI adoption at scale drives expansion across both IT services and business services segments.”


VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link


Xbox family app now lets parents set spending limits

You won’t have to worry about your kids going on a game shopping spree, at least if you’re part of a Windows or Xbox household. Microsoft has updated the Xbox Family Settings app with controls to manage your kids’ spending. You can set spending limits, and require that children ask permission to buy content when they don’t have the funds in their account. And if you’re wondering what your young ones bought, you can check their spending history.

Microsoft pitches this as a way not just to prevent out-of-control spending, but to reward good behavior. You can top up an account when your child finishes their chores, or reward them with money for Minecraft extras when they ace a test.

The refreshed Family Settings app is available now for Android and iOS. The spending tools aren’t exactly novel concepts, but they could make all the difference if you want to teach your kids better spending habits — or at least, save yourself from unpleasant credit card bills.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.

Repost: Original Source and Author Link


Xbox Family Settings app can now manage kids’ spending

Gaming consoles like the Xbox are often considered to be a solitary machine for a solitary gamer, but that hasn’t been the case for the past years. Consoles have even become a means by which families spend time together at home or for kids to develop skills often associated with safe gaming. The latter is an important consideration for parents who want to give their kids access to consoles and the games available for them. The Xbox Family Settings app tries to make that task less of a chore, and its newest features promise to save parents money or prevent them from accidentally losing some.

IAPs and accidental game purchases aren’t just the banes of mobile gaming. Those can happen on any gaming platform, too, and on any age group. Xbox has provided parents and guardians with ways to monitor and manage a child’s use of the console, but parents still need more control and safeguards, especially when money is involved. The latest update to the parental control app for Android and iOS finally delivers that, allowing parents to monitor and limit spending on games.

Parents will be able to set limits on how much a child can spend when buying games or making in-app purchases. Of course, that presupposes that kids have a sort of “wallet” for buying those. The app does let parents add and view a child’s balance, allowing them to give kids a reward for good grades or chores well done.

In case their money does run out, kids can also ask their parents if they can buy a game. Parents can either buy it on their behalf or give them additional money for it. Of course, they can also deny the request.

Parents can also keep track of what their kids buy with the money given to them, in case they give them the trust and freedom to make their own purchases without having to ask every time. The app’s new features not only empower parents but also empower children to make their own decisions and earn the trust of their parents to further give them more freedom when it comes to their gaming choices.

Repost: Original Source and Author Link


Sony’s PlayStation Studios spending spree may not be over yet

Earlier today, we learned that Sony is buying Returnal developer Housemarque. That’s big enough news on its own, but it turns out that Housemarque may not be the only studio Sony is looking to absorb. Around the time the Housemarque announcement went live this morning, PlayStation Japan accidentally leaked what appears to be another major acquisition for Sony.

Apparently, Sony is also cooking up the acquisition of Bluepoint Games. When it was announcing the acquisition of Housemarque, the PlayStation Japan Twitter account accidentally uploaded an image welcoming Bluepoint Games to the PlayStation family. The tweet was quickly pulled, but not before that image was grabbed by Nibellion on Twitter.

So, it would seem that Sony is planning to announce an acquisition of Bluepoint Games shortly. Of course, that announcement may not come today – Sony might want to spread these acquisition announcements out a little bit – but given that an image welcoming Bluepoint to PlayStation Studios has already been created, confirmation probably isn’t far off.

A Sony acquisition of Bluepoint Games makes a lot of sense. For more than a decade, Bluepoint has basically acted as Sony’s go-to for PlayStation remakes and remasters. Some of the games the studio is responsible for include Uncharted: The Nathan Drake Collection, Gravity Rush Remastered, the Shadow of the Colossus remake for PS4, and the Demon’s Souls remake for PlayStation 5 (which was co-developed with Sony’s Japan Studio).

In other words, if Sony does announce that it’s buying Bluepoint Games, it won’t be much of a shock since Bluepoint has worked with Sony a lot throughout its history. We’ll see if this acquisition is confirmed later today or later on in the week, so stay tuned for more.

Repost: Original Source and Author Link


U.S. Senate committee to consider technology research spending bill

Join Transform 2021 this July 12-16. Register for the AI event of the year.

(Reuters) — A U.S. legislative proposal to allocate about $110 billion for basic and advanced technology research and science in the face of rising competitive pressure from China will be debated by the Senate Commerce Committee on May 12, sources said on Wednesday.

The bipartisan “Endless Frontier” bill would authorize most of the money, $100 billion, over five years to invest in basic and advanced research, commercialization, and education and training programs in key technology areas, including artificial intelligence, semiconductors, quantum computing, advanced communications, biotechnology and advanced energy.

The bill had been expected to be considered on April 28, but was delayed after more than 230 amendments were filed for consideration. Senate Democrats and Republicans are moving closer to reaching agreement. A congressional aide said “there has been very encouraging progress toward a deal.”

The measure, sponsored by Senate Democratic Leader Chuck Schumer, Republican Senator Todd Young and others, would also authorize another $10 billion to designate at least 10 regional technology hubs and create a supply chain crisis-response program to address issues like the shortfall in semiconductor chips harming auto production.

Many lawmakers want to use the legislation to advance other priorities and attach additional proposals, and some sought to use the bill to speed the deployment of thousands of self-driving cars.

Republican Representative Mike Gallagher, another sponsor, warned earlier that U.S. superiority in science and technology is at risk. “The Chinese Communist Party has used decades of intellectual property theft and industrial espionage to close this technological gap in a way that threatens not only our economic security, but also our way of life,” he said.

Senator John Cornyn, a Republican, said lawmakers are also likely this month to vote on approving at least $37 billion to fund programs to boost U.S. semiconductor production that was authorized under a law enacted in January.


VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link

Tech News

Apple is spending $430 billion more on 5G and silicon: Here’s why

On the heels of its Spring Forward event that took place last week, Apple has announced that it will be committing $430 billion in US investments over the next five years. While these investments will support the engineering of its own in-house silicon and the build out of 5G, Apple also says that it will add 20,000 jobs across the country in the same time period as well. Today, the company listed a number of states where it will focus its efforts.

In its announcement today, Apple said that its $430 billion in investments will include “direct spend with American suppliers, data center investments, capital expenditures in the US, and other domestic spend — including dozens of Apple TV+ productions across 20 states, creating thousands of jobs and supporting the creative industry.”

One state that will get a lot of attention from Apple is North Carolina, where Apple plans to spend $1 billion in opening a new campus (which you can see in concept art above) and an engineering hub in the state’s Research Triangle area in the Raleigh-Durham region. Apple also says that it will “establish a $100 million fund to support schools and community initiatives in the greater Raleigh-Durham area and across the state,” in addition to contributing $110 million in infrastructure spending across the state.

The company then detailed its plans for several states, saying that it will grow its teams in San Diego, Culver City, Boulder, Boston, and Seattle in addition to building a new campus in Austin (where construction is already underway) and a new data center in Waukee, Iowa. Back in 2018, Apple said that it would add 20,000 jobs in the US by 2023, and while the company saying that it’s on track to meet that goal today, it should be noted that today’s announcement of 20,000 jobs over the next five years are in addition to the 20,000 it committed to in 2018.

Apple’s plans also include partnering with other companies through its Advanced Manufacturing Fund. Some of the partnerships Apple detailed today include investments with XPO Logistics in Indiana, Corning in Kentucky, II-VI in Texas, and silicon engineering and 5G investments in California, Colorado, Maine, Massachusetts, New York, Oregon, Texas, Vermont, and Washington.

That last bit about silicon engineering is important, because these days, Apple is rolling with its own in-house CPUs in a number of devices like the MacBook, Mac mini, and now the iMac, and we expect that to expand more in the future too. So, these investments in silicon engineering will probably mean big things for Apple in the long run. You can read more about Apple’s investment plans over on the company’s website.

Repost: Original Source and Author Link

Tech News

The data broker industry is spending big bucks lobbying Congress

The data brokers who’ve made fortunes from collecting and sharing millions of people’s personal information tend to fly under the radar. Names like LiveRamp or RELX might not be familiar to most Americans, but they’re making themselves known on Capitol Hill.

Collectively, data broker spending on lobbying in 2020 rivaled the spending of individual Big Tech firms like Facebook and Google. The Markup searched lobbying disclosures in the U.S. Senate’s Lobbying Disclosure Act database and the watchdog Center for Responsive Politics’ tool OpenSecrets for the names of companies that registered as data brokers in Vermont or California. Those states are the only two that require companies to annually disclose that they collect, sell, or share people’s personal information without having a direct relationship to them.

All in all, we found 25 companies whose combined spending on federal lobbying totaled $29 million in 2020. Many of the top spenders were not pure data brokers but companies that nonetheless have massive data operations. Oracle, which has spent the past decade acquiring companies that collect data, spent the most by far, with disclosure documents showing $9,570,000 spent on federal lobbying.

For comparison, of the Big Tech firms with heavy lobbying presences, Facebook spent $19,680,000, Amazon $18,725,000, and Google $8,850,000 in the same period, according to the Center for Responsive Politics. Public Citizen, a consumer advocacy group, found that Big Tech spent $108 million collectively on lobbying in 2020.

Oracle has its own data collection arm but has also built its portfolio by buying up companies like DataRaker, Compendium, and Crosswise. The companies, which were acquired in 2012, 2013, and 2016, respectively, take data from a variety of sources. DataRaker gets data from millions of smart meters and sensors for utilities companies, while Compendium delivers targeted ads. Crosswise allows Oracle to track people across devices, claiming to process data from billions of devices every month.

Oracle also acquired Datalogix, in 2014, which connected offline purchases to online profiles. Additionally, Oracle combines datasets from more than 75 other data brokers, which it calls “the world’s largest collection of third-party data.”

Our report comes as the data broker industry is not only growing but also facing scrutiny for the first time. California recently passed a statewide privacy law that establishes an agency focused on regulating data privacy issues. Virginia and Maine have also passed regulations to protect people’s online information.

California’s and Virginia’s privacy laws hit at the core of what data brokers do by requiring companies to delete data collected about people upon request and allowing people to prevent their personal information from being used for targeted advertising. Maine’s privacy law prevents internet service providers from sharing personal information with data brokers until people give “express, affirmative consent.”

And the past year also saw concerns raised about how well these brokers protect their massive troves of data—Oracle, for instance, suffered a data breach after security researchers found billions of records from its BlueKai data collection were left exposed on a server.

The Markup contacted all 25 companies for comment on their lobbying activities. Several companies, like Inmar Intelligence and LiveRamp denied being data brokers, though they had self-identified as such to California and/or Vermont regulators.

“Inmar Intelligence does not generally consider itself a data broker though one of our entities, Inmar-OIQ, LLC, is registered as such in a couple of states,” Holly Pavlika, a corporate marketing senior vice president for Inmar Intelligence, said.

The industry itself is hard to define—many companies, including tech giants, make money off of personal data, though the technical details of how they use that data vary. So The Markup relied on companies that self-reported to Vermont and California as members of the industry. The list, 480 companies long, shows just how pervasive it has become for companies to traffic in personal information.

The list includes businesses that are primarily data brokers, like CoreLogic, which claims to have collected data on 99 percent of property and homeowners in the U.S. and spent $215,000 on lobbying in 2020; and Acxiom, which spent $360,000 on lobbying in 2020 for issues related to data security and privacy. Acxiom’s InfoBase boasts of datasets on more than 250 million people and collects information including location data, purchases, interests, life events, and behaviors, according to its marketing material.

Our list also includes credit monitoring services like TransUnion, Equifax, and Experian, which each spent about $1.4 million lobbying in 2020 on issues related to credit score reform, like the Protecting Your Credit Score Act of 2020, the Credit Access and Inclusion Act of 2019, and the Clarity in Credit Score Formation Act of 2019. TransUnion also owns subsidiaries like Callcredit, Iovation, and Signal, which has been used for collecting and profiling users for gambling apps, according to The New York Times.

Some registered data brokers also didn’t lobby specifically on privacy issues. Refinitiv, which collects data on people for its risk assessment tool, spent $120,000 lobbying on banking issues, the National Defense Authorization Act for Fiscal Year 2021, and cybersecurity legislation.

Notably, our tally also does not include lobbying by trade associations that data brokers are a part of, such as the Interactive Advertising Bureau and the Digital Advertising Alliance.

Oracle, the biggest spender, used lobbyists to advocate on issues like annual defense and intelligence budgets and “competition and antitrust in the mobile telecommunications and digital advertising industries,” according to public records.

The company didn’t respond to requests for comment.

The second largest spender was Accenture—a technology and consulting company that boasts a marketing and analytics branch called Accenture Interactive. According to its marketing materials, the company can combine datasets from sources including people’s purchase history and location to help its clients build customer profiles for advertising.

In 2020, the company spent $3,250,000 lobbying on issues like artificial intelligence, the Digital Dollar, and COVID-19 contact tracing, according to public records.

When The Markup reached out to Accenture, the company pointed us to two statements CEO Julie Sweet made on data privacy legislation in 2018. The statements called for a federal privacy law that would preempt state laws.

PricewaterhouseCoopers, a major accounting firm, spent $2,820,000 lobbying the federal government in 2020, third most among registered data brokers.

PWC collects data from advertising networks and data analytics partners including personal information like names and addresses, which it uses to help clients personalize advertising campaigns, according to the company’s privacy statement.

Spending went to monitoring privacy legislation and accounting and auditing issues.

PWC didn’t respond to a request for comment.

Other big spenders included Deloitte, which was awarded a $106 million Department of Defense contract to build a development environment for the Joint Artificial Intelligence Center this summer. The company spent $2,400,000 on federal lobbying in 2020, including on bills like the Artificial Intelligence in Government Act and the Artificial Intelligence Initiative Act.

Deloitte is an auditing and advisory company but provides services like PredictRisk, which uses information including people’s hobbies, interests, and financial data provided by data brokers to generate a health risk prediction score and help life insurance companies figure out how likely people are to buy their products.

And RELX, which spent $2,375,000 on federal lobbying in 2020, including on data privacy bills like the Data Accountability and Trust Act, the Information Transparency & Personal Data Control Act, and the Data Broker Accountability and Transparency Act; and on bills related to data privacy and COVID-19, including the Covid-19 Consumer Data Protection Act and the Exposure Notification Privacy Act.

RELX is a major data broker that owns companies like LexisNexis and ThreatMetrix, which has customers in law enforcement, insurance, and financial services. ThreatMetrix alone boasts tracking on 4.5 billion devices, according to a statement from RELX in 2018.

Deloitte and RELX didn’t respond to a request for comment.

An industry increasingly under fire

Some of the bills targeted by lobbyists would have regulated the data broker industry, though disclosures do not specify whether they lobbied for or against the bills.

The Data Accountability and Trust Act looked to establish security standards and require postbreach audits for data brokers and also prohibit collecting information under false pretenses. The Information Transparency & Personal Data Control Act would have required data brokers to get consent to collect sensitive data and go through an annual privacy audit.

The Data Broker Accountability and Transparency Act of 2020 wanted to mandate opt-outs from data brokers and for the FTC to create a national list of data brokers.

While lobbying records don’t always list specific bills, filings show that RELX paid lobbyists to address all three bills. Deloitte records show that some of its lobbying efforts went toward addressing the Data Accountability and Trust Act.

Both COVID-19 data privacy bills looked to provide stronger controls over data related to contact tracing.

None of the bills named in this story passed Congress, except for the National Defense Authorization Act, which was enacted.

Some of the companies that showed up in lobbying records faced other sorts of pressure during 2020.

For instance, Apple and Google banned X-Mode’s trackers from their app stores last December following a series of reports from Motherboard on how the location data broker was providing information to contractors who passed that information to the U.S. military. X-Mode spent $30,000 on lobbying in the last quarter of 2020, during the height of the public pressure.

Both X-Mode and Venntel, another location data broker, which spent $160,000 on lobbying, are facing scrutiny from Congress over their location data sales. Sen. Ron Wyden, a Democrat from Oregon who signed on to a letter calling for an investigation on Venntel, called the data broker industry “out of control,” in a statement to The Markup.

“Americans are learning more every day about the secretive and shady data broker industry and they’re demanding new laws to protect our privacy,” Wyden said in the email to The Markup. “It’s no surprise data brokers are trying every avenue they can think of to ward off the common sense protections Americans desperately need.”

X-Mode and Venntel didn’t respond to a request for comment.

Some companies said they were lobbying to have a voice on legislation, including potential federal laws on privacy.

“With increased momentum, attention and legislative activity in multiple states, we support a federal data privacy law that can rebalance the system and set standards that rebuild trust with the people providing the data—consumers,” Christine Travis, a senior communications director for LiveRamp, said in an email. “A federal approach to data privacy and security is better than a patchwork of state laws for all stakeholders.”

LiveRamp was the top lobbying spender among companies whose primary focus is collecting data for advertising purposes. The company, which connects people’s activity across the web and from offline purchases for advertisers, spent $630,000 on lobbying in 2020 on issues such as the Augmenting Compatibility and Competition by Enabling Service Switching (ACCESS) Act and Privacy Shield.

Privacy Shield was a framework for transferring data between the European Union and the U.S. but was declared invalid last July for failing to meet the EU’s privacy regulations.

LiveRamp collects its datasets from a multitude of sources, including credit card transactions and location data, and connects it to people’s online profiles for advertisers. The company claims to have data on more than 250 million Americans.

Travis denied that LiveRamp is a data broker despite being listed on California’s data broker registry, insisting instead that the company is a “data connectivity platform.”

Experts said scrutiny and oversight is fairly new to the data broker industry.

“When somebody shows up on the lobbying records, or in meetings or in trade associations, it just tells me they’ve probably recently woken up to this issue and they see a real threat to their business model by privacy regulations,” Hayley Tsukayama, a legislative activist for the Electronic Frontier Foundation, said.

Others, however, said Congress is not doing nearly enough to regulate the unwieldy industry.

“What is widely understood now in Congress and among independent agencies like the FTC are the ways in which large tech companies like Facebook and Google are extracting data from consumers and using that to monopolize markets like the advertising industry,” Jane Chung, a Big Tech accountability advocate at the consumer advocacy group Public Citizen, said. “What a lot of people don’t understand is how data brokers fit into that ecosystem.”

This article by Alfred Ng and Maddy Varner was originally published on The Markup and was republished under the Creative Commons Attribution-NonCommercial-NoDerivatives license.

Read next:

Watch this VW bus get an EV-conversion and let your worries melt away

Repost: Original Source and Author Link

Tech News

How to preorder the Samsung Galaxy S20 without spending an arm and a leg

Update 2 p.m.: Added Amazon’s preorder bundle.

For those of you who have been counting the days until you could preorder a Galaxy S20, the wait is finally over. Starting today, you can get in line to own Samsung’s newest high-priced premium phone when they hit shelves on Friday, March 6.

But before we get into the new hotness, a quick PSA: Don’t sleep on 2019’s Galaxy S10. Samsung has slashed the price of all three models by $150 in its store, making it one of the best premium values you can find. It might not be the shiny new thing, but it’s an excellent phone with a great camera and a fantastic screen.

And even if your heart is set on the S20, you don’t need to spend quite as much as Samsung is asking. Lots of stores and carriers are offering great preorder deals on the Galaxy S20, and we’ve rounded them all up:

Samsung Store

Samsung is offering the best trade-in deals we’ve seen so far, with up to $700 for a new Samsung or Apple device, and up to $600 for a Pixel phone, as well as interest-free financing for 36 months. Additionally, Samsung is offering up to $200 in Samsung Store credit that can be redeemed for a pair of Galaxy Buds+ or other accessories on new S20 purchases made at and most other outlets.


Verizon isn’t selling the smallest S20 yet (a version that supports mmWave is coming in the second quarter) but new and existing customers who buy an S20+ will save $150 off the $1,000 price tag (spread out over 24 monthly bill credits). Existing unlimited customers who are upgrading or trading in an old phone can save another $300, depending on the make and model of your old phone.

Existing customers on an unlimited plan can save a little more. Verizon is offering up to $300 if you upgrade to a new Galaxy S20 and trade-in your existing smartphone and have an active Verizon Unlimited plan. You’ll also get six free months of the Hatch streaming game service.

Finally, anyone who buys an S20+ or S20 Ultra can get $1,050 towards the purchase of another Galaxy S20+ or S20 Ultra when they add a new line and purchase the phone on a Verizon Device Payment plan.

Repost: Original Source and Author Link