Categories
AI

Use AI to weather the upcoming data deprecation storm

Hear from CIOs, CTOs, and other C-level and senior execs on data and AI strategies at the Future of Work Summit this January 12, 2022. Learn more


This article was contributed by Tara DeZao, product marketing director, AdTech and MarTech at Pega

Last summer’s announcement that Google’s third-party cookie deprecation has been pushed out to 2023 drew cheers, yawns, and everything in between. But the question remains: what will advertisers and marketers do with the extra time? Will they squander it and resist evolution until the bitter end? Or will they use it to make real changes in pursuit of better business outcomes and improved customer experiences?

There’s a scene in “My So-Called Life,” the 90s TV drama that followed a group of angsty teens as they navigated the pitfalls of a suburban high school. The protagonist, Angela Chase, is huddled in the girls’ bathroom, frantically studying for a test, and demoralizing herself for not doing it sooner. Suddenly, another student bursts through the door and triumphantly shouts, “The copy machine ate our geometry test!” Given the time period and its lack of digitization, this meant that the crowd of girls huddled in the bathroom had a one-day reprieve and all cheered triumphantly, grateful for the extra study time.

We’ve now arrived at our “the copy machine ate our geometry test” moment. This tipping point has been a long time coming, whether or not Google pulled the plug on cookies in Chrome. Without data deprecation, consumers continue to opt-out of third-party cookie tracking and data storing amid ongoing privacy concerns, as they have done for the past several years. Various sets of government regulations around the globe are in response to these concerns. The browser and operating system restrictions, as well as the tightening of data portability rules within walled gardens, have signaled major disruptions to the digital marketing environment for several years. This is why closing the door on legacy data practices is a golden opportunity to deliver customers better experiences, a goal that continues to elude many of us in the existing landscape.

Overall, consumers aren’t happy with the State of the Union between themselves and the digital marketing industry. We struggle with frequency, relevance, context, and personalization in our engagement with them. We engage too often with irrelevant content at the wrong times and deliver either not enough or too much personalization to consumers. Not only that, but we’re getting it wrong, and we have been for quite some time. And the number of consumers who’ve installed ad blocks since 2019 proves it: Statista reported in March that there are more than 763.5 million ad block users worldwide. We’re driving consumers away from wanting to even have conversations with us.

But it’s not that they don’t want to hear from us, it’s the way we’re speaking to them that isn’t working. When there is a true value exchange, consumers welcome engagement.  According to Merkle’s “Consumer Experience Sentient Report” released in April which reviewed how consumers feel about brands using their data in digital marketing and advertising, about half of adult US internet users said that when brands use their data in advertising, it helps them discover (50%) and find (49%) products and services that interest them — but 44% said this often feels invasive.

Undoubtedly, there will be brands and other organizations in the digital ecosystem that will resist change and hold onto legacy practices until the bitter end of data deprecation. But If you want to be in the better outcomes and experiences camp, even if you haven’t started your transformation, now is the time, and here’s how to get started.

Commit to better privacy and transparency on all your channels

This may seem obvious, but many organizations do not lead with consumer trust, privacy, and transparency when it comes to digital marketing and advertising. And although it’s in the fine print, respecting consumer privacy always makes for a better outcome. Always be transparent about how you use customer data. Be responsive when the consumer asks you for information, and deliver clear and understandable explanations of the types of data you collect and how you use it.

Audit your data strategy and MarTech stack

Many current MarTech stacks look like a jumble of point solutions to deploy segment-based, fixed marketing campaigns, and as such are leveraging multiple kinds of data. Spend the time to audit exposure – you might have external partners in your media ecosystem and not even know what types of data segments are being utilized. What do your data usage, audience targeting, and search capabilities look like? What steps can you take in the data deprecation future to deploy new types of targeting, modeling, and beyond that aren’t reliant on third-party data and technology? Are there new partnerships you can create to access the data you need to round out your strategy? Do some of your technologies need to be replaced?

Various cookie alternatives are emerging to fill the data gap looming for many organizations. But these solutions will not be a like-for-like replacement. As alternatives remain largely unproven, it becomes even more essential to identify where your business needs to strengthen or replace existing capabilities. For example, in 2020 Forrester Research defined a new type of data, zero party data, as “data that a customer intentionally and proactively shares with a brand, which can include preference center data, purchase intentions, personal context and how the individual wants the brand to recognize her.” It is essentially a subset of first-party data that gives its users broader contextual signals about a consumer with their consent. But this approach assumes that consumers will want to participate and on the scale that advertisers need to make the investment worth the reach. Also positioned as a viable alternative when facing data deprecation are data cohorts where organizations share first-party data to create types of interest-based segments in privacy compliant ways.  Google’s FloC is an example. The efficacy of this approach is yet to be determined.

Invest in AI and centralize all your data

Today’s consumer moves through the web at an accelerated pace, constantly changing their context. If we as marketers don’t move with them, we miss the opportunity to connect. Fixed campaigns and rules-based customer journeys aren’t about what the customer needs or wants at the moment, they are about what products and services we want to talk to them about. The only way to keep up with both the consumer and the industry as it evolves is to implement solutions that rely on AI and machine learning.

Artificial intelligence-backed marketing and advertising solutions can ingest your customers’ signals as they move through the web and interact with your brand to deliver highly relevant, perfectly timed customer interactions. Most leverage either predictive or adaptive modeling capabilities to predict consumer behavior, even in the face of a data deprecation storm. The most accurate tools leverage both. Predictive modeling and analytics are not new to marketing and advertising technologies. Predictive analytics ingests data from historical customer interactions to try and predict customer behavior, and most of the time is created by data scientists. Conversely, adaptive modeling starts with no data inputs and learns in real time from data gathered during a customer interaction. Models auto-create and optimize on the fly. The key to hyper-personalized, perfectly timed customer engagements is the capability to ingest immediate inputs and deliver an interaction at that moment.

These capabilities greatly reduce, and in some cases eliminate, the need for external data sources. Especially because AI empowers businesses to aggregate and centralize data from all over the organization for use in marketing initiatives. We keep hearing that there are brands across various industries that don’t have access to enough first-party data to survive in a post-cookie digital landscape, but many organizations are sitting on a trove of it. However, because it may be sitting outside of marketing in areas of the business such as finance, customer service, operations, etc., and because they don’t have solutions that can stitch it together, it goes wasted.

Making these changes may seem daunting because they are. Major strategic and organizational shifts are a heavy lift, but the delay has given us the gift of time to make them. And because we aren’t time-crunched, we can do it right and get away from the cobbled-together stacks of the past. We are in the middle of a digital revolution, and we are running out of time to make the big changes that brands need to make to weather the upcoming storm data deprecation will create. And isn’t it always better to fix the roof when it’s dry outside?

Tara DeZao, product marketing director, AdTech, and MarTech at Pega.

DataDecisionMakers

Welcome to the VentureBeat community!

DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.

If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.

You might even consider contributing an article of your own!

Read More From DataDecisionMakers

Repost: Original Source and Author Link

Categories
AI

How Pizza Hut aims to turn weather forecasts into sales

All the sessions from Transform 2021 are available on-demand now. Watch now.


It’s not enough to be first — just ask tech giants like Netscape and Friendster. And Pizza Hut.

The deep dish chain was first to online food ordering — an industry projected to hit $126.91 billion this year. But it didn’t keep pace with innovation and was later eclipsed by competitors. Now, nearly three decades later, Pizza Hut knows it’s time to get serious. The company is looking to data, analytics, and AI to learn more about its customers in order to boost digital experiences and sales.

So while the other aforementioned early entrants are no more, Pizza Hut is still spinning up pies, and there are lessons to be learned from its game of catchup. To find out more about the company’s approach to data, its partnerships, and why it chose build over buy for its machine learning technologies, we chatted with Tristan Burns, Pizza Hut’s global head of analytics.

This interview has been edited for brevity and clarity.

VentureBeat: Pizza Hut was fairly early to online ordering and digital customer experiences. How has the vision and approach evolved over time?

Tristan Burns: You’re right — Pizza Hut was the first brand to create an online ordering experience. That was back in 1994, in California. You could submit an order online, it would end up in a store, and it’d be prepared and sent to your house, which was pretty cool. And while Pizza Hut was quite early to the ecommerce game, I think no one would mind me saying we were kind of eclipsed by Dominos in the 2000s. They came out swinging, saying they were a tech company that makes pizzas and with some pretty innovative technologies. Now Pizza Hut Digital Ventures, the organization I work for that is specific to Pizza Hut International, is taking a tech-first approach to redesigning, reimagining, and recreating our ecommerce capabilities. We’re in the process of building and scaling out some pretty robust solutions. It’s a very, very data-centric and very customer-centric approach.

VentureBeat: Are you using AI and machine learning as well? What does that technology look like, and what role is it playing in the organization, specifically in regards to personalization and customer experiences?

Burns: Definitely. We’re in the early stages of an AI journey. And part of our machine learning program is to ingest customer behavior and a little bit about who customers are, where in the world they are, what the weather might be at their location, and then surface relevant product recommendations to them during their experience. We’re still early days in that process, but we’re building in-house capabilities to own it and with the hope that we can do a better job and have more specific outcomes. I think there are limitations when you use an off-the-shelf platform, and because we’re global and working across multiple different regions of the world, we have to be pretty nimble in how we implement and use AI. So those nuances and specifics mean we’re going to have a lot more flexibility if we own the experience and the platform.

VentureBeat: What are the more recent challenges the company’s been facing in terms of data and analytics? What have you been looking to execute or improve?

Burns: So we’re very much trying to improve our daily decision-making, and conversion rate optimization (CRO) is a huge part of that. We’re probably in the early to mid stages of a new CRO-led approach to designing our digital experiences. We have a lot of stakeholders, so we’ve had a lot of input from various corners on how the experiences should work and how we should go about building them. But we’re in a position now where we have to be really conscious of data and what the customer needs, and experimentation is a really big part of that for us now going forward. We’re becoming a lot more mature in making sure that we test and validate with data and user research.

VentureBeat: I know you tapped digital analytics company Contentsquare as part of these efforts. Why did you seek an outside partnership? What is it enabling you to do?

Burns: It’s been almost two years with them now, and I saw that the opportunity and the capabilities of what they were trying to do would just be so effective in getting to the bottom of the problems we were experiencing. We had a lot of what the problem was, but we didn’t have a why. And Contentsquare gave us the opportunity to kind of look right into the customers’ behaviors and get a far better understanding of what they’re doing on our platform. Now it primarily supports our CRO programs, but it also allows us to come up with and test ideas really efficiently. We can see customers do something unexpected or that might not be optimal and then run tests to see if we actually solved the problem.

VentureBeat: What more specifically are the capabilities you’re referring to? And can you give a specific example or anecdote of how you use the technology and the results you’ve seen?

Burns: Personally, I’m a big fan of Contentsquare’s page comparator, where you can superimpose the click rate, scroll rate, and attractiveness rate metrics over top of your experience. And one great example was we saw that customers were not immediately clicking on our deals page. And it became clear they weren’t sure where they needed to click, or even if the deal cards themselves were clickable. We hypothesized it was because we didn’t have a CTA (call to action), and so we ran a test and saw a phenomenal increase in the rate at which customers added those deals to their baskets. We estimated there was an almost $7 million to $8 million uplift in sales if we were to extrapolate the performance we saw over a 12-month period.

VentureBeat: What are the top considerations you found are important when applying data and analytics to customer experiences?

Burns: I think data is fantastic at telling you where you might have a problem and what customers are doing, but I believe you always have to supplement that with user research and insight to really get the full picture. So for any problem data analysis surfaces, we also want to attack it from a different angle. And if we see a problem in both the data and the research, that’s something we should look into solving.

VentureBeat: Is there anything else you think is important to know about all this?

Burns: One thing is the role of data within an organization, and the power tools such as Contentsquare can have to support the democratization and communication of data insight across maybe less data-focused teams, as well as leadership and other stakeholders. Traditionally, I think data people are not seen as being phenomenal communicators — you know, diving deep into a spreadsheet, coming up for short breaks, and then it’s back into the spreadsheets. And we’ve not always looked to hire or looked to focus on fantastic communicators within the data space. But I think that as data takes on a much more central role within organizations, it’s going to be really crucial for companies to think about their data people as strong communicators.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link

Categories
Tech News

Extreme weather: A drone’s worst enemy

Small aerial drone’s are touted as a disruptive technology, with massive investment and hype surrounding their use. They may deliver our morning coffee, pizzas, time-sensitive medical supplies, and Amazon orders.

These on-demand drone applications require a high proportion of uptime or flyability — the amount of time when drone’s can fly safely. But a key factor often overlooked in the hype about on-demand drone applications is the weather: drone’s cannot and should not fly in all types of weather.

The weather conditions most likely to prevent drone use are precipitation (which can damage electronics), strong winds (which can increase battery usage or even cause drones to lose control) and cold temperatures (which greatly reduce battery performance).

Ambitions to expand on-demand operations hinge on drone’s performing with sufficient up-time to supplement or supplant conventional practices. The viability of using drones for these applications is diminished if weather prevents them from flying.

In our new study published in Nature Scientific Reports, we examined the impact of precipitation, wind speed and temperature on drone flyability at a global scale.

Categories
Tech News

Here’s why predicting space weather is even harder than it sounds

Recent developments at the forefront of astronomy allow us to observe that planets orbiting other stars have weather. Indeed, we have known that other planets in our own solar system have weather, in many cases more extreme than our own.

Our lives are affected by short-term atmospheric variations of weather on Earth, and we fear that longer-term climate change will also have a large impact. The recently coined term “space weather” refers to effects that arise in space but affect Earth and regions around it. More subtle than meteorological weather, space weather usually acts on technological systems, and has potential impacts that range from communication disruption to power grid failures.

 

An ability to predict space weather is an essential tool in providing warnings so that mitigation can be attempted, and to hopefully, in extreme cases, forestall a disaster.

The history of weather forecasting

We are now used to large-scale meteorological forecasts that are quite accurate for about a two-week timescale.

Scientific weather forecasting originated about a century ago, with the term “front” being associated with the First World War. Meteorological prediction is based on a good knowledge of underlying theory, codified into massive computer programs running on the most advanced computers, with huge amounts of input data.

Important aspects of weather, like moisture content, can be measured by satellites that monitor continuously. Other measurements are also be readily taken, for example, by the nearly 2,000 weather balloons launched each day. Exploring the limits in weather forecasting gave rise to chaos theory, sometimes called the “butterfly effect.” The buildup of error brings about the two-week practical limit.

In contrast, the prediction of space weather is only truly reliable about one hour in advance!

An explainer of the science behind chaos.

Solar effects

Most space weather originates from the sun. Its outermost atmosphere blows into space at supersonic speeds, although at such low density that interplanetary space is more rarified than what is considered a vacuum in our laboratories. Unlike winds on Earth, this solar wind carries along a magnetic field. This is much smaller than Earth’s own field that we can detect with a compass at the surface, and vastly smaller than that near a fridge magnet, but it can interact with Earth, with an important role in space weather.

The very thin solar wind, with a very weak magnetic field, can nevertheless affect Earth in part because it interacts with a large magnetic bubble around Earth, called the magnetosphere, over a very large area, at least a hundred times as big as the surface of our planet. Much like a breeze that can barely move a thread can move a huge sailing ship when caught on the large sails, the effect of solar wind, through its direct pressure (like on a sail) or through its magnetic field interacting with Earth’s, can be enormous.

As the origin point, the sun itself is a seething mass of hot gas and magnetic fields, and their interaction is complex, sometimes even explosive. Magnetic fields are concentrated near sunspots, and produce electromagnetic phenomena like solar flares (the name says it all) and coronal mass ejections. Much as with tornadoes on Earth, we know generally when conditions are favorable for these localized explosions, but precise prediction is difficult.

Even once an event is detected, if a large mass of fast, hot and dense gas is shot in our direction (and such a “cloud” in turn is difficult to detect, coming at us against the glare of the sun), there is a further complicating factor in predicting its danger.

NASA scientists answer questions about space weather.

Detecting magnetic fields

Unlike the detectable, sometimes even visible, water content in the atmosphere that is so important in meteorology, the magnetic field of gas ejected from the sun, including in hot and denser clouds from explosions, is almost impossible to detect from afar. The effect of an interplanetary cloud is greatly enhanced if the direction of its magnetic field is opposite to Earth’s own field where it hits the barrier of Earth’s magnetosphere. In that case, a process known as “reconnection” allows much of the cloud’s energy to be transferred to the region near Earth, and accumulate largely on the night side, despite the cloud hitting on the side facing the sun.

By secondary processes, usually involving further reconnection, this energy produces space weather effects. Earth’s radiation belts can be greatly energized, endangering astronauts and even satellites. These processes can also produce bright auroras, whose beauty hides danger since they in turn produce magnetic fields. A generator effect takes place when dancing auroras make magnetic fields vary, but unlike in the generators that produce much of our electricity, the electric fields from auroras are uncontrolled.

The electric fields from auroras are small, and undetectable to human senses. However, over a very large region they can build up to apply a considerable voltage. It’s this effect that poses a hazard to our largest infrastructure, such as electric grids. To predict when this might happen, we would need to measure from afar the size and direction of magnetic field in an incoming space cloud. However, that invisible field is stealthy and hard to detect until it is nearly upon us.

Satellite monitors

By the gravitational laws of orbits, a satellite continuously monitoring magnetic fields by direct measurement must sit about a million miles (1.6 million kilometers) from Earth, between us and the sun a hundred times further away. A magnetic cloud causing minor space weather effects usually takes about three days to come from the sun to Earth. A truly dangerous cloud, from a bigger solar explosion, may take as little as a day. Since our monitoring satellites are relatively close to Earth, we only know about the crucial magnetic field direction at most one hour in advance of impact. This is not much time to prepare vulnerable infrastructure, like power and communication networks and satellites, to best survive.

Since the fleets of satellites needed to give better warning are not even on the drawing boards, we must rely on luck in the face of space weather. It may be a small comfort that the coming solar maximum — when the surface of the sun is at its most active during a cycle and is expected to peak in 2025 — is predicted to be mild.

It may be Mark Twain who said “it is hard to make predictions, especially about the future,” but it is certainly true in the case of space weather.

This article by Martin Gerard Connors, Professor of Space Science and Physics, Athabasca University, is republished from The Conversation under a Creative Commons license. Read the original article.

Repost: Original Source and Author Link

Categories
AI

Taiwan predicts its chip industry will weather global shortage

Join Transform 2021 this July 12-16. Register for the AI event of the year.


(Reuters) — Taiwan’s key semiconductor industry has years of growth ahead of it with no worries about oversupply despite a massive capital investment program and only a few competitors in the next decade or so, a senior government minister said on Friday.

Kung Ming-hsin, the head of Taiwan’s economic planning agency, the National Development Council, told Reuters the business opportunities presented by the global transformation to a digital economy were “very, very enormous”.

Kung also sits on the board of Taiwan Semiconductor Manufacturing Co as a representative of the largest shareholder, the government’s National Development Fund, which holds around 6% of the stock of the world’s most valuable semiconductor company.

He said between now and 2025, Taiwan companies have planned more than T$3 trillion ($107 billion) in investment in the semiconductor sector, citing expansion plans from chip giants including TSMC and Powerchip Semiconductor Manufacturing.

“Once they are built, Taiwan’s competitors in semiconductors in the next decade will be very few,” Kung said in an interview in his office building, which overlooks the presidential office.

Taiwan’s semiconductor firms are ramping up production to tackle a global chip shortage, which has affected everything from carmakers to consumer products, and meet booming demand following the work-from-home trend during the COVID-19 pandemic.

Soaring demand is set to continue, driven by 5G, artificial intelligence and electric vehicles, Kung said.

“In the next decade or even longer there won’t be oversupply for semiconductors,” he added, when asked if the massive investment plans could have a downside.

Taiwan is currently in the grip of its worst drought in more than half a century, but Kung said the impact on chip firms was limited at present, citing the amount of water they are able to recycle and the location of their main factories in Hsinchu in northern Taiwan, and in the island’s south.

“These two places are okay at the moment. So the impact on semiconductors is not bad.”

Still, Taiwan does face other challenges, not least from China where President Xi Jinping has made semiconductors a strategic priority.

Kung named Samsung Electronics as Taiwan’s most serious competitor and also able to match TSMC’s advanced chipmaking, but said U.S. tech restrictions had for now blunted the Chinese threat.

Intel — both a TSMC client and competitor — last month announced a $20 billion plan to expand its advanced chip making capacity.

Kung said there was perhaps room for TSMC to cooperate with Intel, but “what’s important is really how you upgrade yourself”.

To that end, the government is helping the industry develop the next generation of semiconductor manufacturing technology like 1 nanometre and beyond with funding support and talent recruitment programmes in the works, he added.

($1 = 28.1070 Taiwan dollars)

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Repost: Original Source and Author Link

Categories
Tech News

Windows 10’s big taskbar update with news and weather is on the way

Back in January, Microsoft rolled out a Windows Insider update that included some big changes to the taskbar. The changes include a new weather readout next to the system tray that, when hovered over, showed a new panel with more detailed weather information and headlines from various news sources. After a few months of testing, it seems these taskbar updates are ready for prime time, as Microsoft is starting to roll them out on a wider scale.

If you saw coverage of this Windows Insider update back in January – or even participated in that Insider release – then you already have a pretty good handle on how this works because it doesn’t seem like Microsoft has implemented too many dramatic changes. The idea, it seems, is to allow Windows 10 users to stay up-to-date with current news without necessarily having to get absorbed in it.

The panel that appears when you hover over the weather information located in the task bar will show a number of panels, each of which with a headline. You can click on one of those panels to open the full story in a “streamlined reading experience,” or if you want to come back to it later, you can save the story or share it.

You also have personalization options when it comes to which information cards appear in the panel. Not only can you select from a collection of different topics, but you can also select your preferred publishers and even tell Microsoft which news stories you’d be likely to read to better tailor the add-on to your tastes. If, on the other hand, you have no interest in this panel at all, Microsoft says you can turn it off.

While the News and Interests panel on the taskbar is beginning to roll out today, it might still be a while before it appears on your desktop; Microsoft says that it’s taking a “phased and measured approach” to this roll out, so while some will begin to see it in the coming weeks, it’ll be a few months before it’s broadly available.

Repost: Original Source and Author Link