Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.
You are a walking data repository. While outside your residence or vehicle, walking down a street, shopping in a store, or visiting any type of public event or meeting — you potentially lose your personal privacy and cross the boundary from being a private individual to a virtual public figure. You can be filmed or photographed, your image can be transported to a storage silo anywhere in the world, your voice can be recorded, and your time in public view can be noted. This is the world in which we live 2022.
When you go online to make a purchase, there opens a whole new door to others of your personally identifiable information, (PII). You invariably will be voluntarily offering strangers your name, address, phone number, email address and possibly more extensive information about yourself. Ostensibly, this data remains private between you and the vendor. “Ostensibly” is the key word here, however; one never really knows how much of your PII stays legitimately private.
Everything cited above can become data and go on your record somewhere in the world, whether you like it or not. Over-the-top severe assessment? Possibly, but it’s up to you to know this and act accordingly.
What information qualifies as personally identifiable information?
According to the U.S. Department of Labor, (DoL) companies may maintain PII on their employees, customers, clients, students, patients, or other individuals, depending on the industry. PII is defined as information that directly identifies an individual (e.g., name, address, social security number or other identifying number or code, telephone number, email address, etc.). It can also mean information by which an agency intends to identify specific individuals with other data elements, such as a combination of gender, race, birthdate, geographic indicator and other descriptors.
MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.
Whether you want this PII to be in the hands (or databases) of numerous outsiders is largely, but not totally, your own decision. The DoL says specifically: “It is the responsibility of the individual user to protect data to which they have access.”
People have long been uncomfortable with the way companies can track their movements online, often gathering credit card numbers, addresses and other critical information. They found it creepy to be followed around the web by ads that had clearly been triggered by their online searches, which led them to worry constantly about identity theft and fraud. This is a direct result of putting PII in the hands of companies who want to profit from your movements on the web.
Those concerns have led to the passage of regulations in the United States and Europe guaranteeing internet users some level of control over their personal data and images — most importantly, the European Union’s 2018 General Data Protection Regulation (GDPR). Of course, those measures didn’t end the debate around companies’ use of personal data; they are merely a starting point for deeper and more specific laws.
The California Consumer Privacy Act is a prime example, a data privacy law (enacted in 2020) that provides privacy rights to California residents, giving them options as to how their PII can be used. There’s also California’s Automated Decisions Systems Accountability Act (still in the legislative process), which aims to end algorithmic bias against groups protected by federal and state anti-discrimination laws.
Privacy, AI regulations moving in parallel fashion
Data privacy laws and regulation of data gathered for the use of artificial intelligence are progressing in parallel paths through government agencies because they are so intertwined.
Anytime a human is involved in an analytics project, bias can be introduced. In fact, AI systems that produce biased results have been making headlines. One highly publicized example is Apple’s credit card algorithm, which has been accused of discriminating against women and caused an investigation by New York’s Department of Financial Services. Another is the COMPAS (Correctional Offender Management Profiling for Alternative Sanctions) algorithm used in U.S. court systems to predict the likelihood that a defendant would become a repeat offender. This one in particular has been wrong numerous times.
As a result of all this PII collection, the rapid rise of the use of analytics and machine learning in online applications, and the constant threat of bias in AI algorithms, law enforcement agencies are chasing down an increasing number of complaints from citizens regarding online fraud.
Governments too are trying to get their arms around appropriate legislation in statewide efforts to curb this criminal activity.
The state of AI regulations
Are there regulations for artificial intelligence? Not yet, but they are coming. States can move quicker on this than the federal government, which is not a surprise. For two years, the California legislature has been debating and modifying the Automated Decision Systems Accountability Act, which stipulates that state agencies use an acquisition method that minimizes the risk of adverse and discriminatory impacts resulting from the design and application of automated decision systems. There’s a possibility it will become law later this year or early next year.
These are just the first wave of a phalanx of new laws and regulations that will be impacting online companies and their customers during the next several years. There’s plenty of evidence that tighter regulations are needed to contain deep-pocket companies such as Google and Amazon, which are becoming virtual monopolies due to the continued use of their users’ PII.
There’s no question that the ocean of PII is the fuel that analytics uses to produce data that can lead to business value. Analytics is the basis for artificial intelligence that can suggest a strategy correction for a business, warn of an impending problem in the supply chain, or make a prediction about where any market is headed over months or years. This is all bottom line-important to an enterprise and its investors, not to mention all the employees, partners, contractors, and customers that rely on the business itself.
Bobby Napiltonia is the president of Okera.
Welcome to the VentureBeat community!
DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.
If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.
You might even consider contributing an article of your own!
Read More From DataDecisionMakers