California’s passage of their “GDPR-lite” caught people off guard. We think this is part of a trend we’ve studied for a long time. Much of the current analysis misses key points, so it seems worth explaining.

Anytime something happens in a computer, someone thinks they have the right to understand it. At Lone Star Analysis, we call this the “right to transparency.”

About two years ago, we asked several thought leaders in the U.S. about the odds we’d see legislation like the E.U. General Data Protection Regulation (GDPR). GDPR provides clear rights to E.U citizens, controlling data captured on-line. It limits how data and algorithms can be used. GDPR levies hefty fines for companies who violate those rights.

“GDPR won’t happen here” was the answer we heard. The experts had some great logic. After the 2016 national elections, the emphasis was on reducing regulation; neither the FTC, nor the FCC would be willing to take this up. And, experts said it was impossible to see legislation along these lines. The national legislature wasn’t going to add new laws. None of the three big states with a stake in tech were going to do it.

  • California was going to listen to Silicon Valley
  • Texas Blues would listen to Tech Austin, Texas Reds hate regulation
  • New York would listen to the Wall Street Banks, and to Silicon Valley

But, the experts were wrong. The passage of California’s version of GDPR will go into effect in 2020. It’s likely this will drag the rest of the U.S., and perhaps Canada along. Black Box

How did this happen? There’s a short answer and a longer answer.

The short answer is “democracy is messy.” California’s’ ballot initiative is raw, direct democracy. The people can pass their own laws directly in California. This was about to happen, even though millions of dollars had been quietly spent by Big Tech to prevent this. Their coordinated effort was called “The Committee to Protect California Jobs.” The group included huge firms who don’t agree on much of anything: Amazon, AT&T, Comcast, Facebook, Google, Microsoft and Verizon.

They offered grim predictions. For example, Lothar Determann, said, ““As a U.S. company, this is going to be a big hit to the business model,” He predicted free-services business models like Waze and Google Maps would suffer or go away. Determann is Berkeley professor and attorney at Baker & McKenzie (Palo Alto).

In the end, it seems polling data showed the people weren’t listening to PR or dire warnings. The CA legislature and governor moved to pass a law they could live with, rather than face a ballot initiative they could not live with.

What did their private polling data show? It’s hard to say but it must have been bleak. A poll sponsored by the Bay Area News Group showed even in the heart of Silicon Valley, voters “deeply distrust social media companies”

  • Only 17% percent trusted social media firms, and only about one quarter trusted telecom firms
  • 51% thought the government needed to regulate this topic “more” and only 8% favored “less”
  • 86% were concerned about the security of their personal and financial data

We could stop there, with the short answer, “democracy is messy.” But the longer answer is even more interesting.

Some observers point to the recent public outcry around Facebook’s woes.

We disagree. The belief in “digital civil rights” runs a lot deeper than the news cycle. And, it’s been hardening for some time. Privacy laws are not new. The U.S. Constitution and Bill of Rights have several protections. Since 1970, the U.S. congress has passed at least 20 laws touching this topic. At least 48 states have local laws on the books. California’s statutes go back at least as far as 1971.

But even these laws are not enough for the typical voter. Lone Star’s research shows the “age of algorithms” is part of this hardening of sentiment. Data is one thing. When large data sets are merged, managed, and mangled by black box algorithms, people become wary.

Why does the “black box” topic matter?

Because “black box” Artificial Intelligence is the darling of Silicon Valley. It’s the only technology we can use to automate some important processes. There’s no other practical way to do image classification (is it a cat or a zebra?). We can’t perform some critical speech and text processes any other way. It makes computers seem less stupid, less cold. It makes them appear to be more caring, more understanding.

But black boxes are unexplainable.

We can’t quite figure out how the AI algorithms decide about the cat and the zebra. If it stopped there, Big Tech would not be in trouble.

Big Tech didn’t stop there, of course. Big Tech uses AI to decide what news stories to show us, and what might be fake news. AI offers sentencing guidelines to judges. It’s a core technology for self-driving cars. It’s being promoted for Finance Technology (FinTech), and it’s being used for medical research.

We’ve been testing public sentiment on this since early 2017. Our research poses hypothetical scenarios where a computer makes a ruling which ends badly for someone;

  • You are on a jury considering a self-driving car crash
  • Your loved one is turned down for an organ transplant
  • You are turned down for loan you need

The data shows some interesting trends.

  • Black boxes are ALWAYS harder for people to accept
  • The more important the life topic, the more people feel a “right to transparency”
  • These views seem to be hardening over time

We plan to keep an eye on this. Lone Star has been saying companies need to keep an eye on their legal risk from black boxes. Juries and consumers won’t tolerate “the computer made me do it” as a “reason.”

We’ve been predicting a lot of this will play out in court. For consumer privacy, the large classes involved make tempting class action targets. It seems the E.U. and California agree. They both use violations as a source of government revenue. Companies should be concerned about using black boxes for important decisions.

You probably won’t be sued or go to jail for calling a tiger striped kitty a “zebra.” Beyond that, you might want to think twice.

About Lone Star Analysis

Lone Star Analysis enables customers to make insightful decisions faster than their competitors.  We are a predictive guide bridging the gap between data and action.  Prescient insights support confident decisions for customers in Oil & Gas, Transportation & Logistics, Industrial Products & Services, Aerospace & Defense, and Military & Intelligence.

Lone Star® delivers fast time to value supporting customers planning and on-going management needs.  Utilizing our TruNavigator® software platform, Lone Star® brings proven modeling tools and analysis that improve customers top line, by winning more business, and improve the bottom line, by quickly enabling operational efficiency, cost reduction, and performance improvement. Our trusted AnalyticsOSTM software solutions support our customers real-time predictive analytics needs when continuous operational performance optimization, cost minimization, safety improvement, and risk reduction are important.

Headquartered in Dallas, Texas, Lone Star is found on the web at http://www.Lone-Star.com.