By Chris Sands
As privacy issues continue to heat up among the technology and legal community, we wanted to provide a broad overview of how digital privacy law is and will continue to affect the technology ecosystem in the United States. We collected discussions from thought leaders who touch privacy on all sides of the spectrum, as a well-rounded perspective that includes corporate, legal, and academic viewpoints can help break away an inclination to view these issues in a vacuum. This should add some clarity around the California Consumer Privacy Act, the largest privacy legislation since Global Data Protection Regulation enacted in 2018, while also giving a sense of what incoming federal privacy legislation may look like. This is a topic that, undoubtedly, affects every technology company using data as a core aspect of their business model - and as tech startups require a lean, nimble and ready-to-pivot strategy, receiving insights about areas of impact, particularly major shifts in the law, ought to be entrenched in all forward-thinking leadership for companies currently innovating.
We’ve highlighted our discussions with the following privacy experts.
Hyongsoon Kim (Law Partner)
The CCPA is about empowering consumers through knowledge, about what information businesses are collecting about them and about what businesses do with the information, once they’ve collected it. The details about how to comply with the CCPA can become very technical. I think the principles that drive the CCPA are the same principles that drive a lot of the existing litigation today over privacy issues. The standard that’s often applied in these cases is whether the conduct in question was so egregious that it falls outside accepted social norms. This is actually the legal standard that’s applied in these cases.
I think it’s helpful to understand just how aggressive and sophisticated the Bar is in California. The Bar are not interested in bringing cases that they can’t certify as a class. We’re now entering, I think, or we’re really in the midst of a period when consumers are much more sophisticated and understanding about the fact that their data is being collected. We are in an age when consumers understand - if I’m downloading an application on my phone, if I’m purchasing a smart TV or some device that’s connected to the internet and I am using it to obtain particular services, things that I otherwise could not be able to do, chances are the seller of that electronic device is collecting some information about me. I think consumers are increasingly understanding about that and accepting of the fact that this is more and more a world of shared information.
I believe it ultimately comes down to disclosure. Companies, I think, are increasingly realizing we can put these disclosures in front of consumers and encourage them to accept what might seem, just from reading the language - be it the onerous terms of service or disclosures about the information that we’re collecting - because consumers understand and accept those types of disclosures and that kind of use of their information.
Ben Barokas (CEO/Founder)
You can say that the rise of Facebook and the rise of Google and the rise of Amazon has been around harvesting terabytes and terabytes of data an intelligent fashion, in order to create marketplaces that are almost irresistible. What is key, though, is that these Googles and Facebooks and Amazons have also provided an amazing public good. What it has cost is data. You can understand, through machine learning, people’s intent, and people will bid on those people if their propensity and likelihood to do a particular thing are really high.
Who’s allowed to process what data and release it to whom is something that I believe the market is not going to take care of. We’re going to need, ya know, government to step in and say, these are what the social norms are for the next ten years. The regulation doesn’t scare me. There’s a lot of space within digital privacy that innovation will need to exist in order for us to live in a world by which privacy is respected by corporations.
Michelle De Mooy (Policy Reformer)
This not about transparency. If the corporate world wants to get to a single federal standard, they need to move much further down the privacy line here to actual data use issues than some of the more complex things. It really isn’t about notice and consent. That is how the internet has worked the last couple of decades and it has fundamentally failed. It’s sort of an illusion that any of us could understand the hundreds of companies that touch and process our data every day. So if you’re talking about a serious privacy law, it has to go beyond notice and consent and get serious about what’s a fair use of people’s data.
People usually say, well just don’t use Facebook, but this is so much bigger than Facebook now. This is everything. Data is used in housing, education, employment, in the prices that you pay when you buy things online. This is very serious and you can’t opt out of it.
Opening access to information that has historically been opaque to the consumer just may be the right direction for many companies offering data-driven products and services. Global Data Protection Regulation and now the California Consumer Privacy Act are setting these types of standards. The big question mark is around the future standards set by federal data privacy laws. Perhaps executives should also think about ways to go beyond GDPR and CCPA, to establish their own ethical standards around privacy. In a rapidly evolving digital environment, with bipartisan support on drawing hard lines of privacy, taking proactive measures to he ahead of the privacy trends could be highly advantageous, while also expressing goodwill to regulators.