California now has comprehensive consumer privacy protection… well at least compared to the rest of the United States. Will it result in changes to Federal Privacy laws?

January 3, 2020 |

California’s much touted, and feared in some quarters, privacy law is now 2 days old and the West Coast of the USA has not slid into the sea. The Attorney General of California has set out the operation of the California Consumer Protection Act (CCPA) here with a short fact sheet.

It is common practice in the United States for significant law reform to emanate from the States and then for the Federal Government to legislate to cover the field and generally supersede those state laws.  That is particularly the case where the law applies to commerce that crosses state boundaries, clearly a federal law.  Often times that is done to establish uniformity and avoid duplication and efficiency.  Also states tend to be incubators for public policy experimentation.  Successful policies tend to get picked up and adopted federally.  That happened with welfare reform in the 1990s.  That occasionally occurred in Australia however the States are rarely so ambitious these days and are content to have the Commonwealth organise uniformity through the COAG process, amongst other fora.  The problem is that the genius of experimenting with new concepts has been suppressed for the sake of sameness.  In the area of privacy that has resulted in a dismal Federal Act and generally pale imitations at the State level, with a few slight changes in Victoria and New South Wales.  The States could have adopted stronger and more effective privacy legislation but chose to be constrained.

There are many articles about it but in terms of readability, balancing the tech with the east of reading, it is hard to go past the Wired’s article California’s Privacy Law Goes Into Effect Today. Now What?

It provides:

The Golden State officially has the strongest consumer data protections in the US. Here’s everything you need to know.

Fittingly for the start to a new decade, California decided to go big with its 2020 New Year’s resolution. Today, the California Consumer Privacy Act goes into effect. Passed unanimously in June 2018, it’s the first law in the US to set up a comprehensive set of rules around consumer data, akin to the European Union’s General Data Protection Regulation, or GDPR. Industry and privacy advocates have been fighting over the fine print ever since.

Now the law is officially on the books in the biggest state in the union and the world’s fifth-largest economy. For the average internet user in California, life will not be radically different. But as the mechanisms of the law get finalized, and depending on how it’s enforced, its impact could go a long way to determining whether the 2020s become the decade when the US started taking privacy seriously.

New Year, New Rights

The CCPA applies to any company that operates in California and either makes at least $25 million in annual revenue, gathers data on more than 50,000 users, or makes more than half its money off of user data. For California residents, it creates a handful of new rights over their data. The most significant categories are what Alastair Mactaggart, the California real estate magnate behind the ballot initiative that led to the law being passed, calls “the right to know” and “the right to say no.” That means users will, as of today, be able to see what data companies have gathered about them, have that data deleted, and opt out of those companies selling it to third parties from now on.

It’s important to remember that we’re not just talking about the Googles and Facebooks of the world, but any big company that does a lot of business online—which is to say, any big company. One such corporation is Condé Nast, WIRED’s parent company. So if you’re reading this from a California IP address, you should have seen a pop-up banner with a big button reading “Do Not Sell My Personal Information.” What happens if you click it? Well, WIRED doesn’t exactly “sell” your data right now—no one is giving us cash (or withholding military aid, for that matter) in exchange for dirt on our readers. But, like just about every site on the internet, we track your behavior—what articles you read, for how long, etc.—on using cookies. We use that data internally for research and site improvements, but the information can also go to a third-party vendor, like Google AdSense, which combines it with similar data from other sites to create user profiles that advertisers can target. The infamous shoe ad that follows you across the internet long after you close out your Zappos tab? That’s how it works—and advertisers pay extra for the privilege of this personalized ad targeting. If you ask to stop “selling” your data, you won’t get those types of ads from us anymore, and your browsing history on our site won’t factor into the types of ads you see elsewhere.

Many companies already had to implement processes allowing European users to delete their data or opt out of tracking thanks to GDPR, which laid some groundwork for the CCPA. Some platforms, including Facebook, have built tools allowing users to exercise the rights that the CCPA now guarantees to California residents.


Final regulations that clarify and define the parameters of the law haven’t been released, but California attorney general Xavier Becerra is expected to issue them sometime in the next six months. The state won’t start enforcing the law until July 1. It’s an open question whether enforcement will be robust enough for the law to really make an impact.

The law grants Californians the right to sue companies for failing to take reasonable precautions to prevent data breaches. But apart from that, making sure companies comply with the CCPA is the sole province of the attorney general’s office, which has indicated that it will only have the bandwidth to bring a handful of cases each year.

“The California attorney general has said, ‘We only have resources to bring a few cases a year,’” said Justin Brookman, director of privacy and tech policy at Consumer Reports. “So maybe companies are saying, ‘The odds of getting sued are pretty slim.’”

Mactaggart, however, said he expects businesses to fall in line.

“I come from one of the most heavily regulated industries in the country: real estate development,” he said. “I’ve literally never even come close to sitting in any meeting where I’ve heard anyone say something like, “It’s the law, but we’re not going to get caught, so let’s just do it anyway.” He argued that even if cases are rare, the threat of crippling fines—$2,500 per user per piece of data, which could easily scale to the tens of billions for a company that flouts the law—should be an effective deterrent.

Still, Mactaggart granted that some violations of the law might be hard to detect in the first place, let alone police.

“It’s easy to see on the page if they’re tracking,” he said. “The harder part is, how do I know they deleted it, or how do I know they didn’t sell it?”

What Comes Next?

In part to solve the potential enforcement problem, Mactaggart is working to get another initiative on the ballot this November that would beef up the existing law. “Right now, the regulation is in the hands of the attorney general, who has stated, and I don’t blame him, ‘We’re cops, not regulators,’” he said. The initiative would create an independent agency focused just on the privacy law, with the power to audit companies for compliance. It would also restrict the legislature from watering down the law in the future—a serious concern given the amount of industry lobbying that has already taken place.

Meanwhile, the California law puts pressure on Congress to act at the national level, as the business community howls at the prospect of complying with a patchwork of state requirements. (States like Nevada and Vermont have their own privacy statutes; lawmakers in other states, like New York, have tried to introduce bills that are even more ambitious than California’s, although with less success so far.) The Senate is considering a number of bills, but so far Democrats and Republicans are far apart on two key issues: whether to grant ordinary Americans the right to sue for violations (Democrats generally think yes, Republicans no), and whether the federal law should preempt tougher state regulations (Democrats no, Republicans yes). The longer Congress waits to act, the more California—and any state that goes even further—will get to determine the facts on the ground.

“Really, you have to have a short- and long-term CCPA strategy,” said Jennifer Rathburn, a partner at the law firm Foley & Lardner, who advises companies on compliance with the law. “The final regulations come out; you’re going to have ballot initiative 2.0 coming out; and then you’re going to have potentially other state laws. This isn’t a one and done. This is an evolving area that’s pretty new to the US.” She added, “In sum, privacy is here to stay.”

The Economist has an excellent article on the new legislation with Companies should take California’s new data-privacy law seriously.  It provides:

HISTORY DOES not repeat but sometimes it rhymes. So, it seems, do efforts to protect netizens’ privacy. The European Union led the world with its General Data Protection Regulation (GDPR), which came into force in May 2018. That law shook up internet giants and global advertising firms, both of which had previously used—and at times abused—consumer data with little oversight. On December 11th India’s government introduced a bill that would force firms to handle data only with consumer consent and give the authorities sweeping access to them. The same day Scott Morrison, Australia’s prime minister, promised a review of privacy laws and said the competition authority will monitor how advertising is done on digital platforms. But the most important piece of legislation rhyming with GDPR right now is the California Consumer Privacy Act (CCPA), which comes into force on January 1st. To online businesses, it jars.

The Californian law copies some of the GDPR’s provisions. It gives consumers the right to know what online information is collected about them and how it is used, permits them to demand that their data be destroyed and to sue companies for data breaches. In some ways, the CCPA is looser than its European predecessor. It does not, for instance, insist that firms have a “legal basis” for collecting and using personal data or restrict the international transfer of data. It also stops short of demanding the appointment of corporate data-protection officers and assessments of projects’ data-protection risks. And whereas the GDPR lets individuals demand that private information about them be removed from the web under certain circumstances, the First Amendment makes this “right to be forgotten” a non-starter in America.

In other respects, though, California goes further than the EU. The CCPA adopts a broader definition of personal information (which extends to such things as internet cookies that identify users on websites) and it explicitly forbids discrimination (by offering discounts to those who grant firms access to their data). Companies must enable Californians to opt out of the sale of personal data with a clear “do not sell” link on their home page, rather than through GDPR’s fiddlier process. Michelle Richardson of the Centre for Democracy and Technology, a privacy-advocacy group which is bankrolled in part by big tech companies, calls the CCPA “ground-breaking”.

The California law will apply to firms with revenues of $25m or more that do business in the state or process its residents’ data, even if not based there. Any for-profit entity anywhere that buys, shares or sells the data from more than 50,000 Californian customers, households or devices a year is also covered. Law-breakers face fines of up to $7,500 for every violation, compared with 4% of global annual revenues or €20m ($22m), whichever is higher, for the GDPR. But California’s relatively trifling ceiling can add up quickly for firms with thousands of users.

The GDPR’s track record suggests the effects of the CCPA will be far-reaching. Some 250,000 complaints have been lodged under the EU rules, and some penalties approach €100m. If breaking the rules could prove expensive, so is respecting them. The International Association of Privacy Professionals, an industry body, and EY, an accountancy, reckon that complying with the GDPR costs the average firm $2m. Tech firms spend over $3m; financial firms, more than $6m. By one estimate, the total cost to all American firms with more than 500 employees could reach $150bn.

“Initial compliance” with the CCPA may, for its part, cost the estimated 500,000-odd affected American firms $55bn, according to a study commissioned by California’s attorney-general. Any such estimates should be taken with a grain of salt. For one thing, global firms that are already GDPR-compliant have a head start, even if differences between the two laws mean abiding by the Californian one will be far from automatic. Big firms, which are already on the hook for GDPR, are expected to spend another $2m each. For the tech giants that looks like chump change. Microsoft and Apple say they are not only ready for CCPA, but also plan to implement it across America.

For America’s legions of smaller online trinket-sellers, app-makers or other firms present on the internet the Californian law will be onerous. They can ignore European regulations, because most have no EU business, but cannot easily stay away from one of America’s biggest domestic markets. A new survey by the US Chamber of Commerce, a lobby group, claims that only 12% of small businesses in America know about the law, let alone have prepared for it.

The impact of the CCPA is being felt beyond boardrooms. Big Tech is lobbying lawmakers in Washington, DC, for a federal statute on the subject. “We really, really support an omnibus federal privacy law,” says a data-privacy official at a large American technology company. Facebook and Google do, too, they profess. The US Chamber of Commerce, better known for opposing regulations, is also now in favour.

One explanation for tech firms’ sudden enthusiasm to safeguard user information is their reasonable desire to avert a balkanised mess of contradictory state laws. Illinois, New York and Washington have differing state legislation in the works. Many others are looking into the matter.

Tame west, wild east

Tech companies could have another motive to back federal rules. Because much online activity crosses state boundaries it falls under federal jurisdiction. A national data law would therefore supersede California’s, unless it explicitly made federal rules the floor which states could raise if they wished. A Democratic proposal in the Senate does just this. A rival Republican one would set business-friendlier rules as the ceiling, in effect obviating the CCPA. No points for guessing which one of these America Inc would prefer. Neither is likely to pass before November’s presidential elections. Until then companies will need to heed California’s data sheriffs. After that, expect a shoot-out.

In its World in 2020 the Economist goes out on a limb slightly and predicts that there will be push back against surveillance in 2020 in Pushback against the surveillance state. It provides:

RIOT POLICE in Hong Kong have attacked protesters with the usual weapons: clubs, tear-gas, water cannons. Protesters, conversely, have deployed distinctly unusual weapons to defend themselves, including laser pointers and spray paint. These are intended not to blind or mark police, but to impede the use of facial-recognition technology via Hong Kong’s roughly 50,000 closed-circuit television cameras.

That may sound like a lot. In fact, measured by cameras per 1,000 people, Hong Kong does not crack the top 25 most-surveilled cities on Earth. With 6.71 cameras per 1,000 people, it has less than one-twentieth as many as the mainland Chinese cities of Chongqing and Shen­zhen. China’s government has bet heavily on surveillance technology, providing startup funding for multiple companies now hawking their wares around the world. But Hong Kong’s camera density also trails that of cities in freer countries, including London (68.4 per 1,000 people), Atlanta (15.6) and Chicago (13.1).

In none of those places did people vote to create a surveillance state—and yet, here it is. For years, the rollout of surveillance technology around the world, whether under dictatorship or democracy, followed a drearily predictable pattern. Whatever security forces said they needed, they tended to get.

As a result, police in liberal countries now have a host of tools at their disposal. As well as facial-recognition systems, they have cameras mounted on police cars or telephone poles that recognise and record the licence plate of every passing vehicle; and Stingrays, which mimic mobile-phone towers and let police intercept data from every passing phone, including texts, websites visited and the phone numbers of incoming and outgoing calls. All these gadgets allow the police to build detailed portraits of people’s lives.

That is the bad news. The good news is that, in the United States at least, concerned citizens are starting to hit the brakes.

In 2019 cities on both coasts of America banned police forces from using facial recognition. California was poised to become the first state to ban it statewide on police body-worn cameras—used by an increasing number of agencies to record encounters with citizens. Kade Crockford, of the Massachusetts branch of the ACLU, a civil-liberties group, called these bans “a wrench thrown into the gears of techno-determinism: this belief that if it’s invented then it has to be deployed, and you’d better just get out of the way.” This trend looks set to continue in 2020, and travel inland.

New York City convened a task force in 2018 to examine how its agencies use algorithms. Decision-making software pops up in surprising places. It is used not just in predictive-policing programmes (which rely on historical crime data to forecast where future crimes will be committed, and therefore where police should go), or in creating risk-assessment scores to determine who is eligible for pre-trial release and who has to wait in jail, but also to determine who goes to which schools, and which buildings merit a fire-safety inspection.

We got algorithm

Many people worry that, far from being impersonal, impartial tools, algorithms are trained on racially biased data which, whether consciously or unconsciously, will replicate that bias. Another concern is that algorithms are unaccountable black boxes with opaque decision-making processes, which is inappropriate in a democracy. Congressional Democrats have introduced a bill requiring companies to investigate and fix algorithmic bias in their systems. But with Congress divided and a presidential election looming, it is unlikely to become law in 2020.

If Democrats capture the presidency next November, however, the bill could pass in the Congress to follow. Several candidates have announced criminal-justice reform packages sceptical of surveillance technology. Bernie Sanders wants algorithmic risk-assessment and a full ban on facial recognition. Elizabeth Warren has promised to establish “privacy protections” and a “task force on digital privacy in public safety”. Kamala Harris has expressed more measured support for regulations to combat tech-enabled racial bias.

Although Democrats may be the loudest voices in this arena, privacy and scepticism of state power have traditionally been bipartisan concerns. The optimism that led politicians from both parties to become starry-eyed about technology is fading. So is the tough-on-crime consensus that made politicians afraid to say no to police and eager to impose ever-harsher sentences. A convergence of these two trends means that in 2020, the all-seeing watchers can expect to get a long overdue poke in the eye.

Whether the above bullish predictions will come to pass is the test.  There is no doubt there is some momentum to improve privacy and data security regulations given the Cambridge Analytica fiasco and the overwheening power of Google, Facebook, Instagram and Microsoft (to name only the top the pile data piranhas. In the US Senate Senators Crhis Coons and Mike Lee have introduced a bill to required court orders for law enforcement to use facial recognition technology.  The Senate Democrats have introduced goals for a federal data privacy legislation.  Given the Republicans control the Senate that is a long way from legislation being introduced, let alone passed however in the US Congress it is often preliminary suggestions that presage changes.  And privacy has risen up the priority lists of many legislators since 2016.

Leave a Reply

Verified by MonsterInsights