• Perspectives

The world Google built: Privacy, data, and the surveillance machine

In the second post in our series on Google and our discontents, we trace how Google’s long history of privacy erosion produced an internet built to observe and control its users—that is, all of us.

Written by
  • Jamie Lawrence
  • Rex Mizrach
Publish date
09/01/2026
Share this article

Your entire life will be searchable.

Larry Page

Privacy. The linchpin of democratic society. Across the world, privacy is recognized as a universal human right. Increasingly, the right to privacy is extended to include data protection, with the European Union leading the charge.

Meanwhile, Big Tech—and Google in particular—treat digital privacy rights as mortal threats. They fight tooth and nail to sink data protection laws; and when that fails, they move to dilute them as much as possible.

This is, of course, not a design flaw.

In The Age of Surveillance Capitalism, Shoshana Zuboff traces Google’s hostility towards privacy rights back to the early 2000s, when the dot-com crash collided with the company’s original promise to "organize the world’s information."

Google saw the trade-off: survival required scale, and scale demanded revenue. The answer: targeted advertising. Leveraging personal data, Google built digital profiles of their users that predicted tastes, behavioral patterns, etc. They then used these profiles to sell ad space with the promise of “guaranteed” clicks.

It made them very, very rich.

But—there was a problem. What if users choose withhold their personal information? Back in the day, the prevailing wisdom of Ye Olde Internet was never ever give out your real name, don’t share your location, don’t trust strangers or strange platforms with your data. Those norms (all but extinct now) stood directly in the way of Google’s ambitions. As did the possibility that governments might step in to protect users’ interests (what an idea!).

As Zuboff explains:

"If new laws were to outlaw extraction operations, the surveillance model would implode […] The survival and success of surveillance capitalism depends upon engineering collective agreement through all available means while simultaneously ignoring, evading, contesting, reshaping, or otherwise vanquishing laws that threaten [its business model]."

So began the era of surveillance capitalism; the largest data collection project the world has ever seen. (We'll go into that on our next post.)

We need to keep this history in mind when considering Google's privacy pitfalls: the reason that Google sucks at privacy is because data protection is essentially opposed to its business model.

But it's worse than that: no one company had done more to destroy the very idea of online privacy than Google. Their revenue model alone shifted (quietly, insidiously) the norms of the internet writ large. After Google, things would never be the same.

Fast forward to 2026—should we trust Google's privacy policy? A 20 year history of violations suggest… no, we should not.

In recent years, courts have repeatedly found Google in violation of its own privacy promises.

Last September, a federal jury ordered Google to pay $425 million to users for violating its own privacy policy, determining it continued to collect user data even after users had explicitly opted out of tracking.

Earlier in 2025, a Texas court found Google guilty of collecting biometric data and private search history without users' consent.

When cases become too damaging to fight, however, Google often opts to settle.

In 2023, it agreed to a $5 billion class-cation settlement for allegations that it tracked users while on Chrome's incognito mode.

Last August, Google moved to settle yet another lawsuit accusing the company of collecting children's personal data, without parental consent, to use for targeted advertising. This follows a nearly identical case in 2019.

Google, it would seem, never learns. Even when caught red-handed, Google denies any wrong doing.

Every. Single. Time.

Digital privacy under (perpetual) attack

We could go on, and on, and on… Google has faced no shortage of lawsuits over the years (estimates are in the hundreds)—but you get the point. Actions speak louder than privacy policies, and Google’s actions offer little reason to believe that user data is treated as anything other than fodder for profit.

Again, there's no mystery here—this is quite literally the business model. No wonder Big Tech has spent millions on lobbying efforts in the EU to water-down digital privacy protections (nearly €27 million in 2020 alone), following similar practices in the US. (See this, and this.)

And they’re getting bolder about it. In 2019, Google lobbied heavily to alter the language of California's data-privacy legislation to allow the company "to continue collecting user data for targeted advertising, and in some cases, the right to do so even if users opt out."

Often, Google will try to broaden or loosen definitions in data protection policies to obscure what it collects and how that data is actually used. Take this example: as per their privacy policy, Google claims they don't sell your personal data—and technically that's true: they don't sell raw data. Rather, they sell predictions of your behavior fashioned from your data. (Zuboff)

At the end of the day, the gap between what Google says and what it does—its brazen violations of its own policies; its freedom to unilaterally change its terms and conditions (documents which are deliberately that long so that no one can read them); its sneaky attempts to opt you in to data-scraping initiatives—makes real privacy on their platforms impossible.

That's right, impossible. Because real privacy policy is transparent. A platform should tell you in plain and simple terms what it has access to, and to what ends it uses your data, with sufficient guarantees that that won't change.

Walled gardens

These shadowy practices inform the structure of today’s internet. Google has designed its architecture to deter users from ever leaving. So what if they take a little extra, we told ourselves, what harm could it do? We all stayed in the Googleverse out of inertia, an inertia carefully engineered to look like convenience.

That’s how we ended up here, in the surveillance-industrial complex. A world where Google and Big Tech collaborate not just with advertisers, but also with bad actors and governments moving (rapidly) toward censorship-oriented content policies, expanding the infrastructure of mass observation.

Finally, we can't trust what they say.

What, exactly, would stop Google from rolling back the already pitifully thin privacy protections that remain? It's own history suggests... very little. Google has shown itself more than willing to cooperate with hostile administrations when it suits them (see: monetary donations to and an slow-burn romance with the Trump regime—again: more on that later.)

As political norms continue to erode, taking regulatory capabilities down with them, the question isn’t if Big Tech will cave on privacy commitments—but when, how much, and at whose expense. (Hint: it's all of us, and creatives especially.)

As we know, surveillance changes how people behave—and what they write. It instructs writers to self-censor; silences activists, and engenders the chilling disappearance of ideas and communities, putting marginalized people at greater risk. A system that records everything for gain hobbles dissent, turning ideas into topics to be carefully side-stepped rather than freely expressed. Ultimately, such a system wants to discipline human thought itself.

Today, sticking around with Google and its ilk is an ethical—and existential—liability.

But there's good news—we can leave. The Big Tech-fiefdoms have alternatives. (See here, for starters.)

Google's vision of the future isn't inevitable. People like you (and us!) are exercising the freedom to choose spaces that don’t treat data, thoughts, and human work as mined material. The culture of the internet isn't pre-determined; we're deciding it in real-time: choice by choice. We still get to decide what kind of world we build together. Places where we can own our work, where we can think, write, connect with each other as humans.

Together we'll get there, step by step.

Want to connect with a like-minded community and get the latest news on Ellipsus? Join our Discord to follow announcements and share your feedback.

Let's be pen pals.

We'll be in touch!
Something went wrong.