The Illusion of Choice

Are those police boats?

Seriously? email header
Gob Bluth doing magic tricks

Welcome back.

Who wants to come help me drag the Halloween decorations out of the attic?

This week: It can be difficult as hell to understand when we don’t actually have a choice — and when we’ve got more options than we think. Deciphering the two is the key to the good stuff getting built way, way faster.

Did you know we record an audio version of all of our essays? Subscribe to our podcast feed and listen to this essay now 👇️ 

I’m Quinn Emmett, and this is science for people who give a shit.

Every week, I help 23,000+ humans understand and unfuck the rapidly changing world around us. It feels great, and we’d love for you to join us.

Why You're Here logo

New Shit Giver Laura wants to help solve “the lack/affordability of mental health resources, hunger insecurity, the current housing shortage and its effect on lower-income people, and where the political polarization of our country is going to take us.“

Yeah I think that sums it up pretty well, Laura. Let’s get to it!

Together With Bookshop

Want to read what the people working on the frontlines of the future are reading?

Every week, I ask our podcast guest, "What’s a book you’ve read this year that’s opened your mind to a topic you haven’t considered before, or that’s changed your thinking in some way?"

And every week, we add their picks to a list on Bookshop, where every purchase on the site financially supports independent bookstores.

Want an ad-free experience? Become a Member.\

THE ILLUSION OF CHOICE

Skipping for juuuuust a moment over the entire “does free will exist” conversation, because, as Oliver Burkeman wrote in The Guardian, “Peer over the precipice of the free will debate for a while, and you begin to appreciate how an already psychologically vulnerable person might be nudged into a breakdown.”

No thank you, not today. My Prozac has barely kicked in.

Not that some level of self-awareness isn’t great, obviously, but that’s 101 level stuff (I don’t keep cookies in our office for a reason), and so anyways, no, we’re not doing the Bereitschaftspotential right now.

But I do want to discuss how much more vulnerable you are than you think to the systems around us — and — plus! — on the other hand, how we have more power than we think to dismantle them to provide more choices for more people.

Earlier this year, I wrote a post called What Do You Need.

If you missed it, shame on you.

I’m kidding, though it’s worth reading the whole thing, and there’s a great WIRED post this week — by way of Google’s antitrust lawsuit they REALLY REALLY don’t you to know about — that expounds on my intro and thesis.

My original intro to What Do You Need, in February 2023:

In screenwriting, there is a well-honed idea that main characters should want one thing, but need something different, something that is often opposed to or even opposite their most public desires.

They are blind to what they need the most, and often purposefully so, having shoved those feelings down juuuuust about as far as they can go. Trust me when I say: having a long hard look at yourself isn’t easy, or comfortable.

So we empathize with these characters because, I mean, who amongst us, right?

It’s an imperfect character development mechanism, of course. The best characters aren’t that simple, and none of us are, either.

That said, history is littered with memorable characters who reluctantly go through transformations, who finally walk away from what they want and go through hell to get what they needed all along, letting us experience what it’s like to have that long hard look without actually having to, you know, do it.

Web search was intended to give us what we need, but over time the utility has been hijacked to give us what we want.

We need a real answer, but at this point search most often gives us what we want — self-affirmation — and if it’s delivered by a paid advertisement that looks just like a real answer, that’s even better.

That process, over and over, billions of times a day, leads to disinformation. Sometimes disinformation hurts one person, as we’ll see below, but at scale disinformation inevitably hurts many, many people.

Imagining that search could ever give us entirely objective answers, all of the time, ignores the web’s original sin — the web is only what we put into it, and we are fundamentally flawed.

The internet is so fundamentally broken that we desperately want the next thing — AI chatbots — to be everything, all at once.

But that’s even more dangerous because instead of your question returning a list of links ranked by Google, most of which are now paid ads, or a newsfeed of extreme views from friends and family on Facebook, a chatbot is an extremely convincing version of both.

It’s incredibly confident, and often very wrong. But we can’t tell the difference, and I’m not sure we want to.

I’m not going to spend today’s essay assessing the technological capabilities of search or new large language models, because that assessment will be old news almost immediately.

What I do want to do is try to force us to confront our wants and needs, to confront our expectations, borne of who we are — a construct that has remained the same for eons and underpins every single system we’ve ever built.

Now from WIRED (3 minute read) just this week:

Recently, a startling piece of information came to light in the ongoing antitrust case against Google. During one employee’s testimony, a key exhibit momentarily flashed on a projector. In the mostly closed trial, spectators like myself have only a few seconds to scribble down the contents of exhibits shown during public questioning.

Thus far, witnesses had dropped breadcrumbs hinting at the extent of Google’s drive to boost profits: a highly confidential effort called Project Mercury, urgent missives to “shake the sofa cushions” to generate more advertising revenue on the search engine results page (SERP), distressed emails about the sustained decline in the ad-triggering searches that generate most of Google’s money, recollections of how the executive team has long insisted that obscene corporate profit equals consumer good.

Now, the projector screen showed an internal Google slide about changes to its search algorithm.

This onscreen Google slide had to do with a “semantic matching” overhaul to its SERP algorithm. When you enter a query, you might expect a search engine to incorporate synonyms into the algorithm as well as text phrase pairings in natural language processing. But this overhaul went further, actually altering queries to generate more commercial results.

Google likely alters queries billions of times a day in trillions of different variations.

Here’s how it works. Say you search for “children’s clothing.” Google converts it, without your knowledge, to a search for “NIKOLAI-brand kidswear,” making a behind-the-scenes substitution of your actual query with a different query that just happens to generate more money for the company, and will generate results you weren’t searching for at all. It’s not possible for you to opt out of the substitution. If you don’t get the results you want, and you try to refine your query, you are wasting your time.

This is a twisted shopping mall you can’t escape.

Now some of you might say, “What the fuck?”

Upgrade to an Important Membership to read the rest.

Join the Important Membership to get access to this post and other subscriber-only content.

Already a paying subscriber? Sign In.

Membership gets you:

  • • Your WCID profile: Track and favorite your actions while you connect with other Shit Givers
  • • Vibe Check: Our news homepage, curated daily just for you. Never doomscroll again
  • • Your choice of our critically-acclaimed newsletters, essays, and podcasts
  • • Ad-free everything
  • • Lifetime thanks for directly supporting our work

Reply

or to participate.