What happens when the technologies that shape our lives also shape our sense of self? In Searches, Vauhini Vara offers a personal response to that question. This is an introspective collection of essays that traces how the tools we use daily have become mirrors, magnifiers, and even editors of our identities.
Structured as a series of mostly chronological essays, Searches moves through the decades, from the dawn of internet culture in the 1990s to the hyper-connected present, showing how technology has subtly and profoundly altered the way we live, think, and relate to one another. Vara centers these reflections through the lens of her own life, drawing from intimate sources: her Google search history, Amazon product reviews, Adsense demographic data, and even chat logs with early versions of ChatGPT, which she asked to read and comment on drafts of the very book you’re holding. She explores the interplay between technology, creativity, and human connection, how innovation both responds to and reshapes our cultural and emotional landscapes.
One of the most intriguing chapters for me focuses on Google Translate, where language becomes both a barrier and a bridge. Vara places her own writing side-by-side with the tool’s translations, sometimes showing how it fumbles with perfectly crafted Spanish sentences, and other times marveling at how it somehow improves upon her own mangled grammar and invented words. This experiment is about linguistic accuracy and it’s a metaphor for how technology mediates communication, revealing both its limitations and its strange, unexpected brilliance.
She writes:
“Different languages divide us. It seems to me that we need to invent a universal language, in which everyone could talk to everyone and be understood. But, even speaking in our own language… full understanding is impossible.”
This tension between clarity and misunderstanding between the desire to connect and the impossibility of perfect translation runs throughout the book. It echoes a classic philosophical dilemma: the problem of other minds. If we can only access our own thoughts, how can we ever be sure that others think, feel, or perceive the world in the same way?
If you’re interested in the intersection of technology and selfhood, or simply enjoy memoirs that are as curious as they are vulnerable, this is one to sit with slowly.
Summary
Why Search Results Can Be Misleading Without Us Realizing It
Some people worry that Google shows us results that just confirm what we already believe, especially when it comes to politics. This idea is called a “filter bubble.” Google says that’s not true. They say their system shows what you’re interested in, but doesn’t guess things like your religion, race, or political views. Still, researchers don’t all agree on what’s really happening. What is clear, though, is that when something shows up in a Google search, it feels more trustworthy, even if it’s not. That can be confusing, or even dangerous, depending on what you’re looking at.
How Social Media Shapes Us to Perform Instead of Be Ourselves
Social media isn’t built to help us be our true selves. It’s built for performance. These platforms reward whatever gets more likes, views, or comments, so we naturally start shaping ourselves to fit what the algorithm wants. That often means showing faces and bodies that fit a narrow, mostly white and Western beauty standard. It means speaking in certain ways, acting with big emotions, especially anger, and learning how to grab attention.
Over time, we start changing ourselves in small, subtle ways just to fit in. There are even names for these changes: “Instagram face,” “TikTok voice.” It’s like constantly seeing yourself through someone else’s eyes, and not just anyone’s, but the dominant culture’s, amplified by machines. First it was information shaped by algorithms, then products. Now, it’s us: our personalities, our looks, our voices.
Living like this, always comparing our real lives to the polished, filtered versions we see online, can wear us down without us even noticing. It’s quiet but constant pressure. Teens especially feel it, spending an average of five hours a day on social media. And the more time they spend, the more likely they are to struggle with anxiety and depression.
As writer Jia Tolentino put it, trying to live and work online turns your entire self, your thoughts, your face, your feelings, even your dog, into content. It becomes a never-ending performance, all for the feed.
How Big Tech Is Quietly Taking Control of What We Know and Believe
Author Shoshana Zuboff warns that tech companies are gaining too much power by controlling what they know about us. She calls it an “epistemic coup,” a takeover of knowledge itself, and says it’s happening in four steps.
- First, these companies collect huge amounts of personal data from us, such as what we click, search, buy, and say online, and then treat it as their own property.
- Second, they keep this data to themselves, creating a gap between what we know about how they work and what they know about us.
- Third, that gap opens the door for harmful actors to spread false or extreme content using profit-driven algorithms, which damages our public conversations and political systems.
- The fourth step, which Zuboff fears is next, is the most alarming. It’s when tech companies use all this knowledge to quietly shape our behavior on a massive scale. Not just what we see online, but what we think, how we feel, and how we act.
How Machine Learning Helps Tech Companies Learn About Us Through Our Words
Machine learning is a way for computers to spot patterns in large amounts of data. If you give a machine-learning system enough examples, for example words, pictures, or sounds, it can start figuring out what goes together and how things work.
Google used this method to improve language translation. They fed the computer tons of text in different languages and let it learn how to match words and meanings across them. But this wasn’t just about building better tools. It also gave Google something more valuable: insight into how people communicate.
Every time we search, type, or speak online, we give tech companies bits of information about ourselves. And because Google sees so much of this language, it can use machine learning to understand words and to understand us.
Why Human Language Has Meaning and AI Just Mimics It
When people talk to each other, whether it’s face-to-face, over video, or even through writing, we’re not just saying words. We’re building meaning together. Even if we’re talking to someone we don’t know, like a writer we’re reading or a speaker we’re listening to, we still imagine who they might be and what they might have in common with us. That helps us understand what they really mean.
But AI doesn’t work that way. Language models like ChatGPT don’t have thoughts, feelings, or goals. They don’t know who they’re talking to or even that they’re talking to anyone at all. They just generate text based on patterns in data, like a very advanced form of echoing. There’s no intent behind the words, no real understanding, and no personal perspective. If we feel like the language means something, it’s because we give it meaning. Not because the AI does.
How AI Might Shape Culture to Fit What Corporations Want Us to Like
As AI gets better at learning what we enjoy, it will recommend content and start creating it. Chatbots and other AI tools will be able to write stories, make music, or even generate entire films designed to match our personal tastes. But it won’t be real artists behind these creations. It’ll be algorithms owned by powerful tech companies. Even the testing process might be done by fake audiences, built by those same companies.
Over time, the culture that comes from this won’t be neutral. It will reflect the values, preferences, and blind spots of the people and corporations who control the technology. Just like we’ve seen with the rise of “Instagram face” or “TikTok voice,” everything might start to look and sound the same. A single style could take over: one that reflects what’s profitable, not what’s personal or diverse.
If AI is a major leap in technology, then without pushback, it could also be a major leap in how deeply corporations shape our lives, from what we buy to what we believe to how we express ourselves.
Author: Vauhini Vara
Publication date: 8 April 2025
Number of pages: 352 pages
Leave a Reply