This morning in a taxi, the driver glanced at me at a red light, hesitated, then mustered the courage to say: “Sir, could you please give me five stars? Please.”
His tone wasn’t polite request—it was genuine pleading. Because if his average rating drops below 4.6, the system prioritizes trips for higher-rated drivers. For him, every star translates to actual money. I wasn’t sure how to respond. Not because the request was unreasonable, but because I suddenly realized—sitting in the backseat, my finger tapping the screen for that one second would determine how many rides he could take today, how much money he could earn.
This isn’t just one taxi driver’s story. This is the epitome of an entire era.
From Merit Books to Real-Time Judgment
I remember a news story from China a few years ago about a delivery worker. A young man broke down crying on the street because a customer gave him a negative review. Under the delivery platform’s rules, one negative review doesn’t just mean losing money—it triggers a system downgrade, compressing the next few days’ order allocation, while his colleagues get more opportunities as a result. One negative review can nullify an entire day of more than ten hours of hard work.
This reminds me of a contrast. Less than a century ago, folk religion’s “merit books” worked like this: you did good deeds, heaven recorded one mark; you did bad deeds, that got recorded too. But the reckoning happened “once per lifetime.” You had ample time to make amends, adjust, prove you were more than just that moment of error.
Now, algorithms have upgraded the merit book—it doesn’t settle accounts once per lifetime, but daily, per order, per interaction in real-time. Efficiency has indeed improved, but the space left for people to breathe and recover has almost vanished.
The Invisible Panopticon
Michel Foucault, in his 1975 work “Discipline and Punish,” deeply analyzed Jeremy Bentham’s late-18th-century concept of the “Panopticon”: a circular building with a watchtower at the center and cells around the perimeter. Prisoners never know if they’re being watched, but precisely because of the pressure of “possibly being monitored,” they automatically discipline their own behavior.
The rating systems of the platform economy are the digital version of the Panopticon.
Taxi drivers don’t know which passenger will give them low stars, so they’re extra careful with every passenger. Delivery workers don’t know which order will result in bad reviews, so they desperately race against time, wanting to run red lights even when they shouldn’t. The first thing Uber drivers do when opening their app each day isn’t checking income—it’s checking if their rating has dropped.
The difference is that in Foucault’s prison, there was at least a visible watchtower. In the platform economy, you don’t even know “who’s watching you.” Ratings come from an anonymous collective you can never question, while verdicts are executed by algorithms whose code you’ll never see.
Quantification is Eating Everything
If you think this is just a problem for blue-collar workers, you probably haven’t realized how far quantification’s tentacles reach.
Credit scoring is already routine. In China, a Sesame Credit score above 600 lets you rent apartments without deposits; while the government-led social credit system directly affects millions of people’s freedom of movement—those on the “untrustworthy” list can’t even board planes or high-speed trains. In Taiwan, Joint Credit Information Center records determine how much you can borrow and at what interest rates. Your “credit” is no longer your neighbors’ word-of-mouth about your character, but a set of numbers.
Health data too. Insurance companies are already using wearable device data to assess premiums. People who walk more pay lower premiums; those who sit too much pay higher ones. Your body is no longer just yours—it’s also a continuously quantified asset.
Even knowledge work can’t escape. In my experience running companies, I’ve deeply felt how “numbers” dominate decision-making. When we use KPIs to evaluate employees, conversion rates to measure marketing campaigns, reading time to judge an article’s value, we’re all doing the same thing: forcibly quantifying what cannot be quantified, then making decisions that affect real lives based on those quantified results.
The Entrepreneur’s Dilemma
To be honest, I’m also a participant in this system.
During my years as a digital transformation consultant, I helped clients build various “data-driven” evaluation systems. Revenue dashboards, customer satisfaction tracking, employee performance boards—all these tools started with good intentions, to make decisions more objective and transparent.
But the more I did this work, the more I discovered an uncomfortable truth: when you compress a person’s performance into a number, you’re essentially telling them, “Your value as a person equals this number.”
This reminds me of a concept I learned in seminary—the imago Dei (image of God) in humans. Christian theology maintains that human value is intrinsic, irreducible, existing independent of external performance. Your value doesn’t come from your output, not from your rating, much less from an algorithm’s classification of you.
But algorithms say exactly the opposite: your value = your data.
The tension between these two ways of viewing humanity is one of what I consider the deepest conflicts of our time. It’s not just a matter of tech ethics—it’s a fundamental interrogation of what makes us human.
The Cost of Data as the New Oil
“Data is the new oil”—this phrase has become so popular it’s almost cliché. But most people only hear the wealth fantasy of “new oil,” without thinking about the other side of oil extraction: environmental destruction, resource curse, geopolitical conflicts.
Data extraction also has its costs. Except this cost isn’t polluted rivers, but eroded human autonomy. When every click, every swipe, every pause is recorded and analyzed, when your consumption behavior, social patterns, health data are all used to feed recommendation systems and credit models—your digital footprints are no longer just footprints. They’re a continuously operating self-portrait, and the right to interpret this portrait isn’t in your hands.
What’s more valuable than oil: oil doesn’t emerge by itself, but data is something you provide freely, automatically, and abundantly every day. We are both the producers of data and those judged by data.
The Algorithm is Watching You
So returning to that morning scene in the taxi.
After the driver said “please give me five stars,” I gave him five stars. But I kept thinking: how much courage does it take for someone to ask a stranger to evaluate them? Under what kind of system are people forced to do such things?
What we’re experiencing isn’t a game you can choose to join or leave. Rating systems have infiltrated labor, credit, health, education—almost all domains essential to survival. And the most paradoxical aspect of this system is—it makes the monitored believe they are free.
You can choose not to drive for Uber, but you can’t opt out of your credit score. You can choose not to use social media, but you can’t opt out of your profile in various databases.
The algorithm is watching you. And the more brutal question is: under its gaze, who are you?
💬 Comments
Loading...