Late last year, I spent twelve days using AI to build an entire website. Code, automation, multilingual translation—everything done. But after finishing, I sat staring at the screen for a long time.

Not because I was tired. But because a thought suddenly emerged: if these things could be completed for three thousand dollars, what is the judgment I’ve accumulated over twenty years actually worth?

This question made me uncomfortable. But uncomfortable questions are usually the right questions.

Negative Entropy is Not Organization Techniques

In “What Is Life?”, Schrödinger proposed a concept: life can resist the universe’s tendency toward chaos because it continuously “extracts order” from its environment. He called this ability negative entropy.

When most people hear negative entropy, they think of organizing desks, building knowledge management systems, taking notes. But that’s tool-level negative entropy, and AI does it a hundred times better than you.

True humanistic negative entropy is something else—it’s your ability to transform a fragmented experience into a meaningful story. It’s your ability to recognize what’s worth caring about among a pile of contradictory information. It’s your ability to make a decision you’re willing to bear the consequences of when facing a situation with no standard answer.

These abilities cannot be learned by any model’s loss function.

What Theology Taught Me

I have fifteen years of theological training background. This experience is usually inconvenient to mention in tech circles, but it’s actually the underlying operating system for all my judgments.

The core of theological training isn’t memorizing scriptures. It’s facing an ultimate uncertainty—you can neither prove God exists nor prove He doesn’t—and then, within this uncertainty, choosing a way to live.

This is very similar to entrepreneurship. You can never prove a business model will definitely work. You can only make judgments with limited information, then validate through action. The difference is that entrepreneurship validates the market; faith validates life.

In the AI era, this ability to “act within uncertainty” has become even more critical. AI excels at providing optimal solutions in high-certainty domains. But life’s important decisions—whether to divorce, whether to start a business, whether to abandon a stable position to pursue an uncertain vision—have no optimal solutions. Only your solutions.

The first layer of humanistic negative entropy’s meaning lies here: it doesn’t make you know more, it enables you to act when you don’t know.

Why “Efficiency” is Actually a Trap

I spent ten years in the circular economy industry. The core logic of this industry is: what others see as waste, you see as resources. The key isn’t that the thing itself changed, but that the framework through which you view it changed.

The same logic applies to knowledge.

The problem now isn’t insufficient knowledge. It’s too much knowledge, so much that it loses meaning. You can have AI organize a complete knowledge graph of the concept of “negative entropy” in five minutes. But after organizing it, then what? How has your life changed as a result?

Efficiency was the core metric of the industrial age. But in an era of meaning scarcity, efficiency is actually a trap. The faster you digest information, the easier it is to miss those things that need slow chewing to digest—the warmth behind a poem, the struggle within a piece of history, the lessons in a failure.

This is why I say humanistic negative entropy isn’t chicken soup. Chicken soup makes you feel good. Humanistic negative entropy makes you willing to face things that make you feel bad, then extract meaning from them.

Meaning is Not Found, It’s Created

During my collaboration with Claude, the biggest realization wasn’t technical. It was being forced to redefine “where my value lies.”

When AI can write code, translate, do data analysis, generate images—those things I thought required a team, one person plus one AI can handle. What remains?

Judgment. Taste. The decision of where you choose to spend your time. The intuition that looks at a bunch of viable options and says “No, this direction is wrong.”

These things don’t come from databases. They come from the books you’ve read, the people you’ve loved, the pits you’ve fallen into, the questions you’ve asked yourself at three in the morning.

Meaning is not discovered, it’s constructed. Every time you choose to go deep instead of scroll past, every time you choose to face instead of avoid, you’re doing humanistic negative entropy. You’re resisting the universe’s tendency to push everything toward disorder.


Chaos is not the enemy. Chaos is raw material.

The tools you use to refine it into order determine what kind of person you become. AI is a good tool. But choosing what to refine and for whom—that’s your business.