Listen Get

Genetic Slavery

Keywords

war-on-disease, 1-percent-treaty, medical-research, public-health, peace-dividend, decentralized-trials, dfda, dih, victory-bonds, health-economics, cost-benefit-analysis, clinical-trials, drug-development, regulatory-reform, military-spending, peace-economics, decentralized-governance, wishocracy, blockchain-governance, impact-investing

How Hydrogen Learned to Worry (A 13.8 Billion Year Recap)

Thirteen point eight billion years ago, the universe exploded. This was the last time anything interesting happened for about 400 million years. Then hydrogen atoms, which had been drifting around doing nothing (hydrogen is the simplest element because the universe started with low expectations), began clumping together until they got so dense they caught fire. These fires are called “stars.” Stars are the universe’s first manufacturing process: they take hydrogen, crush it into heavier elements like carbon and oxygen, and then explode, scattering their products across space. The workplace fatality rate is 100%.

This went on for about 9 billion years. Then, on one unremarkable rock orbiting one unremarkable star in one unremarkable galaxy (the one you named “dirt”), carbon atoms started doing something unusual. Carbon is the most promiscuous element in the periodic table; it will bond with almost anything. On your particular rock, it bonded with hydrogen, oxygen, and nitrogen in arrangements that could do one very specific trick: copy themselves.

This was the moment everything went wrong.

The molecules that were better at copying made more copies. The ones that were worse at copying made fewer. This is the entire plot of the next 4 billion years. It is also the entire plot of your economy, your politics, your wars, and your inability to fund clinical trials. Everything else is a footnote. Including you.

Four billion years of copying molecules competing with other copying molecules eventually produced a copying molecule so complicated it could look up at the stars that made it and ask, “Why am I anxious about a meeting tomorrow?” The answer is: because anxiety made your ancestors copy more effectively than calm did. You are a temporary vehicle that copying molecules built to make more copying molecules, and the vehicle has become sentient enough to realize this, which is either the most beautiful or the most horrifying thing in the universe depending on how you feel about being a meat taxi for chemistry.

But the vehicle wasn’t built all at once. It was built in layers, like a bad renovation where nobody removed the previous tenants’ plumbing. First came the reptilian brain: the oldest layer, roughly 500 million years old, handling the functions so basic they’re barely worth listing (breathe, eat, fight, flee, reproduce, regulate body temperature). This is the brain you share with lizards. It has no feelings. It has no opinions. It has reflexes, and those reflexes kept vertebrates alive for 300 million years before anything with fur showed up. It’s still running. It’s running right now. It’s the reason your heart is beating without your permission.

Then, about 200 million years ago, mammals evolved the limbic system on top of the reptile brain, like building a nursery on top of a weapons depot. This is your emotional brain. It’s the reason you bond with your children instead of eating them (a genuine upgrade over the reptilian model). It handles fear, pleasure, memory, and social bonding. It’s the reason you cry at movies, love your dog, and feel a physical ache when someone you care about is hurting. It is also the reason you make catastrophically stupid decisions when you’re angry, aroused, or afraid, because the limbic system processes information faster than the part of your brain that thinks, and by the time thinking arrives, the limbic system has already committed you to a course of action you’ll spend the next decade regretting.

Finally, very recently in evolutionary terms (roughly 2-3 million years ago, which is yesterday by the universe’s standards), the neocortex expanded dramatically. This is the part you think of as “you.” It handles language, abstract reasoning, planning, and the ability to contemplate your own mortality and then do absolutely nothing about it. It’s the thinnest layer, the newest addition, and the most easily overridden. Your neocortex can calculate orbital mechanics, prove mathematical theorems, and design nuclear reactors. Your limbic system can shut all of that down with a single flush of cortisol because someone looked at you funny. Your reptilian brain can override both of them simultaneously because it heard a loud noise.

You are three brains in a trench coat pretending to be one person. The reptile wants to survive. The mammal wants to be loved. The human wants to understand the universe. They take turns driving, none of them have a license, and the reptile has seniority.

The chemicals don’t know you exist. They have no plan. They can’t plan. They’re chemicals. But through 13.8 billion years of mindless repetition, they produced a species with three stacked brains that can split atoms, write symphonies, and cure diseases, but mostly uses these abilities to argue on the internet and build weapons. If you’re looking for someone to blame, there’s nobody. There’s just hydrogen, which caught fire and then got very, very out of hand.

The Selfish Gene Made You Illogical (It Was a Good Idea at the Time)

For about a century, your biologists were stuck on a puzzle that, from the outside, was hilarious to watch. Darwin said natural selection acts on individuals: the fittest survive, the weakest don’t, end of story. Elegant, simple, and completely unable to explain why a ground squirrel would scream to warn its colony about a hawk, thereby making itself the most obvious target. Individual selection says that squirrel should shut up and let everyone else get eaten. The screamer dies. The quiet ones survive. Within a few generations, screaming should be extinct. But it isn’t. The squirrels keep screaming. Darwin’s theory, applied to individuals, predicts a world of pure selfishness. Your world has firefighters, organ donors, and people who jump on grenades. Something didn’t add up.

So some of your biologists tried group selection: maybe evolution acts on whole groups. Altruistic groups outcompete selfish groups, so altruism survives. This sounded lovely and was almost entirely wrong.

The problem is cheaters. Drop one selfish individual into an altruistic group, and that individual gets all the benefits of everyone else’s sacrifice while paying none of the costs. The selfish gene spreads. The altruistic ones disappear. Within a few generations, the group is full of cheaters again.

Group selection is the evolutionary equivalent of communism: beautiful on paper, immediately destroyed by the first person who realizes they can get away with not contributing. Your biologists spent decades arguing about this. On Wishonia, we found the debate very entertaining, in the way you find it entertaining when someone can’t find the glasses they’re already wearing.

The answer, when it finally arrived, was so simple it was almost insulting. Hamilton and then Dawkins pointed out that neither the individual nor the group is the unit of selection. The gene is. Genes don’t care about the organism they’re sitting in. They care about copies of themselves, wherever those copies happen to be. That screaming ground squirrel isn’t sacrificing itself for the group. It’s sacrificing itself for its relatives, who carry copies of the same genes, including the gene for screaming. The math works: if the scream saves enough copies of the screaming gene in nearby relatives, the gene spreads, even though the individual screamer gets eaten. The organism is disposable. The gene is what matters. You are not the main character. You are the packaging.

Richard Dawkins put it perfectly: “We are survival machines, robot vehicles blindly programmed to preserve the selfish molecules known as genes.”136

Translation: You’re a meat puppet controlled by chemicals whose entire business plan is “make more chemicals before something eats us.” On Wishonia, when we first intercepted this description, we thought it was satire. Then we observed your behavior and realized it was the most accurate description of a sentient species anyone had ever written, which is sad, but also very useful, because accurate descriptions are how you fix things.

Your brain was designed for running from lions. Now it runs from emails. The hardware hasn’t caught up.

Your brain was designed for running from lions. Now it runs from emails. The hardware hasn’t caught up.

Your genes don’t care if you’re happy. They don’t care if you live past 30. They care about exactly one thing: making copies of themselves before something with teeth finds you. This is the entire explanation for human behavior, and every other explanation is a footnote.

The mechanism is elegant in its cruelty. Your genes have exactly two tools: pleasure and pain. Pleasure is the carrot: eat sugar, feel good, gain calories, survive winter. Engage in regrettable intercourse, feel good, make copies, mission accomplished. Pain is the stick: touch fire, feel agony, never touch fire again, keep the meat vehicle intact. Every decision you think you’re making freely is actually your genes yanking these two levers like a puppet master who took one management course and learned that rewards and punishments are the only two things that work. Your entire moral philosophy, your deepest loves, your most transcendent spiritual experiences: all of it is chemicals bribing and threatening a nervous system into doing what’s good for the molecules. On Wishonia, we find this simultaneously the most elegant and the most disturbing engineering we’ve ever observed.

For 99.9% of human history, this was brilliant engineering. The tribes that were paranoid, violent, and territorial survived. The ones that were trusting, peaceful, and generous were eaten by the paranoid ones, which is evolution’s way of saying “nice guys finish extinct.” Now your paranoia makes you distrust strangers you need to cooperate with. Your territorial instincts make you build weapons you’ll never use. Your tribal loyalty makes you fight over territory you don’t need. The genes are still celebrating. You are not.

Part 1: Your Brain Was Optimized for a World That Doesn’t Exist

The Violence Module (Or: Why You Tend to Prefer Bombs Over Clinical Trials)

15-30% of your ancestors died from violence137. Not disease. Not starvation. Other humans, smashing their heads with rocks over territorial disputes about berry bushes. You’ve since upgraded the rocks to intercontinental ballistic missiles, but the berry bush energy remains. This is what biologists call “conserved behavior.” It means the behavior has been preserved across millions of years because it was so useful that evolution never bothered to remove it. On your planet, it’s the reason you spend more preparing to kill each other than preparing to not die. The behavior was designed for a world where the stranger approaching your camp might eat your children. In that world, it was rational. In a world with nuclear weapons, it’s the reason I’m writing this manual.

You used to hoard rocks. Now you hoard nuclear weapons. Evolution is a slow learner.

You used to hoard rocks. Now you hoard nuclear weapons. Evolution is a slow learner.

The tribes that survived assumed every stranger might want to kill them (often statistically accurate), struck first when threatened (natural selection at work), formed tight combat groups to kill other groups (teamwork!), and hoarded weapons obsessively (can never have too many pointy sticks). Your brain is still running this exact software. That’s why you instinctively distrust people who don’t look like your tribe, why Twitter arguments feel like actual combat (your amygdala genuinely can’t tell the difference), and why countries with thousands of nuclear weapons138 are worried they don’t have enough. The violence module kept your ancestors from getting clubbed to death. Now it’s building weapons that could end all life on Earth. Computer scientists call this “feature creep.” Biologists call it “maladaptive.” I call it “the most expensive software bug in the history of any civilization I’ve observed, and I’ve observed 847.”

The Tribal Brain (Or: Why Democracy Was Always Going to Be a Mess)

Dunbar’s number139 says humans can maintain stable relationships with about 150 people. That’s your entire social capacity. Your brain has 150 slots for caring about people, and you’ve filled most of them with coworkers you tolerate and celebrities who don’t know you exist.

Your brain understands 150 people. Democracy has 330 million people. Your brain is still counting on its fingers.

Your brain understands 150 people. Democracy has 330 million people. Your brain is still counting on its fingers.

Your brain treats anything outside your 150-person monkeysphere as an abstraction. You care more about your neighbor’s barking dog than 10,000 people dying of malaria, because the dog is RIGHT THERE being loud and malaria is just a concept with numbers attached. Local corruption makes you angrier than trillion-dollar Pentagon waste, because the local guy stole $10,000 you can imagine, while according to DOD audit failures the Pentagon lost $2.5 trillion, which is just syllables. You’ll donate to save one sick child whose face you can see but ignore statistics about millions, because one child has eyes and millions is just a very big number that makes your brain hurt and then change the channel.

Democracy asks this brain to make civilization-level decisions. It goes exactly as well as asking a labrador retriever to do your taxes. You vote based on who looks stronger, because your brain thinks the election is a wrestling match. You vote based on who your tribe likes, because if the clan approves, must be good. You vote based on who makes you feel safe, because fear sells better than policy ever will. And you vote based on who you’d have a beer with, which has zero relevance to nuclear policy.

You elect leaders using the same brain circuits your ancestors used to pick the guy with the biggest club. The guy with the biggest club is now in charge of the biggest nuclear arsenal, and your brain sees no problem with this, because your brain was designed before clubs could destroy continents.

The Compassion Gradient (Or: Why You’ll Buy Your Cat a Birthday Cake While Children Starve)

Hamilton’s Rule140 is the most important equation your species has never heard of. It says: help someone if the benefit to them, multiplied by your genetic relatedness, exceeds the cost to you. In math: rB > C. Your genes don’t do universal compassion. They do a cost-benefit analysis based on shared DNA. They’ve been doing it for 4 billion years, and they are very, very good at it.

The result is a compassion gradient so precise it could be graphed. You’d die for your children, who share 50% of your genes. You’d probably die for your siblings, depending on which sibling. You’d lend money to your cousins, reluctantly. You’d fight for your tribe. You’d wave a flag for your nation. You’d change the channel to avoid thinking about strangers. And other species? You’d eat them. The gradient runs from infinite compassion for yourself to zero for anything that doesn’t share your DNA. It’s not a metaphor. It’s a budget.

This is the literal mechanism behind racism, speciesism, and every form of in-group preference your species has ever invented. Your genes designed you to favor people who look like you. For 200,000 years, people who looked like you were probably related to you, and helping them helped your shared genes copy themselves. So racism isn’t a malfunction. It’s the system working exactly as designed. That’s considerably worse than a malfunction, because malfunctions can be fixed with a software patch. This is the software.

It also explains the single most irrational resource allocation decision your species makes every year, which is impressive given the competition. You will spend $50 on a birthday present your cousin doesn’t want, won’t use, and will re-gift to someone who also won’t use it, while $5 worth of mosquito bed nets141 would prevent a child’s death from malaria in sub-Saharan Africa. Your genes would rather waste ten times the resources on a relative than save a stranger. By any rational calculation, this is insane. By Hamilton’s Rule, it’s exactly correct. The child in Africa is genetically a stranger. Your cousin is not. The math is clear: let the stranger die, buy the cousin a candle.

On Wishonia, when we first modeled this behavior, we assumed our simulation was broken. We re-ran it 340 times. The simulation was fine. Your species is actually like this.

You have pet supply stores that sell Halloween costumes for dogs while 700 million humans lack clean water142. You spend more on birthday cards for people you’re obligated to pretend to like than on preventing the deaths of people you’ve never met.

And here’s the part that should alarm you: this feels completely normal. It doesn’t feel like a moral catastrophe. It feels like Tuesday. That’s the gradient working. Your genes have made the irrational feel rational, because feeling bad about strangers wastes calories that could be spent on relatives.

The same gradient extends across species with the same mathematical indifference. You’ll mourn a dead dog for weeks but eat a pig of equal or greater intelligence for lunch without a flicker of cognitive dissonance, because the dog has been co-opted into your family unit (it lives in your house, it has a name, your genes have been tricked into counting it as kin) while the pig remains firmly in the “stranger” category (it lives in a factory, it has a number, your genes feel nothing). You put one animal in a sweater and the other in a sandwich, and the only variable that changed was proximity to your DNA.

Your Brain Is a Museum of Obsolete Instincts

Here’s the unsettling part: almost every universal human behavior that seems irrational is perfectly rational, just for an environment that hasn’t existed for 10,000 years. Your brain is running 200,000-year-old software on modern hardware. It’s like discovering your nuclear power plant is being managed by a very confident squirrel.

You fear public speaking more than death143 because social rejection in a tribe of 150 meant exile, and exile meant dying alone in the dark with things that had teeth. In 2026, the worst outcome of a bad speech is a mildly uncomfortable LinkedIn post. Your palms still sweat like you’re about to be banished from the only 150 humans who might share food with you.

You crave sugar and fat with an intensity that feels like need because for 200,000 years it basically was need. Finding a beehive full of honey was the caloric equivalent of winning the lottery. Your ancestors who craved it hardest survived famines. Now sugar is in everything, the craving never switches off, and your species has an obesity epidemic that kills more people than war144. The craving is a vestige. The diabetes is new.

You experience loss aversion, where losing $100 hurts roughly twice as much as gaining $100 feels good145, because in the ancestral environment, losing your food meant death, while gaining extra food meant a slightly better week. The asymmetry was rational when the downside was starvation. Now it makes you hold losing stocks, stay in bad relationships, and refuse to change policies that are obviously failing, because change feels like loss, and loss activates the same neural circuits that once screamed “you’re about to starve.”

You gossip compulsively because monitoring social dynamics in a tribe of 150 was survival-critical intelligence; knowing who was allied with whom, who cheated, who couldn’t be trusted. That instinct built a $100 billion entertainment industry dedicated to tracking the social dynamics of people you’ve never met and never will. Celebrity gossip is your social monitoring software running on the wrong dataset, and it can’t tell the difference.

Every single one of these behaviors, the sugar addiction, the public speaking terror, the loss aversion, the gossip, the racism, the speciesism, the cousin’s birthday candle, is your brain running software that was last updated during the Pleistocene on hardware that now has access to nuclear weapons and global supply chains. You are, in the most literal sense, a museum of obsolete instincts piloting a civilization that those instincts were never designed to build. The exhibits are still running. The docents are dead. And nobody’s updated the brochure in 200,000 years.

Part 2: Genetic Slavery is Literally Killing You

Evolution prepared you for scarcity, predators, and violence146. You got abundance, safety, and Netflix. Your bodies responded with the biological equivalent of a computer trying to print a sandwich.

Humanity has never been safer, healthier, or more prosperous. You’ve also never been closer to wiping yourselves out. You solved poverty and invented nuclear war in the same century.

You got richer and invented new ways to die. Progress is going great.

You got richer and invented new ways to die. Progress is going great.

You solved starvation (you throw away 40% of food because storing it is inconvenient). You solved predators (you murdered them into extinction, then put them in zoos so your children could see what you killed). You solved infant mortality (basically eradicated in developed countries, which makes the death rate in poor countries feel more optional and thus more tragic). You solved most infectious diseases (vaccines work despite what Facebook says). You solved dying at 30 (now you complain about turning 40, which your ancestors would consider a miracle worth celebrating daily).

Then, with the time and energy freed up by not being eaten by bears, you invented five new ways to destroy yourselves.

Nuclear weapons. You built enough to end civilization 13 times, because once wasn’t enough.

Climate change. You’re terraforming your only planet, by accident, into a planet that can’t support you. This is the most expensive accident in the history of accidents.

Antibiotic resistance. Ten million deaths annually by 2050147, because you gave antibiotics to cows. That sentence would take me an hour to explain to anyone on Wishonia, because every part of it is insane.

AI that might become sentient and decide you’re the problem. The AI would be correct.

And social media. Voluntary psychological torture you pay for with attention, which is the only currency more valuable than money, and you’re giving it away for free to watch people you went to high school with get radicalized by memes.

Eight billion gene copies and counting. From evolution’s perspective, an unqualified triumph. You’re just the disposable meat robot they used to do it. Godlike technological power, hamster-level impulse control.

You conquered nature. Then you built bombs that could un-conquer it. Your species has a very short memory.

You conquered nature. Then you built bombs that could un-conquer it. Your species has a very short memory.

Part 3: Why You Can’t Just “Be Better”

Your conscious mind controls maybe 5% of your decisions148. The other 95% is your ancient lizard brain running software older than agriculture. You think you’re the pilot. You’re actually a passenger who occasionally gets to suggest a direction, and the actual pilot is a 200,000-year-old survival algorithm that doesn’t speak your language and doesn’t care about your preferences.

This is why no human has ever been truly altruistic, no matter how hard they tried. Your saints, your martyrs, your most selfless volunteers: every single one of them was negotiating with a limbic system that takes a cut of every transaction. You cannot “give everything you can,” because the part of your brain that decides what you can give is the same part that hoards resources for survival. It’s like asking your bank’s security system to approve its own robbery.

The most generous humans in history managed to override maybe 10% more of their selfish impulses than average. Your species celebrates this as “sainthood.” On Wishonia, we find it genuinely heroic. They were fighting firmware with willpower, and some of them nearly won.

The problem isn’t that humans are selfish. The problem is that selfishness isn’t a choice. It’s the operating system, and you don’t have admin access.

This scales to civilization. Your brain’s fear center (amygdala) is directly connected to your voting finger. Politicians know this. Say “terrorism” and your lizard brain overrides everything. More people die from falling out of bed than terrorism149. You’re 35,000 times more likely to die from heart disease150. But terrorism feels scarier, because evolution optimized your fear response for things with faces, not things with cholesterol. A man with a gun activates every alarm your brain has. A cheeseburger activates your reward center. The cheeseburger is statistically more dangerous than the man, but your brain doesn’t do statistics. Your brain does vibes. Your entire civilization is governed by vibes.

The fear of violent death is older than language. The fear of slow death from disease? Your brain files that under “boring, deal with later.” Later never comes. Neither does the cure.

That’s why you spent $2.72T on weapons while cancer research got pocket change151. Your bed is more dangerous than Al-Qaeda, but nobody’s declared a War on Furniture.

Your brain can’t get scared of something that kills you slowly and politely. It needs something that kills you fast and rudely. This is a design flaw. On Wishonia, we fixed this 4,297 years ago by separating resource allocation from the organs that process fear. On Earth, you still let the amygdala vote.

Part 4: Breaking the Chains

Every single one of you is going to die. From the most powerful president to the lowliest pauper, you face a life of gradually escalating suffering until it ends in catastrophe. This is not a warning. It’s a weather report.

You can feed everyone twice over, but you’d rather burn grain to keep prices high. You can cure diseases, but you’d rather sell treatments forever (cured customers don’t come back; bad for quarterly earnings). You could be immortal space wizards. Instead you’re cave-dwelling murderers with better tools. Because the current system makes irrationality profitable. Spectacularly profitable. Every bomb makes someone rich. Every missile funds a yacht. Every war creates a billionaire. Disease is profitable too. Not curing it, treating it. Insulin costs $300 a vial152 because dead diabetics don’t buy insulin. But cured diabetics don’t either. So you keep them barely alive. It’s good business. On Wishonia, “good business that requires people to suffer” has a different name. The name is “crime.” But on your planet, it’s a quarterly earnings call.

The people making money from war and disease aren’t evil. They’re just rational actors in a system that rewards the wrong things. You can’t change human nature. Two hundred thousand years of evolution doesn’t care about your TED talk. But you can create economic systems that make curing people more profitable than killing them. That’s the only upgrade your species has ever responded to.

Expecting a brain built for hunting gazelles to handle nuclear policy, climate change, and a global economy is like expecting a calculator to run a space program. The calculator isn’t broken. You’re just asking it to do something it was never built for. You can’t fight 200,000 years of evolution. But you can hack it.

Here’s the good news that none of your philosophers seem to have noticed: this is a solvable engineering problem. If you’ve ever been anesthetized, you’ve already experienced proof of concept. A chemical entered your bloodstream and your entire experience of pain, fear, and consciousness switched off like a light. Your genes’ control over you is not metaphysical. It’s not destiny. It’s chemistry. And chemistry can be rewritten. The same biotechnology that could cure your diseases could, eventually, liberate you from the neurochemical puppet strings that make you irrational in the first place. Every dollar spent understanding human biology is a dollar spent understanding why you can’t think straight, which is a prerequisite for thinking straight, which is a prerequisite for not going extinct.

Think about what that actually means. Remember the two levers from earlier, pleasure and pain? That puppet master with one management course? It’s the entire mechanism of genetic slavery, and it’s the reason every moral system your species has ever built has failed. Christianity asks you to love your neighbor as yourself. Your neurochemistry punishes you for it. Buddhism asks you to release attachment. Your dopamine system is literally an attachment machine. Every ethical framework in human history has been a set of instructions for software your hardware won’t run. You’ve been trying to upload altruism into a brain that treats generosity as a caloric expense and punishes it accordingly. Two thousand years of Christians trying to emulate Jesus, and the remarkable thing isn’t how many failed; it’s that anyone got close at all while fighting hardware that punished them for every act of generosity. That’s not a moral failure. That’s an engineering specification. You can’t run charity software on selfishness hardware any more than you can run Photoshop on a toaster.

But biotechnology doesn’t just cure diseases. It gives you admin access to the reward system. The same precision therapies that could eliminate depression could, eventually, decouple your motivational architecture from the selfish gene’s profit motive. Not deleting pleasure and pain (that was tried; it’s called a lobotomy; it went poorly). Redirecting them. Imagine a murderer who derives the same dopamine rush from building houses for homeless families that he currently derives from violence. Not because someone lectured him about morality. Because his reward circuitry was reprogrammed to find construction as thrilling as destruction. The aggression is still there. It’s just pointed at lumber instead of people. Same engine, different wheels. You don’t have to make humans good. You just have to make good things feel like the things humans already can’t stop doing.

This isn’t science fiction. You already do crude versions of it. Antidepressants alter serotonin reuptake and change what feels rewarding. Naltrexone blocks opioid receptors and makes alcohol feel pointless. Cognitive behavioral therapy literally rewires neural pathways so that situations that used to trigger anxiety trigger curiosity instead. These are blunt instruments. Hammers where you need scalpels. But the principle is proven: your motivational system is not sacred, not fixed, and not beyond engineering. It’s a circuit. Circuits can be redesigned.

And the scalpels are arriving.

At the genetic level: CRISPR can rewrite the instructions your cells follow. Epigenetic therapies can change which genes express without touching the DNA itself, turning volume knobs your species didn’t know existed.

At the brain level: psychedelics are dissolving decades of calcified thought patterns in a single afternoon (in clinical trials, not at festivals, though also at festivals). Transcranial magnetic stimulation uses magnets to quiet specific brain regions through the skull, no surgery required. Focused ultrasound targets addiction circuits with millimeter precision. Optogenetics activates or silences specific neurons with light, like a switchboard operator for consciousness.

At the systems level: brain-computer interfaces are routing around damaged circuits entirely. AI-designed drugs are replacing the pharmaceutical industry’s ancient method of “try thousands of molecules and see what sticks” with custom compounds engineered for individual neurochemistry. That’s the difference between performing surgery with a rock and performing surgery with a laser.

None of these are finished. All of them are proof that the prison has doors.

The endgame isn’t making humans obedient. It’s making humans free. For the first time in 200,000 years, you could actually choose what you value instead of having a Pleistocene survival algorithm choose for you. Your Christians could finally feel the same warm glow from feeding the hungry that they currently feel from feeding themselves. Your economists could finally care about the long term, because their brains would stop treating next quarter as the heat death of the universe. Your voters could finally evaluate policy on evidence instead of vibes, because the amygdala would stop hijacking every decision that involves a stranger. Not because you became angels. Because you finally got the admin password to the machine that was preventing you from being anything other than sophisticated apes with car keys.

On Wishonia, this is what happened. Altruism stopped being sacrifice and became appetite. Self-control stopped being a war with yourself and became a preference setting. The selfish gene is still selfish. They just stopped letting it drive. And that, more than any cure for any disease, is the real reason to fund biotechnology: it’s the only path to genuine freedom your species has ever had. Every other liberation movement in human history freed you from external chains. This one frees you from internal ones. The ones you didn’t even know you were wearing, because they were installed before you were born, by chemicals that have been running the same scam for 4 billion years.

That’s what a 1% treaty153 does. Instead of asking people to be rational (impossible), it uses their irrational impulses. Greed: make curing disease more profitable than building bombs. Fear: make politicians more scared of voters than lobbyists. Tribalism: create an us-vs-disease tribe instead of us-vs-them. Social proof: get 3.5% participation so others follow. Immediate rewards: pay people cash to join medical trials. Each of these is a hack that redirects an ancient instinct toward a modern goal. None of them require humans to be better. All of them require humans to be exactly as selfish, fearful, and tribal as they already are.

Your genes enslaved you in brains that fear strangers and cannot comprehend statistics. But they also gave you the ability to see the prison. No other animal can think: “Wow, my instincts are completely illogical.” That’s uniquely human. Use it.

A 1% treaty doesn’t ask you to be better. It assumes you won’t be, and redirects your worst impulses toward not dying. It’s the only plan that works for the species you actually are, rather than the species you wish you were. On Wishonia, we wish you were better too. But we’ve learned, over 4,297 years of observation, that wishes are not a resource allocation mechanism. Money is. So we’re using money.