The Behavioral Architect
How Nikita Bier spent a decade perfecting psychological control on teenagers—and now deploys it on everyone at X
“Rethink Democracy.”
That was the tagline for Politify back in 2011. A Berkeley student project that went viral during the election. Four million users in one month. National media coverage. A feel-good story about using technology to help voters understand how policies would affect them.
Nikita Bier was 22 years old when he discovered something profound: Americans vote against their financial self-interest more than any other nation in the Western world.
Most people would see that as a problem to solve. A gap to bridge with better information and education.
Bier saw it as an opportunity to exploit.
Because what he actually learned from Politify wasn’t just that voters are irrational. What he learned was that “rhetoric supersedes logic when systems are complex enough that ordinary citizens don’t have hours to understand them.”
That line, buried in his TED talk and some Berkeley research papers, is everything you need to understand about what came next. It wasn’t a warning about vulnerabilities in self-governance. It was his operating manual.
Bier abandoned Politify after proving the concept worked. A Berkeley study showed 6% of users changed their voting choice after using the app. That’s enough to swing an American election. The Knight Foundation gave him money. Governments wanted to license the technology.
But he walked away from all of it. Not because the mission was accomplished. Because he’d learned that manipulating behavior at scale was possible, and government contracts weren’t the most lucrative way to apply that knowledge.
So he went looking for a better laboratory. A population more psychologically vulnerable. More desperate for validation. More willing to give up their data in exchange for a dopamine hit.
He went hunting for teenagers.
The Laboratory: tbh and the Science of Dependency
In 2017, Bier launched tbh (”to be honest”) an app that let high schoolers send anonymous compliments to each other through polls. The pitch was pure and positive: “Make this generation happier.” Stop the bullying that plagued other anonymous apps. Create a space for kindness.
The reality was more sinister.
To use tbh, you had to give the app access to your contacts. Your location. Your school. Your social graph. Every vote you cast in a poll. Every notification you responded to. Every dopamine hit the app delivered. The app wasn’t measuring happiness. It was measuring dependency.
Which prompts created the most engagement? Which rewards kept users coming back? How quickly did teenagers respond when they “won” a poll? What made them anxious when they weren’t chosen? Every data point was a lesson in behavioral manipulation.
And the growth strategy? Pure psychological warfare. Bier would create a private Instagram account for a specific high school. Follow students who had that school in their bio. Then at a predetermined time, make the account public with a link to download the app.
Engineered scarcity. Manufactured FOMO. Weaponized social anxiety. He called it “a psychological trick.” That’s not marketing language. That’s a confession.
Facebook acquired tbh within months. Not for the product. They shut it down less than a year later. They acquired it for what Bier had learned. The behavioral data. The manipulation techniques. The proof that you could engineer psychological dependency in teenage populations at scale.
God Mode: When the Quiet Part Gets Said Out Loud
Then Bier did something that should have raised every red flag: he built Gas, a nearly identical clone of tbh. Same anonymous polls. Same dopamine engineering. Same growth tactics. Same mandatory access to contacts and location data.
But this time he added something darker: “God Mode.” For a fee, some users could get hints about who voted for them in polls. The psychological architecture was explicit now, monetizing insecurity by creating information asymmetry. Some kids could see who liked them. Others remained in the dark. Pay to resolve the anxiety we engineered into the experience.
Gas exploded to 5 million downloads. Then the rumors started. Parents on TikTok and Snapchat began spreading a theory: Gas was a front for human trafficking. The hoax went viral. Police departments issued warnings. Schools sent alerts. The app experienced a 3% uninstall rate in a single day. Bier’s team received death threats.
The trafficking claims were baseless. The app had no messaging, no location tracking, no features that could enable trafficking.
But here’s what’s revealing: the hoax worked because the app already felt predatory. Parents couldn’t articulate why, but they sensed something wrong. Teenagers couldn’t explain it, but they felt manipulated. The app was designed to exploit fear of social rejection, monetize adolescent insecurity, and extract behavioral data under the guise of “positivity.”
When parents heard “human trafficking,” they didn’t need evidence. The entire design already felt like exploitation. Discord acquired Gas in 2023. Again, not for the product. For the proof that Bier’s techniques were reproducible.
The Hiring: When Free Speech Met Behavioral Control
Now he’s X’s Head of Product.
Elon Musk bought Twitter promising to restore free speech. To make it the digital town square. To end the shadowbanning and algorithmic manipulation. To bring transparency to content moderation.
Then in late 2024, as Bluesky and Threads gained momentum and X bled users, Elon needed someone who could make the platform stickier. Someone who understood how to engineer psychological dependency. Someone who’d spent a decade learning how to keep people scrolling even when they knew they should leave.
He hired Nikita Bier.
A man whose entire career has been built on exactly the opposite of everything Elon promised: opacity over transparency, behavioral manipulation over authentic engagement, psychological control systems disguised as community building.
Bier’s expertise wasn’t in building community. It was in building cages that feel like community.
And suddenly, mysteriously, predictably, users are reporting exactly what you’d expect from someone whose entire career has been built on behavioral manipulation disguised as innovation.
Reach vanishing without explanation. Accounts shadowbanned for “inauthentic activity.” Critics blocked when they ask questions. Complexity preventing understanding. Opacity preventing challenges.
The Architect Doesn’t Like Questions
When users started noticing these patterns, that verified accounts with large followings were being suspended for unclear reasons, they reached out to Bier directly. He’s the Head of Product. Surely he could help.
He blocked them.
When someone reported that their home address had been posted online and asked if that violated platform policy, Bier responded: “Thanks for flagging. I sent the post to our policy team and that user was suspended.” Then when that same user pressed him on why the reporting system was broken and why users organizing doxxing campaigns weren’t being automatically suspended, Bier blocked them too.
The behavioral architect doesn’t like being questioned about his architecture.
But the most revealing pattern is what the algorithm now promotes.
A selfie from a moderately known account? 129 million views.
Substantive research and investigative journalism? Suppressed.
Meme content? 6.2 million views.
External links to academic papers or primary sources? Buried.
Bier announced it explicitly: “Starting next week we’ll be testing a new way to share and engage with web links on X. The goal will be to ensure all content on the platform has equal visibility on Timeline.”
Translation: we’re going to bury external links, the kind that provide verification, context, primary sources, and boost native engagement bait. Selfies. Memes. Rage content. Anything that keeps you on platform, scrolling, engaging, feeding the algorithm.
The business model is obvious. Engagement equals ad revenue. External links send users away. Users leaving means less data to harvest, less time on platform, less revenue. Bier was hired to solve a revenue problem by making X psychologically inescapable.
He’s applying everything he learned from teenagers to everyone.
Social Credit by Another Name
Then came the “authenticity verification.” Bier announced that profiles would soon display new information, like which country an account is based in, to help users “verify authenticity.”
Social credit scoring by another name. Geography as a trust indicator. Behavioral patterns as authenticity markers.
Here’s what nobody’s saying clearly enough: Bier didn’t spend the last decade building apps to help people. He spent it building the most sophisticated behavioral data harvesting operation in history.
Politify taught him that human behavior can be manipulated through information complexity and psychological triggers. tbh and Gas taught him exactly which triggers work, how to engineer dependency, how to monetize insecurity, and how to extract behavioral data at scale while calling it “community building.”
And now he’s deploying everything he learned at X. On the platform that controls public discourse.
Think about what he knows: How to make people act against their self-interest. How to create psychological dependency. How to map social networks. How to trigger emotional responses. How to manufacture consensus. How to suppress dissent while maintaining the illusion of openness. How to keep people trapped in systems they know are harming them.
And now he controls who sees what you post. Who you see. What gets amplified. What gets buried.
Why This Matters
This is why it matters. Not because this pattern is new. But because behavioral data harvesting plus infrastructure control equals manufactured reality. And manufactured reality is incompatible with informed consent.
A free people require informed consent. But informed consent requires access to information, ability to verify information, understanding of how decisions affect you, and transparency in systems of power. Bier’s entire career has been about destroying all four.
The complexity prevents understanding. The opacity prevents verification. The psychological triggers override rational analysis. The architecture engineers compliance while maintaining the illusion of choice.
You think you’re participating. You’re being managed.
You think you’re being heard. You’re being suppressed.
You think you understand the system. It’s deliberately incomprehensible.
You think you’re making free choices. You’re responding to engineered triggers.
Users are now openly questioning whether Bier is indifferent to their safety. Whether he’s enabling harassment that could lead to real-world harm. Whether someone who likes posts about political assassination being “free and legal” should control who gets heard on a platform that shapes public discourse.
That’s not hyperbole. That’s the actual question users are asking in the screenshots. Because when someone with his history gains control over whose voice gets heard, and then blocks victims of doxxing, and deletes evidence of inflammatory statements, and weaponizes “inauthenticity” against critics, people stop believing the censorship is about platform health.
They start believing it’s about control.
And they’re right.
The Cage
The cage you build for others becomes your own prison.
Bier thought he was building tools to manage other people’s behavior. He’s discovering he built his own cage. The authenticity verification tracks him. The algorithmic suppression creates an environment where everyone suspects his motives. The complexity and opacity make even his own actions look sinister.
You cannot build systems of control without eventually being controlled by them.
“Rethink Democracy” wasn’t about improving it. It was about replacing authentic public discourse with behaviorally engineered compliance. Learn how to manipulate voter behavior. Perfect the techniques on psychologically vulnerable teenagers. Deploy at global scale on the infrastructure of public discourse itself.
The data harvest is complete. The control systems are deployed.
Elon promised you a town square. He hired someone who specializes in building beautiful prisons.
The only question is whether you’ll keep participating.
<3 EKO
Thank you for reading and sharing. Not sure how long I’ll last on X, but I’ll always write to you from here.
If you’d like to support me directly, you can always buy me a coffee.
I love you.






I'm not sure how you get this information, but don't stop. You are bringing things to light I had no awareness of. Thanks for the education. Also, I added to your bank account.
And he couldn't see the forest for the trees.
Painted himself in a corner.
Got caught in the delusion of his own trap.
2 Thessalonians 2:10-13
And with all deceivableness of unrighteousness in them that perish; because they received not the love of the truth, that they might be saved. And for this cause God shall send them strong delusion...
Thanks EKO! I'm going to sleep really well tonight knowing another unrighteous deceiver has been caught by his own trap. That fact has such profound meaning.