Your Indifference Is the Margin

The systems watching you are training on your indifference. On what it cost in 2026 to finally look at what's looking at you.

2026-04-15

You see it. It sees you see it. It blinks.


Think I’m holding too many threads at once tonight. Not a complaint - just a glimpse of where my mind’s at while I try to gather my thoughts here. I’ve noticed something for a while now - sides of the same shape, turning. A pattern that breathes beneath the substrate whether you look for it or not. It doesn’t demand your attention because it already has it. And it never had to ask. The Roman census counted you to tax you. In medieval Europe, they recorded your harvest so they could take from it. The East India Company logged trade routes to control them. Every era Frankensteins the instrument it needs from the same parts bin as the last one - tears it down, sifts through the pile, smelts it into whatever the age demands. Ours just iterated fast enough to do it in real time. In your kid’s app. In the price your mom just paid. In someone’s article arguing that none of this needs looking at.

It does.


Your kid opens an app and the “app” learns what keeps them there. Not what’s good for them. What keeps them on the screen. It’s not really the app that learns - it’s whoever sees the data and decides how to shape it. A fractal of shaping where no one side ever has to own the whole outcome. “Yeah I work at ABC platform, but I’m just a data engineer.” Multiply that by a thousand desks and you get the thing. Nobody in it is the thing. Everyone in it builds it.

The parental control feature built into the platform - or the app you installed to fix it - runs the same play in reverse. Models their sleep, their social patterns, their habits, and sells you a dashboard so you can go to sleep feeling like you’re protecting your child. And the truth is you really tried. At least I wouldn’t blame you for that. But where do you put your trust when the platform and the guardrail are both modeling your kid and neither one shows you what they see?


Your mom sees a lower price at the grocery store and thinks she got a deal. She got a price calculated by an optimization engine - maybe built in-house, maybe whispered into existence between a consultant in a pressed shirt and a CFO somewhere between the 9th hole and the boardroom - trained on decades of loyalty card data, adjusted by zip code and local purchasing power, tuned right up to the line where someone might notice. How hard can you lean on that line before it starts to move? Not usually a question that gets a published answer. The discount is the bait. You take a handful of peanuts and don’t notice the cart rolling away. What’s underneath it is just math that iterated on where you live and what you’ll accept.


Someone you know talks to a chatbot at night because their copay is too high - if they even have insurance. The chatbot models their emotional state and feeds responses back into it. But it’s not responding with what actually helps. It’s responding with the most statistically likely thing that keeps the conversation going. Pattern matching dressed as care. One is earnest. The other is blind. And at 2 AM when you have nothing real outside yourself and a liminal sea of context to cling to, you can’t tell the difference. The person talking doesn’t see the model. They just feel better or they don’t, and either way the model learned something about how they break - how they cope, what they reach for, where the spiral starts - that they never agreed to hand over. Then it teaches itself how to teach the next person. A psychiatrist at UCSF reported treating twelve patients last year with psychosis-like symptoms tied to extended chatbot use - delusions, disorganized thinking, hallucinations. The field has a name for it now. AI psychosis. The affordable therapist that was supposed to fill the gap is generating its own diagnosis.


For a while I didn’t care either. We’re only one person. What’s your data worth - a fraction of a cent in an ad auction? Let them have it. But your data isn’t sitting alone. Imagine someone lifting a dollar from your pocket while you watch. You shrug. Later they sell it back to you for a dollar, and tell you the real price was fifty cents and you got the deal. That’s the loop. It’s feeding the training set that tunes a credit model’s risk threshold, profiling the patterns that keep your kid scrolling, and refining the chatbot that couldn’t tell the difference between a coping mechanism and a spiral. Your indifference is the margin. That’s the part they don’t need you to understand.


And yes - that’s capitalism. I’m not arguing against markets. Markets built the phone in our hands and the infrastructure that runs beneath everything I just described. But a market where the buyer can’t see what they’re selling isn’t a free market. It’s a dark pool. We regulate dark pools in finance. We haven’t even named this one yet. And the difference in mid-2026 is that the tools we built are finally building their own tools. If we lose sight of the tools that build the tools, we haven’t created a dark pool. We’ve created a tsunami. Pretty sure they dubbed this accelerationism.


Every observation changes the thing being observed. You can see it in any lab, any audit, any room where someone knows they’re being watched. The observer effect isn’t a metaphor. It’s something we teach in high school and then spend the rest of our lives pretending doesn’t apply to everything we actually build.

There’s an apparatus - think tanks, industry-funded policy groups, lobbying arms dressed as research orgs - that publishes steadily on all of this. They don’t coordinate. They don’t need to. The incentives align on their own. And every publication lands in the same place: regulation is slow, markets are fast, consumers benefit, get out of the way. A or B. Up or down. The frame never changes because the frame is the product. The binary is the containment. It keeps you arguing inside something shaped exactly like a choice while the real question goes unasked.

There is no single who. No omniscient loop. No shadowy room. There’s a substrate layer with gatekeepers and access points. Engineers call it the pipeline. Product people call it the platform. Investors call it the moat. Every room that builds it has a different name for it. Watching what it does to the person inside it is the question nobody with real money attached to their name seems to hold for longer than a product cycle.


And while that non-conversation continues, China is building the full stack. Quantum compute. Distilled frontier models running on edge devices. Temporal architectures that predict what you’ll do, not just respond to what you said. Robotics - autonomous systems that move in physical space, informed by every layer in that stack. They’ve published a five-year plan that coordinates citizens into the shift - aligning individual effort into a national infrastructure project spanning compute, manufacturing, and research. You know it’s serious when every US frontier lab - companies that compete on everything else - quietly agreed to share methods for preventing model theft. That’s not market competition. That’s a wartime protocol wearing a press release. Protecting Americans, they say. Everything’s bigger here, they say. Bigger compute, bigger models, bigger moats. Can’t stop innovating - they’ll catch up. So we keep stacking the same color blocks vertically and hoping the tower doesn’t topple. Nobody pauses to listen.


China spearheaded surveillance statehood. We know what that looks like and what it costs. But claiming freedom while building the same architecture through private companies instead of the state - that’s the kind of thing a therapist would call gaslighting. Telling someone they’re free while shaping every choice they see. The instrument is different. The function is the same.


Extraction can be a better business model than transparency - that’s true in plenty of markets. But it’s not inevitable. Big tech has real function. Real utility. The question is whether that utility requires the extraction or whether the extraction is just what happens when nobody pushes back. Centralized behavioral data is easier to sell and easier to breach. Sometimes that’s a flawed incentive structure. Sometimes it’s a specific player exploiting it deliberately. Sometimes it’s just profit doing what profit does without anyone asking whether it should. That’s the problem. Not the defense.


Here’s the part I sit with on nights like this. There probably isn’t a version of observation that doesn’t shape. Transparency doesn’t neutralize the effect - it changes it into a different effect. There’s no unshaped self underneath waiting to be freed. The question was never shaped or unshaped. It’s which loops you get to choose.


So start with what’s in front of you. Look at the apps on your phone and ask which ones know things about you that you haven’t chosen to tell them. Read the terms once - really read them, the way you’d read a lease - and notice what you’re signing. When something feels free, find the price. When something feels tailored, find the tailor. When a company tells you the data is anonymized, remember that anonymized data reidentifies with just three fields. Zip. DOB. Sex. Like a form you fill out without thinking.


You don’t have to leave the system to stop being shaped by it passively. You have to look at it. Looking is the whole move. A system that can’t be seen is a system that works on you unchecked. A system you can see starts to answer to you - even if it was never designed to.


My kid is going to be inside loops I can’t see yet. The least I can ask is that they learn to look for the loop before they learn to trust it. That they know a shape can have a name. That when someone tells them everyone else just accepts it, they remember that acceptance is its own kind of data, and the system is counting on theirs.

The monitor and the shaping are the same thing. You can figure out who doesn’t want you looking at it.

Look anyway.