01 / 10
Novapunks Dispatch — Issue 010

THE
PANOPTICON
ECONOMY

The surveillance economy has a simple mechanism at its core: your behavior is its raw material, and you are not a customer — you are the source. Every platform, every app, every service offered without a cash price is running a harvest operation. The data extracted is not incidental to the product; it is the product. Your attention, your revealed preferences, your hesitations, your social connections, your location at 2:47 a.m. — all of it moves upstream through an invisible supply chain to buyers who want to predict and modify what you do next. What follows is not a paranoid manifesto. It is a structural analysis of a legitimate industry operating exactly as designed — an industry whose design is the problem. Who built it, how it works, who profits from it, and most importantly: how to begin withdrawing. Because the exit is not impossible. It is just unfamiliar. And that unfamiliarity was engineered.

02 / 10
Infrastructure

The Architecture
of Capture

Most people, when they think about online surveillance, imagine cookies. That mental model is about twenty years out of date. The modern behavioral data collection apparatus is a layered, cross-platform, cross-device infrastructure that converts every digital interaction — and an expanding range of physical ones — into machine-readable behavioral signals that persist indefinitely and are combined into profiles of staggering intimacy.

Device fingerprinting identifies your specific browser and hardware configuration with sufficient precision to track you across sites even when you delete cookies, use incognito mode, or change IP addresses. The combination of screen resolution, installed fonts, browser plugins, graphics card rendering patterns, and dozens of other signals creates a fingerprint that is effectively unique. You are identifiable not because you told anyone who you are, but because your device is distinctive enough to be recognized — like a gait pattern or a writing style that cannot easily be faked.

Location data is collected continuously by the apps you've granted permission to and intermittently by many you haven't. The location record is not a list of places you visited. It is a behavioral film: where you sleep, where you work, who you spend time with inferred from co-location, which medical facilities you visit, which political gatherings you attend, which religious communities you belong to. The data broker market turns this film into a product available for purchase — by insurers, employers, political campaigns, and government agencies that cannot legally collect it themselves.

The real-time bidding system that runs the internet advertising economy means every ad you see represents a live auction in which dozens of data brokers are sharing information about you in the microseconds between when you navigate to a page and when it loads. You are profiled, auctioned, and targeted faster than you can read this sentence. This is not occasional. It happens every time you load any page on the commercial web.

The Internet of Things has extended this architecture into physical space. Smart TVs listen to ambient conversations. Smart speakers do so by design. Fitness trackers record your sleep, your heart rate, your movement through space. Smart doorbells build a database of who enters and leaves your home and when. The data from these devices, sold to brokers and aggregated across millions of users, converts the private domestic space into an extension of the behavioral harvest. The home was always the refuge. It is now also the crop.

03 / 10
Class Analysis

The New
Feudalism

"The platform is the manor. Engagement is the rent. And the peasant does not recognize themselves as a peasant because the relationship has been designed to feel like a gift."

Industrial capitalism was organized around the ownership of productive machinery. The ruling class owned the means of production — the factory, the mill, the mine. The working class owned only its labor, which it sold in exchange for wages. The exploitation was visible enough to generate an entire tradition of political analysis and organized resistance.

The successor system is more elegant and considerably harder to name. In the surveillance economy, the scarce resource being extracted is not labor but behavior — specifically, the continuous behavioral stream that your life in digital space produces. You do not sell this behavior. It is taken. The transaction is invisible because there is no transaction: the platform offers a service, and the behavior that service generates is harvested as the real product without your meaningful awareness or consent.

The new class structure has a ruling class that does not own factories — it owns the architecture through which information flows and through which behavior is monetized. The curators who determine what is and is not visible in the feed. The nexialists who connect across networks and accumulate influence through centrality. The platform owners who do not produce content but control the infrastructure through which all content moves and through which all data flows upward. Their power comes not from capital accumulation but from network position — from being the node through which everything passes.

Below them: the consumtariat. Not the proletariat — the consumtariat may be comfortable, housed, employed. It does not recognize itself as a class because its exploitation is not experienced as hardship. But it is structurally excluded from the networks that determine what happens. It does not own the platforms it depends on. It does not set the terms of the data relationship it is enrolled in by default. It generates the behavioral raw material that the ruling network processes, packages, and sells — and receives in return access to services it could not build and does not own.

This is feudalism in its essential structure: an extractive relationship between those who control the land and those who work it. The gift economy of surveillance capitalism — free email, free social connection, free mapping, free everything — is the ideological achievement of our era: a system of extraction that presents itself as generosity, and whose beneficiaries are genuinely grateful for what is being taken from them. The alternative does not begin with legislation or protest. It begins with the recognition that the gift was never free.

04 / 10
The Profile

What They
Actually Know

The phrase "they know a lot about you" fails to convey the texture and depth of what the behavioral profile actually contains. Most people, if pressed, would describe it as a record of their shopping habits or search queries. The reality is more intimate and more unsettling than that framing suggests.

Psychological profiling from digital behavior is not speculative. Research published in mainstream academic journals has demonstrated that the Big Five personality traits — openness, conscientiousness, extraversion, agreeableness, neuroticism — can be accurately inferred from social media engagement patterns. Political alignment can be predicted from search behavior with significant reliability. Creditworthiness is inferred from app usage patterns that have no obvious connection to financial behavior. Health conditions are inferred from the combination of searches, app installations, and the routes your daily movement traces past specialist clinics. The inference engine doesn't need you to volunteer the information. It derives it from everything else.

The concept of behavioral surplus captures the core of what is being extracted: what is harvested beyond what is necessary to deliver the service. If you search for a restaurant, the engine needs your location and query to return results. The behavioral surplus is everything else — that you hesitated before searching, that you searched at 11 p.m. on a Tuesday, that you've searched for that cuisine sixteen times this month, that your location is shifting neighborhood by neighborhood in a pattern that suggests a life change. None of this is necessary to answer your question. All of it is profitable to sell.

Vulnerability windows are what make the profile particularly consequential. Behavioral systems can identify periods of psychological susceptibility — grief, loneliness, financial stress, relationship instability — before the individual has consciously registered them, based on changes in behavioral patterns that precede self-awareness. This knowledge is then available to whoever purchases the prediction product: the payday lender, the addiction-economy platform, the political campaign that wants to reach you at the moment your priors are softest.

The permanence of the record means the version of you from a decade ago — your worst searches, your 3 a.m. spirals, your moments of political radicalization or emotional collapse — exists in a dataset that will likely outlive you. What future institutions will do with that data, under what legal regimes, in response to what political pressures, is unknown. The profile was not built for any specific purpose. It was built to be sold for any purpose that presents itself.

05 / 10
Behavioral Modification

The Compliance
Engine

The most commonly cited harm of the surveillance economy is privacy violation. This is a real injury, but it is not the primary one. The deeper harm is not that they know what you do. It is that what they know shapes what you do next.

Behavioral modification is the endpoint of behavioral data collection, and the loop is the product. Your behavior is observed and profiled; the profile is used to curate the information you receive; the curated information shapes your subsequent behavior; the changed behavior generates new data that refines the profile; the refined profile produces more accurate curation. This is not a side effect of the advertising business model. It is the business model. A platform that can reliably predict your next behavior is valuable. A platform that can reliably influence your next behavior is worth considerably more.

Content curation as behavioral modification operates at civilizational scale. Hundreds of millions of people receive their understanding of current events, their sense of what is socially normal, their political priors, and their emotional baseline from information environments whose architecture was optimized not for accuracy or depth but for engagement — the metric that predicts behavioral data generation and advertising revenue. The result is a population whose collective attention has been deliberately organized around outrage, anxiety, and tribal identity, because these emotional states produce more clicks, more shares, more time-on-platform than calm deliberation or genuine complexity. The manipulation is not incidental. It is the output the system was designed to produce.

The predictive policing applications make the mechanism legible by making it explicit. Behavioral data from commercial platforms is used to flag individuals as elevated-risk before any specific act has been committed — based on social graph, location patterns, and online behavior. The individual is managed not for what they have done but for what the statistical model predicts they might do. This is the actuarial logic of behavioral capitalism applied to criminal justice: you become a prediction rather than a person.

Jeremy Bentham's panopticon required an observer in the central tower — but the power of the architecture always came from the prisoner's uncertainty about whether they were being watched at any given moment. The surveillance economy has built the panopticon at scale and removed the uncertainty. The observer is always present. The behavioral consequence is the internalization of compliance — not because rules are enforced, but because the architecture of observation produces self-governance before enforcement is necessary. This is the system working as intended.

06 / 10
The Integration

The Commercial‑State
Pipeline

There is a common tendency to frame surveillance capitalism and state authoritarianism as opposing tendencies — private excess on one side, government overreach on the other. This framework is wrong. They are the same architecture operating with two revenue models and a shared interest in its perpetuation.

The commercial surveillance apparatus and the state security apparatus have been formally integrated through procurement relationships that both sides have every incentive to keep opaque. The U.S. government cannot legally collect continuous location data on its citizens without a warrant. But it can purchase that data from commercial brokers who collected it through terms-of-service agreements embedded in apps that citizens downloaded voluntarily. This legal arbitrage — using commercial collection to bypass constitutional constraints on state collection — is not a workaround discovered after the fact. It is documented policy, confirmed in litigation and government procurement records. The commercial sector collects what the state cannot. The state buys what it could not legally gather. The civil liberty prohibition remains on paper while the data flows regardless.

The intelligence fusion centers built in the post-2001 period are the structural expression of this integration: facilities where federal intelligence agencies, state law enforcement, and commercial data analysts share data streams in real time. The fusion center does not distinguish between commercial behavioral data and criminal records. They flow into the same analytic systems, assessed by the same risk-scoring models, producing outputs about individual behavior that neither the commercial nor the state actor could have generated alone.

The authoritarian export is the most visible expression of the system's political neutrality. The facial recognition platforms, behavioral scoring databases, and predictive policing infrastructure enabling contemporary authoritarian surveillance states were not built from scratch by those governments. They incorporate technology from companies that simultaneously serve the commercial surveillance market in liberal democracies. The architecture is infrastructure-neutral — deployable by any sufficiently resourced actor for whatever behavioral management purpose they choose. What varies between contexts is not the technology but the legal constraints on its use — and those constraints erode under sufficiently sustained pressure.

Every major expansion of the surveillance state in the past two decades has been justified by an emergency and has outlasted it. The emergency creates political permission. The commercial data industry provides the infrastructure. The security apparatus locks in the access. The emergency ends. The integration remains. This is not a pattern of exception. It is the operating mechanism of a system that grows with each crisis it is offered.

07 / 10
The Exit

The Counter‑Surveillance
Stack

The counter-surveillance exit is not a single act. It is a set of layered practices that together reduce your behavioral footprint to a level where commercial profiling becomes uneconomical and state surveillance becomes significantly more expensive. No single tool provides complete protection. The goal is not invisibility — it is raising the cost of watching you high enough that the return on investment no longer justifies the expenditure.

At the network layer: encrypted DNS prevents your internet provider from logging every domain you visit. Services like NextDNS or a self-hosted Pi-hole simultaneously block the tracking domains that commercial websites load before the page renders. A VPN encrypts your traffic between your device and the VPN exit node, masking activity from your ISP and concealing your real IP address from sites you visit. For higher-risk activities, Tor — routing traffic through multiple encrypted relay nodes — provides substantially stronger network anonymity at the cost of speed.

At the browser layer: Firefox configured with the arkenfox user.js profile combined with uBlock Origin in medium mode blocks the majority of tracking scripts, fingerprinting attempts, and behavioral collection infrastructure built into commercial sites. Separate browser profiles or containers for different identity contexts — general browsing, accounts you care about, sensitive research — prevent cross-context linkage that would otherwise connect your identities. The behavioral complement matters as much as the technical one: varying your patterns, compartmentalizing your digital contexts, and treating identity as a choice rather than a fixed disclosure.

At the communications layer: Signal for messaging. The Signal protocol provides forward secrecy, sealed sender, and minimal metadata retention. Sealed sender prevents traffic analysis from revealing who is communicating with whom — the attack vector that survives content encryption intact. For email, Protonmail or Tutanota for correspondence where privacy matters, with the recognition that email's structural design makes it difficult to make genuinely private and should be treated accordingly.

At the payments layer: cash for physical transactions where financial surveillance is a concern. Monero for digital transactions requiring financial privacy — the only major cryptocurrency with mandatory privacy by default, providing transaction-graph confidentiality that Bitcoin's transparent public ledger cannot. The 80/20 principle applies throughout: the configuration above eliminates the majority of commercial behavioral surveillance for most threat models with a few hours of initial setup. You do not need to become a security professional. You need to stop being the easiest person in the room to profile.

08 / 10
Structural Alternatives

Platform
Cooperativism

The counter-surveillance stack addresses the symptom: it limits how much data the existing platforms can extract from you. Platform cooperativism addresses the structure: it changes who owns the platform and therefore changes whose interests the data serves.

A platform cooperative is a digital platform owned and democratically governed by the people who use and work on it. The structural implication for data is immediate. In a conventional platform, behavioral data is extracted from users and sold to third parties because the platform is owned by shareholders whose returns require it. In a cooperative, the users are the shareholders. There are no third parties to sell data to — selling users' data to outsiders would require a membership vote to authorize. Data governance is not a unilateral terms-of-service decision. It is a democratic question subject to the will of the people whose data it is.

The examples are working and growing. Stocksy United is a photographer-owned stock photography cooperative that distributes the majority of its revenue to member creators. Up&Go is a worker-owned home cleaning platform routing clients to worker-owners rather than capturing labor arbitrage. Resonate is a musician-owned streaming service paying substantially higher per-stream rates because it does not generate shareholder returns. Driver's Seat is a driver-owned data cooperative that collects drivers' own behavioral data and returns it to them as operational intelligence — the kind of route efficiency and surge pattern data that conventional platforms collect and use against the drivers rather than for them.

The federated model offers a different architectural approach. ActivityPub — the protocol underlying Mastodon, PeerTube, and Pixelfed — enables a federated social internet: a network of independently owned and governed servers that interoperate without requiring centralized control. No single corporation owns the whole graph. No single entity can monetize the aggregate behavioral stream. Each server sets its own data governance policies. The user's social presence is not hostage to a platform's business model decisions, because the infrastructure is distributed rather than concentrated.

Choosing a cooperative or federated platform is not equivalent to choosing a slightly different app. It is a claim about infrastructure: about who should own the systems through which social life is organized, and what those owners' obligations to the people using the system should be. The surveillance economy's data extraction rests on a specific answer to that question. Platform cooperativism is the counter-claim — and unlike legislation or protest, it builds its argument by existing.

09 / 10
The Inner Dimension

Cognitive
Sovereignty

The counter-surveillance stack protects your data. It does not protect your mind. That is a different project — and in some respects the more fundamental one, because everything you do with your data flows from what you have been shaped to think, want, and believe.

Attention is the resource the surveillance economy monetizes first. Every design decision — the infinite scroll, the notification badge, the algorithmic feed that serves the content most likely to trigger the next click — is optimized to capture and hold attention, because attention is what gets sold to advertisers and what generates the behavioral data that makes the profiles valuable. But before attention can be sold, it must be shaped. The system does not just capture your attention for inventory. It uses that attention to curate the information you receive, which shapes your beliefs, which modifies your subsequent behavior, which generates more data, which improves the targeting. Extraction and modification are inseparable in the design.

What distraction costs is not only time. Research on sustained versus fragmented attention is consistent: single-tasking, deep focus produces qualitatively different thinking than interrupt-driven, stimulus-responsive browsing. Long-form reading, extended problem-solving, patient deliberation — these are cognitive modes that the attention economy systematically degrades, because degraded deliberation produces more behavioral data and is more susceptible to targeted influence. The shallowing of thought is the feature, not a regrettable side effect.

Reclaiming attention is not asceticism. It is maintenance of a cognitive faculty that will determine the quality of every other decision you make. You exercise the body because it matters what condition the body is in when you need it. The same logic applies to the mind's capacity for deliberate, sustained thought. Long-form reading, single-tasking, designed periods of genuine disconnection — these are not productivity strategies. They are the practices through which the attention faculty is kept in working order against a system that has a financial interest in its degradation.

The most invasive form of capture in the surveillance economy is not the data profile. The data profile knows what you have done. The manufactured desire is prior to action — it shapes what you want to do before you act. An opinion that feels like yours but was optimized into existence through months of curated information exposure is more invasive than any data breach, because it is inside the deliberative process rather than a record of its outputs. Cognitive sovereignty is the recognition that the contents of your attention are political — and that defending them is as important as defending your data, because they are the instrument with which you will decide everything else.

10 / 10
Synthesis

The Invisible
Insurgency

The exit from the surveillance economy is not dramatic. It does not require a manifesto, a movement, or a confrontation. It requires a series of defaults changed, habits shifted, and choices made with the awareness that infrastructure is not neutral — and that what you normalize through your daily behavior you endorse through your daily behavior.

The individual exit is meaningful and insufficient simultaneously. Meaningful because it reduces your personal exposure to behavioral extraction, protects your cognitive sovereignty, and removes your data from the supply chain that funds the industry's expansion. Insufficient because surveillance capitalism is a collective action problem: the system's power comes from network effects, and individual exits do not by themselves degrade the model. The behavioral profiles that train the targeting algorithms become more accurate, not less, as the population of non-exiters remains large. Individual exit is necessary but not terminal.

The collective dimension is therefore essential. Communities that build shared infrastructure — mesh networks routing around the commercial ISP layer, community-run federated servers providing social and communication infrastructure outside the platform ecosystem, cooperative economic structures removing the behavioral extraction incentive from the institutional design — are not just protecting individual members. They are building the alternative infrastructure that makes it possible for others to exit. Each community server, each cooperative platform, each encrypted local network is a node in a parallel architecture that routes around the surveillance layer entirely. This is not metaphor. This is the technical and social mechanism of the counter-economy working at the infrastructure level.

This is the cypherpunk inheritance in its most concrete form. The people who designed and built PGP, Tor, Signal, and Monero spent years on code that would never make them wealthy because they understood that privacy is not a preference. Privacy is the precondition for free thought, free assembly, free markets, and free speech. Without the ability to act without observation, every other freedom is conditional — exercisable only to the extent that the observer permits. They built the tools because the tools needed to exist. They did not wait for political permission or commercial validation.

The invisible insurgency looks like this: changing your browser settings on a Tuesday afternoon. Switching your messenger. Paying with cash. Moving your social presence to a server your community controls. Reading long-form work instead of feeds. These are not heroic acts. They are, cumulatively, the practice of refusing to live inside a system designed to convert your life into someone else's raw material. And in refusing, you join a tradition that predates the surveillance economy and will outlast it — of people who understood that sovereignty is not granted. It is built, quietly, in the choices of every ordinary day.