CHAPTER 14
CHAPTER 14
DATE-02.12.2025
Society as the Other-One: Understanding Weiser’s Warning and the Rise of Instrumentarian Power
Introduction: What This Passage Is Really Saying
The passage argues that Mark Weiser—the pioneer of “ubiquitous computing”—saw something far deeper than just a future filled with many computers. He sensed a new form of power emerging from the presence of countless interconnected devices around us. This power is not classical totalitarianism (like the Stalinist or fascist models of the 20th century) but something far more subtle, pervasive, and unprecedented.
This new power, later called instrumentarian power by Shoshana Zuboff, has the capacity to reshape society because it works through data, prediction, and behaviour shaping rather than fear or force.
Ubiquitous Computing: What Weiser Really Envisioned
Key idea: Weiser imagined a world where computers are everywhere—in every object, every room, every space—working silently in the background.
What does this mean in simple terms?
Weiser predicted that computers would become so common and so embedded in daily life that we would stop noticing them.
Not one computer on your desk—
but hundreds around you in your home, workplace, car, city.
Real-world examples today
-
Your phone tracks your steps, sleep, screen time.
-
Smart speakers (Alexa, Google Home) listen for commands—and sometimes more.
-
CCTV cameras with facial recognition track bodies and movements.
-
Smart TVs report what you watch and for how long.
-
GPS in cars tracks driving behaviour.
-
Shopping apps monitor what you browse and what you buy.
These devices quietly observe you and send data to powerful institutions.
What Weiser sensed
If all these devices can sense your presence, collect your data, and communicate with each other, then they create an infrastructure that is more powerful than any surveillance in the past.
Making Old Totalitarianism Look Like “Anarchy”
Key idea: Classical totalitarianism controlled your public behaviour.
Instrumentarian power can shape your internal choices.
Why did Weiser say totalitarianism would look like "sheer anarchy"?
Because older dictatorships used crude methods:
-
police,
-
spies,
-
censorship,
-
propaganda.
But imagine a system that knows:
-
what you think,
-
what you fear,
-
what you desire,
-
how likely you are to act in a certain way.
And it learns this from your devices—without needing fear, force, or prisons.
Practical manifestations
-
TikTok and Instagram algorithms shape what people find attractive, desirable, or dangerous—without users realizing the manipulation.
-
Political micro-targeting (e.g., Cambridge Analytica) influences voters based on psychological profiles.
-
E-commerce nudges push users into impulsive purchases using behavioural data.
-
Insurance apps adjust premiums based on your driving or health data, pressuring you to behave in certain ways.
-
City surveillance systems automatically detect “unusual behaviour,” influencing how people move and act.
Such systems influence behaviour softly and continuously, not violently and openly.
Instrumentarian Power: A New, Unprecedented Force
Key idea: Instrumentarian power does not want to conquer your mind or ideology.
It wants to predict and modify your behaviour for profit or control.
What makes it new?
Unlike totalitarian regimes that imposed political ideology, instrumentarian power operates through:
-
massive data collection,
-
algorithmic predictions,
-
behavioural nudges,
-
automated environments.
It doesn’t need to make you fear the government.
It just needs to shape your choices without you noticing.
Real-world examples
-
Google Maps predicts your route and nudges you to certain paths (influencing city traffic).
-
Netflix predicts what you will watch next and designs screens to keep you watching.
-
Online shopping sites personalize prices and offers based on your psychological weaknesses.
-
Smart city sensors predict crowd behaviour and send automated policing alerts.
In each case, your future behaviour becomes something to be measured, predicted, and steered.
Society as the “Other-One”: How This Power Redefines Social Life
Key idea: When machines observe everything, society becomes a mirror that constantly watches and shapes individuals.
What this means
The passage suggests that society itself becomes the “other”—
not a community of people, but an instrumental environment that reacts to your behaviour and influences it.
Practical manifestations
-
You modify your behaviour online because you know platforms are “watching.”
-
Young people shape their identities based on algorithmic trends instead of community norms.
-
Social life becomes performance: the self curated for the machine, not for human relationships.
-
Public spaces become behavioural laboratories—where sensors track your footfall, speed, gestures.
Society no longer simply consists of people interacting.
It becomes a data-driven system that classifies, predicts, and adjusts human behaviour.
What Might This Power Have in Store for Us?
Key idea: If this power is stronger than past totalitarianism, its future effects could be enormous.
Possible future manifestations
-
Behavioural futures markets
Companies may trade predictions of your future behaviour like stocks. -
Automated social control
Algorithms may determine:-
who gets loans,
-
who gets jobs,
-
who is flagged as a “risk,”
-
who is denied entry to places.
-
-
Loss of human agency
People may outsource choices to machines—
what to eat,
what to watch,
what to believe,
whom to vote for. -
Normalization of surveillance
Constant tracking becomes an accepted part of life. -
Invisible governance
The real power lies not with elected governments but with data systems that control behaviour subtly.
Conclusion: Why This Passage Matters Today
The passage warns that the computational environment around us is not merely technology—it is a new form of social power.
Unlike totalitarianism, which tried to dominate through coercion, instrumentarian power works silently through:
-
prediction,
-
personalisation,
-
nudging,
-
data-driven shaping of choices.
This makes it harder to detect, harder to resist, and more deeply embedded in everyday life.
Reasoned conclusion
To protect human freedom, societies must recognize that:
-
surveillance is no longer a political tool—it is an economic engine;
-
threats to autonomy don’t come from dictators—they come from systems that track and modify behaviour;
-
democracy must evolve to regulate not just governments, but the algorithmic infrastructures that shape human experience.
Unless we understand this shift, we risk entering a future where the greatest power over society is not political authority but a data-driven, algorithmic force that we neither see nor control.
The Normalization of Surveillance Capitalism: How Society Becomes a Laboratory for Behavioural Control
Introduction: What This Passage Is Warning Us About
The passage argues that ideas once considered disturbing and unacceptable—especially the behavioural conditioning vision imagined by B.F. Skinner—are now becoming normal, even inspiring for today’s technology giants.
Surveillance capitalism, which depends on predicting and controlling human behaviour, has moved beyond the digital world and is now turning society itself into a field for data extraction and behavioural modification.
Skinner’s “Walden Two”: From Rejected Utopia to Tech Industry Blueprint
Key idea: Skinner imagined a perfect society engineered by controlling human behaviour through positive reinforcement.
People in the 1950s were horrified by this idea.
Why was “Walden Two” revulsive?
Because it treated human beings as objects to be shaped, not free individuals making choices.
It assumed that if behaviour can be predicted and controlled, society becomes more efficient.
What the passage says
Today, surveillance capitalism is doing exactly what Skinner imagined—except now it is real, and it is marketed as innovation, efficiency, and personalization.
Real-world manifestations
-
Reward systems in apps (likes, hearts, streaks) condition behaviour like Skinner’s pigeons in a box.
-
Social media feeds reinforce addictive scrolling.
-
Gamified workplaces (Amazon warehouses, delivery platforms) reward employees to push for more speed and compliance.
-
Fitness apps use badges and milestones to condition daily routines.
Skinner’s thought experiment has become a commercial strategy.
Normalization and Habituation: When the Unacceptable Becomes Ordinary
Key idea: Practices once considered invasive now feel normal because we have become used to them.
How does normalization happen?
Gradually.
What begins as discomfort becomes convenience.
What begins as unusual becomes routine.
Practical examples
-
Fifteen years ago, people were shocked that Google scanned emails for advertising. Today it feels normal.
-
Smart speakers at home quietly listening were once frightening; many now see them as harmless helpers.
-
Sharing location data felt unsafe; now millions leave location tracking on permanently.
-
Facial recognition in public spaces seemed dystopian; today it is used in airports, malls, even schools.
Normalization is the silent lubricant of surveillance capitalism.
The “Prediction Imperative”: Why Big Tech Wants Total Information
Key idea: Surveillance capitalism needs to predict human behaviour with certainty to make profits.
Why?
Because companies sell “behavioural futures”—predictions about what people will do next—to advertisers, insurers, political campaigns, and others.
More data → better prediction
Thus the industry constantly pushes toward total information:
-
where you are,
-
what you buy,
-
what you say,
-
how you move,
-
what you feel,
-
who you talk to,
-
how long you pause on a screen.
Practical manifestations
-
Google’s smart city plans (e.g., Sidewalk Labs in Toronto) seek total data on urban life.
-
Meta’s metaverse vision hopes to capture body language, micro-expressions, and emotional cues.
-
AI wearables track heart rate, stress, and even mental states.
The closer they get to total information, the more accurate their behavioural predictions become.
From the Virtual World to the Real World: Expanding the Data Frontier
Key idea: Data extraction is no longer limited to your online activity.
Surveillance capitalism wants to monitor your offline life as well.
How this expansion works
The aim is to turn the entire real world into a data source.
Real-world manifestations
-
Smart homes (lights, ACs, fridges, locks) constantly report user behaviour.
-
Smart cars record driving patterns, routes, and conversations.
-
Retail stores track customer movements with cameras and sensors.
-
Wearables collect health, fitness, and emotional data.
-
Public transport apps record travel histories.
Your entire physical existence becomes data.
The “Reality Business”: Turning Everything Into a Computational Object
Key idea: Surveillance capitalism transforms all aspects of life—people, objects, activities—into things that can be measured and manipulated.
What does “equivalence without equality” mean?
Everything is treated as data—
but not with equal dignity.
Your value is not as a human being, but as a predictable data point.
Examples
-
A person’s worth is reduced to a “credit score.”
-
Job applicants are ranked by algorithms, not by human judgment.
-
Police categorize neighbourhoods by “crime risk scores.”
-
Insurers adjust premiums based on your digital footprint.
-
Social media ranks users by engagement potential, not humanity.
Everyone becomes equivalent as data, but no one is equal as a person.
Annexing Society Itself: The Final Frontier of Data Extraction
Key idea: Once data extraction enters personal behaviours, the next target is society’s fundamental structures.
What gets annexed?
-
friendships,
-
family relations,
-
community ties,
-
civic life,
-
political behaviour,
-
social norms.
Practical manifestations
-
WhatsApp and Facebook influence political polarization and voting patterns.
-
Social media shapes how children form identities and friendships.
-
Dating apps shape relationship trends and marriage patterns.
-
Workplace algorithms influence collaboration and teamwork.
-
Consumer data profiles shape how lenders treat entire communities.
-
Predictive policing shapes how society sees “dangerous” neighbourhoods.
Society becomes a computational field: every relation is rendered, measured, predicted, and nudged.
Conclusion: What This Passage Ultimately Warns Us About
The passage warns that we are entering a stage where surveillance capitalism no longer just observes individuals—it tries to shape society itself.
Reasoned conclusion
-
Ideas once rejected as manipulative (like Skinner’s behavioural conditioning) are now embedded in our technologies.
-
We have become accustomed to constant surveillance, mistaking it for convenience.
-
Surveillance capitalism’s hunger for total information pushes it to extract data from every corner of real life.
-
This transforms society into a predictable, controllable environment—an engineered reality, not a free one.
-
If society becomes a computational object, human agency, dignity, and democratic choice weaken.
To protect freedom, we must:
-
recognize how behaviour is being shaped,
-
challenge the normalization of surveillance,
-
demand accountability for data use,
-
and reassert human agency over machine prediction.
Only then can society remain a space for self-determination rather than a behavioural laboratory built for profit.
DATE 02.12.2025 / PAGE 251–52
The Rise of an Instrumentarian Society: How Big Other Seeks Total Coordination and Control
(A clear, simple, nuanced explanation with practical manifestations and reasoned conclusions)
Big Other’s All-Seeing Presence: From Inevitable to All-Controlling
Key idea:
People now accept the presence of “Big Other”—the vast digital surveillance infrastructure—as something unavoidable.
But acceptance is not the final goal.
What is the real aim?
To achieve full visibility and control over social behaviour, interactions, and collective processes so that large corporations can operate at massive scale and influence.
Practical manifestations
-
Social media platforms track not only individuals but how entire communities behave.
-
Navigation apps track city-wide movement to influence traffic flows.
-
Delivery platforms track labour patterns to control supply chains.
-
Smart cities monitor energy use, human movement, and crowd density to shape urban behaviour.
This is not just watching individuals—
it is synchronizing society like a giant algorithm.
Totalitarianism vs. Instrumentarianism: Two Paths to Totality
Key idea:
Both totalitarianism (like 20th-century dictatorships) and instrumentarianism (data-driven behavioural control) aim for “totality,” but for different reasons.
Totalitarianism
-
Goal: political control
-
Method: fear, violence, coercion
-
Objective: obedience to ideology or leader
Instrumentarianism
-
Goal: market dominance
-
Method: data extraction, behavioural prediction, algorithmic control
-
Objective: certainty and profit
Practical distinctions
-
North Korea uses fear to control citizens.
-
Big Tech uses data and algorithms to shape citizen choices invisibly.
-
A dictator threatens punishment;
a platform manipulates through “nudges,” personalized feeds, and predictive models.
Instrumentarianism does not want your soul—
it wants your behavioural patterns.
The Division of Learning: How Corporations Take Control of Knowledge
Key idea:
Instrumentarian power works by controlling who has access to large-scale learning.
What does this mean?
Only Big Tech companies (Google, Meta, Amazon, Baidu) have the:
-
computational power
-
datasets
-
algorithms
-
behavioural models
They learn about society at a depth no government or community can match.
Practical manifestations
-
Google knows global movement patterns via Maps.
-
Meta knows social emotions and political polarizations via Facebook.
-
TikTok understands human attention flows with precision.
-
Amazon predicts consumer desires better than consumers themselves.
This knowledge imbalance becomes a new form of power.
Optimizing Society for Market Goals: A New Kind of Social Engineering
Key idea:
Instrumentarianism reshapes society not for political ideology, but to serve market needs.
What does “societal optimization” mean?
It means designing society—its habits, emotions, behaviours—to produce:
-
more predictability,
-
more engagement,
-
more consumption,
-
more profitable behaviour.
Real-world manifestations
-
Uber directs drivers using algorithmic instructions, optimizing supply for profit.
-
Social media algorithms amplify emotions that keep users scrolling.
-
Smart home devices guide consumption patterns (electricity, entertainment).
-
Health apps nudge people into routines that feed commercial ecosystems.
Society becomes an optimized market machine, not a community of free human beings.
China vs. Surveillance Capitalists: Different Motivations, Similar Tools
Key idea:
The instrumentarian vision looks similar to China’s political surveillance system, but motivations differ.
China’s political elite
-
Uses data for political control and stability.
-
Surveillance is a state project aimed at obedience.
Surveillance capitalists
-
Use data for profit, not political domination.
-
The aim is a market opportunity, not state ideology.
Common features
-
Predictive policing models
-
Real-time surveillance
-
Social behaviour scoring
-
Algorithmic guidance
-
Datafication of daily life
Example
China uses its Social Credit System to enforce obedience.
Silicon Valley uses scoring systems (credit, reputation, engagement) to drive profitable behaviours.
Both reshape society—but toward different ends.
Society Becomes Data: “Equivalence Without Equality”
Key idea:
To surveillance capitalists, every person and every relationship is reduced to behavioural metrics.
Meaning
You and your neighbour become identical data points, even though your lived experiences are profoundly unequal.
Practical manifestations
-
Social media ranking systems don’t distinguish by dignity—just engagement.
-
Predictive policing treats neighbourhoods as risk profiles, ignoring historical injustices.
-
Credit scoring ignores context but determines life opportunities.
-
Employers use algorithmic filters that reduce human beings to data attributes.
Society becomes a predictive spreadsheet, not a human community.
The New Vision: A Society Designed Like a Machine Learning System
Key idea:
Just as industrial society copied the factory, today’s emerging society is copying machine learning models.
What does that mean?
A machine learning system needs:
-
constant data inputs,
-
continuous behavioural feedback,
-
calculated predictions,
-
refined control loops.
Surveillance capitalists want society to work the same way:
humans become the “data,”
algorithms become the “managers,”
and social life becomes a “predictive environment.”
Real-world manifestations
-
Smart cities functioning like autonomous control systems.
-
Supply chains run by predictive models instead of managers.
-
Automated workplaces where algorithms assign tasks.
-
Dating, friendships, and even political opinions shaped by algorithmic feedback loops.
The world becomes a giant, self-adjusting machine.
The Collapse of Social Trust and Social Relations
Key idea:
When machine predictions replace human relationships, society loses its human core.
Big Other replaces trust
Instead of trusting people, society starts trusting data:
-
algorithms judge risk, not communities;
-
machine predictions substitute for human understanding;
-
metrics substitute for relationships.
Practical manifestations
-
Ride-sharing relies on rating systems, not human trust.
-
Online sellers trust algorithmic fraud detection, not customer honesty.
-
Schools use surveillance proctoring instead of trusting students.
-
Employers use keystroke monitoring instead of trusting workers.
Society becomes a place where trust is automated, not earned.
Conclusion: What This Transformation Ultimately Means
The passage delivers a profound warning:
we are moving toward a society where human behaviour, relationships, and even the meaning of social life are reorganized around the needs of machine learning systems and market forces.
Reasoned conclusion
-
Instrumentarianism is not violent, but it is deeply transformative.
-
It reshapes society by data extraction, prediction, and behavioural steering, not by terror.
-
Corporations imagine a future where society functions like a giant algorithm—predictable, coordinated, and profitable.
-
Social trust, values, and relationships are replaced by computational certainty.
-
Society, as we understand it—messy, emotional, unpredictable, deeply human—is at risk of becoming obsolete.
If we fail to recognize this transformation, the future will not be shaped by democratic choice or moral reasoning but by the cold logic of machine learning and market optimization.
The question is no longer whether society will change, but whether humans will remain central to it—or become inputs to a grand computational design.
03.12.2025/PAGE 252
Comments
Post a Comment