You don’t “join” TikTok communities the way you join a club.
You open the app. It opens a door.
Within minutes you’re walking among strangers you didn’t pick, inside jokes you didn’t learn, arguments you didn’t agree to referee. Someone calls it “my side of TikTok,” like it’s a ridge line you can point to. They’re not being poetic. They’re describing routing.
TikTok treats community as a navigation problem. It doesn’t ask, “Who are your friends?” It asks, “What keeps you here?” Then it builds a world around that answer.
That world can be generous. It can also be corrosive. Same machine, different trail.
What makes TikTok culturally distinct from the “follow your friends” era: it doesn’t build community on your social graph. It builds on an interest graph. When your neighbors are chosen by shared watch patterns, not shared history, you get fast intimacy with slow accountability. You can learn from a stranger for three minutes, feel understood, and never see them again.
Useful. Also fragile.
This piece follows a chain that’s easy to miss while you’re scrolling:
Ranking decides what shows. What shows repeats until it’s normal. Normal hardens into belonging. Belonging becomes who you are. And who you are decides what you trust.
TikTok is our case study because it’s the clearest mainstream example of an interest-first feed. It is not a timeline. It is a terrain generator.
The chain Link to heading
Ranking decides what shows Link to heading
A feed is a decision. There is no “the internet” in your hand. There is a ranked list.
TikTok has been unusually direct about its basic inputs: watch behavior, likes, shares, comments, follows, plus weaker signals like device and location 1 . Wired adds a detail people feel but rarely name: TikTok tests videos with smaller audiences first, then expands distribution if people watch to the end 2 .
Technically, that’s a ranking loop. Culturally, it’s zoning. Ranking decides what is “nearby” in your attention, which means it quietly decides which people and ideas become present as social reality.
Two consequences matter.
First, the default is discovery-heavy. The system introduces you to strangers on purpose 1 . Niche communities form at speed. You can also be routed into a subculture before you have context for what you’re seeing.
Second, early signals are loud. A few minutes of watch time act like a trailhead sign. Once the system thinks you want “that kind of thing,” it keeps offering it. Repetition validates the guess.
McLuhan’s point lands here with less mystery: the medium is ranked selection 11 . It doesn’t just carry culture. It edits culture into view.
Repetition turns content into norms Link to heading
Community is not just shared interest. It’s shared expectations.
How do we talk here? What counts as funny? What gets punished? What gets rewarded?
TikTok answers by repetition. The feed ranks, then repeats. Repetition teaches norms faster than any “community guidelines” page ever will.
The platform’s creative grammar accelerates this. Reuse a sound, stitch a clip, follow a template, swap in your meaning. Participation is easy. Norms become copy-pasteable.
On good days, repetition is apprenticeship. You watch ten variations on a skill. People correct each other. Someone who felt alone finds others who speak the same lived language.
On bad days, repetition becomes mood engineering. The norm is speed. The norm is certainty. The norm is the quick moral verdict delivered like a weather report. Postman warned that entertainment logic can swallow public life 12 . TikTok is not television, but retention logic still steers the tone.
Norms decide who belongs Link to heading
Once norms form, belonging follows. Then exclusion.
A good community lets you be wrong in public and still come back tomorrow. Online, the penalty for being wrong can be instant exile, or instant fame. Both warp behavior. People either self-censor into beige or perform certainty like it’s oxygen.
Online belonging arrives fast. That’s convenient. It also means you don’t have to earn trust, and you don’t have to practice repair. You just have to perform the local vibe.
Legibility becomes a gate. You compress yourself into something the system can read and the community can approve.
Belonging hardens into identity Link to heading
Once you belong, you start performing.
TikTok turns identity into naming convention: BookTok, CleanTok, FinanceTok, TeacherTok. Labels help people find each other. Labels also tell people what to be consistent about.
Identity online is partly self-expression and partly optimization. You post. The feed responds. You adjust. The feedback is immediate and public.
This is where the platform shifts from showing you a community to shaping you for it. Not through command but through reward. Content matching the neighborhood’s norms gets surfaced. Content violating them gets ignored, mocked, or punished.
Zuboff’s critique applies at the local, observable level: when identity becomes a set of signals for a ranking system, people start optimizing their selfhood for legibility 14 .
That’s also where equity shows up as mechanics. Noble’s work lays out how “neutral” algorithms reproduce and amplify social bias while looking objective 13 . When the system decides what gets seen, it decides what gets normalized. At scale, that becomes culture.
Identity reshapes trust Link to heading
The tired debate is “Do algorithms polarize people?” The useful question is “What kind of trust does this environment produce?”
Trust is not agreement. Trust is the sense that other people are real, that they have reasons, and that future contact is possible without humiliation.
TikTok increasingly mediates public reality. Pew’s work shows users encounter news through different pathways, with TikTok users less likely than X or Facebook users to see news articles specifically 6 . Format changes feel. Feel changes interpretation.
Then there’s emotional temperature. A 2025 Science experiment reranked X feeds to reduce exposure to partisan animosity, and measured a shift in how warmly people felt toward the other side 8 . Different platform, same mechanism: ranking tunes feelings.
Pew reports trust in social media information hovers around the “one-third” range 7 . Whatever the exact number, many people now treat the information environment as hostile terrain.
Hostile terrain does not grow thick social bonds. It grows brittle tribes.
What we can do about it Link to heading
If you accept the chain (ranking → norms → identity → trust), interventions stop looking like moral lectures and start looking like engineering. The goal is not to remove recommender systems. The goal is to make community-shaping infrastructure contestable, steerable, and less obsessed with compulsive motion.
Offer real choice, not a placebo. The EU Digital Services Act requires very large platforms to provide at least one recommender option not based on profiling 9 . “Choice” only counts if alternatives are usable, understandable, and presented as equals. A buried toggle is not a choice.
Show people the shape of their feed. Give users a readable neighborhood view: what clusters they’re in, what topics dominate, what shifts when they act. Content controls are fine. Structural visibility is better.
Break topic lock-in on purpose. If a user signals “less of this,” the system should respond quickly, not eventually. TikTok’s preference tools like Manage Topics and keyword filtering are a start 3 . Credible reset and cooldown modes are the next step.
Make pile-ons less profitable. Public shaming is a retention engine. Treat it like one. Platforms can detect dogpiling and harassment dynamics and reduce distribution accordingly. If that feels political, good. Community governance is political.
Fund independent audits, then tolerate the results. The DSA includes provisions for vetted researcher access to platform data 9 , and the European Commission has published guidance on how that access should work 10 . If systems shape culture, society needs the ability to measure that influence without asking the system’s owner for permission every time.
Move metrics closer to human outcomes. Watch time is easy to optimize. It is also culturally blunt. If a platform claims it supports community, it should track community health: regret, harassment rates, concentration of exposure, satisfaction that holds up a week later. Some growth is just extraction with better lighting.
A final orientation Link to heading
When someone says “TikTok showed me my people,” they’re telling the truth. They’re leaving out the agent.
The agent is a ranking system that decided which people would feel real to you today.
The sharp questions are not new. They are old questions wearing new infrastructure:
What does the system make easy? What does it make unlikely? Who gets to contest the defaults?
Every community technology answers these questions, usually by not asking them. The printing press answered them. The telephone answered them. The shopping mall answered them. Now the feed answers them, at scale, in real time, while you scroll.
The difference is speed and opacity. The feed decides faster than you notice. The logic is proprietary. The effects accumulate before they become legible.
Friction would help. Not friction that blocks, but friction that reveals. Surfaces that show you the shape of your own attention before it hardens into identity. Pauses that let you ask: Is this where I want to be?
We are not going to abolish recommender systems. We are going to live inside them. The question is whether we build the tools and the habits that let us see the walls.
Sources
- How TikTok recommends videos #ForYou TikTok Newsroom (2020) 1
- TikTok Finally Explains How the ‘For You’ Algorithm Works Laura Matsakis, WIRED (2020) 2
- More ways to discover new content and creators you love TikTok Newsroom (2025) 3
- TikTok rolls out AI-powered Smart Keyword Filters to limit content you don’t want to see TechCrunch (2025) 4
- TikTok’s recommendations skewed towards Republican content during the 2024 U.S. presidential race Ibrahim et al., arXiv (2025) 5
- How Americans get news on TikTok, X, Facebook and Instagram Pew Research Center (2024) 6
- How Americans’ trust in information from news organizations and social media sites has changed over time Pew Research Center (2025) 7
- Reranking partisan animosity in algorithmic social media feeds alters affective polarization Piccardi et al., Science (2025) 8
- Regulation (EU) 2022/2065 (Digital Services Act) EUR-Lex (2022) 9
- FAQs: DSA data access for researchers European Commission (2025) 10
- Understanding Media: The Extensions of Man Marshall McLuhan (1964) 11
- Amusing Ourselves to Death Neil Postman (1985) 12
- Algorithms of Oppression: How Search Engines Reinforce Racism Safiya Umoja Noble, NYU Press (2018) 13
- The Age of Surveillance Capitalism Shoshana Zuboff (2019) 14
