Lesson 4 of 6 · Digital Literacy

Spread and Algorithms

What you see online is not a neutral window on the world. It has been shaped by systems designed to keep your attention, not inform you accurately.

Introduction

Your feed is curated for you.

Every major platform uses algorithms that decide what content to show, amplify, and recommend. These systems are not designed to show you the most accurate information. They are designed to maximise engagement: time on the platform, clicks, reactions, and shares.

The core dynamic: Emotional content generates more engagement than calm, accurate content. Algorithms reward engagement. So emotional content, including content that makes you angry or afraid, spreads further and faster than balanced reporting.

This is not a conspiracy. It is an incentive structure. But it has real consequences for what most people believe about the world.

Key Concepts

Three ideas to understand

Filter Bubbles

Algorithms learn your preferences and show you more of what you already engage with. Over time, you see less of what challenges your views and more of what confirms them. Your information environment narrows without you noticing.

Viral Spread

A 2018 MIT study found that false news spreads six times faster on Twitter than true news, and reaches far more people. It is more novel and emotionally activating. Accuracy does not determine reach. Emotion does.

Echo Chambers

When most of what you see agrees with what you already think, disagreement starts to feel extreme. Moderate positions become invisible. This is a structural feature of social media, not a personal failing. Knowing this lets you compensate deliberately.

The practical response: You cannot opt out of algorithms, but you can deliberately diversify your sources. Follow accounts with different perspectives. Search for topics directly rather than waiting for the feed to bring them to you. Read past the headline.
Examples

How the feed shapes belief

The anger loop

You click on an outrage-inducing article. The algorithm notes the engagement and shows you more similar content. Over weeks, your feed fills with the most extreme versions of one viewpoint. You begin to assume everyone holds extreme views because that is all you see.

The invisible story

A complex policy story with nuance and multiple perspectives generates fewer clicks than a simplified, emotional version. The nuanced version receives fewer algorithmic recommendations and reaches fewer people. The simplified narrative spreads.

The deliberate reader

Someone actively follows sources they disagree with, searches for topics directly, and pauses before sharing. Their understanding of events is more complete and more accurate, because they chose to see beyond their feed.

Quick Checks

Test your understanding

Answer each question correctly to unlock the next one.

Q1. What is the primary goal of most social media algorithms?
A To show users the most accurate and verified information.
B To show users the most recent content from accounts they follow.
C To maximise user engagement: time on the platform, clicks, and shares.
D To show users a balanced range of perspectives on each topic.
Q2. Why does emotionally charged content tend to spread faster than calm, accurate reporting?
A People prefer emotional content because they find it more informative.
B Emotional content generates more clicks, shares, and reactions. Algorithms reward that engagement by amplifying the content further.
C Emotional content is usually more recent and therefore ranked higher.
D Platforms deliberately promote emotional content for political reasons.
Q3. What is a filter bubble?
A A deliberate attempt by a platform to hide political content.
B A feature that blocks offensive or harmful content.
C A personalised information environment created when algorithms show you increasingly more of what you already engage with, reducing exposure to different viewpoints.
D A setting users can turn on to filter their own feeds.
Q4. You notice your social feed mostly shows content that confirms your views. What is the most useful step to take?
A Switch to a different platform entirely.
B Stop using social media for news.
C Accept it. Algorithmic personalisation is unavoidable.
D Deliberately seek out sources with different perspectives, search for topics directly, and expand the range of accounts and outlets you follow.
Q5. A story is shared by millions of people across multiple platforms. Does this mean the story is likely true?
A Yes. Mass sharing functions as collective fact-checking.
B No. Research shows false information often spreads more widely than true information online. Reach reflects emotional resonance, not accuracy.
C Only if verified accounts are among those sharing it.
D Yes, if the story has been circulating for more than 24 hours.
Mini-Game

Feed Simulator

Feed Simulator

You are shown a feed of posts. Each round, choose which post to engage with. Watch how your choices shape what you see next, and what you miss.

Round: 1 / 4

Simulation complete

Practice Round

Five more questions

Apply what you have learned. Each question unlocks after the previous answer.

Question 1 of 5
A friend says "I see the same kinds of opinions everywhere online, so everyone must think this way." What is the most accurate response?
A Your friend is right: popular views are more visible online.
B Your friend should switch platforms to get a more accurate view.
C Your friend is experiencing a filter bubble. Algorithmic personalisation shows you more of what you engage with, creating the illusion that your information environment represents everyone.
D The observation is valid: social media is a reliable proxy for public opinion.
Question 2 of 5
A nuanced article about a policy issue gets 200 shares. A simplified, emotionally charged version of the same story gets 50,000 shares. What does this tell you?
A The second article is more accurate because it resonated with more people.
B Algorithmic amplification rewards emotional engagement over accuracy. The nuanced version reaches fewer people not because it is worse, but because the platform's incentives work against complexity.
C People are too busy to read long articles, so shorter versions always spread more.
D The first article failed because its author had fewer followers.
Question 3 of 5
What is the most effective way to counteract echo chambers in your own information diet?
A Trust only content from sources you have used for a long time.
B Avoid all social media and rely only on traditional news.
C Engage only with content that has been algorithmically ranked as high quality.
D Actively follow sources that represent different perspectives, search for topics directly, and read beyond your default feed.
Question 4 of 5
A false health rumour spreads to 3 million people in 24 hours. A correction from the health authority reaches 40,000 people. What structural factor explains this?
A The health authority posted the correction too late.
B People distrust official sources.
C Algorithms amplify emotionally engaging content regardless of accuracy. The rumour triggered fear and sharing. The correction was accurate but less emotionally compelling, so the algorithm showed it to far fewer people.
D The correction was written in language too technical for a general audience.
Question 5 of 5
Every time you see an outrage-inducing post on your feed, you click and react. Over time, what effect does this have?
A No effect: algorithms do not track individual post interactions.
B The algorithm learns that outrage-inducing content keeps you engaged and shows you progressively more of it. Your feed narrows toward increasingly extreme emotional content.
C You will eventually be shown a wider range of perspectives as the algorithm tests new content types.
D Your profile is flagged as high-engagement and shown to more advertisers.

Reflection

Think it through

Think about the last time you felt strongly about something you saw on social media. Was the content accurate, or was it emotionally compelling? Did you share it? Would you approach it differently now?

This is just for you. Nothing is saved or submitted.

PreviousLesson 3: Signals