Lesson Plan
Bias Busters Plan
Students will define algorithmic bias, identify unfair AI examples, and propose fairness improvements through collaborative analysis and reflection.
Understanding algorithmic bias enhances critical thinking and digital literacy, empowering students to recognize and challenge unfairness in technology.
Audience
5th Grade
Time
45 minutes
Approach
Interactive scenarios, group exploration, and discussion
Materials
Fair or Foul AI, - Algorithm Stories, - Sticky Notes, - Chart Paper, and - Markers
Prep
Prepare Materials
10 minutes
- Review Fair or Foul AI slide deck
- Print copies of Algorithm Stories
- Gather sticky notes, chart paper, and markers
- Set up projector for the slide deck
Step 1
Warm-Up: Fair or Foul AI
5 minutes
- Display sample AI scenarios from Fair or Foul AI
- Ask students to vote whether each outcome is fair or unfair
- Invite 2–3 students to share quick reasons for their votes
Step 2
Group Exploration: Algorithm Stories
15 minutes
- Divide students into small groups of 3–4
- Provide each group with Algorithm Stories
- Instruct groups to read and highlight examples of bias
- Have students record each bias example on a sticky note
Step 3
Discussion: Fairness Forum
15 minutes
- Reconvene whole class and display chart paper
- Invite each group to place their sticky-note examples on the paper
- Facilitate discussion: Why did this bias occur? What impact does it have?
- Record student ideas for solutions around the chart
Step 4
Assessment: Bias Fix Proposals
10 minutes
- Ask each student to write a brief reflection:
- One bias they observed
- One suggestion to make the AI fairer
- Collect reflections to assess understanding and suggestions
use Lenny to create lessons.
No credit card needed
Slide Deck
Fair or Foul AI
Welcome! In this activity, we’ll explore three AI scenarios. For each one, decide: Is it fair, or is it foul?
Introduce the lesson. Explain that today we’ll look at real AI examples and decide if they’re fair or foul. Encourage curiosity and respectful discussion.
What Is Algorithmic Bias?
• Algorithms are sets of instructions that help computers make decisions.
• Bias happens when these instructions treat some people unfairly.
• We want to spot bias and learn how to fix it.
Define algorithmic bias in kid‐friendly terms. Emphasize that computers can learn bad habits from people.
Why Fairness Matters
• Unfair AI can hurt feelings or stop people from getting help.
• Fair AI treats everyone equally, no matter their background.
• We all deserve tools that are honest and fair.
Discuss why fairness in AI matters. Relate to everyday experiences like games or homework apps.
Scenario 1: Face Recognition Fun
A new app uses a camera to identify classmates’ faces. It works well on some students but fails more often on students with darker skin, showing the wrong name.
Read the scenario aloud. Ask students to show thumbs-up for “fair” or thumbs-down for “foul.” Record responses on the board.
Scenario 2: Playtime Preference
An AI suggests activities at recess. It recommends basketball for boys and dance for girls, even when interests differ.
After votes, ask 2–3 students why they voted fair or foul. Guide them toward understanding training data issues.
Scenario 3: Homework Helper
An AI divides chores and study tasks. It gives harder assignments to students from certain neighborhoods, assuming they need more discipline.
Encourage students to think about stereotypes. Prompt ideas for how to make the app learn from everyone’s interests.
Key Takeaways
• AI can copy human bias if we’re not careful.
• Spotting unfairness helps us ask better questions.
• We can make AI fairer by using good data and listening to everyone’s voice.
Summarize the discussion. Reinforce that fair AI must be tested and improved to treat all people equally.
Reading
Algorithm Stories
In these short stories, you’ll read about different algorithms—special computer instructions that help make decisions. Notice how each one can treat people fairly or unfairly. As you read, look for examples of bias and think about how to make each algorithm fairer.
Story 1: The Book Buddy Algorithm
Every morning, Maya logs into her reading app called Book Buddy. It recommends new books based on what it thinks she likes. Maya notices it always suggests mystery books written by authors from one country. She loves adventure stories from around the world, but the app never shows them.
Why does this happen? Book Buddy learned from a small set of reading lists that mostly featured authors from one place. Because of this, it keeps suggesting similar books and ignores diverse voices Maya might enjoy.
Story 2: The Quiz Wizard
Leo uses Quiz Wizard to practice math and reading. One day, he tries the reading section about everyday life situations. The app asks questions about things like dialing rotary phones or mailing letters—things most kids today don’t know.
The algorithm was trained on old textbooks, so it forgets that technology and daily life change. Leo feels confused and frustrated because the questions don’t match his world.
Story 3: The Sports Coach App
Jamal and Sofia both try the Sports Coach App to find new games. Jamal sees recommendations for basketball, football, and skateboarding. Sofia sees tennis, ballet, and swimming. Neither app asked them what they liked first.
The algorithm used past data showing boys played certain sports and girls played others. By following these patterns, it assumes what each child will enjoy without asking them directly.
Reflection Questions
- Name one bias you noticed in one of the stories.
- Why could this bias be unfair or harmful to someone?
- What is one idea you have to make the algorithm fairer?
- Which story did you relate to the most, and why?
Use your answers in our Fairness Forum to share examples and solutions with classmates.
Discussion
Fairness Forum
In this whole‐class discussion, students share examples of bias they discovered and work together to propose and critique solutions. Use this guide to keep the conversation focused, respectful, and productive.
Purpose
• Celebrate discoveries from Algorithm Stories
• Deepen understanding of why bias happens and how it affects people
• Brainstorm realistic ways to make AI fairer
Discussion Setup (15 minutes)
- Display chart paper covered with each group’s sticky‐note examples.
- Ensure markers are ready for students to jot down key ideas.
- Remind everyone of our discussion guidelines.
Discussion Guidelines
- Listen respectfully; wait your turn to speak.
- Refer to specific examples (“In Story 2, the Quiz Wizard…”).
- Ask questions if you need clarity.
- Build on each other’s ideas.
Core Discussion Questions
- What bias did your group find?
• Why do you think it happened?
- How might this bias affect real people?
• Who could be hurt or left out?
- What suggestion did your group make to fix it?
• Could we ask users more questions, add new data, or change the rules?
- Which solution seems most practical, and why?
• How easy is it to test or update?
- Can you think of another app or game that might have the same bias?
• How would you investigate it?
Follow-Up Prompts
- “Can someone restate Maria’s idea in their own words?”
- “What might happen if we only change the training data?”
- “How could we check that our fix really works?”
- “What challenges might come up when we try this solution?”
Recording & Next Steps
- As students share, write key points around each group’s sticky notes.
- Highlight common themes or unique ideas.
- At the end, summarize 2–3 top strategies for fairer AI.
- Connect back to our Bias Busters Plan to choose one idea for deeper exploration in future lessons.
Reflection Exit Ticket (5 minutes)
Ask each student to write on a new sticky note:
- One new insight they gained today.
- One question they still have about AI fairness.
Collect these as students leave to guide our next session.