AI and Ethics: A Mini-Unit for High School Lesson Set Two

This is the second set of three lessons in a six-lesson mini unit on the basics of how artificial intelligence/machine learning works and the ethics involved with artificial intelligence. In this lesson set, students will learn about socio-technical systems, algorithms, and ethics as they redesign the recommender algorithm to make it less biased and more ethical. Students will then look at how they use AI in their own lives and devise a plan to use it ethically.

Author: Sarah Horen
Grade Level: 9-12
Materials: Website, Works best with Google Chrome and Google Education Suite; Youtube.com
Tag: WySTACK

National Standards Alignment

csta 3A-AP-13 3A-AP-14 3A-AP-15 3A-AP-21 3A-IC-24 3A-IC-25 3A-IC-26
iste ISTE-1c ISTE-1d ISTE-2a ISTE-2b ISTE-4a ISTE-4b ISTE-5c ISTE-5d ISTE-6c ISTE-6d

OVERVIEW

Activity Overview:

This is the second set of three lessons in a six-lesson mini unit on the basics of how artificial intelligence/machine learning works and the ethics involved with artificial intelligence. In this lesson set, students will learn about socio-technical systems, algorithms, and ethics as they redesign the recommender algorithm to make it less biased and more ethical. Students will then look at how they use AI in their own lives and devise a plan to use it ethically.

Meta description

  • Subject Area: Computer Science, Technology, Advisement classes. Lessons can also be adapted to other content areas.
  • Grade Level : 9-12
  • Computer Science Domains:
    • Algorithms and Programming
    • Impacts of Computing
  • Computer Science Principles:
    • Fostering an Inclusive Computing Culture
    • Developing and Using Abstractions
    • Communicating About Computing
  • Materials:
    • Website, Works best with Google Chrome and Google Education Suite; Youtube.com
  • Considerations:
    • Educators will need a good general understanding of computer science concepts and how machine learning works in order to answer student questions that might come up during the lesson. They should test the links to slide decks, student access to YouTube, and other AI tools students will use as some require a Google account. Slide decks can be converted to PowerPoint if needed.

Lesson Plan

Overview

This is the second set of three lessons in a six-lesson mini unit on the basics of how artificial intelligence/machine learning works and the ethics involved with artificial intelligence. In this lesson set, students will learn about socio-technical systems, algorithms, and ethics as they redesign the recommender algorithm to make it less biased and more ethical. Students will then look at how they use AI in their own lives and devise a plan to use it ethically.

ASSESSMENT PRE/POST-TEST

I did not do these correctly and instead did an assessment at the end of each lesson.

Lesson Four: Have students fill out an exit ticket where they write three to five sentences reflecting on the ethical implications of the YouTube recommender algorithms. How have they seen these implications reflected in their lives or the lives of others?

Lesson Five: Each group will turn in their worksheets for their new algorithm.

Lesson Six: Have students look back on the paragraph they wrote at the beginning of class on how they use AI in their daily life. Based on what they have learned during this mini unit, have them choose one area of their life where they could improve their use of AI. Have students craft a SMART (specific, measurable, achievable, relevant, and time-bound) goal to improve their use of AI in that area in order to become a more responsible and ethical AI user.

OBJECTIVES

Lesson Four:

  1. Understand that all technical systems are socio-technical systems. Understand that socio-technical systems are not neutral sources of information. 1A. Understand the term “optimization” and recognize that humans decide the goals of the sociotechnical systems they create. 1B. Reason about the goals of socio-technical systems in everyday life and distinguish advertised goals from true goals (for example, the YouTube recommendation algorithm aims to make profit for the company, while it is advertised as a way to entertain users). 1C. Map features in existing socio-technical systems to identified goals.

Lesson Five:

  1. Recognize there are many stakeholders in a given socio-technical system and that the system can affect these stakeholders differentially. 1A. Identify relevant stakeholders in a socio-technical system. 1B. Justify why an individual stakeholder is concerned about the outcome of a socio-technical system. 1C. Identify values an individual stakeholder has in a socio-technical system, e.g. explain what goals the system should hold in order to meet the needs of a user. 1D. Construct an ethical matrix around a socio-technical system.

Lesson Six:

  1. Be able to define bias and ethics and explain how they relate to artificial intelligence. 1A. Provide real-world examples of each and explain why ethical issues are present. 1B. Define various ways to use AI ethically in life. 1C. Know how to identify bias and other ethical issues in AI algorithms. 1D. Be able to define ways to ethically use AI.

CATCH/HOOK

Lesson Four: Once class has started, ask the students the following questions: How many of you have used YouTube before? How do you decide which YouTube videos to watch?

Lesson Five: Upon arrival, students brainstorm ways they could change the YouTube recommender to better serve their friend group. This can be done on paper or, if seated in groups, students could discuss with their tablemates.

Lesson Six: Students will do a quick write to answer the question “How do you use AI in your daily life?”

ACTIVITY INSTRUCTIONS

Lesson Four:

  1. Tell students that today we are going to consider socio-technical systems (put the diagram from Geeks for Geeks on the SmartBoard). On a very basic level, a socio-technical system can be referred to as the mixture of people and technology. It is the study of how technology is used and produced and looks at people, software, hardware, data, and laws/regulations. It is important to note that socio-technical systems always have a goal in mind and are not neutral sources of information. There are often many stakeholders in a given system (think program developers, users of differing backgrounds/needs, etc.) and these types of systems can impact them in different ways.
  2. Next tell students we are going to use YouTube as a socio-technical system and examine its recommender function. Utilize the MIT AI Ethics Education Curriculum section on how algorithms can have various motives and goals (starting on page 94) to lead students through an activity examining the algorithms of the YouTubes recommender.
  3. Lead the group in a short discussion about the possible ethical impacts of the existing algorithms of the recommender, they will continue this train of thought on their exit ticket. Are the recommender algorithms ethical or not? How can they perpetuate the spread of misinformation? How do they exclude groups of people?

Lesson Five:

  1. Remind the class that last time you met, the group looked at the goals of the YouTube recommender and considered what impacts it has on stakeholders. Tell them that today they are going to redesign the YouTube recommender to better serve a group of stakeholders they select.
  2. Put students into groups or let them select a group to work with. Using the slide deck from DAILy Workshop in the Redesign YouTube activity and the worksheets from the MIT AI Ethics curriculum, walk students through designing their own version of the YouTube Recommender, pausing as you go along to give the groups time to make decisions and fill out their sheets.
  3. Have a few groups share their new algorithms!

Lesson Six:

  1. Quickly review all the concepts covered during this mini unit – what is AI, what is supervised machine learning, bias in algorithms, socio-technical systems, and what AI ethics are/why they matter.
  2. Ask students to think about how they interact with AI in their daily lives. Have them write down examples of the interactions they have, then have students evaluate whether or not the AI they encounter is biased or unethical in any way.
  3. In small groups, have students consider how they are using AI in their daily lives. Have each group outline three ways that students can be ethical users/creators of artificial intelligence.
  4. Have each group share out the ways they suggested for students to ethically use/create AI. Consider gathering the suggestions and posting them somewhere for the group to reference as needed.

Supplements

Any items in this section are the property & under the license of their respective owners.

REVIEW

Lesson Four: Review the term/concept of a socio-technical system, remind students that these systems are not bias free and usually have some sort of motive or goal. Review the main facets of AI ethics.

Lesson Five: Review the concept of a socio-technical system, and why stakeholders/users are an important part of designing these systems.

Lesson Six: Review all the concepts covered during this mini unit – what is AI, what is supervised machine learning, bias in algorithms, socio-technical systems, and what AI ethics are/why they matter (this is completed in the first step of the lesson).

STANDARDS

TypeListing
CS DomainsAlgorithms and Programming, Impacts of Computing
CS PrinciplesFostering an Inclusive Computing Culture, Developing and Using Abstractions, Communicating About Computing