Module 8: Developing skills related to digital ethics

Aim of the module

The aim of this module is to equip educators with the essential knowledge and practical tools to promote ethical digital behaviour in educational contexts. It addresses key challenges such as privacy, consent, misinformation, and AI ethics. By fostering critical thinking and ethical awareness, the module enables educators to guide learners in becoming responsible digital citizens.

Learning outcomes

After the completion of this module, participants will gain the knowledge and skills as listed below:

Understand the fundamental principles of digital ethics, including privacy, consent, fairness, and accountability, and their application in digital environments.

Recognise and analyse ethical dilemmas related to digital tools, online interactions, and emerging technologies like Artificial Intelligence (AI) and big data analytics.

Learn about global frameworks and standards, such as the GDPR and UNESCO’s Guidelines on Digital Citizenship, and their implications for educational practices.

Identify the potential risks associated with digital platforms, such as misinformation, cyberbullying, and data breaches.

Develop strategies to teach digital ethics, fostering critical thinking, responsible decision-making, and respect for others in online spaces.

Design and implement interactive activities, including case studies and role-playing scenarios, to engage students in discussions on ethical issues.

Evaluate and adjust classroom practices and policies to promote ethical use of technology and digital tools.

Facilitate the creation of a digital code of conduct with students, encouraging collective responsibility and respect in online environments.

By achieving these learning outcomes, participants will be well-prepared to address the ethical challenges of the digital age and guide their students toward becoming responsible and ethical digital citizens.

List of Topics

This topic introduces fundamental ethical principles that guide responsible behaviour in digital spaces. It explores key concepts such as privacy, consent, fairness, and accountability, emphasising their role in education. Educators learn how to foster a culture of respect and ethical digital citizenship by addressing challenges like data protection, misinformation, and online interactions.

This topic delves into the risks and best practices associated with data protection and cybersecurity in educational settings. It equips educators with tools to implement strong security measures, protect digital identities, and guide students in managing their digital footprints.

This topic examines the ethical dilemmas arising from advancements in AI, digital surveillance, and automation in education. Educators learn to critically assess technology’s impact on students and society, fostering responsible innovation.

Objective, Key Concepts, Skills to Develop

You will learn about key principles of digital ethics including privacy, consent, fairness, and accountability. You will be able to identify ethical challenges, evaluate digital risks, and teach responsible online behaviour. You will develop the skills to promote digital citizenship and support learners in making ethical decisions in digital environments.

Guide for Learning

Explore the module by starting with real-life scenarios and reflecting on key ethical principles. Engage with case studies, group activities, and guided questions to deepen your understanding and apply the content to your educational context. Use the suggested activities for classroom practice.

Motivating Case Study

Developing a Digital Code of Conduct

At Drinkwater Elementary School, students faced challenges with inappropriate digital behaviour, such as oversharing and disrespectful comments. Inspired by the YouTube video “Drinkwater Elem Digital Code of Conduct: BE SMARTER”, an educator collaborated with students to create a Digital Code of Conduct. This initiative emphasised ethical online behaviour, respectful communication, and shared accountability, fostering a positive digital environment.

Objective

To empower students to:

  • Understand ethical digital behaviour and respectful online communication.
  • Collaborate in developing guidelines for responsible technology use.
  • Apply critical thinking to resolve online conflicts and promote digital citizenship.

Implementation

Students participated in discussions, analysed examples from the video, and collaboratively drafted rules. Through group activities and reflections, they created a meaningful code to ensure ethical digital practices.

Expected Outcomes

  • Students will identify and apply the principles of ethical digital conduct, including privacy, respectful communication, and accountability in online interactions.
  • Students will work collaboratively to create a Digital Code of Conduct and critically evaluate online scenarios to resolve conflicts and promote responsible digital citizenship.
  • Students will demonstrate respect and inclusivity in their digital interactions, contributing to a safer and more positive online environment.

 Drinkwater Elem Digital Code of Conduct: BE SMARTER.

https://youtu.be/F_vxmpE0M7w?si=S9U0krkuQdyIAX6Q

Reflection Questions:

  1. If you could add one more guideline to the Digital Code of Conduct, what would it be and why?
  2. How did the process of creating a Digital Code of Conduct help you recognise the importance of respectful and ethical online behaviour?
  3. What strategies from the Digital Code of Conduct will you personally apply in your daily online interactions, and why?

Topic 1 - Ethical principles in digital environments

You will learn about the foundational ethical principles of privacy, consent, fairness, and accountability in digital environments. You will be able to identify ethical challenges, promote responsible behaviour online, and apply these principles to foster respectful and equitable digital practices. By the end of this topic, you will understand how these principles shape ethical digital decision-making and their impact on digital communities.

This module takes a practical approach, guiding educators to apply ethical principles in real-life scenarios. Through structured activities, case studies, and reflective discussions, participants will explore teaching strategies, design interactive exercises, and promote digital responsibility effectively.

Privacy in digital environments is essential for protecting personal information from misuse. In schools, this means safeguarding student records, digital communications, and other sensitive data. The General Data Protection Regulation (GDPR) requires schools to ensure that personal data is collected, stored, and used responsibly. Educators looking to strengthen their understanding of digital privacy in education can refer to the European Framework for the Digital Competence of Educators (DigCompEdu), which provides guidelines on how teachers can integrate data protection principles into their digital teaching practices. (Link: https://op.europa.eu/en/publication-detail/-/publication/fcc33b68-d581-11e7-a5b9-01aa75ed71a1/language-en)

Teachers play a crucial role in maintaining digital privacy and security. They can protect student data by using encrypted platforms, strong passwords, and secure communication tools. Additionally, it’s important to educate students about digital footprints—the traces they leave online that can have long-term consequences. A great way to promote privacy awareness is through real-life examples and best practices. For instance, educators can guide students in analysing privacy policies of commonly used applications by using 7 Best Practices for Student Data Security and Privacy Compliance, which offers practical strategies for ensuring online safety. (Link: https://innovaresip.com/student-data-privacy-best-practices/).

Objective: To understand the importance of privacy in digital environments and analyse risks in educational settings.
Duration: 40 minutes
Materials Needed: A School Privacy Policy Model (provided in the annex) and access to GDPR resources.
Implementation:

  • Review a school’s privacy policy with students. (See the Annex: School Privacy Policy Model for the full document)
  • Analyse scenarios where student data could be compromised (e.g., oversharing, unsecured platforms).
  • Propose solutions to mitigate these risks.

Expected outcomes:

  • Participants will recognise privacy risks and learn strategies to secure digital data.

Group work: Privacy policy deciphering

Objective: To teach students how to analyse privacy policies and recognise data collection practices.
Duration: 40 minutes
Materials needed: Excerpts from real privacy policies (Google, TikTok, school platforms).

Implementation:

  • Provide students with sections of privacy policies from different platforms.
  • Ask them to identify key information, such as:
  • What data is collected?
  • How is the data used?
  • Who has access to this data?
  • Discuss why companies collect data and how students can adjust settings to protect their privacy.
  • Conclude with a student-led discussion on their comfort level with data collection and what they would like to see changed in privacy policies.

Expected outcomes:

  • Students will develop critical reading skills to understand privacy policies.
  • They will become more conscious of how companies use personal data and learn to adjust privacy settings accordingly.

Consent and informed decision-making

Consent ensures individuals have control over their data by giving explicit permission before it is collected, shared, or used. In education, informed consent is particularly critical when engaging students with digital platforms, third-party tools, or online surveys. Educators must communicate clearly with parents and students about the purposes of data collection and how it will be managed.

Informed decision-making goes beyond legal obligations; it empowers students to understand their rights and responsibilities in digital interactions. For instance, students should be aware that tagging someone in a social media post or sharing a peer’s project requires their explicit approval. Educators can foster this understanding through class discussions and real-life scenarios, helping students recognise the importance of consent in maintaining trust and respect in digital spaces. For further exploration, the article Understanding Consent in a Digital World, available here: https://saferinternet.org.uk/blog/understanding-consent-in-a-digital-world provides practical insights into the concept of digital consent and offers tips on educating students about the importance of informed and respectful data-sharing practices.

Additionally, the resource How to Practice Digital Consent available at https://nofiltr.org/resource/how-to-practice-digital-consent/  offers a comprehensive guide for students and educators on how to engage in digital consent in a thoughtful and responsible manner, covering key strategies for applying consent principles in online settings.

This principle also applies to collaborative projects where data is shared. Educators can set rules for obtaining consent before publishing group work or multimedia presentations. By promoting transparency, educators create an environment where students feel confident about their choices and the security of their data.

Group Discussion: Navigating digital consent

Objective: Help students analyse the importance of informed consent in digital interactions and explore ethical strategies for data protection.

Duration: 30-40 minutes
Materials needed: Printed case study, discussion prompts, flip chart, or notebook.

Implementation:

Discussion Questions:

  • What should a clear consent policy include?
  • How can schools educate parents and students about data collection?
  • What are the risks if consent is not properly obtained?
  • Groups discuss solutions and present their recommendations to the class.

Expected Outcomes:

Peer Learning Activity: Red Light, Green Light: Digital Consent Edition

Objective: Help students differentiate between acceptable and unacceptable data-sharing practices.
Duration: 30 minutes
Materials: Red and green cards (or digital polling tools).

Implementation:

Expected outcomes:

  • Students will be able to identify situations where digital consent is required.
  • They will understand that not all data-sharing is ethical, even if legal.

Fairness and addressing digital inequities

Fairness in digital environments focuses on providing equitable access to technology, ensuring that all students, regardless of socioeconomic background, geographical location, or ability, have equal opportunities to succeed. Digital inequities manifest in various forms, such as lack of internet access, insufficient devices, or limited digital literacy. Addressing these inequities is essential to creating inclusive educational experiences.

Schools can adopt strategies like loaning devices, offering internet subsidies, or providing offline materials to ensure every student can participate in digital learning. Educators, on the other hand, should design lesson plans that cater to diverse needs. For instance, assignments should not solely rely on digital resources if some students have limited access.

Fairness also extends to the use of digital tools and technologies. Educators should evaluate educational software and platforms for biases that may disadvantage certain groups. For example, automated grading systems must be scrutinised to ensure they assess students equitably. By fostering inclusivity and mitigating biases, educators can create a learning environment that reflects the values of equity and justice.

Moreover, fairness involves raising awareness among students about the ethical use of digital tools. Educators can guide students in examining the societal impact of technologies and encourage discussions about how these tools can be designed to benefit all users, including marginalised groups.

For further support, Equitable Access Resources by Digital Promise available at https://digitalpromise.org/digital-equity/  provides practical tools and guidance for educators to ensure fairness in using educational technology, helping create more inclusive and accessible digital learning environments.

Accountability in Digital Practices

Accountability ensures that individuals and organisations take responsibility for their actions in digital environments, fostering trust, transparency, and ethical behaviour. In educational contexts, accountability includes adhering to ethical guidelines, addressing online misconduct, and promoting responsible use of digital tools.

Educators play a crucial role in establishing accountability. By modelling ethical practices, such as respecting student privacy, obtaining consent, and using digital tools responsibly, they set the foundation for a respectful online culture. Additionally, schools should implement clear policies that outline acceptable behaviour, consequences for violations, and reporting mechanisms for issues like cyberbullying or inappropriate content. These guidelines provide a framework for maintaining a safe and ethical digital environment. For further guidance, the Digital Citizenship Education Handbook which is available at https://rm.coe.int/digital-citizenship-education-handbook/168093586f  offers practical tools and strategies to support ethical digital behaviour in schools.

Students also have an important role in maintaining accountability. By participating in the creation of a classroom code of conduct, students gain a sense of ownership and responsibility. This code can include rules for respectful communication, proper use of digital tools, and appropriate online collaboration. Through this collaborative process, students learn to take responsibility for their actions and understand the impact of their behaviour on others.

Common motives behind media manipulation include:

At the institutional level, schools must extend accountability to the adoption and use of digital technologies. Regular audits should be conducted to ensure compliance with ethical standards and the protection of data privacy. Schools should also establish feedback mechanisms to address concerns from students, parents, and educators. These practices reinforce a culture of trust and continuous improvement. Resources like ICT Tools for Improving Transparency and Accountability in Education available at https://www.iiep.unesco.org/en/projects/digital-tools-promote-transparency provide strategies and examples for enhancing accountability through technology.

Finally, educators can enhance accountability by incorporating real-world examples of its presence—or absence—in digital environments. Analysing case studies of data breaches or online misconduct allows students to critically evaluate actions, identify ethical lapses, and propose strategies to prevent similar issues. This approach not only fosters ethical awareness but also empowers students to navigate digital spaces responsibly

Activities

Project-based activity: Tech for good: Designing inclusive tools

Objective: Encourage students to think critically about designing fair digital tools.

Duration: 45 minutes

Materials needed: Brainstorming sheets or digital whiteboards.

Implementation:

Expected outcomes:

  • Students apply digital equity principles in technology design.
  • They consider accessibility and fairness in educational tools.

Scenario -based activity: Disconnected dreams

Objective: Help students understand the impact of digital inequities and explore possible solutions.
Duration: 40 minutes
Materials needed: Fictional scenario of a student facing digital access challenges.

Implementation:

  • Divide students into small discussion groups.
  • Present the following scenario:
  • Maria is a high school student in a rural town with unreliable internet access. Her school requires online assignments and group projects, assuming all students have constant internet access. Maria can only access the internet a few times a week at a community centre 10 kilometres away. She struggles to participate in group projects and submit assignments on time, affecting her grades and confidence.
  • Groups analyse the challenges Maria faces and discuss:
  • How do digital inequities impact students like Maria?
  • What solutions could her school implement to support students with limited internet access?
  • What alternative learning strategies could her teachers adopt to make learning fairer?
  • Each group presents their proposed solutions.
  • Facilitate a whole-class discussion on practical ways to bridge the digital divide.

Expected outcomes:

  • Students will recognise how digital inequities affect academic performance.
  • They will brainstorm realistic solutions for schools and communities to ensure fair digital access.
  • They will develop critical thinking and empathy by exploring the perspectives of students with limited resources.

Case Study: The Viral Post – A Digital Ethics Discussion

Objective: Help students evaluate privacy, consent, fairness, and accountability in online interactions.
Duration: 45 minutes

Materials: Internet access, ethical dilemma worksheets, digital ethics guidelines.

Implementation:

  • Present the Scenario:

Alex, a student, posts a photo of his classmate Taylor at a private school event on social media without permission. The post goes viral, attracting both positive and negative attention. Taylor feels uncomfortable and did not consent to the exposure. The school gets involved, raising ethical concerns about privacy, digital responsibility, and accountability.

  • Divide students into small groups to discuss:

Did Alex violate Taylor’s privacy?

How should accountability be handled in this situation?

What steps could Alex have taken to promote Taylor’s cause ethically?

How should schools address consent and privacy concerns in digital spaces?

  • Each group presents its analysis and suggested solutions.
  • The class discusses best practices for responsible digital behaviour.
  • Students develop a list of digital consent and accountability rules to apply in their own digital interactions.

Expected outcomes:

  • Students recognise privacy and consent violations in real-world scenarios.
  • They develop solutions for ethical digital behaviour.
  • They understand how accountability applies in online spaces.
  • They propose preventative measures for responsible digital interactions.

Examples

Group work: Digital code of conduct– Creating classroom accountability rules

Objective: Engage students in developing rules for responsible digital behaviour.
Duration: 45 minutes
Materials Needed: Chart paper or digital collaboration tools (Google Docs, Padlet).

  • Start with a discussion:
  • What does accountability mean in digital spaces?
  • What are examples of ethical vs. unethical online behaviour?
  • Ask students to work in small groups to draft sections of a classroom digital code of conduct, covering:
  • Respectful communication in online discussions.
  • Consequences for plagiarism or misinformation.
  • Proper use of digital tools and resources.
  • Groups present their rules, and the class finalises an agreed-upon code.
  • Students take ownership of digital responsibility by setting accountability guidelines.
  • They develop a clear understanding of ethical digital behaviour.

Group discussion: The anonymous post

Objective: Teach students the consequences of online anonymity and accountability.
Duration: 30 minutes
Materials needed: Fictional cyberbullying scenario.

Implementation:

  • Present the scenario: A student receives an anonymous, hurtful message on a school platform. The sender believes they won’t get caught because it’s anonymous. The victim reports it to the teacher, but there’s no clear way to trace it.
  • In groups, students discuss:
  • Should anonymous posting be allowed?
  • How can schools balance privacy with accountability?
  • What steps can students take if they witness online bullying?
  • Groups present their solutions and discuss how schools can prevent cyberbullying while ensuring fair digital policies.

Expected outcomes:

  • Students understand how anonymity impacts accountability.
  • They explore ethical ways to report misconduct.

Topic 2 – Data Privacy and Online Security

You will learn about the fundamentals of data privacy and online security, focusing on identifying and mitigating cybersecurity risks, implementing best practices for data protection, and understanding the impact of digital footprints. You will develop skills to teach secure online habits, evaluate privacy tools, and help students manage their digital presence responsibly. By mastering these concepts, you will foster a culture of safety and accountability in digital spaces.

The learning process for Data Privacy and Online Security adopts a hands-on, reflective approach to empower educators and students to navigate the digital world safely. This guide is structured to ensure participants develop a practical understanding of cybersecurity risks, data protection practices, and the management of digital footprints.

Understanding cybersecurity risks

Cybersecurity threats are a growing concern in schools, where student and teacher data must be protected from cyberattacks. Some of the most common risks include phishing attacks, where hackers send fake emails to steal personal information, and malware, which can damage files and systems. Ransomware is another serious threat that locks school data and demands payment for its release, while weak passwords make it easier for cybercriminals to gain access to accounts. A more detailed analysis of these cybersecurity risks and their impact on education can be found in Cybersecurity Challenges in Education (Link: https://nordlayer.com/blog/cybersecurity-challenges-in-education/ ).

To reduce these risks, schools must follow strong cybersecurity practices, such as keeping software updated, using multi-factor authentication (MFA) to protect accounts, and teaching students how to recognise potential cyber threats. Educators should also include cybersecurity awareness in their lessons by using real-life scenarios and interactive activities. For example, students can be given sample phishing emails and asked to identify suspicious details, such as unusual links or urgent language.

Another way to build awareness is by showing educational videos on cybersecurity best practices. The “Pause, Think, and Act” video provides a great starting point, demonstrating practical strategies for staying safe online. This video can be accessed here: (Link: https://youtu.be/JpfEBQn2CjM?si=1133pS0wL_gGXFfi ). By making cybersecurity education a routine part of digital learning, schools can equip students with the skills needed to navigate online spaces securely and responsibly.

Best practices for data protection

Data protection in educational settings involves safeguarding personal and institutional information from loss, theft, or misuse. This requires adherence to data protection laws, such as the General Data Protection Regulation (GDPR), which mandates transparent data handling practices. Schools must ensure that only necessary data is collected, stored securely, and accessed by authorised personnel.

Educators play a critical role in promoting data protection by modelling responsible behaviours. This includes using encrypted communication tools, minimising unnecessary data sharing, and securely storing sensitive information. For example, when assigning digital projects, educators can use platforms with robust privacy policies to protect student submissions.

Educators should also incorporate lessons on data protection into their teaching practices. Students can be taught to create strong, unique passwords, recognise suspicious activities, and utilise privacy settings on digital platforms. By integrating these lessons into their curriculum, schools foster a culture where data protection becomes a shared responsibility. For a deeper understanding of how data protection can be effectively implemented in schools, you can watch Data Protection in Schools – Compliance is a Culture, at  https://youtu.be/klgfbIW_3Fw?si=pR_L1y4hgUzlLwSS which outlines practical strategies for fostering a culture of compliance and trust.

At the institutional level, schools can conduct regular security audits to evaluate the effectiveness of their data protection measures. This ensures compliance with legal standards and builds trust among students, parents, and staff. Additionally, providing ongoing professional development for educators on data protection ensures they stay updated on the latest best practices and technologies.

Digital footprints and their long-term implications

A digital footprint is the record of an individual’s online activities, including social media posts, website visits, and email communications. These traces can be analysed, shared, or exploited, often leaving a permanent mark that may affect future opportunities. For students, understanding their digital footprints is crucial, as seemingly harmless online interactions can have long-lasting consequences.

Educators have a responsibility to teach students about the implications of their digital footprints. This includes helping them understand how their online behaviours reflect their personal and professional identities. For instance, a student’s social media post might impact their chances of gaining admission to a university or securing a job.

Practical lessons on managing digital footprints can include reviewing privacy settings on social media platforms, avoiding oversharing, and evaluating the credibility of the websites they engage with. Teachers can also introduce case studies of individuals whose digital footprints negatively or positively influenced their lives, illustrating the importance of online reputation management. 

By encouraging reflective practices, such as questioning the appropriateness of a post before sharing it, educators help students develop critical thinking skills for managing their digital identities. These lessons extend beyond the classroom, equipping students with the tools they need to navigate digital spaces responsibly.

Activities

Role-play activity: Identifying phishing attacks

Objective: Teach cybersecurity best practices through role-playing real-world phishing scenarios.
Duration: 30-40 minutes
Materials needed: Printed phishing email example, role descriptions, discussion prompts.

Implementation:

Present the scenario:

  • A student receives an email disguised as a school announcement, requesting login credentials for a “mandatory system update.”
  • The email appears legitimate, featuring the school logo and urgent language, but contains suspicious links.

Assign role-playing groups:

  • Student: Decides whether to trust the email or report it.
  • Scammer: Attempts to persuade the student to click the link and enter login details.
  • IT Expert: Explains how to detect phishing attempts and take preventative actions.

Role-playing exercise:

  • Groups act out the scenario, demonstrating common phishing tactics and responses.
  • Participants identify red flags and discuss alternative actions the student could take.

Conclusion & Discussion:

  • How can students identify phishing emails?
  • What security measures can schools implement to prevent cyberattacks?
  • What should students do if they accidentally click on a malicious link?

Expected outcomes:

  • Recognise phishing attempts by identifying warning signs in emails and messages.
  • Understand the importance of reporting cyber threats to school IT departments.
  • Develop cybersecurity awareness to prevent online scams and attacks.
  • Learn best practices for protecting personal and school-related digital information.

Group discussion- cybersecurity threats: What Would You Do?

Objective: Help students recognise cybersecurity threats and discuss solutions. Duration: 40 minutes
Materials needed: Case studies or real-life cybersecurity incidents in schools.

Implementation:

  • Divide students into small discussion groups.
  • Present each group with a different cybersecurity threat scenario, such as:
  • A student receives a suspicious email from an unknown sender asking them to reset their password.
  • A teacher’s computer is infected with ransomware, blocking access to lesson plans.
  • A school Wi-Fi network gets hacked, exposing student login credentials.
  • Each group discusses:
  • What are the potential risks?
  • What steps should be taken immediately?
  • How could the attack have been prevented?
  • Groups present their solutions, followed by a class discussion.
  • Expected outcomes:
    • Students recognise common cybersecurity risks in education.

Scenario-based activity: Cyberattack at school!

Objective: Train students to respond to cybersecurity incidents in real time.
Duration: 45 minutes
Materials needed: A fictional cyberattack scenario affecting a school.

Implementation:

  • Present the following scenario:
  • The school’s grading system has been hacked, and student data is exposed.
  • The IT department has detected malware on multiple school computers.
  • A student’s personal information was leaked due to weak password security.
  • Divide students into crisis response teams:
  • IT Security Team: Investigates the breach.
  • School Administration: Communicates with teachers, students, and parents.
  • Affected Students: React to their data being exposed.
  • Each team develops a response plan and presents their solution to the class.

Expected outcomes:

  • Students gain insight into real-world cybersecurity threats.
  • They develop quick problem-solving skills for managing digital crises.

Individual task: Securing personal data

Objective: Develop skills for protecting personal data by evaluating real-world scenarios and proposing solutions.
Duration: 30-40 minutes
Materials needed: Worksheet with data security scenarios

Scenario 1: The weak password dilemma

Situation:
Emma, a high school student, uses the same password “Emma123” for all her accounts, including her school portal, social media, and email. One day, she receives a notification that someone has attempted to log into her email from an unknown location. She starts worrying that her account has been compromised.

Task:

  • Identify why Emma’s password is weak and what risks she faces.
  • Propose a stronger password and explain why it’s more secure.
  • Suggest best practices for managing passwords safely.

Scenario 2: Suspicious email from a “Teacher”

Situation:
David receives an email that looks like it’s from his teacher, asking him to click on a link to update his student account password. The email includes a school logo and urgent language:
“Your account will be locked in 24 hours if you don’t update your password immediately!”

David is unsure whether this is a legitimate request or a phishing scam.

Task:

  • Identify the warning signs that this might be a phishing attempt.
  • Explain what actions David should take to verify the email’s authenticity.
  • List steps to avoid falling for phishing scams in the future.

Scenario 3: Public Wi-Fi and personal information

Situation:
Sophia is at a café working on a school project. She connects to the free public Wi-Fi and logs into her email and school platform. She then notices an alert saying:
“Your connection may not be secure. Someone may be able to see your information.”

She’s unsure whether her personal data is at risk.

Task:

  • Explain the risks of using public Wi-Fi for logging into personal accounts.
  • Propose solutions that could help Sophia protect her data when using public networks.
  • Identify secure alternatives for accessing sensitive information on-the-go.

Expected outcomes:

  • Participants understand how to create strong passwords and manage them securely.
  • They learn to recognise phishing attempts and avoid online scams.
  • They develop strategies for protecting their data on public networks.

Examples

Securing student data in an online platform

A school adopts an online platform for assignments and grades, but a student’s account is compromised due to phishing, exposing personal data. Recognising the issue, a teacher collaborates with the IT team to address vulnerabilities. They launch a cybersecurity campaign to educate students on secure practices.

Objective

  • To teach students:
  • How to identify and avoid phishing.
  • The importance of strong passwords and multi-factor authentication.
  • Best practices for safeguarding personal information.

Implementation

Teachers host workshops on cybersecurity, students create educational materials, and all accounts adopt stronger security measures. Mock phishing tests evaluate awareness, and surveys assess learning outcomes.

Outcomes

Improved cybersecurity awareness, protected data, and collaborative accountability ensure a secure online environment.

Reflection questions:

Why is it important for students and teachers to take shared responsibility in maintaining cybersecurity in an online learning environment?

Group discussion - "Why does data protection matter?" – Understanding the risks

Objective: Help students and educators understand why protecting data is essential in educational settings.
Duration: 40 minutes
Materials Needed: Real-life examples of data breaches in schools or education sectors.

Implementation:

  • Divide students into small groups and provide each group with a data breach case study, such as:
  • A school’s grading system is hacked, leaking students’ personal records.
  • A teacher’s email account is compromised, exposing private student communication.
  • A student unknowingly shares sensitive school documents on a public platform.
  • Groups analyse the risks, consequences, and preventable measures for each case.
  • Each group presents their insights, followed by a class discussion on the importance of secure data practices.

Expected outcomes:

  • Students understand real-world risks of poor data protection.
  • They learn preventative measures to safeguard personal and institutional data

Role-play activity: “The hacked school system"

Objective: Teach students how to respond to a data breach responsibly.
Duration: 45 minutes
Materials needed: A fictional scenario describing a data breach in a school.

Implementation:

  • Present the scenario:
  • A cyberattack exposes student records, including names, addresses, and grades.
  • The school’s IT team suspects weak passwords or a phishing attack.
  • The principal needs to communicate the breach to parents and students.
  • Assign roles:
  • IT Team: Investigates the breach.
  • School Administration: Manages communication.
  • Students & Parents: React to the news.
  • Groups develop an incident response plan, addressing:
  • What immediate actions should be taken?
  • How should the school inform students and parents?
  • What policies should be changed to prevent future attacks?

Expected outcomes:

  • Students understand how institutions respond to data breaches.
  • They learn the importance of strong security measures and communication. Examples
  1. Case study: The impact of digital footprints on future opportunities

Objective: Help participants understand how online activities shape their future opportunities and develop strategies for managing their digital identities responsibly.
Duration: 40 minutes

Materials needed:

  • Real or fictional examples of social media posts that have influenced individuals’ careers (both positive and negative).
  • Discussion prompts on digital reputation management.
  • Chart paper or digital collaboration tools for brainstorming solutions.

Scenario: A viral social media post with unexpected consequences

Situation:
Alex, a university student, has been actively posting on social media for years. Recently, a tweet he made two years ago—containing an offensive joke—resurfaced and went viral. A company where he applied for an internship revoked his offer, citing concerns about professionalism.

At the same time, another student, Emily, gained attention for her well-documented volunteer work and leadership in online projects. When she applied for a similar internship, the company saw her positive online presence and reached out with an offer. Both cases highlight how digital footprints can impact career opportunities, whether positively or negatively.

Implementation:

  • Discuss the consequences of oversharing online
  • Analyse Alex’s situation:
  • How did his past online activity affect his professional future?
  • Could he have prevented this outcome?
  • Compare it with Emily’s case:
  • How did her digital footprint enhance her career prospects?
  • What types of content contribute to a positive online presence?
  • Evaluate personal digital footprints
  • Participants search their own name online (if comfortable) to see what comes up.
  • Discuss what kind of content might be concerning for future employers.
  • Identify common mistakes that could harm a digital reputation.
  • Propose strategies to manage digital footprints
  • Develop a checklist for responsible social media use.
  • Discuss privacy settings and how to curate content to reflect professional goals.
  • Create a list of positive digital engagement activities, such as:
  • Professional networking on LinkedIn.
  • Showcasing skills or community involvement.
  • Engaging in meaningful discussions instead of controversial topics.
  • Group presentations & whole-class discussion
  • Each group presents their top three digital footprint management strategies.
  • Discuss how employers, universities, and scholarship committees use online searches to evaluate candidates.

Expected outcomes:

  • Participants recognise the long-term impact of their digital presence on career and academic opportunities.
  • They develop strategies to curate a professional and positive online identity.
  • They learn to differentiate between private and public online content, ensuring responsible social media use.

Group work: Digital footprint timeline

Objective: Help students visualise the long-term impact of their digital actions.
Duration: 30 minutes
Materials needed: Large sheets of paper 

Implementation:

  • In groups, students create a timeline of an individual’s digital life, starting from childhood to adulthood, including:
  • Early online activities (first social media account, gaming profiles, school projects).
  • Teen years (sharing memes, participating in online debates, posting photos).
  • Adulthood (job applications, professional social media use).
  • They analyse:
  • Which actions might help or harm their future opportunities?
  • What steps can they take to manage or improve their digital footprint?
  • Groups present their timelines and discuss strategies for maintaining a positive digital presence.

Expected outcomes:

  • Students recognise how digital footprints accumulate over time.
  • They learn strategies for protecting and managing their online identity.

Topic 3- Ethical challenges in emerging technologies

You will learn about the ethical challenges posed by emerging technologies, focusing on understanding AI ethics, addressing algorithmic bias, and balancing surveillance with privacy. You will develop skills to critically evaluate technology’s social implications, design activities fostering ethical awareness, and guide students in navigating technological innovation responsibly. By mastering these concepts, you will promote fairness, accountability, and inclusivity in digital environments.

You will explore the ethical challenges of emerging technologies through a structured, practice-oriented approach. This learning path focuses on understanding the ethical implications of AI, addressing algorithmic bias, and balancing surveillance with privacy. Through critical discussions, hands-on activities, and reflective exercises, you will develop strategies to evaluate and teach responsible use of technology. By mastering these skills, you will empower students to navigate technological advancements ethically and inclusively.

AI ethics and algorithmic bias

Artificial Intelligence (AI) is becoming more common in education, helping automate grading, track student progress, and personalise learning. However, these systems rely on data, and if the data contains bias, AI can produce unfair results. For example, an AI grading system trained mostly on essays from native English speakers may lower the scores of multilingual students, even if their arguments are well-structured. Similarly, AI-powered career counselling tools might suggest STEM careers primarily to male students, reflecting existing societal biases instead of individual student potential.

In classrooms, AI should support fair decision-making rather than reinforce inequalities. Teachers should critically evaluate AI-based tools before integrating them into lessons, ensuring that these systems are transparent and used alongside human judgment. Students should also learn how to recognise and challenge algorithmic bias to become responsible digital citizens. The classroom application of this topic can include exercises where students compare AI-generated and human-made decisions to assess fairness.

For further reading on AI bias in education, see:

“AI in Schools: Pros and Cons” – University of Illinois: https://education.illinois.edu/about/news-events/news/article/2024/10/24/ai-in-schools–pros-and-cons#:~:text=Concerns%20with%20AI%20in%20education,maintaining%20AI%20technologies%20in%20schools.

AI Grading simulation – fair or biased?

Objective: To help students understand how AI grading works and how biases affect fairness.

Implementation:

The teacher introduces a real-world case where an AI grading system was found to be biased, explaining how AI evaluates essays differently than human teachers.

Students are provided with three sample essays graded by AI, along with a teacher assessment rubric.

Students compare AI scores with their own evaluations, noting differences in grading criteria and fairness.

A class discussion follows, where students analyse why AI might favour longer essays, how word choice or grammar complexity affects AI grading, and what criteria should be included to ensure fairer AI assessment.

Expected outcome: Students develop critical thinking skills about fairness in AI and understand the implications of algorithmic bias.

Group discussion: Who’s responsible? Ethical decision-making in AI

Objective: Help students analyse who is accountable when AI systems produce biased or unfair outcomes.
Duration: 40 minutes
Materials needed: Case studies of AI-related ethical dilemmas (e.g., biased grading, facial recognition in schools).

Implementation:

  • Divide students into small discussion groups.
  • Provide real-world AI bias cases, such as:
  • An AI-powered grading tool consistently scores essays from non-native English speakers lower than those from native speakers.
  • A facial recognition system at a school incorrectly flags students from minority backgrounds as “unauthorised” visitors.
  • A career guidance AI recommends STEM fields primarily to male students and humanities to female students.
  • Groups discuss:
  • Who is responsible for these biases?
  • How could AI developers, teachers, or policymakers prevent such issues?
  • What actions should be taken to ensure fairness?
  • Groups present their findings, followed by a whole-class discussion on accountability in AI ethics.

Expected outcomes:

  • Students understand how algorithmic bias affects real-world decisions.
  • They develop critical thinking skills on ethical AI implementation

Balancing surveillance and privacy

Many schools are introducing student monitoring technologies, such as facial recognition for attendance tracking, AI-powered exam proctoring tools, and classroom surveillance cameras. While these technologies aim to increase security, they also raise serious privacy concerns. Some students feel uncomfortable with constant surveillance, while others worry about how their personal data is stored and used.

Schools must find a balance between security and privacy, ensuring that students understand why data is being collected and how it is protected. Implementing ethical data collection policies, informing students and parents about privacy implications, and involving them in discussions about school surveillance policies are essential steps toward fostering transparency. A detailed discussion on the impact of EdTech surveillance in classrooms and on campuses can be explored in the video “EdTech and Surveillance in Classrooms and on Campus” at the following link: https://youtu.be/mAy2kMTsOXM?si=_cpZXUQbF4-1UjYd.

By integrating activities such as debates, case studies, and school-wide surveys, students can voice their opinions, critically evaluate monitoring systems, and contribute to policy decisions that affect their educational environment.

Case study 1: Security vs. Privacy – Where should we draw the line?

Objective: Help students evaluate the benefits and risks of surveillance technologies in schools by analysing real-world scenarios.
Duration: 40 minutes
Materials needed: Pre-written case studies on AI-powered exam monitoring and facial recognition attendance tracking.

Implementation:

Divide students into small discussion groups.

  • Assign each group a surveillance case study:
    • Group 1: AI-powered online exam monitoring.
    • Group 2: Facial recognition attendance tracking.

Case Study 2: AI-Powered online exam monitoring

  • At Greenwood High School, an AI-based remote exam proctoring system began flagging students for “suspicious behaviour”:
    • Lena was marked for frequent eye movement away from the screen.
    • Javier, a bilingual student, was flagged for murmuring questions aloud, misinterpreted as cheating.
    • Mia, a student with ADHD, was flagged for excessive movement, delaying her results.
  • The flagged students underwent manual review, causing stress and delays.

Discussion questions for Group 1:

  • What privacy concerns arise from AI-powered exam monitoring?
  • How might AI bias impact different student groups?
  • Should human oversight be required alongside AI monitoring?
  • How can schools balance security and fairness in remote assessments?

Case study 3: Facial recognition attendance tracking

  • Westwood Academy installed facial recognition cameras to automate attendance.
  • Issues arose:
    • Identical twins were frequently misidentified, marking them absent.
    • Students with darker skin tones reported accuracy issues due to bias.
    • Some students felt uncomfortable being constantly watched, raising privacy concerns.
  • The school administration defended the system, citing efficiency and campus security.

Discussion questions for Group 2:

  • What privacy rights do students have regarding facial recognition?
  • How might algorithmic bias impact attendance tracking?
  • Should students and parents have a say in using this technology?
  • How can schools ensure fair and accurate implementation?

Group presentations & Class discussion:

  • Groups present their findings, highlighting:
    • Ethical concerns with surveillance technologies.
    • Potential security benefits of AI-powered monitoring.
    • Recommendations for balancing privacy and security in schools.
  • The class engages in a whole-group discussion, debating:
    • Which surveillance tools are fair or unfair?
    • How can schools develop ethical policies?
    • What rights should students have regarding privacy?

Expected outcomes:

  • Students analyse the ethical challenges of surveillance technologies.
  • They explore ways to reduce bias and improve fairness in monitoring systems.
  • They understand how to balance security, privacy, and transparency in school settings.

Responsible innovation

AI and emerging technologies offer exciting possibilities, but they must be designed with fairness, inclusivity, and ethical considerations in mind. If AI tools are developed without considering diverse perspectives, they may unintentionally cause harm—such as voice assistants struggling to recognise accents or AI hiring tools favouring certain applicants over others. These issues highlight the need for ethical innovation that prioritises accountability and social responsibility. A comprehensive discussion on AI’s ethical implications and challenges in schools can be found in the University of Illinois article “AI in Schools: Pros and Cons” at the following link: https://education.illinois.edu/about/news-events/news/article/2024/10/24/ai-in-schools–pros-and-cons#:~:text=Concerns%20with%20AI%20in%20education,maintaining%20AI%20technologies%20in%20schools.

Students should be encouraged to think critically about who benefits from AI systems and who might be left out. Classroom discussions should focus on how AI tools can be designed to be fairer and what role accountability plays in ethical innovation. Instead of just learning about AI ethics in theory, students should be actively involved in designing their own AI solutions that address real-world challenges while ensuring fairness and inclusivity.

Group discussion: Balancing innovation and ethics in emerging technologies

Objective: Help students evaluate the ethical implications of emerging technologies and develop responsible guidelines.
Duration: 45-60 minutes
Materials needed: Case study on facial recognition in schools, discussion prompts, chart paper, or digital collaboration tools.

Scenario: Facial recognition in schools

A school district plans to install facial recognition cameras to improve security and automate attendance. However, concerns arise:

  • Privacy risks: Students and parents feel uncomfortable with constant monitoring.
  • Bias and fairness: Research shows potential inaccuracies for certain racial and ethnic groups.
  • Transparency and security: Parents question how student data is stored and who can access it.

Implementation:

  • Discuss potential benefits and risks of facial recognition in schools.
  • Provide real-world examples and ask students to think critically about ethical concerns.
  • Divide students into groups, each representing a stakeholder (students, teachers, parents, school administrators, tech developers).
  • Groups discuss:
    • Privacy violations – Is student consent required?
    • Security vs. surveillance – Does this make schools safer?
    • Bias risks – Could some groups be unfairly affected?
    • Accountability – Who is responsible for data protection?
  • Groups propose fairness, transparency, privacy, and accountability rules for school surveillance policies.
  • Each group presents recommendations, followed by a class discussion on alternative solutions.

Expected outcomes:

Expected outcomes:

Objective: Encourage students to critically assess the societal impact of emerging technologies.
Duration: 40 minutes
Materials needed: Case studies or short articles on new technologies (e.g., AI, VR, gamification).

Technology is advancing rapidly, but not everyone benefits equally. Some innovations solve major challenges, while others create unintended harm or leave certain groups behind. Consider a company developing AI-driven hiring systems. They claim their system eliminates bias, but later, investigations reveal that their training data favoured applicants from privileged backgrounds, leading to discrimination against women and minorities.

Implementation:

Divide students into small groups and assign each group a different technology with ethical concerns, such as:

  • AI-driven hiring systems (impact on diversity and bias).
  • Virtual reality in education (benefits vs. psychological effects).
  • Social media algorithms (engagement vs. misinformation).
  • Self-driving cars (safety vs. job displacement).

Each group discusses:

  • Who benefits from this technology?
  • Who might be harmed or excluded?
  • How can ethical concerns be addressed in its design?

Groups present their analysis, followed by a whole-class discussion on responsible innovation.

Expected outcomes:

  • Students develop critical thinking skills about technology ethics.
  • They learn to assess who gains and who loses in technological advancements.

Examining bias in AI-powered grading tools
At Green Valley High School, teachers adopted an AI-powered grading system to streamline assessments. However, students noticed inconsistencies, with certain groups receiving lower grades despite similar performance. A teacher turned this issue into a learning opportunity, engaging students to analyse how algorithmic bias could impact fairness. Together, they explored AI ethics and developed guidelines for equitable use of such technologies.

Objective
To empower students to:

Understand the ethical implications of AI tools in education.

Critically evaluate algorithmic bias and its societal impact.

Collaborate on strategies to ensure fairness and accountability in AI applications.

Implementation
Students discussed real-world AI bias cases, analysed their school’s grading tool, and proposed improvements. Through collaborative activities and reflections, they developed practical guidelines for ethical AI use, promoting fairness and accountability in digital education.

Reflection questions

  1. If you were tasked with improving AI-powered grading, what guidelines or safeguards would you propose?
  2. What lessons from this case study can be applied to other AI-powered tools, such as hiring algorithms or facial recognition systems?
  3. What ethical responsibilities do developers have when designing AI for educational use?

What measures can be taken to make AI-powered grading more transparent and accountable?

Module Summary

The Digi’Aware Training Module equips educators with essential skills and knowledge to foster ethical digital behaviour among students, preparing them for responsible engagement in digital spaces. The module addresses critical topics such as digital privacy, cybersecurity, misinformation, AI ethics, and the societal impact of emerging technologies. Participants gain the ability to navigate ethical dilemmas, apply global frameworks like GDPR, and teach students to critically analyse and responsibly use digital tools.

Educators will develop both foundational knowledge and practical skills to teach digital ethics effectively. These include understanding core ethical principles, managing cybersecurity risks, addressing algorithmic bias, and promoting digital equity. By achieving these outcomes, educators are empowered to guide students in developing critical thinking and responsible decision-making in digital environments.

Key learning outcomes

Knowledge: Explore digital ethics principles, global standards like GDPR, and the implications of emerging technologies on education.

Skills: Design interactive activities, foster ethical awareness, and implement policies promoting safe and responsible technology use.

Through structured activities, reflective discussions, and real-world case studies, this module provides educators with the tools to shape a culture of digital responsibility in classrooms and beyond.

Module 1: Quiz

Test Your Understanding: Ethical Digital Practices

Glossary of terms

The study and practice of ensuring ethical behaviour, principles, and practices in digital environments, addressing issues such as privacy, consent, fairness, and accountability.

A statement or legal document that outlines how an organisation collects, uses, manages, and protects personal data of its users.

The practise of protecting systems, networks, and data from cyber threats and unauthorised access to ensure safety and integrity.

Unfair outcomes resulting from biases embedded in the algorithms or data sets used in Artificial Intelligence (AI) systems.

The record of an individual’s online activities, including browsing history, social media interactions, and any data shared online.

A legal framework established by the European Union for data protection and privacy, providing guidelines for organisations handling personal data.

The ability to use digital technology and the internet responsibly, safely, and ethically.

The process of designing and implementing technologies that are inclusive, sustainable, and ethical, minimising harm while maximising societal benefits.

Bibliography and references

Funded by the European Union. Views and opinions expressed are however those of the author(s) only and do not necessarily reflect those of the European Union or the European Education and Culture Executive Agency (EACEA). Neither the European Union nor EACEA can be held responsible for them. Project number: 2023-1-NO01-KA220-ADU-000151380 

fr_FRFrench