Your digital footprint

Your digital footprint

Objective: 

Learners should understand the concept of the digital footprint, become aware of its risks, and learn specific strategies for reducing it and protecting their online reputation.

Content and methods: 

The worksheet introduces the topic of digital footprints (active/passive) and cookies and highlights the associated social and individual risks. The methods used include a video, a practical self-assessment of one's own digital footprint, analysis of a specialist text on a selected risk, and concluding tips on digital security and reducing one's footprint.

Skills:

  • Acquiring and applying knowledge about active and passive digital footprints
  • Critical reflection on one's own online presence and the potential risks
  • Analysis of the social and individual effects of data misuse
  • Implementation of practical strategies to improve digital security and protect privacy

Target group and level:

Grade 10 and above

GF
HI
IL
JO

56 other teachers use this template

Target group and level

Grade 10 and above

Subjects

EconomicsPoliticsEthics

Your digital footprint

Icon

Introduction

Every click and every “like” leaves traces on the internet. In this worksheet, you will learn what a digital footprint is and what risks it can pose—both for you personally and for society as a whole.

🎬 Watch the video to learn what a digital footprint is.

📝 Task: What does your digital footprint look like? Search online for information about yourself using the following list. Then enter all the information you find in the space provided (e.g., name, email address, place of residence, close social circle, etc.).

1. Google your full name

Search for your full name in quotation marks (“first name last name”) and look at the first two pages of results.

2. Check images and nicknames

Go to the image search for your name and also search for your most commonly used nickname or gamer tag.

3. Social media check (external)

Enter your name in the search bar of the largest social networks where you do not have an account (e.g., Instagram, Facebook).

4. The stranger's perspective

Ask yourself the following question for all results found: What would a stranger learn about me from this?

👉🏽Space for your information

📝Much of the information found probably belongs to your active digital footprint. Think about what information from your passive footprint is likely to be stored online about you. Write down your thoughts.

📌 Today, almost everyone leaves a digital footprint—completely automatically. But why should we even care about this?

Read the text to find out what dangers can arise when digital data is misused or abused.

The Algorithmic Gatekeeper: Digital Data and the Erosion of Equal Opportunity

In our increasingly digitized economy, every online action—from a simple search query to a complex financial transaction—contributes to a vast personal data trail. This accumulation of information forms an individual's "digital footprint," a detailed and dynamic profile of their behaviors, preferences, and associations. Grappling with the implications of this footprint is of paramount economic and social importance. While many dangers, such as privacy violations or consumer manipulation, are widely discussed, one of the most insidious threats is the potential for future discrimination, which systematically undermines the principle of equal opportunity. This form of discrimination does not rely on protected characteristics like race or gender directly, but on data-driven proxies that achieve the same exclusionary outcomes.

The individual and societal risks are profound. On an individual level, the data collected is used to construct predictive profiles that algorithmically score a person's perceived value, reliability, or risk. These scores can become decisive factors in life-altering moments: securing a loan, applying for a job, renting an apartment, or even being accepted into an educational institution. An applicant might be denied a mortgage not because of their financial history, but because their online consumption patterns correlate with a higher default risk in a predictive model. A candidate could be filtered out of a job application process because their social media activity suggests a personality type deemed a poor "cultural fit" by an algorithm. This creates a "data-driven ceiling," an invisible barrier to advancement based on statistical correlations rather than individual merit or potential. The process is often opaque, leaving the individual with no clear reason for the rejection and no recourse for appeal.

At the societal level, this practice threatens to create a new form of digital caste system, reinforcing and amplifying existing inequalities. Algorithms are trained on historical data, which inevitably reflects past and present societal biases. If a certain demographic has historically been underserved by financial institutions, a credit-scoring algorithm trained on this data will learn to associate that demographic with higher risk. This creates a self-perpetuating cycle of disadvantage, a phenomenon known as algorithmic bias. The result is the erosion of social mobility and the entrenchment of a stratified society where opportunity is allocated based on predictive scores rather than actual achievement. This undermines the very foundation of a meritocratic economy and can lead to widespread disenfranchisement and a loss of trust in core economic institutions.

This danger is not a distant dystopia; its precursors and parallels are well-documented. The historical practice of "redlining" in the United States, where banks systematically denied services to residents of certain, often minority-inhabited, neighborhoods, serves as a powerful analogue. Today, this is being replicated in digital form. China's Social Credit System is a state-driven example where data from various sources is aggregated to score citizens, affecting their ability to travel, get loans, or access public services. In Western economies, automated systems are already used in hiring to screen résumés and in the justice system to assess recidivism risk, often with demonstrably biased outcomes against marginalized groups.

The advent of Artificial Intelligence (AI) acts as a powerful accelerant for this risk. Before AI, the sheer volume of "Big Data" was a limiting factor. Now, machine learning algorithms can analyze immense datasets in real-time, identifying subtle and often non-intuitive correlations that a human analyst would miss. The complexity of these models, particularly deep learning networks, often results in a "black box" problem, where even the developers cannot fully explain why the system made a particular decision. This transforms the risk from one of simple data collection to one of automated, scalable, and inscrutable judgment. The danger is no longer just that our data is being collected, but that it is being fed into autonomous systems that are increasingly becoming the gatekeepers of economic and social opportunity, operating with a logic that is both invisible and potentially unjust.

Sources:

  • O'Neil, C. (2016). Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. Crown Publishers. Link
  • Council on Foreign Relations. (2022). China’s Social Credit System. Link
  • Angwin, J., Larson, J., Mattu, S., & Kirchner, L. (2016). Machine Bias. ProPublica. Link
  • Brookings Institution. (2019). Algorithmic bias detection and mitigation: Best practices and policies to reduce consumer harms. Link

💭 Vocabulary work

📌 Here is a list of tips to help you reduce your digital footprint and improve your online security:

✅ Digital Footprint Checklist

I. 📲 App and Device Settings (Passive Footprint)

Check Location Services (Disable for all apps that do not require GPS. Select "While using the app" option.)

Review Camera & Microphone Access (Revoke access for unnecessary apps like weather apps.)

Manage Contacts and Calendar Access (Allow access only for essential apps like Messenger.)

Adjust Tracking Identifiers (Enable "Limit App Tracking" on iOS, reset/disable ad personalization on Android.)

Remove Metadata (Turn off location tagging in your camera app.)

II. 🌐 Browser and Surfing Behavior (Passive Footprint)

Block Third-party Cookies (Select this setting in your browser.)

Use Tracking Protection (Enable browser extensions like uBlock Origin.)

Utilize Incognito Mode (Use for quick searches or on shared devices.)

Delete Search History (Regularly clear your search and browser history.)

III. 💬 Social Media (Active Footprint & Reputation)

Set Profile to Private (Ensure only confirmed followers can see your content.)

Control Tags (Require manual approval for tags by others.)

Deactivate Activity Status (Turn off visibility of your last online activity.)

Review Posts Before Sending (Consider if it’s suitable for future bosses or family.)

Turn Off Advertising Personalization (Disable personalized ads in social network settings.)

IV. 🔒 Digital Security

Delete Old Accounts (Search for inactive accounts and delete or deactivate them.)

Secure Passwords (Use complex and unique passwords for each service.)

Enable Two-factor Authentication (2FA) (Activate for important services like email and social media.)

Check Your Online Reputation (Google yourself and review public search results.)

Request Content Deletion (Report harmful content posted by others and request removal.)