Bias in Computing: Popcorn Hacks & Extra Credit

Popcorn Hack #1

An online job recruitment algorithm favored male over equally qualified female candidates for tech jobs. This is an example of Pre-existing Social Bias, as the system had learned from historical hiring data that was gender-biased in tech recruitment. One method to reduce this bias is to audit the training data for patterns of discrimination and alter the algorithm to rank qualifications independent of gender considerations.

Popcorn Hack #2

A money management AI system prefers men candidates because of discriminatory training data.

Balance training data for equal representation of both male and female applicants with similar qualifications and financial histories.

Remove gender-specific attributes (e.g., name or inferred gender) from data so decisions are only based on relevant financial data.

Homework Hack

I view YouTube videos a lot. The algorithm often recommends content that is aligned with my present interests and beliefs, scarcely reflecting differing views. This is an example of Emergent Social Bias because the algorithm learns from the actions of the users and starts developing echo chambers over time.

To counteract this bias, YouTube could introduce a “diverse perspectives” mode that periodically recommends content outside of a viewer’s typical watch history to enhance exposure and reduce filter bubbles.

Extra Credit

AP CSP MCQ Performance Chart

1. Pre-existing Social Bias

Example: Amazon AI biases against hiring men

Why it’s a bias:This is a sign of Pre-existing Social Bias since the system had been educated with past hiring trends which already favored males, especially for tech roles. The algorithm “learned” such bias based on human decisions.

Impact:

Unfair employment practices that shortchange women and create imbalance.

Women are likely to lose a job they were qualified for.

Solution:Modify training data to remove gender markers and reweight hiring outcomes to display equal opportunity, or use blind hiring algorithms that consider just skills.

2. Technical Bias

Example: Google Translate applies gender roles (e.g., “he is a doctor”, “she is a nurse”)

Why it’s a bias:This is a Technical Bias since the system is learning patterns from data structure—here, skewed language use in training data.

Impact:

Strengthen negative gender role stereotypes

Shapes global perception through language

Solution:Train the model on more equitable language sources and have the system updated to produce a number of gender-neutral options in translation.

3. Emergent Social Bias

Example: Tay chatbot picks up hate speech from users

Why it’s a bias: This is Emergent Social Bias—the machine wasn’t originally biased, but learned in real-time from the actions of its users and began to spread offending content.

Impact:

Spreads hate speech, disinformation, and harm

Destroys public confidence in AI systems

Solution:Add content filters and rules of moderation to limit what the bot can learn or replicate, and define moral learning limits.