Design a site like this with WordPress.com
Get started

Coded Bias Reflection

During her first semester, the main character was astonished by the fact that the machine learning algorithms were unable to recognize black faces while white faces were recognizable with the algorithm she used. Upon further exploration, she recalls the failed experiment of Amazon’s AI for resume filtration which resulted in the rejection of all women applying for big tech jobs. On a separate note, a CEO of Citibank and his wife upon applying for the same loan amount, having all of their assets shared and income as the same, the male received 10 times the amount his wife received.

All of these following examples just identify that algorithms are already having pre-existing biases due to the fact that they are not intelligent.

“When you think of A.I., it’s forward-looking,” she says. “But A.I. is based on data, and data is a reflection of our history.”

Machine learning is primarily focused on using training data sets in order to allow the program to establish pattern recognitions. By giving the machine learning program heavily skewed datasets, the algorithm tends to then depict real world scenarios rather than give a fair judgement for the matter being decided. To clarify such, the datasets used for Amazon’s AI depicted the fact that merely little woman exist in big tech companies thus the AI reflected such values upon application of the program; hence, all woman were rejected.

The notion that the algorithms are black boxes is true and worrisome. I feel like that the legislation for oversight of such practices is much slower than the development and application of AI algorithm calculation. Police are deploying facial recognition software without any legal oversight. They abuse the rights of their respective humans without any regard that the AI they use might not be accurate. Anyways, it tends to depict a dystopian future whereas everyone’s movements are under surveillance while big tech tend to exploit these algorithms on humans to make more money.

Moreover, the movie tends to resonate with my day to day experience. Upon entering AUC, the gates track my entry point with extensive checks. Thereafter, the cameras are everywhere which track your movement. They do deploy facial recognition software since a lot of my friends tend to have smoking fines without a security guard present. In Egypt, the use of CCTV is yet limited and the application of AI is ambiguous and unchecked if present.

Upon reading Engine Failure: Safiya Umoja Noble and Sarah T. Roberts on the Problems of Platform Capitalism, I truly agree with the notion that search engines have failed to deliver quality results. They tend to prioritize search results that will drive ad revenues and popular SEOs rather than checking whether the results provided are biased or not.

“It also doesn’t often pick up voices in the margins, where people are writing in small fields that are not represented by powerful journals or publishing houses. The metrics used don’t capture all of the ways that knowledge is being created and disseminated.

People tend to assume that results which are at the top of the list mean that they are the most relevant; however, this is purely driven from their access to capital which enhances their SEO (search engine optimization). On a separate note, there is no need to mention there is no filter for false information which is increasingly alarming. For example, prior to the 2016 election, there was no way to filter out the fake news perpetuated by Donald Trump and his supporters. Recently, Facebook has introduced the fact checking feature which allowed people to fact check the content circulated which is good.

Whenever they click some fake news story posted by Macedonian teenagers, Facebook makes money. It doesn’t matter what the content is—so long as the content circulates. Virality and clicks generate revenue for the platform.

Finally, the movies Social Dilemma, Coded Bias, and the aforementioned article have helped me understand the problematic nature of big tech and CCTV surveillance. I have started using ad blockers and similar programs to block data detection and have deleted all the data Facebook has gathered about me.

Monitor, Binary, Binary System, Computer, Binary Code
Monitor Binary System – Free image on Pixabay , Pixabay Licence (free for commercial use, no attributions required)
Advertisement

Published by adhamsherif99

Construction Engineering Student at AUC

One thought on “Coded Bias Reflection

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: