6/16 – Coded Bias Documentary

From this week’s readings on biopolitics, data colonialism, and biometric surveillance and tracking, I immediately thought of the documentary that came out on Netflix last year, “Coded Bias.” The documentary exposes how human interaction with technology (data) has taught technology (data) the same biases that are found in our society. The documentary focuses on facial recognition technology or biometric identity tracking, and the biases that are built into the use and accuracy of the algorithms used in this type of tracking or surveillance. I have posted the trailer below, but if you have Netflix I highly suggest you watch the full piece. At one point in the documentary, the discuss the William Gibson quote, “the future is already here—it’s just not very evenly distributed.” Whereas some people use this quote to imply that people in power, people with money are living in a future that the rest of us are trying to catch up to, in the documentary they argue that it is minority groups and the oppressed that are living in this future (the Orwellian Big Brother future).  They (we) are the ones being hyper-tracked, where power is being wielded through data science.  People in power, people with money, are being elevated with these advances in technology, while the other groups are being punished through its use.  It is also a catch-22 that we are placed in (and that we are placing ourselves in), in order to fully participate in society, you have to allow some amount of data to be collected about you.  Working hard to not let data be collected about you, also means that the data that is “teaching” technology is also only coming from a select group of people.


9 thoughts on “6/16 – Coded Bias Documentary

  1. Stephanie Thomson

    This is so interesting, thanks for sharing! I’ve long followed the discussion about how the data we feed into algorithms is biased and therefore the results that get spat out are as well. Although it’s been an issue for a long time, that still doesn’t mean progress is being made fast enough (e.g. just last month Google showcased a new dermatology app that doesn’t work very well on people with dark skin; my mind actually boggles at how this type of thing is still happening – we have, quite rightly, had so much bad press about this type of issue, to the point that it’s now something that’s been incorporated into our individual bi-annual reviews, and yet we’re still falling short). I read this great piece from the Economist just today that showed even some of the easiest things to fix when it comes to algorithmic biases (such as gender stereotypes in search image results) have yet to be tackled: https://www.economist.com/graphic-detail/2021/06/05/demographic-skews-in-training-data-create-algorithmic-errors

    But I’ve never thought about this other angle you bring up: the idea of being hyper-tracked, of the technology actually working even more effectively on those not in a position of power but in incredibly nefarious ways. I don’t think this angle is ever really discussed when it comes to the debate about the need for regulation and ethical AI boards, etc., so thank you so much for bringing this up – I’ll definitely be watching this documentary!

  2. Ryan Yaffee

    The trailer made the idea you share far too real, so thank you for sharing it. Throughout history, there is constantly a hierarchy system through class and visual looks of a person. Today, as we advance in technology, the people making and designing these technologies are higher up on the hierarchy and would like to keep thier place. Therefore, when they make advancements, they create them to continue these biases. However, I am not shocked because the people who make these technologies are biased, therefore, they create bias. What scares me the most is the policing systems know these machines are biased and continue to use it to police and discriminate. I feel it is not only the maker at fault, but the people that use the machine to discriminate are just as at fault

  3. Carolle Pinkerton

    Wow, now I’m quite interested in seeing this film. I think that this begs the question: what will be the data revolution in the next generation that fights for a more equitable system of data collection? I think that it the future, in order for a new technology to become implemented, there will be a process whereby the biases are tested by a panel that did not create the technology in order to catch possible issues. Then it would get its stamp of approval or disproval saying that it passed or failed the bias test.

  4. Nick Schiff

    A good first principle, I think, is to not trust anything the company tells you about what it’s doing, especially if it is engaged in public relations.

    For example, “In May 2019 Sundar Pichai, chief executive of Google, wrote in The [New York] Times of his corporations’s commitment to the principle that ‘privacy cannot be a luxury good.’ Five months later Google contractors were found offering $5 gift cards to homeless people of color in an Atlanta park in return for a facial scan.” (https://www.nytimes.com/2020/01/24/opinion/sunday/surveillance-capitalism.html)

    Thanks to these filmmakers, journalists, and researchers, we have some real information about these companies’ actual practices.

  5. Amanda Filchock (she/her)

    This documentary has been on my watch list for a while now and I’m bumping it to the top! The quote, “the future is already here—it’s just not very evenly distributed”, and your multi explanations on it are very interesting. I watched “The Social Dilemma” on Netflix and it was wild to hear that former and current tech execs won’t let their children have access to social media and severely restrict their own usage. One former exec created a new Chrome extension to help undo the damaging data collection work he was hired to do in the first place. They know the terrifying impacts at stake here. This makes me think that maybe that quote could additionally be spun to say that it’s these people who have worked or continue to work in tech and have cut themselves off from social media that are living in the future which is a safe space without these dangerous data collection practices. We’re all living in the past because we’re all still active in this danger zone and not as aware of the effects.

  6. Yohanna M M Roa

    Thanks for this post, I had not seen the documentary. It is interesting that you put the use of biometric technology in a political-economic way. It is quite frightening, the use of our physical characteristics, to manipulate to produce money, and that once again social situations take course, depending on economic resources. Personally, I try to resist, but it is getting more and more complicated because the entire social system is heading towards that place.

  7. Catherine Winograd (she/her)

    I think there is no use trying to resist technology, but it’s worth fighting to ensure that they are as unbiased and fair to all as possible. It was appalling to see the MIT student forced to wear a white mask in order to have the computers properly recognize her–that is shameful. I think it’s a reflection of the problems engendered by having a small and non-diverse group create programs for the general population without properly testing them first against a heterogeneous sample.

  8. Kelly Hammond

    I’d recommend, too, Ruha Benjamin’s brilliant “Race After Technology.” In it, she gets at (among other things) the gross injustices manifested by surveillance technology that, as you note Melinda, has propelled people of color into an Orwellian future. She traces the problems around the globe, and she gives careful attention to how the digital world not only allows for encoded bias, but for the rapid replication of that bias. For example, code libraries that are shared liberally by coders (such as facial recognition code) take their bias with them. Some of you may have heard of Sophia, the AI robot who was the first to be cast in a movie. Plans to mass-replicate her means replication of her as a model of (white) feminine beauty as well.



  9. Camila Santander

    Hi Melinda, thank you so much for sharing this documentary! I will definitely watch it in the near future. It is incredibly sad how something that should be doing public good can be so damaging for communities because of biases, that should not exist.

Comments are closed.