Spin to Win
Big Brother is watching.
Our data is being recorded without our consent and used in ways we cannot control, predict, or sometimes even fathom.
Data and new technologies, however benign they initially seem, are ripe for error and abuse. Kinks take time to find, and laws and oversight are always several steps behind.
Facial recognition technology is one of those creepy frontiers. It's been around for decades but only now seeping into public consciousness due to the pervasive use of cameras, including our own phones. It may seem comforting that governments and law enforcement agencies can use this tech to find wanted or missing persons, but these programs and their algorithms have been proven to misidentify people, especially women and people with darker skin color - what essentially equates to algorithmic bias, an issue found across AI systems in multiple sectors, whether job recruiting, mortgage lending, health care risk-assessment, even courtroom sentencing. Marginalized groups are often the most affected.
It's difficult to prevent data mining, but it’s worth being aware of how it can be misused, whether intentionally or not, and which laws, if any, can regulate the use of tech. Several cities have already banned the use of facial recognition tech. Again it seems harmless, but in the wrong hands it's dangerous. Case in point: the Chinese government's use of facial and gait recognition to track (racially profile) and detain the Uyghur Muslim population, not to mention profiling the rest of the population for its 'social credit' system.
--
Inspired by New York Magazine's article "There Will Be No Turning Back on Facial Recognition: It's not perfect yet, but it's already changing the world"
Our data is being recorded without our consent and used in ways we cannot control, predict, or sometimes even fathom.
Data and new technologies, however benign they initially seem, are ripe for error and abuse. Kinks take time to find, and laws and oversight are always several steps behind.
Facial recognition technology is one of those creepy frontiers. It's been around for decades but only now seeping into public consciousness due to the pervasive use of cameras, including our own phones. It may seem comforting that governments and law enforcement agencies can use this tech to find wanted or missing persons, but these programs and their algorithms have been proven to misidentify people, especially women and people with darker skin color - what essentially equates to algorithmic bias, an issue found across AI systems in multiple sectors, whether job recruiting, mortgage lending, health care risk-assessment, even courtroom sentencing. Marginalized groups are often the most affected.
It's difficult to prevent data mining, but it’s worth being aware of how it can be misused, whether intentionally or not, and which laws, if any, can regulate the use of tech. Several cities have already banned the use of facial recognition tech. Again it seems harmless, but in the wrong hands it's dangerous. Case in point: the Chinese government's use of facial and gait recognition to track (racially profile) and detain the Uyghur Muslim population, not to mention profiling the rest of the population for its 'social credit' system.
--
Inspired by New York Magazine's article "There Will Be No Turning Back on Facial Recognition: It's not perfect yet, but it's already changing the world"