Good luck getting that information.
If one is in high school right now and one is considering which career path to take I strongly suggest learning how to write and manipulate algorithms. Our world is already deeply influenced by algos, from our social media to our grocery purchases. Their importance in our lives will only grow.
(From Raw Story)
In criminal justice systems, credit markets, employment arenas, higher education admissions processes and even social media networks, data-driven algorithms now drive decision-making in ways that touch our economic, social and civic lives. These software systems rank, classify, associate or filter information, using human-crafted or data-induced rules that allow for consistent treatment across large populations.
But while there may be efficiency gains from these techniques, they can also harbor biases against disadvantaged groups or reinforce structural discrimination. In terms of criminal justice, for example, is it fair to make judgments on an individual’s parole based on statistical tendencies measured across a wide group of people? Could discrimination arise from applying a statistical model developed for one state’s population to another, demographically different population?
You can bet your sweet assets that it could.
This has been a concern of mine for a long time. In another life I worked for a company which employed fairly crude credit algorithms (years and years ago) for important decisions. It (almost) didn’t matter if a customer had been a lifelong customer in good standing. It didn’t matter if the customer always paid on time. What mattered was their credit score in addition to some other minor factors and I saw first hand how this collective algorithmic score didn’t always reflect sensible business decisions. At least in particular cases. The decisions I had problems with did however often make sense within the algorithm and so varying from the system’s suggestions undermined the overall system. As such we went with the system generally. Statistically it made sense. Sure there were outliers who didn’t get a fair shake but we were managing a “population” not individuals.
Now apply that kind of thinking to healthcare, for instance. Yikes. (It is being done on a grand scale these days already.) Have a cold or the flu? You are likely to get good treatment. Have something unusual which hasn’t been accounted for in the algorithm, which falls outside of any statistical relevance and you might be in some trouble.
We need to be more algorithmicly literate and we need to have a long overdue discussion about where algos are taking us.