Like you, we at DFC have no time for Luddites: technology is here to stay, and it’s important for us to grapple with what kind of effect it will have on us, rather than sticking our heads in the sand and hoping it goes away.
This is easy to do when examples of said technology are obvious and ridiculous — I have been dining out on Juicero jokes for months now. But there are plenty of other insidious interventions that are either too scary to look at directly, or too well-hidden by the bad actors behind them.
I include in this category the fallout from Big Data — the general term referring to use of predictive analytics or other rigid algorithms to crunch masses of information. The results often have an impact on human lives, in ways that automated processes can’t take into account. For example, a faceless algorithm’s understanding of who it thinks you are online can funnel ads and news to you that excludes alternate viewpoints: editing (or virtually censoring!) the world before you can make a decision about it. I’ve put on my reading list mathematician Cathy O’Neil’s book Weapons of Math Destruction, and have been researching in preparation for diving in. What I’ve found worries me:
“Like the dark financial arts employed in the run-up to the 2008 financial crisis, the Big Data algorithms that sort us into piles of “worthy” and “unworthy” are mostly opaque and unregulated, not to mention generated (and used) by large multinational firms with huge lobbying power to keep it that way. ‘The discriminatory and even predatory way in which algorithms are being used in everything from our school system to the criminal justice system is really a silent financial crisis,’ says O’Neil. […]
Indeed, O’Neil writes that WMDs punish the poor especially, since ‘they are engineered to evaluate large numbers of people. They specialize in bulk. They are cheap. That’s part of their appeal.’ Whereas the poor engage more with faceless educators and employers, ‘the wealthy, by contrast, often benefit from personal input. A white-shoe law firm or an exclusive prep school will lean far more on recommendations and face-to-face interviews than a fast-food chain or a cash-strapped urban school district. The privileged… are processed more by people, the masses by machines.’”
The supposed impartiality that Big Data dangles in front of us flawed humans is definitely attractive. It’s attractive because it’s aspirational; we are flawed. But we can’t forget that it’s precisely our nature to use or interpret Big Data in ways that are biased or prejudiced. Our reliance on technology doesn’t absolve us of moral responsibility, to the people we know directly, or to the greater society. We’re all in this together… You can’t say that about algorithms!