We now know some of the unintended consequences of planetary-scale AI systems like Facebook and YouTube, and how the engineers and executives who conceived those systems failed to appreciate the ways the products they built could be misused, exploited, and games. Most of these systems, I believe, were not intentionally designed to create harm. Instead, I think their founders and engineers were idealists who thought that having good intentions mattered more than producing good outcomes.
We all inhabit this new regime of digital data, but we don't all experience it in the same way. What made my family's experience endurable was the access to information, discretionary time, and self-determination that professional middle-class people often take for granted.
In his famous novel, 1984, George Orwell got one thing wrong. Big Brother is not watching you, he's watching us. Most people are targeted for digital scrutiny as members of social groups, not as individuals. People of color, migrants, unpopular religious groups, sexual minorities, the poor, and other oppressed and exploited populations bear a much higher burden of monitoring and tracking than advantaged groups.