The real threat of Big Tech
In Washington, DC, the policy arguments about Big Tech are not nearly big enough. Politically electric issues like censorship and monopoly tend to dominate the policy conversation. But the Information Age has fundamentally altered how human beings relate, how children develop, how markets function, and how our shared social and political life unfolds. Like the Industrial Revolution did before it, the Information Revolution requires a broad and updated policy vision to govern it. In developing such a vision, policymakers must recall an essential truth that challenges their usual faith in markets to solve problems: market forces alone do not maximize human flourishing, and not every good belongs in a market.
In their book “Civil Economy,” economics professors Luigino Bruni and Stefano Zamagni turn to the Italian philosophical tradition to describe an old but neglected conception of the marketplace: “Actual markets,” they remind us, “are never ethically neutral; they are either civil or uncivil.” By “civil” Bruni and Zamagni do not mean “polite” but civilizing. Markets operate either according to their “ancient, original vocation as an ally of the common good, representing a space for liberty, sociality and the expression of our capabilities…as persons,” and thus are civil—or they do not, instead “compulsively seeking wealth rather than public happiness, and…forgetting and destroying such fundamental economic goods as relational goods, common goods, and gratuitous goods.” Many of the changes the Information Age has wrought have been, in this sense, uncivilized.
Democratic government must play a role in correcting this. Technology leaders would prefer we believe that any harms resulting from their products and practices are the unavoidable cost of progress—that the neutral market has spoken, and the role of government is to get out of its way. This obscures the reality that allowing technological change to unfold in ways that damage individual well-being, community economic thriving, the integrity of our shared social fabric, and our political liberties is a political choice. We can and should make better choices, bearing several key principles in mind.
The first is that a free and flourishing nation must defend its children. Children deserve a protective sphere in which to safely grow and develop, but a growing body of evidence makes clear that the mass migration of childhood social life into the virtual environment of social media has caused them profound harm. This is unacceptable. The virtual world which Big Tech induces children to inhabit is not built for the benefit of children, but to capture their attention and profit from it, at the expense of their well-being. The market has not corrected for this harm. Quite the opposite: it is now effectively beyond the ability of parents to voluntarily remove their children from the reach of social media, and as for children, the product is engineered to be addictive and render them socially dependent on its use. Policy must intervene. We do not usually leave questions of child safety to the market to define and determine, or sacrifice children for the sake of “innovation” and “growth,” nor should we in this case.
Public policy must also protect the right of persons to enjoy a private life to which neither the state nor the market has access. This is a matter of personal privacy, certainly, but also of autonomy. The massive amount of personal data available to corporations (and in many cases to governments) presents frightening opportunities to manipulate both belief and behavior. This use of personal data, especially when it has not been consensually and affirmative disclosed for such a purpose, is unethical and dangerous. Yet the provision of such data, while in some cases technically “voluntary,” has functionally become the necessary price of entry to both social and professional life in America, rendering true choices difficult. And meaningful consent is difficult to give in an environment in which personal data, once provided, effectively leaves the control of the individual whom it concerns. Functioning democracies should not permit a market in the privacy of its citizens, nor in the personal autonomy of which their privacy is a necessary precondition.
Citizens must be in control of technology, not the other way around. To paraphrase the New Testament, human beings were not made for technology, but technology for human beings. Algorithms can enhance decision-making power in valuable ways, but their use must be subject to human judgement and democratic reflection. Failure to properly oversee the use of algorithms in the pursuit of market efficiency results in unacceptable but avoidable injustices and indignities, as in cases when healthcare algorithms meant to improve administrative efficiency distribute care in a racially discriminatory manner, or hiring algorithms discard qualified but non-traditional job candidates before their application ever sees a human eye.
Finally, the right of nations to democratically make and enforce their laws, protect their citizens, and uphold their values should not be subject to trading in the marketplace. And yet the global technology market ignores the integrity of national prerogatives. American tech companies must not be permitted to sidestep American law and values simply because they believe their business models demand it. Foreign tech companies wishing to conduct business in the United States must be subject to American law written in the interests of its citizens.
There are signs of hope. In recent months, in response to the growing discontent and concern of the American public, U.S. policymakers have increasingly wrestled with larger questions of what governing the Information Age well might entail. They must continue expanding that understanding, until it is big enough to govern Big Tech properly.
Chris Griswold is the Policy Director at American Compass. @Chris_Griz
More on this topic can be found at americancompass.org/technology.