Journalists and people who think aloud for a living are often invited to gatherings where experts in various fields share what they know. These meetings often operate under Chatham House rules, in which you can write of the ideas presented but not directly quote speakers. At such a gathering this week I was especially struck by the talks on Big Tech, and since Congress is considering various regulatory bills I want to say what I gleaned.
First and most obviously, nobody understands the million current aspects of social media sites. They raise questions ranging from the political (misinformation, disinformation, deliberate polarization, ideological bias) and the technological (hidden data harvesting) to the legal (antitrust law, First Amendment rights) and the moral and ethical (deliberately addicting users, the routine acquisition and selling of private information, pornography). It’s all so big and complex. Mark Zuckerberg, who invented the social-media world we live in, appears to have thrown in the towel and fled to the metaverse, where things will no doubt become even more complex and bizarre. But what he calls a visionary next step looks very much like an escape attempt.
The breakthrough event in public understanding of social-media problems was the congressional testimony, last fall, of Facebook whistleblower Frances Haugen. She said Instagram, owned by Facebook parent Meta, was fully aware it was damaging the mental health of children and teenagers. She had proof, internal documents showing Instagram knew of studies showing increased suicidal thoughts and eating disorders among young girls who used the site. Big Tech had failed what Google, at the turn of this century, famously took as its motto: “Don’t be evil.” That wouldn’t seem the most demanding mission, yet they all failed.
One thing that was strange and unreal about her celebrated testimony is that it was a revelation of what everybody already knew. Professionals in the field knew, think-tank observers knew, Big Tech knew it had addictive properties, they were put there deliberately to be addictive. It was part of the business model. Attentive parents knew as they watched their kids scroll. Ms. Haugen spoke of what she called “little feedback loops” in which “likes and comments and reshares” trigger “hits of dopamine to your friends so they will create more content.” But now at least everyone else knows.
The difficulty at the heart of all Big Tech debate is how hard it is to get the facts, and how the facts keep changing. Transparency and disclosure are urgently required—how much information is being gathered about you each day, to whom is it sold, and for what purpose? The social-media sites don’t want to tell you, or tell each other. The nature of the beast is opaque and fluid. How do you audit an algorithm? It’s a moving river changing all the time. And the algorithms are proprietary. But constructive regulation must be based on clear information.
I asked a speaker if I was thinking correctly when I imagine algorithms: I see them as a series of waves, not necessarily in sequence, different in size, pushing my small skiff in this direction or that. No, she said, the algorithm isn’t the wave, it’s the water. It’s the thing on which you sail. To go to a site is to choose to cast off.
Another speaker: When we speak of the internet we speak of “privacy rights.” Companies are taking information they glean from your use of tech and without your permission selling it for purposes that aren’t fully clear. This violates your privacy, but there’s another way to look at it. Many of the devices you carry with you are pinging out exactly where you are. They know you got out of a car at 23rd and M. But your current location should belong to you. It is a private property issue when someone takes it from you. Because you belong to you. Making it an issue of property rights makes things clearer.
No one among the experts or participants had faith in Congress’s ability to understand adequately or to move in a knowing and constructive way to curb Big Tech. The previous hearings have shown how out of their depth they are. The heads of Big Tech had been hauled in a few years ago and were supposed to break out in a sweat under heavy grilling, but they were pressed on petty irrelevancies and sucked up to, along the lines of: You started your business in a garage—only in America! Does Facebook charge for membership? No, Senator, we’re totally free! Why doesn’t my page load? The hearings were a signal moment—the stakes were high and the inventors of Big Tech walked out more arrogant than ever. Because now they knew their opposition, their supposed regulators—the people’s representatives!—were uninformed, almost determinedly so, and shallow. Big Tech had hired every lobbying shop in Washington, made generous contributions to organizations and candidates.
We’ll see what happens on Capitol Hill. It would probably be best for America’s worried parents to assume the cavalry isn’t coming and take matters into their hands.
A participant suggested an at least partial solution that doesn’t require technological sophistication and could be done with quick and huge public support.
Why can’t we put a strict age limit on using social-media sites: You have to be 18 to join TikTok, Youtube, Instagram? Why not? You’re not allowed to drink at 14 or drive at 12; you can’t vote at 15. Isn’t there a public interest here?
Applying such control would empower parents who face “all the other kids are allowed,” with an answer: “Because it’s against the law.”
When we know children are being harmed by something, why can’t the state help? In theory this might challenge economic libertarians who agree with what Milton Friedman said 50 years ago, that it is the duty of companies to maximize shareholder value. Instagram makes massive profit from ads and influencers aimed at teenagers. But a counter and rising school of conservative thought would answer: Too bad. Our greater responsibility is to see to it that an entire generation of young people not be made shallow and mentally ill through addictive social-media use.
The nature and experience of childhood has been changed by social media in some very bad ways. Why can’t we, as a nation, change this? We all have a share in this.
A participant here told a story of a friend, the mother of a large Virginia family who raised her kids closely and with limited use of social media. The mother took her children to shop for food. The woman at the checkout counter, who had been observing the family, asked the mother, “Do you homeschool your kids?” The mother wasn’t sure of the spirit of the question but said, “Yes, I do. Why do you ask?” The checkout woman said, “Because they have children’s eyes.” And not the thousand-yard stare of the young always scrolling on their phones.
There were many different views expressed at the meetings but on this all seemed to agree, and things became animated.