Children in the U.S. are to be protected from exploitation – not as humans in development, but as "consumers" of software that was designed to hook them and keep them hooked, not teach them or protect them.
A little cynical? The executives of TikTok and Snap that appeared this week in Washington were brought before a "Senate consumer-protection panel" to have a few questions put to them by senators – questions which they were free to contradict and deny. A consumer-protection response. Not a Child Protection Crisis group.
Senator Amy Klobuchar asked whether "TikTok's algorithm can push young users into content glorifying eating disorders, drugs, violence[?]”
The answer? According to Michael Beckerman, a vice president and head of TikTok public policy in the Americas, a recent Wall Street Journal article [link in comments below] referred to by Klobuchar did not represent an "authentic experience" that a user would have.
The declarations of good intentions have a hollow ring. Younger users shape tomorrow's consumer habits, as every marketer knows. Their attention is gold dust.
The power of AI, like processing power on a chip, keeps rising exponentially. There is now more powerful technology and more profound and cutting-edge R&D directed at addicting us than achieving any other effect. Yes, the executives protest that they are working on improvements.
Like Hamlet's mother, they protest too much.
Facebook executives know their products are harming children. Facebook now faces a separate probe – from the Federal Trade Commission [link below] – into its internal research that showed that Instagram boosted eating disorders in young people. The whistle-blower, Sophie Zhang, has stated that she found that Facebook had a culture of neglect for its abuses and problems – that the company always “waited for things to get dire before taking any action.”
Facebook’s founder Mark Zuckerberg was hauled into congress before – on several occasions, to answer questions about privacy, the company’s intentions to off financial services and many details of potential abuse of customers, children and society. He will now be called to do so again. When will we address these effects as destructive, as evil when they are such, and not simply matters of dishonest commerce? The tools are laying waste to the mental equilibrium of a generation of children.
Finally, as the well-known observation has it, we may consider the consumers are themselves the product, exposed to all the exploitation and erosion of personality that the Internet can create.
In the E.U., in Ireland, we can find jobs listings for Alphabet, ByteDance and Facebook. These parent companies are showing how responsible they are. We find jobs like 'Emergency Response Triage Specialist' at TikTok, where one is to "ensure user safety."
What does the job involve?
"Be aware of and willing to operate in a work environment with sensitive content that includes child exploitation, graphic violence, self-injury and suicide, and other content which may be considered offensive or disturbing."
They protest too much. At what moment will we replace "calls for better regulation" with a forceful rejection of what isn't working? It is clear that they will never provide the answer themselves.
Alphabet, Google parent, recorded its highest profits from Google and fastest growth in a decade last quarter. The 38-year old founder of ByteDance, parent of TikTok, is now China's richest man. These parents are doing a lousy job.
And it's time to take responsibility away from them.