FTC report on predatory social media data hoarding hints at future regulations
A new FTC report on how social media and streaming sites collect and monetize their hoards of user data doesn’t really feature a lot of surprises for anyone who’s followed the space. It’s more helpful to consider this part of a paper trail the agency is laying down in order to justify new regulations in the space.
The report has its roots way back in late 2020, when the FTC ordered nine of the tech companies with the biggest data collection apparatus to disclose numerous aspects of how their surveillance capitalism business models worked. (The companies: Amazon, Facebook, YouTube, Twitter, Snap, ByteDance, Discord, Reddit, and WhatsApp.)
What data do you collect, on whom, and how long is it kept? If asked to delete, do you do so? What do you use it for, who do you sell it to, and what do they use it for? The questions are quite comprehensive, the better to avoid the possibility of prevarication or obscuration through withholding of important data.
The responses of the companies were, predictably, evasive, as the FTC’s Bureau of Consumer Protection Director Samuel Levine notes in the preface:
Echoing the way that firms conceal and hide their collection practices, many of the Companies provided the Commission with limited, incomplete, or unhelpful responses that appeared to have been carefully crafted to be self-serving and avoid revealing key pieces of information.
The resulting report details all manner of shenanigans, representing both malice and incompetence. Few of the practices disclosed will surprise anyone at this point, but the executive summary starting on page 9 is a great refresher on all the skulduggery we have come to expect from the likes of these.
Of course, it has been nearly four years since then, and many of the companies have made changes to their practices, or have been fined or otherwise chastised. But despite the elevation of Lina Khan to Chair of the FTC subsequent to this inquiry, there has been no large revision or expansion of rules that lay down bright lines like “thou shalt not sell data on a user’s health challenges to advertisers.”
One exception you might hope for, compliance with the Children’s Online Privacy Protection Act, also seems to be an afterthought. As the FTC writes:
…In an apparent attempt to avoid liability under the COPPA Rule, most [social media and video streaming services] asserted that there are no child users on their platforms because children cannot create accounts. Yet we know that children are using SMVSSs. The SMVSSs should not ignore this reality…Almost all of the Companies allowed teens on their SMVSSs and placed no restrictions on their accounts, and collected personal information from teens just like they do from adults.
Meta allegedly ignored obvious violations for years; Amazon settled for $25 million after “flouting” the law; TikTok owner ByteDance is the target of a similar lawsuit filed just last month.
So what’s the point of the report, if all this is known?
Well, the FTC has to do its due diligence too when considering rules that could restrict a bunch of multi-billion-dollar global tech companies. If the FTC in 2020 had said, “These companies are out of control, we propose a new rule!” then the industries impacted would quite justifiably challenge it by saying there is no evidence of the kind of practices the rule would prohibit. This kind of thing happened with net neutrality as well: the broadband companies challenged it on (among other things) the basis that the harms were overstated, and won.
Though Chair Khan’s statement accompanying the report suggests it will help inform state and federal lawmakers’ efforts (which is likely true), it is almost certain that this will provide a foundational fact basis on which to build out a new rulemaking. The very fact that the companies both admit to doing these things, and that they have been caught red-handed doing others in the meantime, would strengthen any argument for new regulations.
Khan also fends off dissent from within, from Commissioners who (despite voting unanimously to issue the report) accuse it of attempting to regulate speech or dictate business models. She dispatches these arguments with the confidence of someone already drafting a proposal.
That proposal (should it exist) would likely be aimed at trimming the wings of those companies that have come to embody entire industries within themselves. As Khan puts it:
…It is the relative dominance of several of these platforms that gives their decisions and data practices an outsized impact on Americans. When a single firm controls a market and is unchecked by competition, its policies can effectively function as private regulation. A consolidated market is also more susceptible to coordination with–or cooptation by–the government. Unchecked private surveillance by these platforms creates heightened risk of improper surveillance by the state. How these markets are structured can result in greater risks to—or greater protections of—people’s core liberties.
In other words, let’s not leave it to them, and the FTC likely doesn’t intend to.