TikTok and the onus on social media companies

The responsibility to ensure user data privacy does not lie solely on one platform

TikTok Chief Executive Shou Zi Chew testifies before a House Energy and Commerce Committee, on Capitol Hill in Washington, on March 23. Reuters
Powered by automated translation

Last week, TikTok chief executive Shou Zi Chew testified before US Congress to address concerns about potential Chinese influence over the social media platform, as well as the platform's impact on the mental health of children. During the hearing, which lasted more than four hours, Mr Chew said: "TikTok has never shared, or received a request to share, US user data with the Chinese government." The Chinese government has also denied the claims made by several Congress members.

About 150 million Americans use TikTok, a short-form video hosting service, but its parent company, ByteDance, is based in Beijing – a fact that has consistently raised national security questions in the West, particularly in the US, amid a brewing cold war with China. Earlier this month, the UK enforced a TikTok ban on government devices. In 2020, India banned the platform, among dozens of other Chinese apps, over privacy and security concerns.

However, as legislators in several western countries escalate efforts to restrict access to TikTok and other Chinese apps – which some experts view to be a part of their efforts to contain China's economic rise – a number of American social media influencers have called the stance taken by their own lawmakers hypocritical. Cassidy Jacobson, who has been on TikTok for six years and whose account has 1.5 million followers, said: "Even big US companies are taking our information and we don't really know what they're doing with it but they are sharing it with other big companies."

Manipulative algorithms can make dubious content too easily available to school children and minors

American companies such as Meta and YouTube are not beyond rebuke and have also faced criticism over the harmful content their sites host. To be clear, TikTok is far from being the first social media platform to have undergone scrutiny and criticism over concerns about privacy and misinformation – and rightfully so: studies repeatedly point to the toll that unmonitored use of social sites often and repeatedly takes on young and impressionable people.

A 2018 Pew Research Centre survey of nearly 750 teenagers, 13- to 17-year-olds, found that 45 per cent are online almost constantly and 97 per cent use a social media platform, such as YouTube, Facebook, Instagram or Snapchat. The study linked the use to conditions such as higher levels of anxiety and depression.

Manipulative algorithms can make dubious content too easily available to school children and minors. Often, this content should under no circumstances be consumed by them – be it trends promoting a certain type of body image linked then to eating disorders, or even self-harm or suicide. These may be extreme cases but the ill-effects of unmonitored social media content is directed not only at the young or especially vulnerable. There is a precedent to the manipulation.

In April 2018, Mark Zuckerberg admitted to US Congress that personal data belonging to 87 million of its users was collected, without its consent, by Cambridge Analytica, a British consulting firm hired to provide analytical assistance to former president Donald Trump's 2016 election campaign.

Mr Zuckerberg was forced to lift the lid on the extent to which algorithms and social media can influence people's decisions and manipulate their thinking. The long-term costs of algorithms designed to have certain outcomes are damaging to all, not just to any one nationality or set of people, as is the egalitarian nature of the internet. And inversely, the responsibility for lack of privacy surrounding user data has to be borne by more than any one social media company. It is for all social media chief executives and Big Tech stakeholders to resolve, so that users of all such apps can be assured privacy and safety.

Published: March 27, 2023, 3:00 AM