free webpage hit counter

‘The exact number of minutes needed for children to become TikTok addicts’ & the hidden dangers posed by ‘filter bubble’

IT’S an attention economy and social media companies will do anything they can to keep your eyes on the screen.

Now, US President Donald Trump has revealed “a group of very wealthy people” are lined up to buy Chinese-owned video-sharing app TikTok in the US.

A sad-looking girl looking at her phone in a dark room.
It takes around 35 minutes for a kid to get addicted to TikTok
Getty Images – Getty
Portrait of Olga Cronin, Senior Policy Officer at the Irish Council for Civil Liberties.
Olga Cronin wants Dail reps to back a motion to force social media companies to stop using algorithms on children
PR Handout

The app has been ordered to find a US owner or face being banned in the States.

This is due to concerns that Americans’ data could be passed on to the Chinese government.

But where is the concern about what these social media sites and their addictive algorithms are doing to our children?

We all know the horror stories of worrying content about eating disorders, violence or misogyny being pushed the way of our kids.

So, campaign group People Before Profit has tabled a Dail motion that would force social media companies to stop using these algorithms on children.

Here, Olga Cronin, senior policy advisor at the Irish Council for Civil Liberties, tells Irish Sun readers why she believes it is important for all politicians across the Dail to back this bill.

JUST 35 minutes – that’s all it allegedly takes for a child to become hooked on TikTok.

Not only that — should a child appear interested in an unhealthy type of content, TikTok’s algorithm feeds the child similar unhealthy content in a “filter bubble”.

TikTok users are placed in filter bubbles after just 30 minutes of use in one sitting.

These disturbing details were revealed in confidential company documentation disclosed and referenced in a US court filing last year which should be required reading for all Irish politicians.


Why? Because an opportunity has arisen for Irish politicians to support a new bill which could help end this predatory nature of manipulative, harmful and profit-driven social-media algorithms which push content concerning self-loathing, self-harm and suicide into children’s feeds.

ALGORITHMIC PROFILING

As the recent Online Safety Monitor from the Children’s Rights Alliance called for regulations to require algorithmic profiling to be disabled by default for child and young users, a bill has been put forward by People Before Profit providing for the same.

The bill also states that any algorithms based on profiling or sensitive personal data should have to be actively turned on by adult users.

The aforementioned US court filing is crucial to understanding why we must demand that these algorithms are turned off by default.

TikTok’s business model, according to the lawsuit, is based on keeping users engaged for as long as possible, in order to collect valuable data and expose them to more advertising — which in turn generates more revenue.

The court document, from a case taken by the state of Kentucky against the platform, explains: “Defendants [TikTok] know that all it takes to hook an average user is viewing 260 videos.

“While this may seem substantial, TikTok videos can be as short as eight seconds, and are played for viewers in rapid-fire succession, automatically. Thus, in under 35 minutes, an average user is likely to become addicted to the platform.”

THREE STAGES

The filing notes how in an internal presentation devoted to increasing user retention rates, TikTok identified three “moments” when forming a TikTok habit.

First there is the Set-up Moment when a young user watches their first video. Next is the Aha Moment when TikTok’s algorithm has begun to discern what content a user will respond to, after a child has watched 20 videos or more on their first day on the platform.

And finally, there is what they call the Habit Moment in which “new users start to form a habit of coming to TikTok regularly” which occurs if a youth has watched 260 videos or more during the first week of having a TikTok account.

TikTok is aware of negative impacts on its users.

The court filing notes how internal reports observed “compulsive usage correlates with a slew of negative mental health effects like loss of analytical skills, memory formation, contextual thinking, conver-sational depth and emp-athy, and increased anxiety” — and that “compulsive usage interferes with essential personal responsibilities like sufficient sleep, work/school and connecting with loved ones.”

And yet, the lawsuit alleges TikTok has failed to disclose these harms and continues to mislead the public.

The document outlines how the platform’s design exploits psychological triggers that cause compulsive usage, such as low-friction variable rewards (where users are randomly rewarded with engaging content), social manipulation (where users are encouraged to interact and engage with the content), and ephemeral content (which creates urgency by presenting time-sensitive material).

CYCLE OF ADDICTION

These features make it difficult for users to control their time on the platform, creating a cycle of addiction.

But this is not just about TikTok.

All Big Tech companies are using algorithms to addict children, and polarise adults, as shown by mounting research demonstrating these algorithmic harms.

Tackling this problem is long overdue and, if 35 minutes is all it takes to addict a young child to TikTok, time is not on our side.

Politicians of all hues should get on board with this bill and have these manipulation machines turned off.

About admin