10th Indian Delegation to Dubai, Gitex & Expand North Star – World’s Largest Startup Investor Connect
AI

TikTok fined in Italy after ‘French scar’ challenge led to consumer safety probe


Italy’s competition and consumer authority, the AGCM, has fined TikTok €10 million (almost $11M) following a probe into algorithmic safety concerns.

The authority opened an investigation last year into a so-called “French scar” challenge in which users of the platform were reported to have shared videos of marks on their faces made by pinching their skin.

In a press release Thursday the AGCM said three regional companies in the ByteDance group, Ireland-based TikTok Technology Limited, TikTok Information Technologies UK Limited and TikTok Italy Srl, had been sanctioned for what it summarized as an “unfair commercial practice”.

“The company has failed to implement appropriate mechanisms to monitor content published on the platform, particularly those that may threaten the safety of minors and vulnerable individuals. Moreover, this content is systematically re-proposed to users as a result of their algorithmic profiling, stimulating an ever-increasing use of the social network,” the AGCM wrote.

The authority said its investigation confirmed TikTok’s responsibility in disseminating content “likely to threaten the psycho-physical safety of users, especially if minor and vulnerable”, such as videos related to the “French scar” challenge. It also found the platform did not take adequate measures to prevent the spread of such content and said it failed to fully comply with its own platform guidelines.

The AGCM also criticized how TikTok applies the guidelines — which it says are applied “without adequately accounting for the specific vulnerability of adolescents”. It pointed out, for example, that teens brains are still developing and young people may be especially at risk as they can be prone to peer pressure to emulate group behaviour to try to fit in socially.

The authority’s remarks particularly highlight the role of TikTok’s recommendation system in spreading “potentially dangerous” content, pointing out the platform’s incentive to drive engagement and increase user interactions and time spent on the service to boost ad revenue. The system powers TikTok’s ‘For You’ and ‘Followed’ feeds and is, by default, based on algorithmic profiling of users, tracking their digital activity to determine what content to show them.

“This causes undue conditioning of users who are stimulated to increasingly use the platform,” the AGCM suggested in another remark that’s notable for being critical of engagement driven by profiling-based content feeds.

We’ve reached out to the authority with questions. But its negative assessment of the risks of algorithmic profiling looks interesting in light of renewed calls by some lawmakers in Europe for profiling-based content feeds to be off by default.

Civil society groups, such as the ICCL, also argue this would shut off the outrage tap that ad-funded social media platforms monetize through engagement-focused recommender systems which have a secondary effect of amplifying division and undermining societal cohesion for profit.

TikTok disputes the AGCM’s decision to issue a penalty.

In a statement the platform sought to play down its assessment of the algorithmic risks posed to minors and vulnerable individuals by framing the intervention as related to a single controversial but small-scale challenge. Here’s what TikTok told us:

We disagree with this decision. The so-called ‘French Scar’ content averaged just 100 daily searches in Italy prior to the AGCM’s announcement last year, and we long ago restricted visibility of this content to U18s, and also made it ineligible for the For You feed.

While the Italian enforcement is limited to one EU Member State, the European Commission is responsible for overseeing TikTok’s compliance with algorithmic accountability and transparency provisions in the pan-EU Digital Services Act (DSA) — where penalties for non-compliance can scale up to 6% of global annual turnover. TikTok was designated as a very large platform under the DSA back in April last year, with compliance expected by late summer.

One notable change as a result of the DSA is TikTok offering users non-profiling based feeds. However these alternative feeds are off by default — meaning users remain subject to AI-based tracking and profiling unless they take action themselves to shut them off.

Last month the EU opened a formal investigation of TikTok, citing addictive design and harmful content and the protection of minors as among its areas of focus. That procedure remains ongoing.

TikTok has said it looks forward to the opportunity to provide the Commission with a detailed explanation of its approach to safeguarding minors.

However the company has had a number of earlier run-ins with regional enforcers concerned about child safety in recent years, including a child safeguarding intervention by the Italian data protection authority; a fine of €345 million last fall over data protection failures also related to minors; and long-running complaints from consumer protection groups that are worried about minor safety and profiling.

TikTok also faces the possibility of increasing regulation by Member State level agencies applying the bloc’s Audiovisual Media Services Directive. Such as Ireland’s Coimisiún na Meán which has been considering applying rules to video sharing platforms that would require recommender algorithms based on profiling to be turned off by default.

The picture is no brighter for the platform over in the US, either, as lawmakers have just proposed a bill to ban TikTok unless it cuts ties with Chinese parent ByteDance, citing national security and the potential for the platform’s tracking and profiling of users to provide a route for a foreign government to manipulate Americans.



Source link

AI
by The Economic Times

IBM said Tuesday that it planned to cut thousands of workers as it shifts its focus to higher-growth businesses in artificial intelligence consulting and software. The company did not specify how many workers would be affected, but said in a statement the layoffs would “impact a low single-digit percentage of our global workforce.” The company had 270,000 employees at the end of last year. The number of workers in the United States is expected to remain flat despite some cuts, a spokesperson added in the statement. A massive supplier of technology to… Source link

AI
by The Economic Times

The number of Indian startups entering famed US accelerator and investor Y Combinator’s startup programme might have dwindled to just one in 2025, down from the high of 2021, when 64 were selected. But not so for Indian investors, who are queuing up to find the next big thing in AI by relying on shortlists made by YC to help them filter their investments. In 2025, Indian investors have invested in close to 10 Y Combinator (YC) AI startups in the US. These include Tesora AI, CodeAnt, Alter AI and Frizzle, all with Indian-origin founders but based in… Source link

by Techcrunch

Lovable, the Stockholm-based AI coding platform, is closing in on 8 million users, CEO Anton Osika told this editor during a sit-down on Monday, a major jump from the 2.3 million active users number the company shared in July. Osika said the company — which was founded almost exactly one year ago — is also seeing “100,000 new products built on Lovable every single day.” Source link