At its European Trust and Safety Forum, which took place in Dublin, the platform said it had removed more than 6.5 million videos in the first half of the year for breaching its rules on violent and hateful content.

The company said these accounts attempted to avoid detection by using coded emojis, rebranding accounts and directing users to external websites.

Up to 17 organisations, using over 920 accounts, were reported to be linked to extremist material, including content praising terrorist attacks, encouraging mass violence or targeting protected groups.

However, TikTok said 98.9pc of the content was taken down before users flagged it, and 94pc was removed within 24 hours.

After removing these accounts, the platform said it continues monitoring attempts to rebuild activity on the pl

See Full Page