Ireland has launched investigations into two major tech platforms, TikTok and LinkedIn, over concerns they may be violating EU regulations regarding digital content reporting.
The main issue at hand is how these platforms present and implement their reporting tools. Regulators have identified "deceptive interface designs" that could make users believe they are reporting content as illegal when it's actually in violation of the platform's terms and conditions. These misleading designs could potentially confuse or mislead users, undermining the effectiveness of the reporting mechanisms.
At its core, the EU's Digital Services Act (DSA) requires platforms to have easy-to-access and user-friendly reporting mechanisms for content that may be illegal. However, regulators believe some providers are not meeting these requirements, using design elements that could deceive or manipulate users.
According to John Evans, DSA Commissioner at Coimisiún na Meán, the key issue is ensuring that reporting mechanisms are transparent and trustworthy, allowing users to make informed decisions about what content they report as illegal. Providers must also avoid designing their interfaces in ways that could distort or impair this decision-making process.
Ireland's regulators have already forced other tech companies to make significant changes to their reporting mechanisms for illegal content, with the threat of financial penalties hanging over them. If found to be violating the DSA, platforms could face fines equivalent to up to six percent of their revenue.
The investigation into TikTok and LinkedIn follows a separate probe by Ireland's Data Protection Commission into social media platform X, which is alleged to have trained its AI assistant on user posts in violation of the General Data Protection Regulation (GDPR). If found guilty, X could face a four percent cut of its global revenue as punishment.
The main issue at hand is how these platforms present and implement their reporting tools. Regulators have identified "deceptive interface designs" that could make users believe they are reporting content as illegal when it's actually in violation of the platform's terms and conditions. These misleading designs could potentially confuse or mislead users, undermining the effectiveness of the reporting mechanisms.
At its core, the EU's Digital Services Act (DSA) requires platforms to have easy-to-access and user-friendly reporting mechanisms for content that may be illegal. However, regulators believe some providers are not meeting these requirements, using design elements that could deceive or manipulate users.
According to John Evans, DSA Commissioner at Coimisiún na Meán, the key issue is ensuring that reporting mechanisms are transparent and trustworthy, allowing users to make informed decisions about what content they report as illegal. Providers must also avoid designing their interfaces in ways that could distort or impair this decision-making process.
Ireland's regulators have already forced other tech companies to make significant changes to their reporting mechanisms for illegal content, with the threat of financial penalties hanging over them. If found to be violating the DSA, platforms could face fines equivalent to up to six percent of their revenue.
The investigation into TikTok and LinkedIn follows a separate probe by Ireland's Data Protection Commission into social media platform X, which is alleged to have trained its AI assistant on user posts in violation of the General Data Protection Regulation (GDPR). If found guilty, X could face a four percent cut of its global revenue as punishment.