The Supreme Court ruled today in two cases that could have a major impact on how social media platforms operate and how the government can interfere on behalf of political speech on these platforms.
Katelyn is a writer with CNET covering artificial intelligence, including chatbots, image and video generators. Her work explores how new AI technology is infiltrating our lives, shaping the content ...
Social-media companies never wanted to aggressively police content on their platforms. Now, they are deciding they don’t have to anymore. Mark Zuckerberg’s announcement that Meta Platforms will end ...
California will no longer enforce key provisions of a law requiring social media companies to disclose details about their content moderation practices after settling a lawsuit with Elon Musk’s X Corp ...
Two state laws that could upend the way social media companies handle content moderation are still in limbo after a Supreme Court ruling sent the challenges back to lower courts, vacating previous ...
Attorney General Andrew Bailey issued a regulation this week requiring social media platforms to give users in Missouri a choice over their algorithm. Bailey's proposal, modeled after the road map ...
Social media platforms commonly use artificial intelligence for content moderation, with the AI software itself relying on algorithms to screen content posted by social media users. Ultimately, the AI ...
The Centre has issued a stern warning to social media platforms about the consequences of failing to remove unlawful content.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results