OpenAI wants to make tackling content moderation easier for digital platforms.
According to the company, using the GPT-4 multimodal large language model for content moderation can result in faster implementations of policy changes. The model can also interpret “rules and nuances in long content policy documentation and adapt instantly to policy updates, resulting in more consistent labeling,” according to OpenAI.
OpenAI suggests that the use of GPT-4 in content moderation for digital platforms will eventually relieve the mental burden of a large number of human moderators.
Content moderation demands human moderators to sift through large amounts of content to find out if the platform’s policies are being violated. This process is slow and can lead to inaccuracies. However, the same task when done by an LLM will take less time. According to OpenAI, its models can make moderation judgments based on policy guidelines provided to them. “With this system, the process of developing and customizing content policies is trimmed down from months to hours,” it says.
However, OpenAI also acknowledges that AI models are not perfect, and that human oversight is still necessary. “Judgments by language models are vulnerable to undesired biases that might have been introduced into the model during training. As with any AI application, results and output will need to be carefully monitored, validated, and refined by maintaining humans in the loop,” the blog post states. “By reducing human involvement in some parts of the moderation process that can be handled by language models, human resources can be more focused on addressing the complex edge cases most needed for policy refinement.”
OpenAI is not the only company that is using AI to assist content moderation. Meta has been employing AI to help Facebook moderators for several years but still faces criticism for its content decisions.
OpenAI says that anyone with its API can access and implement GPT-4 models to create their own AI-assisted moderation systems. Read more about it here.
MobileSyrup may earn a commission from purchases made via our links, which helps fund the journalism we provide free on our website. These links do not influence our editorial content. Support us here.