fbpx
News

Facebook’s Large Language Model leaks on 4Chan

The model was initially available only to approved researchers, government officials or members of civil society

Facebook’s LLaMA, short for Large Language Model Meta AI is an Artificial Intelligence model that leaked for the public to download last week.

As reported by Vice, the language model was initially available only to approved researchers, government officials or members of civil society and was shared for research purposes to evaluate and improve the model, but was shared as a downloadable torrent file on 4Chan last week.

The exact implications of the leak still aren’t clear, though it is worth noting that this is the first time a major tech company’s proprietary AI model has leaked to the public.

“To date, firms like Google, Microsoft, and OpenAI have kept their newest models private, only accessible via consumer interfaces or an API, ostensibly to control instances of misuse,” wrote Vice.

In a statement given to Vice‘s Motherboard, Meta did not deny the leak, and said that it stands by its approach of sharing the LLaMA model among researchers:

“It’s Meta’s goal to share state-of-the-art AI models with members of the research community to help us evaluate and improve those models. LLaMA was shared for research purposes, consistent with how we have shared previous large language models. While the model is not accessible to all, and some have tried to circumvent the approval process, we believe the current release strategy allows us to balance responsibility and openness.”

In a blog post from February, Meta wrote access to the AI model will be granted on a case-by-case basis, however, that’s gone to ruin now. Meta is currently filing takedown requests to have the model torrent file taken down, and it is unclear if it has made any progress.

Learn more about LLaMA here.

Source: Vice

MobileSyrup may earn a commission from purchases made via our links, which helps fund the journalism we provide free on our website. These links do not influence our editorial content. Support us here.

Related Articles

Comments