Don’t like Copilot? Opera will let you run any LLM you want

And best of all, it’s all local.

Key Takeaways

  • Opera One browser adds 150 local LLMs, changing the game for AI users and setting a new standard for privacy and efficiency.
  • AI PCs now allow users to run LLMs locally, reducing the need to send queries to remote servers and addressing potential privacy concerns.
  • Microsoft Edge and Chrome adding AI features, but Opera steals the show with local LLM support directly in its browser.

It’s hard to find an area of technology where artificial intelligence hasn’t changed it in some way. Most of it is pretty obvious, such as AI’s ability to generate images or make a scaringly good recreation of someone’s voice. However, behind the scenes, companies are hard at work trying to become the best in the market for AI. In the browser scene, we have Microsoft changing its mobile browser’s name to “Microsoft Edge: AI Browser,” and Chrome adding AI-generated themes, but Opera is keen to take the crown by adding local-running LLMs directly into its browser.


What is Mixtral 8x7B? The open LLM giving GPT-3.5 a run for its money

If you’ve heard about Mixtral 8x7B but aren’t sure what makes it so special, we have all of the details here

Opera One receives local LLM support

An example of an LLM running on Opera one
Image Credit: Opera

As announced on Opera Blogs, the Opera One browser is receiving local LLM support. Usually, when you interact with an LLM such as ChatGPT, your computer has to send the query to a remote server to have it processed. It lightens the load on your system but opens up potential privacy concerns. Fortunately, with AI PCs arriving for the public, you can now purchase the hardware you need to run these LLMs on your own system.

The new update for Opera One brings in 150 LLMs you can tinker with on your PC, all of which stem from around 50 families. Opera lists the following families as available right now:

  • Llama from Meta
  • Vicuna
  • Gemma from Google
  • Mixtral from Mistral AI
  • And many families more

To get this update, you need to grab Opera One through the developer stream, a link to which is included in the blog post linked above. Once done, you can follow the instructions Opera gives in its announcements to get your very own LLM set up and running locally. And now that local LLMs have been added to browsers for the first time, it’ll be interesting to see how competitors such as Google Chrome, Mozilla Firefox, and Microsoft Edge will respond, if at all.


Related posts