Particle.news

Download on the App Store

Opera Launches Local LLM Support in Developer Browser

The new feature offers enhanced privacy and offline functionality, with no set timeline for broader release.

Overview

  • Opera introduces experimental support for running Large Language Models (LLMs) locally in its Opera One Developer browser, offering users privacy and offline functionality.
  • The update includes 150 LLM variants from 50 families, such as LLaMA, Gemma, and Mixtral, requiring 2-10GB of storage each.
  • Opera's move to local LLMs aims to provide a competitive edge in the AI landscape, with potential for faster response times and data privacy.
  • There's no timeline for when the local LLM feature will be introduced to the regular Opera browsers, as it remains exclusive to the developer version for now.
  • Opera accounts for an estimated 4.88% of the browser market in Europe and 3.15% globally, with this feature potentially enhancing its market position.