Mozilla is in a tricky position. It contains both a nonprofit organization dedicated to making the internet a better place for everyone, and a for-profit arm dedicated to, you know, making money. In the best of times, these things feed each other: The company makes great products that advance its goals for the web, and the nonprofit gets to both advocate for a better web and show people what it looks like. But these are not the best of times. Mozilla has spent the last couple of years implementing layoffs and restructuring, attempting to explain how it can fight for privacy and openness when Google pays most of its bills, while trying to find its place in an increasingly frothy AI landscape.

Fun times to be the new Mozilla CEO, right? But when I put all that to Anthony Enzor-DeMeo, the company’s just-announced chief executive, he swears he sees opportunity in all the upheaval. “I think what’s actually needed now is a technology company that people can trust,” Enzor-DeMeo says. “What I’ve seen with AI is an erosion of trust.”

Mozilla is not going to train its own giant LLM anytime soon. But there’s still an AI Mode coming to Firefox next year, which Enzor-DeMeo says will offer users their choice of model and product, all in a browser they can understand and from a company they can trust. “We’re not incentivized to push one model or the other,” he says. “So we’re going to try to go to market with multiple models.”

-_-

  • tonyn@lemmy.ml
    link
    fedilink
    arrow-up
    5
    ·
    1 day ago

    I wouldn’t mind if they added integration for ollama, but right now the feature is useless to me as they only seem to have the major commercial LLMs integrated.

    • Sims@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      23 hours ago

      No, you can access something like ‘localhost:8080’, so if you have your ollama/webui or other agent/llm listening there, FF will show that interface in the side window. Summary etc still works. You just chose ‘localhost’ as your provider.

      Better search, but it looks like this: browser.ml.chat.provider http://192.168.1.83:8080/