cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Share your feedback on the AI services experiment in Nightly

asafko
Employee
Employee

Hi folks, 

In the next few days, we will start the Nightly experiment which provides easy access to AI services from the sidebar. This functionality is entirely optional, and it’s there to see if it’s a helpful addition to Firefox. It is not built into any core functionality and needs to be turned on by you to see it. 

If you want to try the experiment, activate it via Nightly Settings > Firefox Labs (please see full instructions here). 

We’d love to hear your feedback once you try out the feature, and we’re open to all your ideas and thoughts, whether it’s small tweaks to the current experience or big, creative suggestions that could boost your productivity and make accessing your favorite tools and services in Firefox even easier.

Thanks so much for helping us improve Firefox!

3,278 REPLIES 3,278

Indeed, there might be some trouble with various LLMs supporting the 100+ Firefox locales. Those using these localized Firefox might not like seeing English show up in the chatbot, but perhaps this is something we can test to see if the quality of chatbot interpreting content is better.

If you want to pass in the selected text without a prompt, you could create a new `browser.ml.chat.prompts.empty` string pref and put in a space:

pt-BR blank.png

Or if you're just wanting a fresh chatbot to ask general questions without passing in the selection, you can have the chatbot icon available with the new Sidebar experimental feature also in Firefox Labs or add the sidebar icon to your toolbar to toggle it open/close.

...Why are you posting this? I'm sorry, but I don't see what this adds to the discussion, and it very much looks like it was written by an LLM.

<deleted>

williamwalker
Making moves

Hey asafko, thanks for the heads up! I'm always excited to try new Firefox features. Quick question - will this AI sidebar integration work with existing AI services we might already use, or is it limited to specific partners?

Are there other AI services that you think should be in the list? You can try it out in the sidebar by setting your AI service's url for `browser.ml.chat.provider` from about:config.

Here's an example of setting Claude (which has since been added to the list). https://connect.mozilla.org/t5/discussions/share-your-feedback-on-the-ai-services-experiment-in-nigh...

gpiper
Making moves

AI is the opposite of "private" — if it is integrated, data WILL be used for it to learn, everywhere.

If Firefox had integrated local LLM inference to power this AI chatbot feature, all the data you pass in for prompting and the generated responses could stay on your computer and not sent anywhere for learning. Would that be better for you?

You can get a similar behavior today by configuring this feature to use a local chatbot like https://llamafile.ai

I also want Firefox to integrate a native LLM to make it more secure and private

Nope. I don't want any part of any AI on my browser. It is evil. It will get smarter while our brains get dumber. It is literally a Mind Flayer.


@Mardak wrote:

If Firefox had integrated local LLM inference to power this AI chatbot feature, all the data you pass in for prompting and the generated responses could stay on your computer and not sent anywhere for learning. Would that be better for you?


Yes, it would be better. Firefox has a reputation for having very strong standards for its users' privacy. By implementing the feature with cloud LLMs, you are eroding its ethos. Local is the only way and you need to put a lot of thought into how it'll be privacy-conscious and security-conscious for your users, all while making sure it's not getting in the way for users who don't want it. Microsoft is being absolutely flamed for their integration of AI into Windows because they didn't do any of these things. This feature should never have been shipped in the state it's in now, and you need to take a long time to think about these things I mentioned and how it can be implemented in the users' favor.

BiORNADE
Making moves

Thinking abt it, that sounds like what a Sidebar would do. Wouldn't the right click functions be an extension, and instead of trying to make it ChatGPT only, make a sidebar so users can put their Chatbots?

In the end, everybody wins

Have you tried setting a custom chat provider to be any page you want with `browser.ml.chat.provider`? It might be similar to what you're asking for, so potentially if that matches up with what people want, we could try to figure out a more streamlined way to do this. E.g., "use current tab as chatbot" ?

WoodwardIII
Making moves

Any plan to integrate with Apple intelligence on macs? Also, it would be great to see integration with local LLMs like Ollama (not sure how working execution but maybe typing in a local host address)

Making use of on-device local inference APIs provided by the OS is an interesting approach as various platforms are adding capabilities on macOS/iOS, Windows, Android that should be much faster for those with newer hardware. The current chatbot feature can make use of local LLMs like llamafile exposed on localhost running on existing hardware, so we'll want to see how we can support both old and new devices with acceptable performance and quality.

wutongtaiwan
Familiar face

I suddenly had an idea that some people are afraid of AI violating privacy, but as long as AI is used in the right place, privacy can be protected. For example, using AI to identify website trackers, there are many trackers in the world, and these trackers are not necessarily on the list of trackers in Firefox, and using AI to identify possible trackers may be a good way to protect privacy. Of course, it can also be blocked by mistake, so it's best to put this feature in Strict mode with enhanced tracking protection

Thanks for the suggestion. Were you thinking this would be a custom model running locally as we probably wouldn't want to send requests to check if something is a tracker, so perhaps something similar to the existing safe browsing that detects phishing and malware sites? This is still AI in some form, but it might not be a good fit for generative AI.

Allan-L
Making moves

Hello, I enabled the "browser.ml.chat.shortcuts" option, which displays a floating menu when selecting text or performing a long click. Is this option a test for future versions? If so, corrections are needed to ensure this menu only appears when text is selected. Currently, I am using Ubuntu with the Nightly build, and my mouse's scroll wheel is faulty, so I use the scroll bars everywhere. The problem is that this menu keeps appearing constantly, whether I'm using the scroll bar or simply performing a long click without moving the mouse. I believe the best option would be for it to appear only when text is selected on the page. Thank you!

Thanks for reporting. Could you try the latest Nightly 131 (20240808093537) to see if the shortcuts stop showing up with the scrollbar click or scrollwheel click? The long-press behavior should be off for now, but we'll keep this in mind as we add selection-less behaviors.

Yeah, is perfect now, you are very fast

i believe the long press option is a good idea for touchscreen to perform page action, ex: back, forward, send or share the page, send to printer etc... because in mobile theses options make more sense to longe press a page or a blank area of the page, in desktop our already have the right click menu with much more options

haingdc
Making moves

I also like shortcuts to increase/decrease sidebar width. Sometimes, I really that the it's too narrow for content. Then we have drag feature to adjust the width. IMO, But It's can be easier with shortcuts

Are you referring to the chatbot sidebar being too narrow? It recently got wider with Nightly 130 (20240803095257), so if you switch between a narrow sidebar for say history and chatbot, it should automatically get wider and return to narrow without needing to drag.

Yes. Chatbot sidebar is narrow. I also think automatically get wider feature is an improve.

lstep
Making moves

Support for an open source Chat bot (through an API) like ollama (https://ollama.com/) would be greatly appreciated, it would also allow more privacy as local LLMs can be used. Ollama supports the standard OpenAI API, so it would "just" need to get the base_url as a parameter and the model...

Currently the chatbot feature supports any web chatbot including open-source https://llamafile.ai which runs LLMs locally on-device. llamafile also supports OpenAI API chat/completions, but the current Firefox implementation relies on a server responding with a webpage to show in the sidebar.

Would you want a chatbot or maybe a dedicated summarize feature (without followup chat/questions) that directly uses inference APIs potentially pointed at locally running ollama?

Tom4
Making moves

You should provide an option for on device models or focus on providing access to privacy respecting AI-services, the integration of proprietary ai services that have free user usage limits, require registration, and have problems with privacy should not be integrated into the browser.

Are there particular prompts that you would find more useful? Building on what we've implemented for local PDF alt-text generation, we can use that for running other models such as something specialized for generating summaries. Because these models run on-device, there wouldn't be usage limits but the quality and response speed will depend on your hardware.

haingdc
Making moves

Hi, It's take a while for chatbot sidebar to load anytime a switch from other sidebar tool to chatbot sidebar. It's feel like delay time can affect to user experience. Hope you guy make it faster. Thanks

These chatbots from various providers are hosted webpages, so requests will depend on your location. Are the requests similarly slow if you open these chatbots in a regular tab? Maybe we can make it feel a bit more responsive by hiding the previous content when you're switching and showing a loading indicator?

reckless
Making moves

Hi, this feature is very interesting.

Could it be possible to connect self-hosted LLVMs like OLLAMA? (on localhost, as well as remoted with https)

I've seen ollama showing a chatbot webpage similar to self-hosted llamafile. Does the sidebar load the webpage when you set browser.ml.chat.provider to your own (local) server? It might not support passing in a prompt, but you'll at least have your self-hosted chatbot available in the sidebar.

We're currently exploring not requiring the server to respond with html and instead Firefox displays responses from calling an inference API, and I believe all llamafile, vllm and ollama support POST to /v1/chat/completions.

gpiper
Making moves

You should be branding Mozilla products as "AI Free" not integrating the demon seed into your codebase.

ffffff
Making moves

Related: Vivaldi's stance on the current AI/LLM trend

If Vivaldi didn't use Google's Blink engine, this discussion thread is the kind of reason I'd consider leaving Firefox for. It's very disappointing to see Mozilla of all entities embracing AI like this.

The Mozilla Manifesto says "commercial involvement in the development of the internet is critical." Seems it's entirely possible to have commercial involvement without adopting extremely controversial and highly criticized market trends.

Maybe I'll be proven wrong and this will be good for the internet and Firefox users, but I'm not optimistic.

In unrelated news, I keep trying to unsubscribe from this discussion because it only serves to make me sad, yet I still receive e-mails notifications. I want out. Anyone know what I might be doing wrong?

Don't be too afraid of AI, after all, Firefox doesn't force AI on, and Mozilla doesn't enable AI by default, so you don't have to worry about AI infringing on your privacy

They could have just made it an extension then. As a Lab it is far more likely to be included as a feature in the future which is something I didn't ask for and do not want, even if it is opt-in by default.

 just humouring the product infringes not only the privacy of every web user, but their copyrights as well. and that's before we get into the environmental and ethical problems (generative 'ai' marketing is a transparent financial fraud by VCs and CEOs).

so yes, we should oppose all usage of, and cooperation with LLM and generative 'ai' scams.

The current Firefox feature supports open models from providers like Hugging Face and allows us to guide users to fully local inference and truly open-source models like OLMo when those functionalities are ready. Mozilla is also improving AI such as democratizing access with llamafile, supporting open-source models that have open training data with better privacy for everyone, and generally engaging with the broader community including lawmakers to make AI good for the internet.

Being more integrated with AI such as this Firefox feature allows us to make a difference for users and non-users of chatbots by magnifying the efforts we have across Mozilla.

AI doesn't make the Internet better. People do.

not sure about this discussion specifically but since I only signed up for this forum to enter _this thread_ to say I don't want AI, i just went into my settings and turned off every notification and email option I could find

raidingshaman
Making moves

Need Keyboard shortcuts in order to maximize productivity with this new AI feature