06-21-2024 11:55 AM - last edited on 10-18-2024 02:19 PM by Jon
Hi folks,
In the next few days, we will start the Nightly experiment which provides easy access to AI services from the sidebar. This functionality is entirely optional, and it’s there to see if it’s a helpful addition to Firefox. It is not built into any core functionality and needs to be turned on by you to see it.
If you want to try the experiment, activate it via Nightly Settings > Firefox Labs (please see full instructions here).
We’d love to hear your feedback once you try out the feature, and we’re open to all your ideas and thoughts, whether it’s small tweaks to the current experience or big, creative suggestions that could boost your productivity and make accessing your favorite tools and services in Firefox even easier.
Thanks so much for helping us improve Firefox!
09-03-2024 05:08 PM
The current implementation is rendering a webpage in the sidebar, so exposing ollama with open-webui or other web chatbots interfaces works for now. We have been looking into directly calling inference APIs such as chat/completions llamafile and ollama already expose, and this would allow for Firefox to build its own custom response interface not necessarily a chatbot.
09-04-2024 01:22 AM - edited 09-04-2024 03:01 PM
Thanks! I already use openwebui so this worked well! One thing I did notice is that the webpage is not provided to the chatbot unless I select text first. In that case only the title is provided, not the URL (the URL would be handy so I could use a tool for the chatbot to receive it itself into its context window).
For example, if I don't select anything it would be nice if the whole page text would be provided to the AI, or at least a URL so it can retrieve it though an automation.
09-03-2024 10:01 AM
Awesome! Can we get an easier way to set up custom providers? And maybe a shortcut to summarize an entire page?
Using the about:config, i managed to get ollama with open-webui working, but noticed that every time i switch to other providers i need to set it up again on the settings.
Would also be lovely to be able to add custom prompts more easily. Finally, being able to provide the chatbot with the entire page instead of just a selection would be great too.
Awesome feature, specially now with small models like Gemma2 2b achieving higher user preference than GPT-3.5 on lmsys. Gemma2 2b and open-webui are a great pair with this feature.
09-03-2024 02:59 PM - edited 09-03-2024 03:06 PM
Would you mind explaining how you got ollama + open-webui working? I don't really understand how to do this 🙂
I have this running here on a separate server but how to integrate this into firefox isn't clear to me.
Edit: Ah I found it elsewhere in the chat, it's just simply setting browser.ml.chat.provider to my server's URL. That was easy. It doesn't offer me a translate option though (which would normally work better than firefox translate) so that would be a great option to add 🙂 I'd much rather have that than "Quiz" or "Simplify".
09-03-2024 04:29 PM
Do you always want it to translate to the same language? Here's an example of creating a new String pref `browser.
I did need to set ollama + open-webui to have a default model, but it seems to be working fine after configuring the provider.
09-04-2024 01:12 AM - edited 09-04-2024 01:21 AM
Yes, always to English. Thank you!
I set browser.ml.chat.prompts.translate to "Translate to English". It now shows up in the context menu, thanks! But could you tell me where to define the prompt itself? (PS: Is there documentation about this perhaps? I couldn't find it).
09-03-2024 05:05 PM
How do you think we should expose summarizing an entire page? We're looking into ways that don't require text selection such as getting article contents or comments section. Would you want it to always summarize this selection-less text or other prompts as well?
The current behavior of custom providers isn't great for frequent switching as you noticed it is forgotten when switching to a "known" provider. Do you expect to switch among multiple custom providers? Or maybe just saving the last custom provider could be sufficient to make it easy to switch back? Are you using these as 1-off chats with the provider or do you tend to switch provider and use it for multiple chats at a time?
09-03-2024 06:45 PM
I believe that making it available in the right click menu would be awesome. For my use, summarizing the article content would be enough, but the comment section would also be nice. I think that the summarization prompt + a "talk with the page" prompt would be enough for most users. Something like "Here is the page: {page}. In the following messages I will make questions to you about the content of the page. Wait for me to ask a question.". Would be awesome to be able to easily customize/add prompts too.
Yes, being able to add and switch from and to different custom providers would be awesome. With more and more different interfaces, chatbots, etc., I think that being able to add a list of custom providers would make the feature versatile for all users. Maybe even default to have no providers at all (or one), and let the users manage which providers they want.
For my personal use, I only need the latest custom provider. But again, ideally, it would allow to add multiple custom providers at the same time.
09-03-2024 10:23 AM
Can't choose duck.ai
09-03-2024 04:35 PM
You can set `browser.
09-03-2024 10:51 AM
All generative "AI" is bad. I don't want it in my browser, OS, or anything I own.
Generative "AI" is a fad that is yet to be useful for anything -- it provides falsehoods as if they were fact and scrapes private data to train.
Adding "ai" features to Firefox not only misunderstands what people want their browsers to be for, it also detracts time and money from improving Firefox itself.
If this is kept, I will have to assume that FF is a lost cause and move to another browser.
09-03-2024 04:42 PM
Yes, depending on the model, hallucinations or misunderstanding can be quite frequent, but even then, some people find value in having an optional AI feature readily available for situations where this type of creativity is helpful.
This feature allows for people to configure any provider and model, so hopefully we can help increase interest in training without scraping private data.
09-05-2024 02:26 PM
This is a terrible standpoint to take. By adding this functionality despite openly admitting that the current available options have gross ethical problems you are not promoting the idea that any of these ethical issues should be resolved, you are signaling that you consider those ethical issues to be secondary to hopping on the bandwagon. Until they can be independently verified to address all basic ethical concerns regarding AI features, none of these things should come anywhere close to being integrated by default.
09-08-2024 01:56 PM
@Mardak wrote:[...]
This feature allows for people to configure any provider and model, so hopefully we can help increase interest in training without scraping private data.
Yeah, but the ones you’ve specifically included as default options are models that do use irresponsibly scraped data. So, the message you’re sending is that you approve of rampant content theft.
This sucks.
09-09-2024 08:06 AM
If this feature is just for "some" people who find it helpful, then this can be an extension, not a base feature. Why is your stated goal here to "increase interest in [model] training"? So this "feature" is just advertising for the plagiarism machine? Why are you adding adverts for unethical technology into the base version of the browser?
09-13-2024 06:24 PM
"Hallucinations" and misunderstandings" is in and of itself a misunderstanding of how LLMs work. They aren't factbots. They don't "know" things. They predict what text sounds like a plausible sentence. That's ALL they do. If they ever tell you the truth it's purely by coincidence.
09-14-2024 09:48 PM - edited 09-14-2024 09:48 PM
real happy firefox is tanking its reputation for something that in your own words has a problem where "hallucinations or misunderstanding can be quite frequent".
i'm really hoping you get a good paycheck before this browser tanks for shattering user base trust.
09-21-2024 10:40 PM
Mardak, listen. Listen, man.
'Hallucinations' is a specifically chosen word meant to anthropomorphize an advanced autocorrect program. LLM chat bots do not know anything. There are no personalities or people involved. There is no hallucination. It is JUST predicting text, and that text is wrong over half the time. It is JUST a misinformation generation machine. The hype is already crashing. Stop while the hole you're digging isn't too deep.
And I shouldn't even need to say this, but there is no creativity in a LLM or an image generation program. None. There is no creativity. There is no helping people be creative. It is physically, at a base level, incapable of creativity. Arguing with that point demonstrates a misunderstanding of how these programs work.
09-03-2024 11:08 AM
Please allow the usage of containers in the Sidebar. I don't want to be logged in to one of the AI API providers without isolating it to some specific container.
09-03-2024 04:49 PM
Currently the AI Chatbot in the sidebar reuses the same regular tab container, so if you want to also use a different account for the same provider, you can open that in a different container. Do you have a particular provider you're wanting to use with multiple accounts? I currently have a personal and work ChatGPT account that I use Firefox containers to access both at the same time, but I do currently need to keep in mind which one I log into in a container vs regular tab.
Or are you not using multiple accounts and dislike how the feature reuses your logged-in state from regular tabs? At least for ChatGPT in the US, this could be useful for using the logged-out experience.
09-09-2024 02:04 PM
I'm not sure if I understand, are you saying that the sidebar uses the container of the tab from which it was launched?
09-03-2024 12:24 PM
I can't use it as it currently breaks Sideberry
09-03-2024 03:31 PM
Same here 😞 That's a bummer. It would be nice if we could have sideberry on the left and the AI on the right.
Or just to have native horizontal tabs again 😉
09-03-2024 04:55 PM
What do you mean by native horizontal tabs? Turning on AI Chatbot with or without Sideberry still has the normal tabs at the top for me.
09-04-2024 06:15 AM
Oh yes I meant vertical tabs 😞
I thought of them as horizontal because it's still a horizontal bar 🙂
09-03-2024 04:53 PM
How is Sideberry broken? Or more that you need to keep switching between AI Chatbot and Sideberry sidebar panels? It seems like Sideberry at least has a keyboard shortcut to switch back to it.
09-03-2024 04:54 PM - edited 09-03-2024 04:55 PM
I need to keep switching between the AI Chatbot and Sideberry sidebar. If there's a way to open/close the AI Chatbot with a button it would be nice
09-03-2024 05:45 PM
When you have a sidebar open whether it's Sideberry or AI Chatbot, there should be a dropdown towards the top to switch which sidebar is open as well as a close X button. You can also customize the toolbar to add a "Show sidebars" button to toggle your last sidebar open and close.
Alternatively, is the chatbot feature still useful for you if you change `browser.
09-03-2024 03:18 PM - edited 09-03-2024 03:19 PM
You can't run a local llm on the Ai chatbot UI. Can this feature be added?
09-03-2024 04:59 PM
Are you talking about running a LLM directly in Firefox or using local chatbot providers? There's existing configuration that others here have gotten llamafile or ollama working in the sidebar including passing in prompts. Firefox alt-text generation for pdfjs supports running various models including LLMs, so there's a path to doing it within Firefox, but currently it's quite slow. If people do want to try this out on Nightly, we can look into exposing this at least for advanced users with sufficient hardware.
09-03-2024 05:33 PM
That would be nice, Thank you.
09-03-2024 05:51 PM - edited 09-03-2024 06:32 PM
I just got this feature in Firefox 130 and made an account specifically to post my feedback:
I would like to know specifically who asked for this feature in its current state. There are problems others have stated here about quality issues and environmental issues of generative AI, which are all valid, but in my opinion, the bigger issue here is that you are integrating privacy-intrusive LLMs into a browser that has a good reputation for its strong privacy standards. In layman's terms, by implementing this, you are eroding the ethos of your browser. This is a recipe for disaster and you need to pivot ASAP. The fact that you are even throwing around ideas of integrating AI with people's browsing activity is not good if it's going to be with proprietary models that run in the cloud. You implemented this feature in a time when Microsoft is getting absolutely flamed for their integration of AI into Windows and their poor handling of it from a security and privacy standpoint. What you should have done is taken a step back after news of that broke and thought "How can we implement AI in a way that's not invasive to the privacy of our users and doesn't get in the way for users who don't want it?"
In my opinion, here is what needs to be done in order to right this wrong:
If these things are done, the AI people will be happy and the privacy people will be happy.
You need to fix this as soon as possible otherwise Firefox is going to get a metric ton of bad PR. This is a disaster in the making.
09-07-2024 09:11 PM
If you are on a debian / apt machine, you can add a file to /etc/apt/preferences.d to revert and pin Firefox to version 129. Will either solve the problem or at least give you time to find other alternatives.
09-08-2024 11:01 PM
How is that easier than just leaving the feature turned off?
09-09-2024 10:48 AM
If the feature is present on the system, you need to monitor whether future updates will "magically" enable it against your will. If the feature is never installed in the first place, that concern becomes a non-issue.
Aside from that, if there is a continuum between easy & convient vs private & secure, many of us will lean much further towards the security end of that spectrum.
I would also be wary of anyone trying to sell easy and convient. They often don't fully understand the problem domain and are up to some shady marketing.
Refs:
09-10-2024 09:15 PM
The implication that staying with an older piece of software is more secure is laughable. By staying on 129, you would be missing out on security patches introduced from 130 onward. That much should be obvious, and yet you tout this as being more "secure".
Realistically, there are two ways you could be doing this if you wanted to do it right:
It's hypocritical to say that you want more security whilst deliberately running outdated software. Consider one of the above options and stop giving people advice that makes their systems worse.
09-03-2024 05:51 PM
Please don't add A.I. to the search engine. I specifically started using firefox to get away from the A.I. searches. If you start using A.I, I'm moving search engines again.
09-03-2024 08:54 PM
I think it would be helpful to have a short description of each Chatbot, how they compare with each other, which of them Mozilla believes meets ethical standards (both for end user and everyone in general) and only include those by default which do.
Also a disclaimer about AI Chatbots in general would be helpful (or at least about the current generation). How they can help, some example use cases, what issues they have, what harms they might cause on a personal and societal level, and leave it to the user to then proceed to use the bot if they want.
I understand the need for Firefox to stay relevant (even if it has to do what other browsers shamelessly do even with large market shares), but providing additional information for the people who would read it is always a good thing in my personal opinion. 🙂
09-03-2024 09:13 PM
please add ai chat button on toolbar items.
09-04-2024 03:22 AM
Cool! I would like you to also add the ability to use local llms, for more privacy, such as the ability to select llamafile, or to refer to the ollama api. I would also like to have the ability to work with images on the page, using the "screenshot" function in the browser, or selecting from a file.
And in general, more AI - better. You could make the ability to recognize text on the picture, speech recognition, auto translation, creation of subtitles, etc. The main thing is to be able to run it all locally on your pc, and that the models are not loaded into the memory, until the user does not wish.