Comply with ZDNET: Add us as a most popular supply on Google.
ZDNET key takeaways
- If you wish to use an agentic browser, take into account native AI.
- Native AI places much less of a pressure on the electrical energy grid.
- The strategy retains your queries in your native system.
Agentic browsers are storming the fortress gates, and it seems like issues are heating up for one more browser warfare; solely this time with ‘smarter’ instruments.
From my perspective, that battle’s going to trigger a serious drawback. Think about if everybody across the globe is utilizing agentic net browsers. These agentic duties can take critical energy, which might equate to electrical energy costs skyrocketing and a profoundly unfavourable influence on the local weather.
There is a answer for this problem: native AI.
Additionally: Opera agentic browser Neon begins rolling out to customers – learn how to be part of the waitlist
On uncommon events that I want to make use of AI, I at all times accomplish that on the native degree, particularly utilizing Ollama.
Sadly, all however one of many agentic browsers available on the market use cloud-based AI. For me, that strategy makes utilizing these agentic browsers a no-go. Not solely do I not like the concept of putting additional pressure on the electrical grid, however I might a lot favor to maintain all of my queries native so a 3rd occasion can’t use them for coaching or profiling.
I’ve discovered two agentic browsers that may work with native AI: BrowserOS and Opera Neon. Sadly, solely a kind of is presently out there to the general public, BrowserOS.
BrowserOS is accessible for Linux, MacOS, and Home windows. To make use of it with regionally put in AI, you could have Ollama put in and have downloaded a mannequin that helps agentic searching, similar to qwen2.5:7b.
Additionally: I have been testing the highest AI browsers – here is which of them really impressed me
I have been testing BrowserOS and have discovered it to be a stable entry within the agentic browser market. In actual fact, I’ve discovered that it might stand toe-to-toe with browsers that depend on cloud-based AI, with out the unfavourable impacts or privateness points.
As soon as I had BrowserOS set as much as work with Ollama (extra on that in a bit), I opened the agentic assistant and ran the next question: Open amazon.com and seek for a wi-fi charging stand that helps a Pixel 9 Professional.
It took some time to get all the pieces arrange correctly in order that BrowserOS’s agentic instrument might work, however as soon as I had it functioning, it labored completely.
I’ll warn you: utilizing BrowserOS on this manner does require fairly a little bit of system assets, so in case you’re laptop is underpowered, it might battle to carry out.
Additionally: The highest 20 AI instruments of 2025 – and the #1 factor to recollect whenever you use them
Based on the Ollama web site, the minimal RAM necessities for operating native AI are:
- Minimal (8GB): This is absolutely the minimal requirement to get began and can will let you run smaller fashions, sometimes within the 3B to 7B parameter vary.
- Beneficial (16-32GB): For a smoother expertise and the flexibility to run extra succesful 13B fashions, 16GB of RAM is advisable. To comfortably deal with 30B+ fashions, you must goal for at the least 32GB of RAM.
- Massive Fashions (64GB+): To run the most important and strongest fashions, similar to 70B parameter variants, you will want 64GB of RAM or extra.
From expertise, the minimal RAM is not going to work. My System76 Thelio desktop has 32GB of RAM, and that setup labored okay for my functions. If you wish to use a bigger LLM (otherwise you need extra velocity to your agentic use case), I might go along with 64GB+. Even at 32GB, agentic duties might be gradual, particularly when operating different apps and companies.
Additionally: Are AI browsers well worth the safety danger? Why specialists are nervous
With enough assets, BrowserOS will full your agentic duties.
However how do you get there? Let me present you.
I’ll assume you’ve gotten BrowserOS put in in your platform of alternative.
Putting in Ollama
As a result of Ollama can simply be put in on each MacOS and Home windows, by downloading the binary installer from the Ollama obtain web page, I’ll present you learn how to set up it on Linux.
The very first thing to do is to open a terminal app.
Subsequent, run the command to put in Ollama on Linux, which is:
curl -fsSL https://ollama.com/set up.sh | sh
As soon as that set up finishes, you may then obtain a mannequin that helps agentic browsers. We’ll go along with qwen2.5:7b. To drag that mannequin, challenge the command:
ollama pull qwen2.5:7b
After the mannequin pull is completed, it is time to configure BrowserOS to make use of it.
Configure BrowserOS
Let’s configure BrowserOS to make use of Ollama.
1. Open BrowserOS AI settings
Open the BrowserOS app after which level it to:
chrome://settings/browseros-ai
2. Choose Ollama
Within the ensuing window, click on the use button related to Ollama. As soon as you’ve got completed that step, you will must configure Ollama as such:
- Supplier Kind – Ollama
- Supplier Title – Ollama Qwen
- Base URL – Go away as is, except you might be utilizing Ollama on a distinct server inside your LAN, during which case change 127.0.0.1 with the IP tackle of the internet hosting server
- Mannequin ID – qwen2.5:7b
- Context Window Measurement – 12800 (however solely when you’ve got a robust system; in any other case, go along with a smaller quantity)
- Temperature – 1
Additionally: I examined all of Edge’s new AI browser options – and it felt like having a private assistant
Be sure to choose the right mannequin and bump up the Context Window Measurement.
Screenshot by Jack Wallen/ZDNET
Be sure to then set the brand new supplier because the default.
3. Cease and begin the Ollama service
To make BrowserOS work with Ollama, it’s important to first cease the Ollama service. To try this on Linux, the command can be:
sudo systemctl cease ollama
As soon as the service is stopped, it’s important to begin it with CORS (Cross-Origin Useful resource Sharing) enabled. To try this on Linux and MacOS, run the command:
OLLAMA_ORIGINS=”*” ollama serve
To do that step with Home windows PowerShell, the command can be:
$env:OLLAMA_ORIGINS=”*”; ollama serve
To do that from the Home windows command line, run the next command:
set OLLAMA_ORIGINS=* && ollama serve
Utilizing the brand new supplier
At this level, you may open the Agent aspect panel in BrowserOS and run your agentic question, such because the one I urged above: Open amazon.com and seek for a wi-fi charging stand that helps a Pixel 9 Professional.
Additionally: I let ChatGPT Atlas do my Walmart looking for me – here is how the AI browser agent did
BrowserOS will do its factor and finally open a tab with the search situations above.
BrowserOS in motion with Ollama to seek out me a brand new charging stand for my Pixel 9 Professional.
Screenshot by Jack Wallen/ZDNET
Working with an agentic browser does take some getting used to, however as soon as you might be accustomed to the apply, you will discover it may be fairly useful. And by utilizing your regionally put in service, you may really feel a bit much less responsible for utilizing AI.

