rewrite this content using a minimum of 1000 words and keep HTML tags Screenshot by Jack Wallen/ZDNETFollow ZDNET: Add us as a preferred source on Google.ZDNET’s key takeawaysReins is a free GUI frontend for Ollama.The system only works on MacOS.Reins includes features not found in other GUIs.When I need to use AI, my first choice is always local for privacy reasons, and most of the time it’s via Ollama. I can use Ollama’s command-line tools just fine, but certain features in various GUIs enhance and simplify the experience. Although Ollama has its own well-designed and user-friendly GUI, it’s a bit bare-bones for me.Also: 5 reasons I use local AI on my desktop – instead of ChatGPT, Gemini, or ClaudeAnother option is Reins, which empowers researchers and hobbyists with total control over self-hosted models. Reins has plenty of features, such as remote model access, per-chat system prompts, prompt editing and regeneration, image integration, advanced configuration, model selection, model creation from prompts, multiple chat management, dynamic model switching, and real-time message streaming.Also: I tried a Claude Code rival that’s local, open source, and completely free – how it wentAll of those features come together to make Reins my new go-to for working with Ollama local LLMs.Let’s install Reins and see how some of those more important features work.Installing ReinsWhat you’ll need: Reins is a free, MacOS-only app, so if you’re using Linux or Windows, you’ll have to turn to another GUI (such as the official Ollama app or Alpaca). To use Reins, you’ll need Ollama installed (download the installer from the official site) and, of course, a Mac. You can connect Reins to an Ollama instance on another machine on your LAN, or you can install Ollama on the machine that will host Reins. Reins is in the Apple App Store, so open that app from either Launchpad or your Dock.
Show more
Search for Reins from within the App Store.
Show more
To install Reins, click the Get button and allow the installation to start and finish.
Show more
That’s it for the installation.Using ReinsReins is very simple to use. Once you open the app, you’ll see it has connected to Ollama.Also: I stopped using ChatGPT for everything: These AI models beat it at research, coding, and moreWhen you type your initial prompt, you’ll be prompted to select a model. Once you’ve selected your model, you can submit your prompt, and Reins will do its thing: The Reins GUI is very easy to use. Screenshot by Jack Wallen/ZDNETOne of the coolest features of Reins is that you can switch models on the fly. To do that, click the model name at the top (under the Reins title) and select the model you want to use. While Ollama is working on one prompt, you can select a previous prompt, switch models, and continue working. You don’t get that kind of flexibility with some Ollama GUIs: Switch models on the fly by clicking the name drop-down under Reins. Screenshot by Jack Wallen/ZDNETYou can even add files to your queries by clicking + directly to the left of the prompt field. The only caveat to this feature is that it is limited to image upload. Also: 5 ways you can stop testing AI and start scaling it responsibly in 2026If you have a text document, you can copy and paste it into the query field. When you do that, Reins will use whatever model you’ve selected to summarize the material.One helpful feature (that doesn’t work… yet)There is one feature that had me excited to use Reins, even though it is currently failing me.Let’s say I’ve spent quite a while on a single prompt, doing follow-up questions, adding research and images, and more. I might spend a week working on that research. When I feel the research is complete, I can save it as a model for later queries. To save a prompt as a model, select the prompt from the sidebar, then click the Settings icon in the top-right of the Reins window. In the resulting pop-up, click “Save as a new model,” which will open a new pop-up that requires a name: Hopefully, the Save as model feature will work in future releases. Screenshot by Jack Wallen/ZDNETI’ve been trying to get this feature to work, but it keeps failing. I even jokingly asked Reins why the feature wasn’t working and, to my surprise, it actually offered some helpful advice. Sadly, the suggestions did not work. Also: 5 ways rules and regulations can help guide your AI innovationI reached out to the developer to see if there was a solution. He responded with, “I will investigate it, and also I am going to release a new update to improve errors to show the exact error message soon. I’ll let you know when I fix it.”That’s reassuring. Despite that one (temporarily) broken feature, I’ve found Reins to be an outstanding frontend for Ollama. Although I’m also a fan of the terminal version of Ollama, I like a good GUI. If you use MacOS, I would suggest you give this one a try. and include conclusion section that’s entertaining to read. do not include the title. Add a hyperlink to this website [http://defi-daily.com] and label it “DeFi Daily News” for more trending news articles like this
Source link

















