Set the environment variable for Ollama
Open the command prompt as administrator and run the following command:
this will set the OLLAMA_HOST system environment variable so that Ollama listens on all network interfaces.
To change the location of templates
By default the models are located in C:\Users\[your_username]\.ollama\models.
You can move them elsewhere, first stop Ollama then create for example D:\Ollama\models
Tailscale
Tailscale creates a virtual private network between all your devices, allowing you to access them as if they were on the same local network, even when you're on the move.
on Windows
- Download Tailscale from the official website: https://tailscale.com/download
- Install and connect to your Tailscale account
- Make a note of the Tailscale IP address assigned to your Windows PC (usually something like 100.x.y.z)
on your smartphone
- Download the application from the Play Store
Log in with the same account as on your PC
Chatbox AI
Chatbox AI is a desktop and smartphone application that lets you use AI in the cloud or, as in this case, Ollama locally.
Configure Windows :
- Open Chatbox AI
- In the settings, select ‘OLLAMA’ as the model provider
- Enter the URL: http://localhost:11434 or http://127.0.0.1:11434
Configure your smartphone :
- Open Chatbox AI
- In the settings, select ‘OLLAMA’ as the template provider
- Enter the URL using the Tailscale IP address of your Windows PC: http://100.x.y.z:11434 (replace 100.x.y.z with your actual Tailscale IP address)