LLM Providers
LLM Setup Troubleshooting
This page lists common setup problems for OpenAI, Ollama local models, and Ollama-compatible remote endpoints in MLJAR Studio.
Provider cannot be saved
You need to click Test connection before saving a provider. MLJAR Studio tests whether the provider can be reached and whether the selected model is available. The Save provider button becomes active only after the test succeeds.
After you click Save provider, you should see a success toast. The active provider then appears in the top chip in the sidebar with a green dot.
OpenAI API key error
If OpenAI authentication fails, check the following:
- The API key was copied completely.
- The API key was not revoked in the OpenAI dashboard.
- The OpenAI account has billing enabled if required.
- The selected model is available to your account.
- There are no extra spaces before or after the key.
OpenAI invalid model error
An invalid model error usually means the model name is wrong or your account does not have access to that model. For example, you can test with gpt-5.4 if it is available in your account. Try a model name you know is enabled, then click Test connection again.
Ollama connection refused
This usually means MLJAR Studio cannot reach the Ollama server. Check that Ollama is installed and running.
ollama --version
ollama listAlso check that the endpoint is correct. The default local endpoint is usually:
http://localhost:11434Ollama model not found
If the model is missing, download it first. The model name in MLJAR Studio should match the model available in Ollama.
ollama pull qwen-3.5:27b
ollama pull gemma4:31b
ollama listLocal model is too slow
Local inference speed depends on your hardware and model size. If the model is too slow, try a smaller model, close other heavy applications, or use a remote provider such as OpenAI or an Ollama-compatible cloud endpoint.
Ollama Cloud does not connect
- Check that the Ollama API key is correct.
- Check that the model name is correct, for example
qwen3.5:397borgemma4:31b. - Confirm that the model is available in your Ollama Cloud account.
- Click Test connection again before saving the provider.
- Ask your IT or infrastructure team whether a proxy, firewall, or VPN rule is blocking access.
Still not working?
Review the provider setup pages again: OpenAI Integration, Ollama Local Setup, and Ollama Cloud Setup.