LLM Providers
LLM Providers in MLJAR Studio
MLJAR Studio can work with different LLM providers. You can use the built-in MLJAR AI provider, connect your own OpenAI API key, or run local models with Ollama. This gives you a flexible setup for AI-assisted data analysis, notebooks, and machine learning workflows.
The main decision is simple: use MLJAR AI for the fastest start, OpenAI for cloud model quality, and Ollama when local execution and data privacy are more important than convenience.
Supported LLM providers
MLJAR AI
Use the built-in provider with no API key or local model setup.
OpenAI
Connect your own OpenAI API key and use cloud-hosted GPT models.
Ollama Local
Run local LLMs on your own computer with Ollama.
Ollama Cloud
Use Ollama Cloud with an API key and model name when local hardware is not enough.
Quick decision table
| Provider | Best for | Setup | Privacy profile |
|---|---|---|---|
| MLJAR AI | Quick start and default workflows | No setup, default provider | Managed provider |
| OpenAI | High-quality cloud models | API key required | Cloud API provider |
| Ollama local | Private local LLM workflows | Install Ollama and model | Runs on your machine |
| Ollama cloud | Remote model access with Ollama-compatible API | API key and model name | Depends on your endpoint |
How provider setup works
All configurable providers follow the same save flow. First, fill in the required fields for the provider. Then click Test connection. MLJAR Studio checks whether the provider can be reached and whether the selected model is available.
If the test succeeds, the Save provider button becomes active. After you click Save provider, MLJAR Studio shows a success toast. The active provider name is then shown in the top chip in the sidebar with a green dot, which indicates that the connection is OK.
How to choose
- Choose MLJAR AI if you want to start immediately. It is the default provider and does not require setup.
- Choose OpenAI if you already use OpenAI and want to provide an API key and model name.
- Choose Ollama local if you have a model downloaded and running locally.
- Choose Ollama cloud if you want a remote Ollama provider and have an Ollama API key and model name.
Related pages
If you are not sure which setup is right for you, start with Local vs Cloud LLMs. If you already want local models, go directly to Ollama Local Setup.