Database Connections

Connect databases for AI data analysis

Database connectors are available in AI Data Analyst (conversational notebooks). In the prompt box, click DB Connector to open the connection dialog, configure your engine, test connection, and save.

After saving a connection, MLJAR Studio loads table metadata to build context for the LLM. You can decide which tables are available to AI, then confirm save and start querying your database from the notebook.

Supported databases and platforms

Required fields by engine

EngineRequired fieldsNotes
PostgreSQLhost, port, database, schema, user, passwordPython driver installed automatically when missing.
Supabasehost, port, database, schema, user, passwordUses PostgreSQL-compatible connection flow.
MySQLhost, port, database, user, passwordPython driver installed automatically when missing.
MS SQL Serverhost, port, database, schema, user, passwordRequires OS-level ODBC runtime/driver in addition to Python packages (Linux/macOS: unixODBC + msodbcsql18, Windows: ODBC Driver 18 for SQL Server).
Snowflakeaccount, database, schema, user, password, warehouseCloud warehouse connection with account-level auth.
Databrickshost, http path, access token, catalog, schemaRequires valid workspace SQL endpoint and token.

DB Connector flow (conversational notebook)

  1. Open AI Data Analyst notebook.
  2. Click DB Connector in the prompt box.
  3. Choose engine and fill required fields.
  4. Click Test connection. MLJAR Studio validates the connection details.
  5. If Python database drivers are missing, MLJAR Studio asks to install them and performs installation.
  6. When connection test succeeds, Save connection is enabled.
  7. Click Save connection. MLJAR Studio scans available tables to build LLM context.
  8. Select which tables should be visible to the LLM.
  9. Click final Save to confirm table context.
  10. Active connection appears in the prompt box, and tables appear in left sidebar Data Awareness panel.

SQL blocks in conversational notebook

In AI Data Analyst, SQL blocks are compact by default. Users can expand a block to inspect the full query.

Collapsed SQL code block in MLJAR Studio conversational notebook

Expanded SQL block view:

Expanded SQL code block in MLJAR Studio conversational notebook

Related docs

For conversational analysis after SQL queries, see AI Data Analyst. For model provider setup, see LLM Providers.

« Previous
Code recipes
Next »
PostgreSQL