Ollama

Generate text using Llama models

Use the Ollama generate function in Python. Build a query, set the model and prompt, and print responses in streaming or standard mode.

Required packages

You need below packages to use the code generated by recipe. All packages are automatically installed in MLJAR Studio.

ollama>=0.3.3

Interactive recipe

You can use below interactive recipe to generate code. This recipe is available in MLJAR Studio.

Python code

# Python code will be here

Code explanation

  1. Formulate a query for what you would like to generate.
  2. Print the response in the way you have chosen.

Ollama cookbook

Code recipes from Ollama cookbook.

« Previous
Embeddings