OllamaReader¶
Overview¶
The OllamaReader
class is designed to interface with hosted Ollama models to extract relationships between entities in a given text. It extends the BaseLLMReader
class and uses the requests
library to communicate with the API.
Installation¶
- Install the
requests
library:pip install requests
Example Usage¶
Here's an example demonstrating how to use the OllamaReader
class:
from euler.llm_readers import OllamaReader
# Initialize the OllamaReader
ollama_reader = OllamaReader(
base_url="http://localhost:11434",
model="qwen2:0.5b",
temprature=0.7
)
# Read text and extract relationships
text = """Thomas Alva Edison was an American inventor and businessman. He developed many devices in fields such as electric power generation, mass communication, sound recording, and motion pictures."""
response = ollama_reader.read(text)
print(response.text if response else "No response received.")
Using LLMReaderFactory¶
The LLMReaderFactory
class can be used to easily initialize different LLM readers, including OllamaReader
.
from euler.llm_readers import LLMReaderFactory
# Initialize the OllamaReader using LLMReaderFactory
ollama_reader = LLMReaderFactory.get_llm_reader(
reader_type='ollama',
base_url="http://localhost:11434",
model="qwen2:0.5b",
temprature=0.7
)
# Read text and extract relationships
text = """Thomas Alva Edison was an American inventor and businessman. He developed many devices in fields such as electric power generation, mass communication, sound recording, and motion pictures."""
response = ollama_reader.read(text)
print(response.text if response else "No response received.")
Classes and Methods¶
OllamaConfig
¶
The OllamaConfig
class is a configuration model that holds the API base URL, model name, temperature, context window, and an optional default prompt.
Attributes:
- base_url
(str): The base URL for the hosted model.
- model
(str): The hosted Ollama model to use.
- temprature
(float): The sample temperature.
- context_window
(int): The maximum number of tokens in the context.
- default_prompt
(Optional[str]): A default prompt to use if no custom prompt is provided.
OllamaResponse
¶
The OllamaResponse
class represents the response from the Ollama API.
Attributes:
- text
(str): The response text generated by the Ollama model.
OllamaReader
¶
The OllamaReader
class interfaces with the hosted Ollama models to extract relationships between entities in a given text.
Methods:
- __init__(self, base_url: str, model: str, temprature: float = 0.75, default_prompt: Optional[str] = None)
: Initializes the reader with the base URL, model, temperature, and optional default prompt.
- read(self, text: str, custom_prompt: Optional[str] = None) -> Optional[OllamaResponse]
: Reads the text and returns the extracted relationships.
- stream_complete(self, prompt: str, **kwargs: Any) -> Optional[OllamaResponse]
: Sends a request to the Ollama API and processes the streamed response.
This documentation provides an overview, example usage, and detailed description of the OllamaReader
class and its associated classes and methods.