Skip to main content
Open In ColabOpen on GitHub

OpenWebUIAI

This notebook provides a quick overview for getting started with Open WebUI

Open WebUI is an extensible, feature-rich, and user-friendly self-hosted AI platform designed to operate entirely offline. It supports various LLM runners like Ollama and OpenAI-compatible APIs, with built-in inference engine for RAG, making it a powerful AI deployment solution. You can find how to use it in OpenWebUI

Overviewโ€‹

Integration detailsโ€‹

ClassPackageLocalSerializableJS supportPackage downloadsPackage latest
[OpenWebUIAI]langchain_communityโŒโœ…โœ…PyPI - DownloadsPyPI - Version

Model featuresโ€‹

Tool callingStructured outputJSON modeImage inputAudio inputVideo inputToken-level streamingNative asyncToken usageLogprobs
โœ…โœ…โœ…โŒโŒโŒโœ…โœ…โœ…โœ…

Setupโ€‹

To access OpenWebUIChatCompletionsModel models you'll need to create an Open WebUI account, get an API key.

Credentialsโ€‹

Head to https://docs.openwebui.com/ to sign up for Open WebUI and generate an API key. Once you've done this set the OPENWEBUI_API_KEY environment variable:

Installationโ€‹

The LangChain ModuleName integration lives in the langchain_community package:

%pip install -qU langchain_community openwebui-api
Note: you may need to restart the kernel to use updated packages.

Instantiationโ€‹

import getpass
import os


openwebui_api_key = os.environ["OPENWEBUI_API_KEY"]
openwebui_api_base = os.environ["OPENWEBUI_API_BASE"]

from langchain_community.chat_models import OpenWebUIAI


llm = OpenWebUIAI(
temperature=0.1,
api_key=openwebui_api_key,
api_base=openwebui_api_base,
model="gemini-2.0-pro-exp-02-05",
)
API Reference:OpenWebUIAI

Invocationโ€‹

from langchain_core.messages import HumanMessage


response = llm.invoke(input=[HumanMessage(content="Hello")])
response
API Reference:HumanMessage

Chainingโ€‹

from langchain_core.prompts import ChatPromptTemplate
from langchain_core.output_parsers import StrOutputParser
from langchain_core.runnables import RunnablePassthrough


prompt = ChatPromptTemplate.from_template("""please answer the question below: {question}""")
retriever_chain = (
{"question": RunnablePassthrough()}
| prompt
| llm
| StrOutputParser()
)
AIMessage(content=' Ich liebe Programmieren.\n\n', additional_kwargs={}, response_metadata={}, id='run-ffc4ace1-b73a-4fb3-ad0f-57e60a0f9b8d-0')

API referenceโ€‹

https://docs.openwebui.com/getting-started/api-endpoints


Was this page helpful?