Example python service located at ./application/python/example.py:
from langstream import Service
from fastapi import FastAPI
from langchain.prompts import ChatPromptTemplate
from langchain.chat_models import ChatAnthropic, ChatOpenAI
from langserve import add_routes
import uvicorn
app = FastAPI(
title="LangChain Server",
version="1.0",
description="A simple api server using Langchain's Runnable interfaces",
)
add_routes(
app,
ChatOpenAI(),
path="/openai",
)
model = ChatOpenAI()
prompt = ChatPromptTemplate.from_template("tell me a joke about {topic}")
add_routes(
app,
prompt | model,
path="/chain",
)
class ChatBotService(Service):
def main(self):
uvicorn.run(app, host="0.0.0.0", port=8000)
Configure the agent to use the python class in pipeline.yaml:
It is important that the service listens on port 8000 - this port is a special port that is mapped to the LangStream service. The LangStream operator will make sure that the service is reachable from the LangStream Gateway API Service.
Then you can expose your service using a gateway of type "service".
gateways:
- id: chatbot
type: service
service-options:
agent-id: langserve-service