LangStream Documentation
Langstream.aiLangStream GitHub RepoChangelog
  • LangStream Documentation
  • ❤️Langstream.ai
  • ⭐LangStream GitHub Repo
  • 📜Changelog
  • about
    • What is LangStream?
    • License
  • Get Started
  • installation
    • LangStream CLI
    • Docker
    • Minikube (mini-langstream)
    • Kubernetes
    • Build and install from source
  • Building Applications
    • Vector Databases
    • Application structure
      • Pipelines
      • Instances
      • Configuration
      • Topics
      • Assets
      • Secrets
      • YAML templating
      • Error Handling
      • Stateful agents
      • .langstreamignore
    • Sample App
    • Develop, test and deploy
    • Application Lifecycle
    • Expression Language
    • API Gateways
      • Websocket
      • HTTP
      • Message filtering
      • Gateway authentication
    • API Reference
      • Agents
      • Resources
      • Assets
  • LangStream CLI
    • CLI Commands
    • CLI Configuration
    • Web interface
  • Integrations
    • Large Language Models (LLMs)
      • OpenAI
      • Hugging Face
      • Google Vertex AI
      • Amazon Bedrock
      • Ollama
    • Data storage
      • Astra Vector DB
      • Astra
      • Cassandra
      • Pinecone
      • Milvus
      • Solr
      • JDBC
      • OpenSearch
    • Integrations
      • Apache Kafka Connect
      • Apache Camel
    • LangServe
  • Pipeline Agents
    • Agent Messaging
    • Builtin agents
      • Input & Output
        • webcrawler-source
        • s3-source
        • azure-blob-storage-source
        • sink
        • vector-db-sink
        • camel-source
      • AI Agents
        • ai-chat-completions
        • ai-text-completions
        • compute-ai-embeddings
        • flare-controller
      • Text Processors
        • document-to-json
        • language-detector
        • query
        • query-vector-db
        • re-rank
        • text-normaliser
        • text-extractor
        • text-splitter
        • http-request
      • Data Transform
        • cast
        • compute
        • drop
        • drop-fields
        • merge-key-value
        • unwrap-key-value
      • Flow control
        • dispatch
        • timer-source
        • trigger-event
    • Custom Agents
      • Python sink
      • Python source
      • Python processor
      • Python service
    • Agent Developer Guide
      • Agent Types
      • Agent Creation
      • Configuration and Testing
      • Environment variables
  • Messaging
    • Messaging
      • Apache Pulsar
      • Apache Kafka
      • Pravega.io
  • Patterns
    • RAG pattern
    • FLARE pattern
  • Examples
    • LangServe chatbot
    • LlamaIndex Cassandra sink
Powered by GitBook
On this page
  • Application lifecycle
  • Promoting an application through environments
  • Upgrading and Downtime
  • Deleting an application
Edit on GitHub
  1. Building Applications

Application Lifecycle

Application lifecycle

When a developer deploys a new application to the LangStream control plane, a few steps are taken to ensure the application's success. Once the topics and agents are built, the control plane monitors the application's health and reports on issues.

Here is the typical lifecycle of a LangStream application:

  1. Deploy to control plane

  2. Download dependencies (if declared)

  3. Create a final application artifact

  4. Plan the topic and agent layout

  5. Validate connections to resources (if declared) and message brokers

  6. Create the topics (in the messaging platform)

  7. Create the processing agent(s) (as a statefulset)

  8. Start processing data

  9. Monitor agent's health (and report)

If you are familiar with the design of Kubernetes, then you will understand the choice of an agent being a StatefulSet. A desired number of pods in the set is established, and in the event of a crash, Kubernetes will make every effort to reconcile with a new pod. Each pod has a “sticky” identity. This makes debugging and reporting easier to understand.

The base image of a pod is custom to LangStream. There is quite a bit of additional management and tooling included, to help enhance the agent’s processing capabilities. Because LangStream's focus is on generative AI, many of these tools are centered around that. Libraries like LangChain and OpenAI are included in the image.

Promoting an application through environments

Once you have the application in a stable place, you’ll need to promote it off your desktop into a higher managed environment. Hopefully, your environments keep a minimum parity between them and the only addressable changes are to the infrastructure (urls, passwords, etc). This is the purpose of the secrets and instance manifests being separate from the application manifest. You can interchange these files while still using the same application and promote in a stable, known way.

Upgrading and Downtime

When you update your application, all impacted agents in the pipeline are restarted at the same time, and if an agent has more than 1 replica, 1 replica is restarted at a time.

Deleting an application

Deleting an application removes the application pod(s) and StatefulSet. If your application can't deploy and you attempt to clean things up, the custom resource can deadlock the removal of the namespace.

To remove the finalizer causing the deadlock:

appId="some-super-cool-app"
kubectl -n langstream-default patch Application/${appId} \
-p '{"metadata":{"finalizers":[]}}' --type=merge
PreviousDevelop, test and deployNextExpression Language

Last updated 1 year ago

In Kubernetes terms, each agent is a StatefulSet, and updating agents behaves like a of StatefulSets.

rolling update