LangStream Documentation
Langstream.aiLangStream GitHub RepoChangelog
  • LangStream Documentation
  • ❤️Langstream.ai
  • ⭐LangStream GitHub Repo
  • 📜Changelog
  • about
    • What is LangStream?
    • License
  • Get Started
  • installation
    • LangStream CLI
    • Docker
    • Minikube (mini-langstream)
    • Kubernetes
    • Build and install from source
  • Building Applications
    • Vector Databases
    • Application structure
      • Pipelines
      • Instances
      • Configuration
      • Topics
      • Assets
      • Secrets
      • YAML templating
      • Error Handling
      • Stateful agents
      • .langstreamignore
    • Sample App
    • Develop, test and deploy
    • Application Lifecycle
    • Expression Language
    • API Gateways
      • Websocket
      • HTTP
      • Message filtering
      • Gateway authentication
    • API Reference
      • Agents
      • Resources
      • Assets
  • LangStream CLI
    • CLI Commands
    • CLI Configuration
    • Web interface
  • Integrations
    • Large Language Models (LLMs)
      • OpenAI
      • Hugging Face
      • Google Vertex AI
      • Amazon Bedrock
      • Ollama
    • Data storage
      • Astra Vector DB
      • Astra
      • Cassandra
      • Pinecone
      • Milvus
      • Solr
      • JDBC
      • OpenSearch
    • Integrations
      • Apache Kafka Connect
      • Apache Camel
    • LangServe
  • Pipeline Agents
    • Agent Messaging
    • Builtin agents
      • Input & Output
        • webcrawler-source
        • s3-source
        • azure-blob-storage-source
        • sink
        • vector-db-sink
        • camel-source
      • AI Agents
        • ai-chat-completions
        • ai-text-completions
        • compute-ai-embeddings
        • flare-controller
      • Text Processors
        • document-to-json
        • language-detector
        • query
        • query-vector-db
        • re-rank
        • text-normaliser
        • text-extractor
        • text-splitter
        • http-request
      • Data Transform
        • cast
        • compute
        • drop
        • drop-fields
        • merge-key-value
        • unwrap-key-value
      • Flow control
        • dispatch
        • timer-source
        • trigger-event
    • Custom Agents
      • Python sink
      • Python source
      • Python processor
      • Python service
    • Agent Developer Guide
      • Agent Types
      • Agent Creation
      • Configuration and Testing
      • Environment variables
  • Messaging
    • Messaging
      • Apache Pulsar
      • Apache Kafka
      • Pravega.io
  • Patterns
    • RAG pattern
    • FLARE pattern
  • Examples
    • LangServe chatbot
    • LlamaIndex Cassandra sink
Powered by GitBook
On this page
  • Secrets
  • Manifest
  • Pass secrets as environment variables
  • Credentials
Edit on GitHub
  1. Building Applications
  2. Application structure

Secrets

PreviousAssetsNextYAML templating

Last updated 1 year ago

Learn more about the LangStream project .

Secrets

A place to hold secrets. Each label:value in this file is used as a reference in configuration and pipeline manifests. Their values carry on to a step’s environment where it is applied. Secret values can be modified directly in secrets.yaml, or you can

Manifest

An example secrets.yaml manifest contains the credentials necessary to connect to Astra and OpenAI.

The :- characters designate a default value. For example, provider: "${OPEN_AI_PROVIDER:-openai}" designates openai as the default.

For finding these credentials, see

secrets:
  - id: astra
    data:
      clientId: ${ASTRA_CLIENT_ID:-}
      secret: ${ASTRA_SECRET:-}
      token: ${ASTRA_TOKEN:-}
      database: ${ASTRA_DATABASE:-}
      # uncomment this and link to a file containing the secure connect bundle
      # secureBundle: "<file:secure-connect-bundle.zip>"
      secureBundle: ${ASTRA_SECURE_BUNDLE:-}
      environment: ${ASTRA_ENVIRONMENT:-PROD}
  - id: open-ai
    data:
      access-key: "${OPEN_AI_ACCESS_KEY:-}"
      url: "${OPEN_AI_URL:-}"
      provider: "${OPEN_AI_PROVIDER:-openai}"
      embeddings-model: "${OPEN_AI_EMBEDDINGS_MODEL:-text-embedding-ada-002}"
      chat-completions-model: "${OPEN_AI_CHAT_COMPLETIONS_MODEL:-gpt-3.5-turbo}"
      text-completions-model: "${OPEN_AI_TEXT_COMPLETIONS_MODEL:-gpt-3.5-turbo-instruct}"
 

Pass secrets as environment variables

Secret values can be modified directly in secrets.yaml, or you can pass your secrets as environment variables. The secrets.yaml resolves these environment variables.

export ASTRA_CLIENT_ID=...
export ASTRA_SECRET=...
export ASTRA_DATABASE=...
export ASTRA_TOKEN=...

When you go to production, you should create a dedicated secrets.yaml file for each environment.

Credentials

Where do you find credentials for these items? Here's a little help:

Secret
Location
Notes and Example Value

kafka

username

ssl.properties

KAFKA_USERNAME=langstream-tenant

password

ssl.properties

KAFKA_PASSWORD=token:eyXxx...

tenant

ssl.properties

KAFKA_USERNAME=langstream-tenant

bootstrap.servers

ssl.properties

KAFKA_BOOTSTRAP_SERVERS=kafka-gcp-useast1.streaming.datastax.com:9093

open-ai

access-key

access-key: xxx

url

OPEN_AI_URL=https://company-openai-dev.openai.azure.com/

provider

OPEN_AI_PROVIDER=openai

embeddings-model

OPEN_AI_EMBEDDINGS_MODEL=text-embedding-ada-002 export

chat-completions-model

OPEN_AI_CHAT_COMPLETIONS_MODEL=gpt-35-turbo

text-completions-model

OPEN_AI_TEXT_COMPLETIONS_MODEL=gpt-3.5-turbo-instruct

vertex-ai

url

VERTEX_AI_URL=https://us-central1-aiplatform.googleapis.com

token

VERTEX_AI_TOKEN=xxx

serviceAccountJSON

A JSON file downloaded from the Google console containing auth info.

VERTEX_AI_JSON=xxx

region

VERTEX_AI_REGION=us-central1

project

VERTEX_AI_PROJECT=myproject

chat-completions-model

VERTEX_AI_CHAT_COMPLETIONS_MODEL=chat-bison

text-completions-model

VERTEX_AI_TEXT_COMPLETIONS_MODEL=text-bison

hugging-face

access-key

access-key:

provider

Can be api or local

HUGGING_FACE_PROVIDER=api

embeddings-model

HUGGING_FACE_EMBEDDINGS_MODEL=multilingual-e5-small

embeddings-model-url

HUGGING_FACE_EMBEDDINGS_MODEL_URL=djl://ai.djl.huggingface.pytorch/intfloat/multilingual-e5-small

astra

clientID

ASTRA_CLIENT_ID=fnsNZtMgvgBHurHJjfSbgQwifnsNZtMgvgBHurHJjfSbgQwi ClientID is generated with token

secret

ASTRA_SECRET=xxxx Secret is generated with token

token

ASTRA_TOKEN=AstraCSxxxx

database

ASTRA_DATABASE=my-database

The name of your Astra database

secureBundle

environment

ASTRA_ENVIRONMENT=PROD

s3

bucket-name

S3_BUCKET_NAME=langstream-code-storage

endpoint

S3_ENDPOINT=http://minio.minio-dev.svc.cluster.local:9000

access-key

S3_ACCESS_KEY=minioadmin

secret

S3_SECRET=minioadmin

region

S3_REGION=us-central1

google

client-id

client-id: xxxx

github

client-id

client-id: xxxx

pinecone

service

PINECONE_SERVICE=pinecone

access-key

PINECONE_ACCESS_KEY=xxxx

project-name

PINECONE_PROJECT_NAME=b4ea705

environment

PINECONE_ENVIRONMENT=asia-southeast1-gcp-free

index-name

PINECONE_INDEX_NAME=my-pinecone-index

Please note that the example values provided are taken from the current content and may not accurately reflect the actual values that should be used for each secret.

Root
Node
Type
Description

secrets

The base node in the yaml, Holds the collection of secrets.

name

The secret name used for display

id

The id of the secret used for referencing its value

data

<any key:value>

Object of applicable values, given the secret. Provide any combination of key:value that is applicable to the given secret.

To retrieve the values use the format - secrets.<name>.<key> (don't include "data" when referencing secrets)

Base64-encoded secure connect bundle downloaded from

ASTRA_SECURE_BUNDLE=""

here
pass secrets as environment variables.
Credentials.
OpenAI Access Key
OpenAI Azure URL
OpenAI Azure
OpenAI Azure
OpenAI Azure
OpenAI Azure
Google Service Account
Vertex API token
Google Service Account
Google Service Account
Google Service Account
Google Service Account
hugging-face
hugging-face
hugging-face
Astra
Astra
Astra
Astra
Astra
file:secure-connect-bundle.zip
Astra
Minio console
Minio console
Minio console
Minio console
Minio console
Google Service Account
Github
Pinecone console
Pinecone console
Pinecone console
Pinecone console
Pinecone console