Embeddings

Cinna LLM Gateway lets you convert text into high-dimensional vectors that preserve semantic meaning. These embeddings unlock capabilities like semantic search, clustering, classification, and retrieval-augmented generation.

Use embeddings to:

  • Find conceptually similar text

  • Cluster or classify large corpora

  • Power RAG pipelines with vector search

  • Enhance onchain or agent-based reasoning on Solana


Example: Generate Embeddings with Cinna

pythonCopy codefrom openai import OpenAI

client = OpenAI(
    api_key="your_user_id#your_api_key",
    base_url="https://llm-gateway.cinna.ai"
)

embeddings = client.embeddings.create(
    model="BAAI/bge-large-en-v1.5",
    input="Hello, world!",
    encoding_format="float"
)

print(embeddings.data[0].embedding)
print("Prompt tokens used:", embeddings.usage.prompt_tokens)

The interface mirrors OpenAI’s SDK so you can integrate seamlessly. All embeddings are served through the Cinna decentralized stack, designed for efficient AI pipelines and Solana-native infrastructure.

Last updated