Welcome to Alveare

Private SLM inference at 10% of the cost. One shared model powers multiple specialists — classification, summarisation, extraction, Q&A, chat, and code generation — through a single API.

Quickstart

Get from zero to your first API response in three steps.

1

Sign up

Create your account at alveare.ai. Every plan starts with a 7-day free trial. No credit card required to start.

2

Get your API key

After signing up, visit the dashboard to generate an API key. Keys are prefixed with alv_live_.

3

Make your first request

Call any specialist with a single POST request. Pick your language below.

bash
curl -X POST https://api.alveare.ai/v1/infer \
  -H "Authorization: Bearer alv_live_abc123..." \
  -H "Content-Type: application/json" \
  -d '{
    "specialist": "summarise",
    "prompt": "Summarise this quarterly report in 3 bullet points: ...",
    "max_tokens": 256
  }'
python
from alveare import Alveare

client = Alveare(api_key="alv_live_abc123...")

response = client.infer(
    specialist="summarise",
    prompt="Summarise this quarterly report in 3 bullet points: ...",
    max_tokens=256
)

print(response.result)
typescript
import Alveare from '@alveare-ai/sdk';

const client = new Alveare({ apiKey: 'alv_live_abc123...' });

const response = await client.infer({
  specialist: 'summarise',
  prompt: 'Summarise this quarterly report in 3 bullet points: ...',
  maxTokens: 256,
});

console.log(response.result);

Your API key is sensitive. Store it as an environment variable (ALVEARE_API_KEY) rather than hard-coding it. Both SDKs read this variable automatically.

Install an SDK

Python
pip install alveare
TypeScript / JavaScript
npm install @alveare-ai/sdk
CLI
npm install -g @alveare-ai/cli

See the full guides: Python SDK, TypeScript SDK, CLI.

OpenAI-compatible endpoint

Already using OpenAI? Point your existing code at Alveare with zero changes. The /v1/chat/completions endpoint accepts the same request format. Set the model field to a specialist name.

python
# Works with the official openai Python package
import openai

client = openai.OpenAI(
    api_key="alv_live_abc123...",
    base_url="https://api.alveare.ai/v1",
)

response = client.chat.completions.create(
    model="alveare-summarise",
    messages=[{"role": "user", "content": "Summarise this report..."}],
    max_tokens=256,
)

print(response.choices[0].message.content)

Specialists overview

A specialist is a tuned configuration — system prompt, sampling parameters, and guardrails — running on a shared model. One model, many capabilities.

classify

Classify

Categorise text into labels. Sentiment, intent, topic routing.

summarise

Summarise

Condense long text into short summaries. Bullet points or prose.

extract

Extract

Pull structured data from unstructured text. Outputs JSON.

qa

Q&A

Answer questions given a context passage. Grounded responses.

chat

Chat

Multi-turn conversation with memory. Customer support, assistants.

code

Code

Generate, explain, and refactor code in any mainstream language.

Read the full Specialists Guide for examples, best practices, and tips on choosing the right specialist.

What's next?