Get Up and Running
Create an Account
Navigate to app.traceport.ai and sign up for a free account.
Connect Your Providers
Traceport routes your LLM requests to the appropriate model providers. Connect your preferred providers first.
- Go to Integrations in the left sidebar.
- Select your provider under the “Available” section.
- Input your provider’s API key — it is securely stored and used to forward your prompts.
Generate a Traceport API Key
- Navigate to API Keys in the sidebar.
- Click Create New Key.
- Name your key (e.g., “Production API”).
- Copy the generated key.
Update Your Codebase
Integrating Traceport requires minimal changes. Simply point your SDK to Traceport’s gateway URL and use your Traceport API Key.
Traceport provides an OpenAI-compatible Gateway API, so you can use standard OpenAI SDKs (Python, Node.js) or any framework like LangChain or LlamaIndex that supports custom base URLs.
View Your Logs
Run your code, then return to your Traceport Dashboard and navigate to Logs. You’ll instantly see your captured completions, latency, and cost in real-time.
What’s Next?
API Logs
Deep-dive into individual request details, tokens, and costs.
Workflow Traces
Monitor complex multi-step AI workflows end-to-end.
Model Playground
Test and compare models interactively before writing code.
API Reference
Explore the full Traceport Gateway API documentation.

