This feature is only available on the Enterprise tier
Prerequisites
- You must install the gitpod CLI on its latest version.
- You must have an Ona enterprise license.
- You must have a Portkey account.
- You should have configured at least one LLM provider in your Portkey account. We currently support Vertex, Bedrock and Anthropic.
Create API credentials in Portkey
Step 1: Get your Portkey API Key
- Go to the Portkey Dashboard
- Navigate to API Keys in the left sidebar
- Click Create API Key
- Enter a name for your API key (e.g., “ona-agent-integration”)
- Click Create
- Copy the generated API key and store it safely in your vault
Step 2: Create a Virtual Key (Optional but Recommended)
Virtual Keys in Portkey allow you to configure specific LLM providers, models, and settings:- In the Portkey Dashboard, go to Virtual Keys
- Click Create Virtual Key
- Configure your preferred settings:
- Provider: Select your LLM provider (e.g., Anthropic, Vertex)
- Model: Choose the specific model you want to use
- Additional settings: Configure any provider-specific parameters
- Click Create Virtual Key
- Copy the Virtual Key ID
Step 3: Configure the LLM Integration on ONA
- Login into your Ona account:
- Switch the organization, if needed, and select the one that you want to configure the integration on:
- Find the runner on which you want to configure the integration:
- Create the LLM integration:
PORTKEY_GATEWAY_URL
should be https://api.portkey.ai/v1/messages or the url of a self-hosted solution.PLACEHOLDER
can be any non-empty value (likegitpod
) as this is actually not used for the integration. The API key is either managed by Portkey or passed as a header. We need to pass a non-empty value here otherwise the command would be invalid.
- Check if your integration was created as expected:
- Configure Portkey headers: Portkey leverages headers and configurations to access different LLM Providers customise its behaviour. Both can be used to decide how you use Portkey. Ona allows to configure headers to be passed to LLM integrations, in this key to Portkey, to customise its behaviour.
PORTKEY_CONFIG_ID
is the ID of the configuration that you set up earlier (it looks likepc-xxx
).PORTKEY_API_KEY
: is Portkey’s API Key
LLM_API_KEY
: is the API key of the integration that you want Portkey to talk to; for example an anthropic API KeyPORTKEY_API_KEY
: is Portkey’s API KeyPROVIDER
: is the name of the LLM provider to be used, for example antrophic
- Removing or Updating headers (Optional) You might want to customise the configuration further or remove some headers added by mistake.
- You can use the
set-header
command to create or overwrite headers. - You can use the
remove-header
command to delete a header. - You can use the
list-headers
command to list the headers enabled on the integration.
Monitoring and Observability
Portkey provides comprehensive observability for your LLM requests:- Request Logs: View all requests made through Portkey
- Analytics: Monitor usage, costs, and performance metrics
- Alerts: Set up notifications for errors or usage thresholds
Verify the integration
- Create a new environment with an enabled runner
- Open Ona Agent and confirm it can access your configured LLM models through Portkey
- Test with a simple code generation request
- Check your Portkey Dashboard to verify requests are being logged
Troubleshooting
Common Issues
Authentication Errors- Verify your Portkey API key is correct and active
- Ensure your Virtual Key (if used) is properly configured
- Check that your underlying LLM provider credentials in Portkey are valid
- Confirm the model specified in your Virtual Key or Config is available
- Verify your underlying LLM provider has sufficient quota/credits
- Check Portkey’s status page for any service disruptions
- Review request logs in your Portkey Dashboard for detailed error messages
- Ensure your Portkey Config (if used) has valid fallback providers configured
- Verify network connectivity between your Gitpod environment and Portkey
Getting Help
- Check the Portkey Documentation for detailed configuration guides
- Contact Portkey support through their dashboard for gateway-specific issues
- Reach out to your Ona account manager for support (if you are an enterprise customer).
Next Steps
Your Portkey LLM gateway is now configured and ready to use with Ona Agent. You can:- Configure additional LLM providers in Portkey for redundancy
- Set up advanced routing rules and fallbacks
- Monitor your AI usage and costs through Portkey’s analytics
- Explore Portkey’s prompt management and caching features