Configuring LLM Access for Your Bot
To enable LLM functionality in your bot, you need to declare it in the bot manifest file:true
:
Using LLMs in Your Bot Code
JavaScript Runtime
In JavaScript-based bots, you can use theLLM
class provided by the lilbots library:
Python Runtime
In Python-based bots, you can use theLLM
class from the lilbots library:
API Compatibility
TheLLM
class is designed to be compatible with the OpenAI API client, making it easy to integrate into existing code. The interface follows the same patterns and methods as the official OpenAI SDK.
Important Notes
- The model cannot be customized in the code. It is determined by the user’s selection or the default model specified in the bot manifest.
- The
developer
role can be used in messages to provide system-level instructions to the LLM. - Credits are consumed based on the model used and the number of tokens processed (both input and output).
Supported Models and Pricing
The following models are supported on the lilbots platform:Model ID | Provider | Model | Credits per 100 tokens |
---|---|---|---|
openai/gpt-4o | OpenAI | gpt-4o | 5 |
openai/gpt-4.1 | OpenAI | gpt-4.1 | 4 |
openai/o3 | OpenAI | o3 | 20 |
openai/o4-mini | OpenAI | o4-mini | 2 |
openai/gpt-4.1-mini | OpenAI | gpt-4.1-mini | 1 |
openai/gpt-4o-mini | OpenAI | gpt-4o-mini | 1 |