0.15.0
feat: Cerebras API support.
You can now run langroid with LLMs hosted on cerebras by setting up a CEREBRAS_API_KEY in your environment,
and specifying the chat_model
in the OpenAIGPTConfig
as cerebras/<model_name>
, e.g. cerebras/llama3.1-8b
.
Cerebras docs: https://inference-docs.cerebras.ai/introduction
Guide to using Langroid with Cerebras-hosted LLMs: https://langroid.github.io/langroid/tutorials/local-llm-setup/