When running OpenHands, you’ll need to set the following in the OpenHands UI through the Settings under the LLM tab:
LLM Provider to Groq
LLM Model to the model you will be using. Visit here to see the list of
models that Groq hosts. If the model is not in the list,
enable Advanced options, and enter it in Custom Model (e.g. groq/<model-name> like groq/llama3-70b-8192).
API key to your Groq API key. To find or create your Groq API Key, see here.
The Groq endpoint for chat completion is mostly OpenAI-compatible. Therefore, you can access Groq models as you
would access any OpenAI-compatible endpoint. In the OpenHands UI through the Settings under the LLM tab:
Enable Advanced options
Set the following:
Custom Model to the prefix openai/ + the model you will be using (e.g. openai/llama3-70b-8192)