I'm trying to configure a Z.ai GLM model in the Continue.dev VS Code extension, but the model doesn't appear in the dropdown menu despite having what appears to be a correct configuration.
My Current Configuration
I have the following in ~/.continue/config.yaml:
name: Local Agent
version: 1.0.0
schema: v1
models:
- name: "ZGLM 4.6"
provider: openai
model: "glm-4.6"
apiBase: "https://api.z.ai/api/coding/paas/v4"
apiKey: "my-api-key"
roles:
- chat
- edit
- apply
What I've Tried:
Verified the YAML syntax is valid
Restarted VS Code multiple times
Followed the Continue.dev documentation for OpenAI-compatible providers
Expected Behavior
The "ZGLM 4.6" model should appear in Continue.dev's model dropdown for Chat, Edit, and other roles.
Actual Behavior
The model doesn't appear in any dropdown menus. Only the default models (Claude Sonnet 4.5, Mercury Coder, etc.) are visible.
What am I missing in my configuration? Are there any common pitfalls when setting up custom OpenAI-compatible models in Continue.dev?
