-1

I'm trying to configure a Z.ai GLM model in the Continue.dev VS Code extension, but the model doesn't appear in the dropdown menu despite having what appears to be a correct configuration.

My Current Configuration

I have the following in ~/.continue/config.yaml:

name: Local Agent
version: 1.0.0
schema: v1
models:
  - name: "ZGLM 4.6"
    provider: openai
    model: "glm-4.6"
    apiBase: "https://api.z.ai/api/coding/paas/v4"
    apiKey: "my-api-key"
    roles:
      - chat
      - edit
      - apply

What I've Tried:

Expected Behavior

The "ZGLM 4.6" model should appear in Continue.dev's model dropdown for Chat, Edit, and other roles.

Actual Behavior

The model doesn't appear in any dropdown menus. Only the default models (Claude Sonnet 4.5, Mercury Coder, etc.) are visible.

What am I missing in my configuration? Are there any common pitfalls when setting up custom OpenAI-compatible models in Continue.dev?

2 Answers 2

2

If the model doesn't appear, it usually means that it cannot be found. I'm not sure if openai provides Z-AI models (actually, first time I hear about them, but anyway).

You can use openrouter as a provider. They have all sorts of models. Basically, they just route the models they have registered from other providers, so if you see some well-known model, it will certainly be on OpenRouter.

Here's GLM 4.6's page on OpenRouter. I guess, this config will work for that model:

name: Local Agent
version: 1.0.0
schema: v1
models:
  - name: "ZGLM 4.6"
    provider: openrouter
    model: z-ai/glm-4.6 # Pasted from OpenRouter site
    apiBase: "https://api.z.ai/api/coding/paas/v4" # Not sure of that one
    apiKey: ${{ secrets.OPENROUTER_API_KEY }}
    roles:
      - chat
      - edit
      - apply

Well, if it is actually provided by OpenAI (don't know what models they do provide) it can be just that you mistyped the model's ID.

On Continue Hub there's this config, and they use GLM-4.6 as an identifier. This one might work too.

Though, don't judge if it's none of a help (e.g. if it's all right with identifiers). I'm kinda new to Continue and all that stuff.

Sign up to request clarification or add additional context in comments.

2 Comments

Thanks a lot for your attention! The official Z.ai documentation mentions that its API is OpenAI-compatible. There is no Z.ai in the list of Continue's Model Providers, but I guess it should be possible to use OpenAI-compatible APIs (I'm new to this stuff too!).
Thank you for providing a working example. After looking at the config you linked, i understood that my config should work. I then looked more thoroughly through the Continue extension and discovered that i had to select the Local Config instead of the Default Assistant.
1

enter image description here

Thanks for the attention, everyone!

I needed to look more thoroughly through the Continue extension to select Local Config instead of Default Assistant.

Selecting the Local Config option was the step i missed.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.