Seclai

Seclai

GLM 4.7 Flash

LLM
Z.AI

GLM 4.7 Flash is a compact open-weight model optimized for lightweight deployment, balancing efficiency with strong coding, agentic task planning, and tool collaboration capabilities in a 30B class.

Context tokens

203,000

Output tokens

16,384

Schema

OpenAI-compatible chat format.

Schema documentation

Capabilities

Thinking
Multilingual

Supported languages

en
zh

Supported tools

No tools enabled.

Pricing

TypeCreditsUnits
Input0.93Credits per 1k tokens
Output5.32Credits per 1k tokens

Variants

No variants available for this model.

Try This Model

Write a prompt and experiment with GLM 4.7 Flash in the playground. You can compare it with other models side by side.