DeepSeek V3.2

DeepSeek V3.2 is a mixture-of-experts language model with 671 billion total parameters and 37 billion active parameters, featuring DeepSeek Sparse Attention (DSA) for efficient long-context inference and a scalable reinforcement learning framework for superior reasoning. It introduces a 'thinking with tools' capability that integrates chain-of-thought reasoning directly into tool-use scenarios, achieving GPT-5-class performance — including gold-medal results at the 2025 IMO and IOI — across advanced reasoning, coding, and agentic tasks.

Features

Serverless API

DeepSeek V3.2 is available via sciforium' serverless API, where you pay per token. There are several ways to call the sciforium API, including sciforium' Python client, the REST API, or OpenAI's Python client.

Docs

Agentic Capabilities

Features DeepSeek Sparse Attention (DSA) for efficient long-context inference and a scalable reinforcement learning framework for superior reasoning.

Docs
MiniMax  M2.5
Kimi K2.5
GLM 5
DeepSeek V3.2
gpt-oss-120b
gpt-oss-20b
Qwen3 Instruct
Qwen3 Thinking
Qwen3 Coder
Qwen3.5
Qwen3 VL Instruct
Qwen3 ASR
Qwen-Image
Qwen-Image-Edit
Flux2
Stable Diffusion 3.5
Hunyuan Image
Z-Image
Wan2.2-I2V
Wan2.2-T2V
Hunyuan Image
Z-Image