ERNIE-4.5-300B-A47B is a 300B parameter Mixture-of-Experts (MoE) language model developed by Baidu as part of the ERNIE 4.5 series. It activates 47B parameters per token and supports text generation in both English and Chinese. Optimized for high-throughput inference and efficient scaling, it uses a heterogeneous MoE structure with advanced routing and quantization strategies, including FP8 and 2-bit formats. This version is fine-tuned for language-only tasks and supports reasoning, tool parameters, and extended context lengths up to 131k tokens. Suitable for general-purpose LLM applications with high reasoning and throughput demands.
Input
Output
Context
123K
Max Output
12K
Parameters
300B
Input Modalities
Output Modalities
Input
$0.280
Output
$1.10
| Platform | Input | Output |
|---|---|---|
OpenRouter | $0.280 | $1.10 |
Estimates based on INT8 quantization. Actual requirements vary by framework and configuration.
Data sourced from official provider APIs and documentation
Last updated: Mar 24, 2026
Explore models, compare pricing and benchmarks, and right-size your infrastructure — all in one place.