← Back to explorer

Baidu: ERNIE 4.5 300B A47B

Server-rendered model summary page for indexing/share previews. Use the interactive explorer for full filtering and comparison.

Match confidence: UnmatchedSource type: openrouter_only
Context window
123K
Arena overall rank
Input price
$0.000 / 1M
Output price
$0.000 / 1M

Identifiers & provenance

Primary ID
baidu/ernie-4.5-300b-a47b
OpenRouter ID
baidu/ernie-4.5-300b-a47b
Canonical slug
baidu/ernie-4.5-300b-a47b

Source semantics

  • Arena rank is a human-preference leaderboard signal, not a universal truth metric.
  • OpenRouter usage/popularity reflects adoption/traffic, not benchmark quality.
  • Pricing fields may differ by provider and can include extra modes beyond prompt/completion.

Read more on Methodology & data sources.

Description

ERNIE-4.5-300B-A47B is a 300B parameter Mixture-of-Experts (MoE) language model developed by Baidu as part of the ERNIE 4.5 series. It activates 47B parameters per token and supports text generation in both English and Chinese. Optimized for high-throughput inference and efficient scaling, it uses a heterogeneous MoE structure with advanced routing and quantization strategies, including FP8 and 2-bit formats. This version is fine-tuned for language-only tasks and supports reasoning, tool parameters, and extended context lengths up to 131k tokens. Suitable for general-purpose LLM applications with high reasoning and throughput demands.

Raw fields snapshot

{
  "id": "baidu/ernie-4.5-300b-a47b",
  "name": "Baidu: ERNIE 4.5 300B A47B ",
  "description": "ERNIE-4.5-300B-A47B is a 300B parameter Mixture-of-Experts (MoE) language model developed by Baidu as part of the ERNIE 4.5 series. It activates 47B parameters per token and supports text generation in both English and Chinese. Optimized for high-throughput inference and efficient scaling, it uses a heterogeneous MoE structure with advanced routing and quantization strategies, including FP8 and 2-bit formats. This version is fine-tuned for language-only tasks and supports reasoning, tool parameters, and extended context lengths up to 131k tokens. Suitable for general-purpose LLM applications with high reasoning and throughput demands.",
  "created": 1751300139,
  "canonical_slug": "baidu/ernie-4.5-300b-a47b",
  "hugging_face_id": "baidu/ERNIE-4.5-300B-A47B-PT",
  "source_type": "openrouter_only",
  "context_length": 123000,
  "max_completion_tokens": 12000,
  "is_moderated": false,
  "architecture": {
    "modality": "text->text",
    "input_modalities": [
      "text"
    ],
    "output_modalities": [
      "text"
    ],
    "tokenizer": "Other",
    "instruct_type": null
  },
  "input_modalities": [
    "text"
  ],
  "output_modalities": [
    "text"
  ],
  "modality": "text->text",
  "tokenizer": "Other",
  "instruct_type": null,
  "supported_parameters": [
    "frequency_penalty",
    "max_tokens",
    "presence_penalty",
    "repetition_penalty",
    "response_format",
    "seed",
    "stop",
    "structured_outputs",
    "temperature",
    "top_k",
    "top_p"
  ],
  "default_parameters": {},
  "per_request_limits": null,
  "top_provider": {
    "context_length": 123000,
    "max_completion_tokens": 12000,
    "is_moderated": false
  },
  "pricing": {
    "prompt": "0.00000028",
    "completion": "0.0000011"
  },
  "PPM": {
    "prompt": 0.28,
    "completion": 1.1
  },
  "openrouter_raw": {
    "id": "baidu/ernie-4.5-300b-a47b",
    "canonical_slug": "baidu/ernie-4.5-300b-a47b",
    "hugging_face_id": "baidu/ERNIE-4.5-300B-A47B-PT",
    "name": "Baidu: ERNIE 4.5 300B A47B ",
    "created": 1751300139,
    "description": "ERNIE-4.5-300B-A47B is a 300B parameter Mixture-of-Experts (MoE) language model developed by Baidu as part of the ERNIE 4.5 series. It activates 47B parameters per token and supports text generation in both English and Chinese. Optimized for high-throughput inference and efficient scaling, it uses a heterogeneous MoE structure with advanced routing and quantization strategies, including FP8 and 2-bit formats. This version is fine-tuned for language-only tasks and supports reasoning, tool parameters, and extended context lengths up to 131k tokens. Suitable for general-purpose LLM applications with high reasoning and throughput demands.",
    "context_length": 123000,
    "architecture": {
      "modality": "text->text",
      "input_modalities": [
        "text"
      ],
      "output_modalities": [
        "text"
      ],
      "tokenizer": "Other",
      "instruct_type": null
    },
    "pricing": {
      "prompt": "0.00000028",
      "completion": "0.0000011"
    },
    "top_provider": {
      "context_length": 123000,
      "max_completion_tokens": 12000,
      "is_moderated": false
    },
    "per_request_limits": null,
    "supported_parameters": [
      "frequency_penalty",
      "max_tokens",
      "presence_penalty",
      "repetition_penalty",
      "response_format",
      "seed",
      "stop",
      "structured_outputs",
      "temperature",
      "top_k",
      "top_p"
    ],
    "default_parameters": {},
    "expiration_date": null
  }
}
Baidu: ERNIE 4.5 300B A47B · NNZen