Skip to main content

Documentation

Inference playground, model catalog, and GPU planning tools.

Getting Started

The basics of using Inferbase

Inferbase is an AI inference and model intelligence platform. Run models in the playground, compare across providers, and plan GPU infrastructure.

Quick Start

  1. Open the Playground to run models instantly
  2. Browse the model catalog to explore and compare AI models
  3. Use the Model Finder to get recommendations for your use case

Platform

  • Inference Playground: Run models, compare outputs side-by-side, with smart routing
  • Model Catalog: Browse and filter AI models across providers
  • Model Comparison: Compare up to 4 models side-by-side on benchmarks and specs
  • Model Finder: Get recommendations based on your use case and priorities
  • GPU Catalog: Browse GPU specs and cloud pricing
  • GPU Capacity Planning: Estimate VRAM, throughput, and latency for self-hosting