Skip to content

Latest commit

 

History

History
71 lines (62 loc) · 3.13 KB

File metadata and controls

71 lines (62 loc) · 3.13 KB
title Overview
description A tabular foundation model that delivers strong predictions in seconds — no dataset-specific training required.

TabPFN is a pre-trained transformer trained on billions of synthetic datasets to “learn the learning process.” Instead of re-optimizing weights for every new dataset, TabPFN encodes inductive biases, priors, and optimization strategies and applies them to your data via in-context learning. That means one forward pass → high-quality predictions in seconds.

How to access TabPFN

The fastest way to get started with TabPFN. Access our models through the cloud without requiring local GPU resources. Local installation for research and privacy-sensitive use cases with GPU support and a scikit-learn compatible interface.

Capabilities

Solve binary or multi-class classification problems with calibrated probabilities. Estimate continuous values with uncertainty-aware outputs and minimal preprocessing. Model time series (via TabPFN for forecasting) to predict future values and trends. Detect rare and anomalous samples using TabPFN. Generate realistic synthetic tabular data with TabPFN. Optimize TabPFN models to your own data with fine-tuning.

Why teams choose TabPFN

TabPFN reaches tuned-ensemble–level performance with near-instant training. Skip repeated training loops. Simply update the context and TabPFN performs zero-shot inference. Plug into any workflow with the familiar `scikit-learn` interface or through the Prior Labs API. Handles missing values, outliers, categorical & text features natively. Handles missing values, outliers, categorical & text features natively. Returns calibrated probabilities and integrates SHAP for explainable outcomes.

Get Started

Get up and running in minutes with step-by-step instructions