Uncertainty Quantification 360
Uncertainty quantification (UQ) gives AI the ability to express that it is unsure, adding critical transparency for the safe deployment and use of AI. This extensible open source toolkit can help you estimate, communicate and use uncertainty in machine learning model predictions through an AI application lifecyle. We invite you to use it and improve it.

Algorithms

Auxiliary Interval Predictor
Use an auxiliary model to improve the calibration of UQ generated by the original model.
Blackbox Metamodel Classification
Extract confidence scores from trained black-box classification models using a meta-model.
Blackbox Metamodel Regression
Extract prediction intervals from trained black-box regression models using a meta-model.
Classification Calibration
Post-hoc calibration of classification models using Isotonic Regression and Platt Scaling.
Heteroscedastic Regression
Train regression models that capture data uncertainty, assuming the targets are noisy and the amount of noise varies between data points.
Homoscedastic Gaussian Process Regression
Train Gaussian Process Regression models with homoscedastic noise that capture data and model uncertainty.
Horseshoe BNN classification
Train Bayesian neural networks classifiers with Gaussian and Horseshoe priors that capture data and model uncertainty.
Horseshoe BNN regression
Train BNNs regression models with Gaussian and Horseshoe priors that capture data and model uncertainty.
Infinitesimal Jackknife
Extract uncertainty from trained models by approximating the effect of training data perturbations on the model’s predictions.
Quantile Regression
Train Quantile Regression models that capture data uncertainty, by learning two separate models for the upper and lower quantile to obtain the prediction intervals.
UCC Recalibration
Recalibrate UQ of a regression model to specified operating point using Uncertainty Characteristics Curve.