Uncertainty Quantification 360
Uncertainty quantification (UQ) gives AI the ability to express that it is unsure, adding critical transparency for the safe deployment and use of AI. This extensible open source toolkit can help you estimate, communicate and use uncertainty in machine learning model predictions through an AI application lifecyle. We invite you to use it and improve it.
Tutorials
Calibrated Housing Price Prediction
See how to generate and use UQ for a regression model to help decision-makers set the right housing price.
Selective Classification on Adult Income Dataset
See how to generate UQ for a classification model making income predictions, and use UQ to compare the model performance for different demographic groups.
Algorithms
Auxiliary Interval Predictor
Use an auxiliary model to improve the calibration of UQ generated by the original model.
Blackbox Metamodel Classification
Extract confidence scores from trained black-box classification models using a meta-model.
Blackbox Metamodel Regression
Extract prediction intervals from trained black-box regression models using a meta-model.
Classification Calibration
Post-hoc calibration of classification models using Isotonic Regression and Platt Scaling.
Heteroscedastic Regression
Train regression models that capture data uncertainty, assuming the targets are noisy and the amount of noise varies between data points.
Homoscedastic Gaussian Process Regression
Train Gaussian Process Regression models with homoscedastic noise that capture data and model uncertainty.
Horseshoe BNN classification
Train Bayesian neural networks classifiers with Gaussian and Horseshoe priors that capture data and model uncertainty.
Horseshoe BNN regression
Train BNNs regression models with Gaussian and Horseshoe priors that capture data and model uncertainty.
Infinitesimal Jackknife
Extract uncertainty from trained models by approximating the effect of training data perturbations on the model’s predictions.
Quantile Regression
Train Quantile Regression models that capture data uncertainty, by learning two separate models for the upper and lower quantile to obtain the prediction intervals.
UCC Recalibration
Recalibrate UQ of a regression model to specified operating point using Uncertainty Characteristics Curve.
Metrics
Classification Metrics
Include Expected Calibration Error, Brier Score, etc. for classification models. Several diagnosis tools such as reliability diagram and risk-vs-rejection rate curves are also provided.
Regression Metrics
Include Prediction Interval Coverage Probability (PICP) and Mean Prediction Interval Width (MPIW) among others.
Uncertainty Characteristic Curve
A novel operation-point agnostic approach for evaluating UQ, allowing comparison of the trade-off between PICP and MPIW.