Metadata-Version: 2.3
Name: psruq-python
Version: 0.0.3
Summary: Code for uncertainty quantification with proper scoring rules.
Requires-Python: >=3.10
Requires-Dist: bitsandbytes==0.42.0
Requires-Dist: matplotlib>=3.2.2
Requires-Dist: opencv-python>=4.10.0.84
Requires-Dist: pandas>=2.2.2
Requires-Dist: safetensors>=0.4.5
Requires-Dist: scikit-learn>=1.5.2
Requires-Dist: scipy>=1.14.1
Requires-Dist: seaborn>=0.13.2
Requires-Dist: torch-uncertainty
Requires-Dist: torch>=2.4.1
Requires-Dist: torchvision>=0.19.1
Requires-Dist: tqdm>=4.66.5
Provides-Extra: laplace
Requires-Dist: laplace-torch>=0.2.1; extra == 'laplace'
Provides-Extra: notebook
Requires-Dist: ipykernel>=6.29.5; extra == 'notebook'
Requires-Dist: jupyter>=1.1.1; extra == 'notebook'
Requires-Dist: jupyterlab>=4.2.5; extra == 'notebook'
Description-Content-Type: text/markdown

# Predictive Uncertainty Quantification via Risk Decompositions for Strictly Proper Scoring Rules

This repository contains code for the paper "Predictive Uncertainty Quantification via Risk Decompositions for Strictly Proper Scoring Rules," submitted to NeurIPS 2024.

### UV
We use uv to manage dependecies. You can find an installation guide and documentation [here](https://docs.astral.sh/uv/getting-started/installation/#standalone-installer).

**Install uv**

For Linux/MacOS the following line will install uv on your system:
```bash
curl -LsSf https://astral.sh/uv/install.sh | sh
```

**Install dependencies**

**All the commands should be executed from the folder with pyproject.toml file**

To install dependencies from pyproject.toml file using
```bash
uv sync
```
It will create a .venv folder, that will contain a virtual environment and all the needed dependencies.

**Run python script**

uv installs dependencies in special virutal env, that is located in .venv/. To run any python code snippet with it you should run:
```bash
uv run python ...
```
e.g.
```bash
cd experiments/laplace
uv run python main.py -f checkpoints/resnet18_ce.pth -d cifar10_one_batch -v
```

### Repository Structure

The repository is organized as follows:

### Experiments
This folder contains different experiments related to the paper.
- cifar10: Contains training scripts for cifar10 dataset.
- laplace: Contains inference scripts for Laplace Redux – Effortless Bayesian Deep Learning model.

### Embedding Extraction
- source/source/evaluation_utils.py: Script to extract embeddings after the ensembles are trained.

### Notebooks
Folder notebooks/ have notebooks with data analysis of experiments.
- joint_tables.ipynb: Prepares comprehensive pandas tables for both problems.
- ood_analysis.ipynb: Generates plots for the OOD problem.
- ood_visualizations.ipynb: Creates visualizations for the OOD problem.
- mis_analysis.ipynb: Generates plots for the misclassification problem.
- mis_visualizations.ipynb: Creates visualizations for the misclassification problem.

### Notes on Trained Models
Due to a supplementary material size limit of 100MB, we cannot include the trained models in this repository. However, we are ready to provide them through any resource that maintains anonymity upon request.