Skip to content
/ gratsi Public

joint graph recovery and attribution from post-hoc XAI

Notifications You must be signed in to change notification settings

visriv/gratsi

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

30 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

GrATSI: GRaph-based Attributions for Time Series explainability

GrATSI introduces a framework for graph-based attributions in time series models, bridging saliency methods to graphs recovery. By using the post-hoc explainers to generate a structured graph, it enables understanding and evaluation of feature interaction and relevance in temporal modelling.


πŸ“¦ Installation

Clone and set up the environment:

git clone https://github.com/<your-username>/grats-xai.git
cd grats-xai

# Create conda environment
conda create -n grats python=3.9 -y
conda activate grats

# Install requirements
pip install -r requirements.txt

Project Structure

β”œβ”€β”€ configs/               # YAML configs for data generation & experiments
β”‚   └── data_gen.yaml
β”œβ”€β”€ src/
β”‚   β”œβ”€β”€ datasets/          # Synthetic DBN generator
β”‚   β”‚   └── synthetic_dbn.py
β”‚   β”œβ”€β”€ models/            # Simple baselines (e.g. LSTM)
β”‚   β”‚   └── simple_lstm.py
β”‚   β”œβ”€β”€ explainers/        # Explainability methods (IG, TimeRISE, etc.)
β”‚   └── evaluation/        # Metrics (infidelity, comprehensiveness, etc.)
β”œβ”€β”€ runs/                  # Auto-saved experiments (ignored via .gitignore)
└── README.md

πŸš€ Usage

All supported in the config

Quick check (debugging)

python scripts/pipeline.py --config configs/pipeline_quick.yaml

Complete Run

python scripts/pipeline.py --config configs/pipeline.yaml

Compile and aggregate the runs to visualize

python scripts/agg_results.py

1. Generate synthetic data

Outputs are saved under:

runs/dbn_n{params}/
  β”œβ”€β”€ train.pkl
  β”œβ”€β”€ val.pkl
  └── plots/

2. Train a model

3. Run explanability

Choose an explainer:

# Integrated Gradients
# TimeRISE

// TODO: Integrated Hessians DeepLIFT Timex++ etc.

4. Asymmetric Perturbation Response to estimate W_hat (p copies for the lag)

5. Attribution Refinement using graph

πŸ“Š Example Output

About

joint graph recovery and attribution from post-hoc XAI

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages