Skip to content

RohanSardar/CNNexplainability

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🧠 CNN Explainability

This repository contains implementations of popular CNN explainability techniques such as CAM, Grad-CAM, Grad-CAM++, and more. The goal is to make it easy to visualize what parts of an image a convolutional neural network focuses on when making predictions.

At the moment, only Class Activation Mapping (CAM) is implemented. More methods — including Grad-CAM, Grad-CAM++, Score-CAM, and others — will be added soon.

🚧 Project Status

This project is a work in progress. Expect frequent updates as new explainability techniques are added.

🖼️ Class Activation Mapping (CAM)

CAM highlights the image regions that contribute most to a model’s prediction. Explore class_activation_mapping.ipynb

🤝 Contributing

Contributions, suggestions, and ideas are welcome! Feel free to open an issue or submit a pull request.

About

Explore how CNN actually makes a prediction

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published