Layer-Wise Relevance Propagation with Conservation Property for ResNet
Published in ECCV24, 2024
The black-box nature of neural network models sometimes masks the underlying logic of their inference processes. This opacity presents significant challenges in verifying the validity of the models’ predictions. Layer-wise Relevance Propagation (LRP) stands out as a well-established method that transparently traces the flow of a model’s prediction backward through its architecture by backpropagating relevance scores. However, LRP has not fully considered the existence of a skip connection, and its application to the widely used ResNet architecture has not been thoroughly explored.
Project page: Click here
Paper link: Click here
Recommended citation: Otsuki, Seitaro, Tsumugi Iida, Félix Doublet, Tsubasa Hirakawa, Takayoshi Yamashita, Hironobu Fujiyoshi, and Komei Sugiura. "Layer-Wise Relevance Propagation with Conservation Property for ResNet." In European Conference on Computer Vision, pp. 349-364. Springer, Cham, 2025. https://arxiv.org/pdf/2407.09115