Your browser doesn't support the features required by impress.js, so you are presented with a simplified version of this presentation.

For the best experience please use the latest Chrome, Safari or Firefox browser.

What is Attention Anyways?

I thought I knew what “attention” was, so I looked around and found out that everyone is very sure they understand it and that the latest version they are using is the only one. But the idea of looking at the attention, or focus, or “internal thinking”, of a DNN to understand, or improve, its learning has been around for a while.

Reading that Led me to These Articles

This is a great article overviewing many of these topics that guided me to many of these articles:

New “Age” of Attention with Transformers

Learning with Attention for Images Classification and Labelling

The Original Use of Attention in CNNs

Torr, ICLR, 2018 - Learn to Pay Attention

https://arxiv.org/pdf/1804.02391.pdf

Bengio, ICML, 2015 - Show, Attend and Tell

http://proceedings.mlr.press/v37/xuc15.pdf

Using an attention calculation on an already trained CNN to indicate to the user what features the CNN is using to make its classification. It also ties these attention scores to descriptive word labels.

Using Attention to make CNNs Interpretable was all the craze for a while

Grad-CAM and other algorithms created cool images shows us what CNNs are “thinking”.

But they are computed after the CNN is learned, so we don’t really know how it relates to the learning process.

Also, some People have pointed out that even these methods can highlight things that aren’t really there:

Style Sheet (leave this alone)
Hidden Gingko Style Sheet Code

H1 Are Centred

H2 Have A Strong Line

And the text that follows is normal, whatever that means.

Paragraphs are no different.

H3 Has A Dashed Line

The text under it is normal

H3 Has A Dashed Line … even if there are multiple ones

The text under it is normal

H4 Does Something Different

I don’t remember what it is. It’s just smaller.

H5 Is Invisible

That’s like magic. It even applies to the text that follows.

(psst! H5 is invisible ^^^) But the question is…does it apply to text the next paragraph down? The answer is no

Or the next section down?

Who knows? I know, it doesn't

H6 Even Exists

What does it do?

No one knows.

# H7 Does not Exist

So it says.

or does it?

## H8+ Does not Exist

So it says.

or does it?