attention module

10 December 2020
7 to 9 PM CET

The paper “Rethinking Attention with Performers” introduced the Performer, a new model that approximates Transformer architectures and significantly improves their space and time complexity. 

Recently, Sepp Hochreiter and his team showed connections between transformer self-attention and Modern Hopfield networks which can also be extended towards new Performer networks. 

We are delighted to host a fireside session with the authors of these papers, Sepp Hochreiter (IARAI), Krzysztof Choromanski (Google Research) and Johannes Brandstetter (JKU Linz). The panelists will discuss the new linear attention mechanism in Performers, which significantly improves computational efficiency for large inputs, and its connection to classical Hopfield networks.

©2021 IARAI - INSTITUTE OF ADVANCED RESEARCH IN ARTIFICIAL INTELLIGENCE

Imprint | Privacy Policy

Stay in the know with developments at IARAI

We can let you know if there’s any

updates from the Institute.
You can later also tailor your news feed to specific research areas or keywords (Privacy)
Loading

Log in with your credentials

or    

Forgot your details?

Create Account