attention (Q102016)

From MaRDI portal
Revision as of 12:25, 7 March 2024 by Importer (talk | contribs) (‎Changed an Item)
Self-Attention Algorithm
Language Label Description Also known as
English
attention
Self-Attention Algorithm

    Statements

    0 references
    0.2.0
    12 July 2022
    0 references
    0.3.0
    23 April 2023
    0 references
    0.1.0
    24 June 2022
    0 references
    0.4.0
    10 November 2023
    0 references
    0 references
    10 November 2023
    0 references
    Self-Attention algorithm helper functions and demonstration vignettes of increasing depth on how to construct the Self-Attention algorithm, this is based on Vaswani et al. (2017) <doi:10.48550/arXiv.1706.03762>, Dan Jurafsky and James H. Martin (2022, ISBN:978-0131873216) <https://web.stanford.edu/~jurafsky/slp3/> "Speech and Language Processing (3rd ed.)" and Alex Graves (2020) <https://www.youtube.com/watch?v=AIiwuClvH6k> "Attention and Memory in Deep Learning".
    0 references
    0 references

    Identifiers

    0 references