attention (Q102016): Difference between revisions

From MaRDI portal
Importer (talk | contribs)
Added link to MaRDI item.
 
(4 intermediate revisions by one other user not shown)
Property / last update
23 April 2023
Timestamp+2023-04-23T00:00:00Z
Timezone+00:00
CalendarGregorian
Precision1 day
Before0
After0
 
Property / last update: 23 April 2023 / rank
Normal rank
 
Property / copyright license
 
Property / copyright license: GNU General Public License / rank
Normal rank
 
Property / copyright license: GNU General Public License / qualifier
edition/version: ≥ 3 (English)
 
Property / cites work
 
Property / cites work: Attention Is All You Need / rank
Normal rank
 
Property / software version identifier
 
0.1.0
Property / software version identifier: 0.1.0 / rank
 
Normal rank
Property / software version identifier: 0.1.0 / qualifier
 
publication date: 24 June 2022
Timestamp+2022-06-24T00:00:00Z
Timezone+00:00
CalendarGregorian
Precision1 day
Before0
After0
Property / software version identifier
 
0.4.0
Property / software version identifier: 0.4.0 / rank
 
Normal rank
Property / software version identifier: 0.4.0 / qualifier
 
publication date: 10 November 2023
Timestamp+2023-11-10T00:00:00Z
Timezone+00:00
CalendarGregorian
Precision1 day
Before0
After0
Property / last update
 
10 November 2023
Timestamp+2023-11-10T00:00:00Z
Timezone+00:00
CalendarGregorian
Precision1 day
Before0
After0
Property / last update: 10 November 2023 / rank
 
Normal rank
Property / description
 
Self-Attention algorithm helper functions and demonstration vignettes of increasing depth on how to construct the Self-Attention algorithm, this is based on Vaswani et al. (2017) <doi:10.48550/arXiv.1706.03762>, Dan Jurafsky and James H. Martin (2022, ISBN:978-0131873216) <https://web.stanford.edu/~jurafsky/slp3/> "Speech and Language Processing (3rd ed.)" and Alex Graves (2020) <https://www.youtube.com/watch?v=AIiwuClvH6k> "Attention and Memory in Deep Learning".
Property / description: Self-Attention algorithm helper functions and demonstration vignettes of increasing depth on how to construct the Self-Attention algorithm, this is based on Vaswani et al. (2017) <doi:10.48550/arXiv.1706.03762>, Dan Jurafsky and James H. Martin (2022, ISBN:978-0131873216) <https://web.stanford.edu/~jurafsky/slp3/> "Speech and Language Processing (3rd ed.)" and Alex Graves (2020) <https://www.youtube.com/watch?v=AIiwuClvH6k> "Attention and Memory in Deep Learning". / rank
 
Normal rank
Property / author
 
Property / author: Bastiaan Quast / rank
 
Normal rank
Property / copyright license
 
Property / copyright license: GNU General Public License / rank
 
Normal rank
Property / copyright license: GNU General Public License / qualifier
 
edition/version: ≥ 3 (English)
Property / cites work
 
Property / cites work: Attention Is All You Need / rank
 
Normal rank
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI software profile / rank
 
Normal rank
links / mardi / namelinks / mardi / name
 

Latest revision as of 18:56, 12 March 2024

Self-Attention Algorithm
Language Label Description Also known as
English
attention
Self-Attention Algorithm

    Statements

    0 references
    0.2.0
    12 July 2022
    0 references
    0.3.0
    23 April 2023
    0 references
    0.1.0
    24 June 2022
    0 references
    0.4.0
    10 November 2023
    0 references
    0 references
    10 November 2023
    0 references
    Self-Attention algorithm helper functions and demonstration vignettes of increasing depth on how to construct the Self-Attention algorithm, this is based on Vaswani et al. (2017) <doi:10.48550/arXiv.1706.03762>, Dan Jurafsky and James H. Martin (2022, ISBN:978-0131873216) <https://web.stanford.edu/~jurafsky/slp3/> "Speech and Language Processing (3rd ed.)" and Alex Graves (2020) <https://www.youtube.com/watch?v=AIiwuClvH6k> "Attention and Memory in Deep Learning".
    0 references
    0 references

    Identifiers

    0 references