attention (Q102016): Difference between revisions
From MaRDI portal
Removed claim: cites work (P223): Attention Is All You Need (Q102015) |
Changed an Item |
||||||||||||||
Property / last update | |||||||||||||||
| |||||||||||||||
Property / last update: 23 April 2023 / rank | |||||||||||||||
Property / software version identifier | |||||||||||||||
0.1.0 | |||||||||||||||
Property / software version identifier: 0.1.0 / rank | |||||||||||||||
Normal rank | |||||||||||||||
Property / software version identifier: 0.1.0 / qualifier | |||||||||||||||
publication date: 24 June 2022
| |||||||||||||||
Property / software version identifier | |||||||||||||||
0.4.0 | |||||||||||||||
Property / software version identifier: 0.4.0 / rank | |||||||||||||||
Normal rank | |||||||||||||||
Property / software version identifier: 0.4.0 / qualifier | |||||||||||||||
publication date: 10 November 2023
| |||||||||||||||
Property / last update | |||||||||||||||
10 November 2023
| |||||||||||||||
Property / last update: 10 November 2023 / rank | |||||||||||||||
Normal rank | |||||||||||||||
Property / description | |||||||||||||||
Self-Attention algorithm helper functions and demonstration vignettes of increasing depth on how to construct the Self-Attention algorithm, this is based on Vaswani et al. (2017) <doi:10.48550/arXiv.1706.03762>, Dan Jurafsky and James H. Martin (2022, ISBN:978-0131873216) <https://web.stanford.edu/~jurafsky/slp3/> "Speech and Language Processing (3rd ed.)" and Alex Graves (2020) <https://www.youtube.com/watch?v=AIiwuClvH6k> "Attention and Memory in Deep Learning". | |||||||||||||||
Property / description: Self-Attention algorithm helper functions and demonstration vignettes of increasing depth on how to construct the Self-Attention algorithm, this is based on Vaswani et al. (2017) <doi:10.48550/arXiv.1706.03762>, Dan Jurafsky and James H. Martin (2022, ISBN:978-0131873216) <https://web.stanford.edu/~jurafsky/slp3/> "Speech and Language Processing (3rd ed.)" and Alex Graves (2020) <https://www.youtube.com/watch?v=AIiwuClvH6k> "Attention and Memory in Deep Learning". / rank | |||||||||||||||
Normal rank | |||||||||||||||
Property / author | |||||||||||||||
Property / author: Bastiaan Quast / rank | |||||||||||||||
Normal rank | |||||||||||||||
Property / copyright license | |||||||||||||||
Property / copyright license: GNU General Public License / rank | |||||||||||||||
Normal rank | |||||||||||||||
Property / copyright license: GNU General Public License / qualifier | |||||||||||||||
edition/version: ≥ 3 (English) | |||||||||||||||
Property / cites work | |||||||||||||||
Property / cites work: Attention Is All You Need / rank | |||||||||||||||
Normal rank |
Revision as of 15:11, 29 February 2024
Self-Attention Algorithm
Language | Label | Description | Also known as |
---|---|---|---|
English | attention |
Self-Attention Algorithm |
Statements
10 November 2023
0 references
Self-Attention algorithm helper functions and demonstration vignettes of increasing depth on how to construct the Self-Attention algorithm, this is based on Vaswani et al. (2017) <doi:10.48550/arXiv.1706.03762>, Dan Jurafsky and James H. Martin (2022, ISBN:978-0131873216) <https://web.stanford.edu/~jurafsky/slp3/> "Speech and Language Processing (3rd ed.)" and Alex Graves (2020) <https://www.youtube.com/watch?v=AIiwuClvH6k> "Attention and Memory in Deep Learning".
0 references