//
pkg.gl
Category
github.com/nlpodyssey/spago
nn
attention
package
1.1.0
Repository:
https://github.com/nlpodyssey/spago.git
Documentation:
pkg.go.dev
Versions
1
Dependencies
3
Dependents
1
Files
44 SLOC
#
Packages
multiheadattention
selfattention
#
Functions
ScaledDotProductAttention
ScaledDotProductAttention is a self-attention mechanism relating different positions of a single sequence to compute a representation of the same sequence.