package
1.1.0
Repository: https://github.com/nlpodyssey/spago.git
Documentation: pkg.go.dev

# Packages

# Functions

ScaledDotProductAttention is a self-attention mechanism relating different positions of a single sequence to compute a representation of the same sequence.