Go-attention: A full attention mechanism and transformer in pure Go
Published on: 2025-07-07 14:38:50
From the Frontier Research Team at takara.ai we present the first pure Go implementation of attention mechanisms and transformer layers, designed for high performance and ease of use.
Quick Start
Run our comprehensive examples:
# Get the module go get github.com/takara-ai/go-attention # Run the examples go run api_examples.go
API Documentation
Core Types
type Vector [] float64 // Represents a 1D vector of float64 values type Matrix [] Vector // Represents a 2D matrix of float64 values
1. Basic Dot-Product Attention
The simplest form of attention mechanism. Useful for basic sequence processing tasks.
import "github.com/takara-ai/go-attention/attention" // Create query-key-value setup query := attention. Vector { 1.0 , 0.0 , 1.0 , 0.0 } // Pattern to search for keys := attention. Matrix { { 1.0 , 0.0 , 1.0 , 0.0 }, // Similar to query { 0.0 , 1.0 , 0.0 , 1.0 }, // Different from query { 0.5 , 0.5 , 0.5 , 0.5 }, // Neutral pattern } values := attention. Matrix { { 1.0 , 2.0 }, //
... Read full article.