# README
ToyGraph - A Go Project for Computational Graphs
I created ToyGraph as a hands-on way to better understand computational graphs and automatic differentiation. While I’m not making any speed guarantees, I’ve focused on performance where I can (e.g., reusing allocated matrices).
ToyGraph is designed to be flexible with the types of data it can handle. I started with single-value variables and gradually expanded to include matrices and vectors, all while keeping support for scalar values. The nodes in ToyGraph are typed, ensuring type safety at compile time, though matrix/vector shape safety is only checked at runtime. You can mix scalar and matrix/vector values within the same graph, but be aware that I don’t plan to add tensor support—so this probably isn’t the tool for machine learning.
Currently, ToyGraph only runs on the CPU, though extending it to GPU support is something I might explore down the line (it would require a lot more code and operations). While the number of operations available is still limited, adding more is pretty straightforward. You can even use your own types, like a different matrix package than gonum, to build a graph.
What can it do?
ToyGraph can perform both the forward pass and automatic differentiation on any equation. Variables in the equation can be either scalars (float64, float32, int), or gonum/mat
dense vectors and matrices. It works by first getting you to build a graph (an equation represented as a set of nodes with connections between them). You can then set the values of the variable on this graph, then perform a forward pass, which computes the result of the equation (you can also see the result of any single node in the graph). If you want, you can also set the gradients of the output node, and the perform a backward pass which propagates these gradients back through the graph. You can then do various things such as gradient descent (this is the way neural nets learn). ToyGraph can have as many inputs and outputs as you want.