# Addition using Go TF

## Intro

TensorFlow (TF) is a C library for high performance computation often used for implementing neural nets and other types of ML. The primary bindings for TF are in Python.

Recently, Golang bindings have been created. As of this writing, the internet is lacking in adequate documentation for the TensorFlow go bindings. Therefore, seeking to partly remedy this, I present basic examples of its usage.

To begin with, I present this trivial example of adding two int8s.

## Installation

If you have not already installed the TensorFlow C library and go bindings, do so now. Instructions may be found here

## Example

Import the TF go lib and the OP lib.

```
import (
tf "github.com/tensorflow/tensorflow/tensorflow/go"
"github.com/tensorflow/tensorflow/tensorflow/go/op"
)
```

### Constructing the compute graph

TF requires that nodes live within a scope. Create one.

```
s := op.NewScope()
```

To get data into the compute graph, we need placeholders.
Placeholders are typed according to the type of data they hold.
Note, that the type of a placeholder refers to the data type of its elements, not its dimensionality or shape.
A `tf.Float`

placeholder may contain a single float, or an array of floats, or an array of arrays of floats, etc.
In this case, we create two placeholders to hold signed 8 bit integers.
Note that we use the optional `op.PlaceholderShape()`

to enforce the shape of the tensors fed to the placeholder.
In this case, the shape is scalar.
We create a sub scope for each placeholder to avoid naming collisions.

```
ph1 := op.Placeholder(s.SubScope("ph1"), tf.Int8, op.PlaceholderShape(tf.ScalarShape()))
ph2 := op.Placeholder(s.SubScope("ph2"), tf.Int8, op.PlaceholderShape(tf.ScalarShape()))
```

Then create the main operation.
This operation will perform the actual compute.
`op.Add`

takes the scope it is to live in, and two outputs, in this case, the two placeholders.

```
sum := op.Add(s, ph1, ph2)
```

Then, finalize the scope to get the graph.

```
graph, err := s.Finalize()
```

To put data into the placeholders, we must convert it to a `tf.Tensor`

.
In this case, we wish to sum the integers `1`

and `2`

.

```
tensor1, err := tf.NewTensor(int8(1))
tensor2, err := tf.NewTensor(int8(2))
```

To run a compute graph, we must provide TF with the tensors that are to fill the placeholders, and the list of operations, the output of which, we wish to know.

Create the input; a map of placeholder to tensor:

```
input := map[tf.Output]*tf.Tensor{
ph1: tensor1,
ph2: tensor2,
}
```

Create the list of output OPs:

```
output := []tf.Output{sum}
```

Create a new session in which to run the graph.

```
sess, err := tf.NewSession(graph, nil)
```

It should be noted that at this point, TF has performed no actual computation. TF compute graphs are lazily evaluated; outputs are computed only when we request them.

Now however, for the first time in this programs execution, TF performs computation.
We run the session, passing it the map of placeholders to tensors as input, and asking it to return the output of the operations listed in `output`

.

```
result, err := sess.Run(input, output, nil)
```

If you are using a GPU, at this point, the tensors of data would be copied from CPU RAM to GPU RAM, incurring overhead.
If this were a serious compute graph, execution of the go code would block for signification time while the C of TensorFlow executed.
However in this case, we are adding two 8 bit integers.
`sess.Run`

should return in negligible time.

`result`

is a slice of tensors, one tensor for each operation whose output you requested.
Fetch the first element of the slice, and extract its value.

```
value := result[0].Value().(int8)
```

It will return an interface which, in this case, can be coerced to an `int8`

.

## Conclusion

Using TensorFlow to add two integers is obviously silly. The overhead is many orders of magnitude great then the cost the actual addition, not to mention the complexity of ~39 lines of code. However, if we wish to add, or multiply, or perform some more complex operation on vast multi dimensional arrays of numbers, and wish to avail ourselves of the awesome parallelism of modern GPUs, the advantages of using TF is more obvious.

In the next post, I show how go TF can be used in somewhat more complex ways.