# Tensorflow

## On this page:

1. [Tensorflow](#tensorflow)
   1. [Version 1.x](#tensorflow-1-x)
   2. [Version 2.x](#tensorflow-2-x-eager-execution)
   3. [Tensorboard](#tensorboard)
2. [Keras](#keras)
3. [Tensorflow Probability](#tensorflow-probability)

## Tensorflow

Tensorflow (TF) is a powerful ML framework that lets us develop and train models that are computationally optimized and compatible with different processing units, including CPUs, GPUs, and the more recent TPUs.&#x20;

### Tensorflow 1.x

{% embed url="<https://www.tensorflow.org/versions/r1.15/api_docs>" %}

In TF 1.x, the ***computational*** ***graph***, which maps out operation order and dependencies, is specified first, before any computations occur. This type of graph is commonly called a ***static graph*** since it doesn't change during a ***session***, which is when the graph is executed and values are computed.&#x20;

![Tensorflow graph visualized using Tensorboard](https://3071976149-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-MIB5AI4iyB7O4uln8Ob%2F-MN9ho8E6fzRK3WZZlpe%2F-MN9jGxVzVgvNTPKHh78%2Fimage.png?alt=media\&token=7b0c6207-f469-4553-8e37-ff1b78f1ee15)

A TF computational graph is specified using a combination of these objects and operations:&#x20;

* ***TF Placeholders*** are objects that are given information at the time of computation. Ex: the input layer of a neural network.
* ***TF Variables*** are the parameters that are evaluated during a session. Ex: weights and biases.
* ***TF operations,*** which are performed on placeholders and variables, closely follow NumPy operations. TF also includes more ML specific functions, such as one-hot encoding (*i.e.* turning classifications into integer representation) and nonlinear activation functions.&#x20;

In addition to the graph, we also need to specify the training method and any functions to execute during training:

* ***Optimizer*** minimizes the specified loss function to iteratively update the TF variables.
* ***Callback functions*** are functions that are executed during training after a certain number of iterations. Popular callbacks include early stopping (*i.e.* determining when to stop training), using [TensorBoard](#tensorboard-visualization-tool), etc.&#x20;

Finally, to perform the computation, we start a TF session and feed the placeholders with our dataset.&#x20;

### Tensorflow 2.x (Eager Execution)

{% embed url="<https://www.tensorflow.org/api_docs>" %}

With TF 2.x, there are no longer TF "sessions" and everything is executed eagerly.

### TensorBoard (Visualization Tool)

{% embed url="<https://www.tensorflow.org/tensorboard/get_started>" %}

Tensorboard is Tensorflow's visualization tool for viewing computational graphs, tracking variables & performance metrics, etc. Installation of Tensorflow includes TensorBoard. To launch the dashboard, run this on terminal and point the --logdir to the appropriate directory.

```
tensorboard --logdir=summaries
```

Open the link shown on the terminal (default: localhost:8888) on a web browser.

<div align="center"><img src="https://3071976149-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-MIB5AI4iyB7O4uln8Ob%2F-MN9nlvItmEgyWOvbM6s%2F-MN9qm3tonz-gUgkujph%2FScreen%20Shot%202020-11-27%20at%2012.57.06%20PM.png?alt=media&#x26;token=2ba18859-3bb8-4375-bafb-136a2355ecfd" alt="TensorBoard dashboard"></div>

## Keras

{% embed url="<https://keras.io>" %}

Keras is a popular high-level API for building deep neural networks using prebuilt layers (*i.e.* dense layers, convolutional layers, batch normalization layers, etc.). Keras is great for quickly building regular models, but can't accommodate custom operations like in Tensorflow/Pytorch.

Another benefit of Keras is the abundance of [pre-trained model architectures](https://keras.io/api/applications/) that are available for transfer learning.

## Tensorflow Probability

{% embed url="<https://www.tensorflow.org/probability>" %}

Tensorflow Probability is a library for probabilistic deep learning models and includes useful tools for Variational Inference, Bayesian Neural Networks, MCMC, etc.
