Hands-On Neural Networks
上QQ阅读APP看书,第一时间看更新

TensorBoard

TensorFlow provides a handy way to visualize a variety of important aspects of our network. To be able to use this useful tool, Keras will need to create some log files that TensorBoard will read.

A way to do this is to use callbacks. A callback is a set of functions that is applied at a specified stage during the model's training. It is possible to use these functions to get a view on the internal states and statistics of the model while it's training. Is it possible to pass a list of callbacks to the .fit() method of a Keras model. The relevant methods of the callbacks will then be called at each stage of the training.

Here is an example of callbacks:

keras.callbacks.TensorBoard(log_dir='./Graph', histogram_freq=0,  
write_graph=True, write_images=True)

Then it's possible to launch the TensorBoard interface to visualize the graph in this case, but it's also possible to visualize the metrics, the loss, or even the words embedding.

To launch TensorBoard from a terminal window, simply type in the following:

tensorboard --logdir=path/to/log-directory

This command will start a server and it will be possible to access it from http://localhost:6006. With TensorBoard, it will be possible to easily compare the performances of different network architectures or parameters:

This is the screenshot of a running TensorBoard