https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEieS3gcd-GUycw1ETdqspuCz_QpSzgu9jpSJcMVMZPR9i8czVryheltzlHT-oJGb2jdXzS_mgTTYwbqHhvHnhvosoOZ8_I4ROUyhmWV8DUeVirh4KUi9oY7zU9M6HZIe9GfGnH8l4A2cr8/s1600/Screen+Shot+2019-11-25+at+3.56.24+PM.png
Posted by Gal Oshri, Product Manager
TensorBoard, TensorFlow’s visualization toolkit, is often used by researchers and engineers to visualize and understand their ML experiments. It enables
tracking experiment metrics,
visualizing models,
profiling ML programs,
visualizing hyperparameter tuning experiments, and much more.
While TensorBoard makes it easy to visualize your own experiments, machine learning often involves collaboration. You might want to share your research about the effect of a hyperparameter, explain a complicated training procedure, or get help troubleshooting strange model behavior.
We have seen people sharing screenshots of their TensorBoards to achieve this. However, screenshots aren’t interactive and fail to capture all the details. At Google, researchers and engineers often communicate their insights about model behavior by sending their TensorBoard visualizations to teammates. Our goal is to provide this capability to the broader community.
That is why we launched
TensorBoard.dev: a managed service (currently in preview) that enables you to easily host, track, and share your ML experiments for free. Simply upload your TensorBoard logs and receive a link that can be viewed by everyone, with no installation or setup.
If a picture is worth a thousand words, we believe an interactive TensorBoard can be even more valuable.
We are excited to see how the community engages with TensorBoard.dev. Here are a few examples and ideas:
- Research: The paper “Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer” investigates transfer learning for NLP using a text-to-text model and achieves state-of-the-art results on several tasks. This TensorBoard.dev example shows the training results of the baseline for the “pretraining dataset” exploration, corresponding to the first row of table 8 in the paper. The process of pretraining for ~520k steps followed by task-specific training is shown through the loss curve in TensorBoard.
- Example models: TensorBoard.dev can provide a point of reference for people who want to understand the training process used for example models, or make sure they are reproducing them correctly. For example:
- Troubleshooting: Suppose you encounter unexpected behavior during training. Sharing a link to the TensorBoard (instead of a screenshot) could help convey this quickly and aid troubleshooting.
- Tutorials: The TensorFlow.org tutorials on overfitting and underfitting and Pix2Pix now use TensorBoard.dev to help illustrate experiment results.
Getting Started
The first step is to identify the TensorBoard logs you want to share (you can download a sample from
here). Note that the TensorBoard you upload will be publicly visible, so do not upload sensitive data.
Make sure you have the latest TensorBoard installed:
pip install -U tensorboard
Then, simply use the upload command:
tensorboard dev upload --logdir {logs}
After following the instructions to authenticate with your Google Account, a TensorBoard.dev link will be provided. You can view the TensorBoard immediately, even during the upload. The uploader will continue running and uploading new logs that appear in the log directory until you stop the process.
The TensorBoard.dev link can be opened by everybody, so feel free to use it to share your research, ask for advice in a GitHub issue or Stack Overflow question, or simply track your experiments without opening TensorBoard locally. A Google Account is needed to upload logs, but not to view the TensorBoard.
Several other commands are available for listing, deleting, or exporting your experiments. You can learn more by using the
tensorboard dev --help
command. There is currently a limit of 10M data points per user. If you reach this limit (you will get an error during the upload), please
reach out to us! For a quick fix, delete some of your existing experiments.
You can find an end-to-end tutorial that runs in Colab
here. While the tutorial shows how to use TensorBoard logs created with Keras’s
.fit()
, you can also use logs created with the GradientTape-based training loop (as shown in
TensorBoard’s Scalars tutorial) or any other valid TensorBoard logs.
What’s next for TensorBoard.dev?
TensorBoard.dev is in preview and currently only includes TensorBoard’s Scalars dashboard. We are adding more of TensorBoard’s capabilities and expanding the sharing functionality. We are also exploring some ideas on how to make it easier to discover interesting TensorBoards that have been published.
If you have any feedback or ideas on how to make TensorBoard.dev more useful for you, we’d love to hear about it at
tensorboard.dev-support@google.com.