Part 2: Tips and tricks for running your deep learning executions on Valohai CLI
Valohai executions can be triggered directly from the CLI and let you roll up your sleeves and fine-tune your options a bit more hands-on than our web-based UI. In part one, I showed you how to install and get started with Valohai’s command-line interface (CLI). Now, it’s time to take a deeper dive and power up with features that’ll take your daily productivity to new heights.
Prerequisites & Installation
See part one of the series for pre-reqs and an easy step-by-step installation guide.
Ad Hoc or Not?
First, lets jump right in and look at one of the available flags of note:
vh exec run -a mystep
Using this flag will take the contents of your currently active local folder, compress it down to a single tarball, and send it to the server for execution. (Note: any file or folder starting with a dot is considered hidden and won’t be bundled in.)
By default, the executions are based on the latest (or specific) commit in your VCS (like git). With
--adhoc, they are based on your local files instead.
While an ad-hoc execution is not based on any source repository, it’s still version-controlled by Valohai (we save the tarball for you). That said, using a DVCS like git for your code is still highly recommended. Ad-hoc executions are best suited for quick exploration and debugging.
Parameters for deep learning executions
You can feed parameters of your execution via the CLI, too. For example, if you have defined two parameters
dropout in your
valohai.yaml, you can set their values like this:
vh exec run -a mystep --learning_rate 0.001 --dropout 0.1
If you want to list all available parameters for a specific step, you can see them using the
--help flag. For example:
vh exec run mystep --help
Inputs for your DL models
Inputs are predefined “slots” for files that you want Valohai to fetch for you before the execution. Using Valohai’s CLI in combination with inputs is a good strategy when dealing with big data, as you don’t need to fetch huge files into your local machine.
vh exec run -a mystep --input1=http://t.com/file.gz
If you want to download multiple files per slot, you can re-use the name like this:
vh exec run -a mystep --in1=http://t.com/1 --in1=http://t.com/2
You can also see which inputs are available for a specific step, just as you could for parameters by using the
vh exec run mystep --help
I Spy with CLI: Executions
Another cool flag for running an execution is
vh exec run mystep -w
When you use
--watch, the CLI will give you a pretty terminal window with the latest log output and the status of the execution. This saves you a few clicks as you don’t need to open the web UI for the same information.
If your execution is already running or finished, you can also stream the logs of the latest execution with:
vh exec logs --stream latest
If you want logs of a specific execution, say number #13, you can write:
vh exec logs 13
If you just want to see
stdout logs and none of the stderr or Valohai status logs type:
vh exec logs latest --no-status --no-stderr
Now, let’s say you’d prefer a third-party tool for watching execution progress, like Tensorboard. You can use a similar flag
--sync. It will download the output files of your execution (say Tensorboard checkpoint files) live to your local machine. (For more info on this, see our Tensorboard tutorial here.)
After your execution is finished, you might want to download the outputs to your local machine. It’s your lucky day; the Valohai CLI makes this task a breeze. For example:
vh exec outputs 13
This will list all the outputs of execution number #13. You can also get the listing as JSON:
vh --table-format json exec outputs 13
If you want to download the outputs, you can call:
vh exec outputs 141 -d ./myfolder
You can also filter the download with a wildcard or select just a specific file with the
vh exec outputs latest -f *.txt
If you want to do something with the output files right away, you can also stack up multiple commands like this:
vh exec 13 outputs -d .outputs && tensorboard
This concludes your quick introduction to using the Valohai CLI.
Want to learn more? Our blog and documentation have all the answers. Create your free account and dive right in – we’re here to help if you need anything. Feel free to reach out anytime – shoot us a message through the app’s chat window, by Slack, or at firstname.lastname@example.org.