Running a local notebook is great for early data exploration and model tinkering, there’s no doubt about it. But eventually you’ll outgrow it and want to scale up and train the model in the cloud with easy parallel executions, full version control and robust deployment. (Letting you reproduce your experiments and share them with team members at any time.)
Transitioning your local code to running in the cloud can often feel like a complicated task with many steps in an unfamiliar environment. Here at Valohai, we want to help you take your notebook project from the early prototyping phase to production as painlessly as possible, so we’re excited to share that we’ve been developing a Valohai plug-in for Jupyter Notebook.
Running your local notebook on the Valohai deep learning platform is just a few clicks away. The plug-in takes care of configuration, triggering the cloud instance, getting the notebook output back in real-time, and finally closing down the cloud instance automatically. You’ll still have all the power-user tools like Valohai CLI and Valohai API at your disposal, but the plug-in will offer a new easier option without any learning curve.
Unlike many other notebook cloud hosting services, you’re not only able to choose any cloud provider (like AWS, Azure or GCP) on-demand, but more importantly your instances are always shut down immediately after the code has executed. There’s no longer any need to budget for persistent instances; you’ll only pay for the time you actually executed your code! And as an added benefit, every training run that you do is automatically version controlled – you can share your experiments with others and reproduce them with the same code, data, hyperparameters and more.
Want to hear more or give the plug-in a try as a beta tester? Don’t hesitate to get in touch. It’s in closed beta at the moment but ready to use for our most eager users!