How Viator reduced ML pipeline development time from 2 weeks to half a day
Viator is a Tripadvisor company and the leading travel experiences marketplace that makes it easy to plan tours, activities, and excursions around the world.
Over the years, the team had been working with different ML platforms, including one built using open-source
Viator’s team had experimented with various machine learning platforms, including one built on open-source tools. Recently, they set out to refine their approach — not just to improve collaboration and speed up deployment, but to transform their entire ML workflow.
With Valohai, they achieved even more than expected:
Key results TL;DR:
~ 0.5 days
to develop ML pipelines in Valohai vs. 2 weeks in the legacy platform
< 2 months
to migrate all the key pipelines and projects
Faster
iteration speed and cross-team collaboration
Easier
debugging and model version management
Quick migration to Valohai with tailored onboarding and support

It’s scary - in a good way - how quickly we’ve been able to move things over from the legacy environment to Valohai.
Jacob Barnett – Senior Machine Learning Manager, ViatorWhile it could take up to 2 weeks to develop a pipeline with open-source solutions, Viator’s ML team has reduced these cycles with Valohai to just half a day.
In order to speed up the migration process even further, the team relied on Valohai’s utilities and templates to make their own “cookie-cutter” templates.
With a hand from Valohai’s Customer Engineering team, Viator has migrated all of the key projects to Valohai in under 2 months. As of today, they’re running all the production pipelines and new experiments from Valohai.
As a side note, it’s worth mentioning that all Valohai users get assigned to a dedicated team of Customer Engineers who proactively support them with any issues that come up.

The support we’ve received has been amazing. If one of us leaves a question, it usually gets answered within a few minutes if it’s during office hours. And whoever answers it is extremely knowledgeable about not just the product but MLOps in general.
Jacob Barnett – Senior Machine Learning Manager, ViatorTaking experimentation and collaboration to new heights
After migrating to Valohai, team members felt more excited about working collaboratively. This helped them undertake projects greater in scale and make a greater impact across various functions at Viator.
One of the ML team’s use cases with the marketing department centers on showing relevant products to visitors based on the keywords searched before landing on the Viator website. This is made possible by a sophisticated convolutional neural network trained on 2-3 years’ worth of data.
With every version of the model taking at least 8 hours to train, experimenting in local environments was no longer an option. The previous ML platforms that the team used made it rather challenging to run experiments in parallel and compare the results.

With Valohai, we don’t even think twice about running five versions of the same model.
Jacob Barnett – Senior Machine Learning Manager, ViatorThe Valohai MLOps platform has been designed and built to meet the needs of fast-moving, forward-thinking ML teams that aim to maximize their impact on business growth. This is strongly reflected in Viator’s experience with the platform:
- With Valohai’s reproducibility-first design, there’s virtually no additional work required to create new model versions with different parameters and algorithm approaches.
- Valohai’s Smart Orchestration takes the burden of infrastructure management, allowing Viator’s ML team to run multiple experiments simultaneously and utilize resources more efficiently.
- Valohai’s Knowledge Repository makes it easy to compare models based on the inputs, parameters, code, and outputs – all of which are automatically tracked and stored.
As a result, Viator’s ML team can run more experiments in parallel, iterate faster, improve model accuracy, and extend its impact on business growth.

Things are more transparent because people want to run code in Valohai rather than local machines. Reproducibility and keeping track of experiments are vital for us.
Jacob Barnett – Senior Machine Learning Manager, ViatorAs of today, every team member spends more time working in Valohai than in their local environments. As Valohai automatically version-controls all development events and assets, the whole organization can benefit from lineage and reproducibility.
Debugging with Valohai is much easier as everyone on the team can access logs, view generated artifacts, and re-run failed jobs with local debuggers attached, hitting breakpoints and debugging line by line.
Looking ahead
These are just a few of the many ways Viator’s ML team has amplified its impact — without being held back by challenges related to collaboration, infrastructure, traceability, or reproducibility. We’re inspired to support teams like theirs, and we’re just as eager to learn about yours.
Read their success stories