Download our brochure

Enter your name and email to download a copy of our service brochure.

Please click below to download a copy of our service brochure.

Thanks!

Your brochure should be downloading shortly.

There was a problem, please try again or give us a call.

Sending Form

The challenge

Demand forecasting has been a bit of an issue for one of our main FMCG client for a while. Despite using machine learning techniques & advanced infrastructure our client was struggling to improve their MAPE on longer time period forecasts

Step 01.

Discovery

Before getting into building model, we first tried to take a step back and understand their supply chain operations and the time-lags of forecasts under which they can operate efficiently.Next we studied their previous models in and out as we wanted to make sure that all the learnings that went into their previous modeling were captured accurately.Finally based on their forecasts we built the as-is benchmark that we needed to beat for each category.

Step 02.

Proof of Concept

Lot of work were already done in this space, so we collated all the features that were used. However, we also had discussion with Demand Planning team and other supply chain teams who uses the forecasts. That led us to believe that few important information were also missing from the features that were used to build the models. Such as raw material order lead time, customer preferences in dispatches, customer promotional calendars.

Our feature investigation led us to understand the gap in information that were being supplied to the demand model.We worked with our client to source these new information and built several mode features.We then collated all those features together with the existing features into a single database.

Before using all the features, we started with the features that were already being used and tried building better models to improve the MAPE vs the benchmark. We applied an array of techniques like Lasso & Elastic Net regression , xgboost , Beta-regression , deep neural networks , recurrent neural networks. We then chose the best models based on cross validation for each category. This gave us only a mere 3% improvement from existing MAPE.

Now instead of using the only previously used features we added all the new features into the list. This caused our feature dimensionality to grow massively. So instead of pushing everything into the modeling framework, we applied a sequential feature selection technique to get to a number of most significant features such that the total number of observations to model are at-least 4 times the number of dimensions.This technique was run for each sub-category as we wanted the features selection algorithm to have the ability to pick different features for different categories.

Then using these new set of features for each category we ran several machine learning techniques and built a code framework which picked the best model based on MAPE automatically for each category. After this exercise we were above to reduce the MAPE by 12% on average. So for the same groups of products over same hold out time period we reduced the MAPE from 30% to 18% on average.

Step 03.

Scale

Our results were encouraging to the business, so the proof of concept was signed off. Then we further worked on this solution to generalize it so that this can be deployed to any standard data solution within our client's infrastructure. This proof of concept was done on one chosen region and category. Using this generalized automated solution we deployed the models in all categories for the chosen region and revisited the accuracies. The results on the scaled up deployments were consistent with our proof of concept and at this stage we had a fully automated solution to be scaled up at other regions as well.

Step 04.

Empower

Once we proved the value of our generalized automated modeling solution, it was the right time for us to share the deployment guides with our client's internal team to take hand over for further deployments. We created CI/CD pipelines and repositories for easy deployment and delivered all the documentations required. Finally we provided detailed training to the internal team for running the pipelines.

Step 05.

Support

For this particular solution we just had to provide one day a month support in case of any issues arising and post 6 months the entire solution was fully handed over to our client.

Conclusion

So, starting from a simple approach on a restricted scope , we developed a complete end to end solution that are now deployable across all their regions and categories. At the heart of our solution making is our endeavour to empower our clients with scalable solutions that they can then run themselves through their team rather than churning more cost on paying vendors to support a solution that the vendor made themselves. We demonstrated this value through our step by step approach of solution building in this project.