logo

Select Sidearea

Populate the sidearea with useful widgets. It’s simple to add images, categories, latest post, social media icon links, tag clouds, and more.
hello@youremail.com
+1234567890

Intelligent price offering

Having a happy customer base is actually one of the most ultimate dreams and goals of any business in the world. There is no exception for one the biggest local e-taxi startup services in middle east Asia. To achieve this goal, a user must have the best price offered by an Intelligent price service while also the driver and the system can cope with the offering. This is a simple job if you want to recruit dozens of humans to sit down, and send prices when a request comes but this isn’t absolutely the case when your system is growing and you are having a large customer base.

 

So you need an intelligent price offering system in place to help you based on AI techniques. The normal approach here is to use historical data and check on different features of the AI model that can be helpful. This historical data consists of dates, geographical zones of origin and destination, weather conditions, if there is a carnival or traffic causing event running anywhere affecting the trip (the geo analytics system which were in place gives a ratio factor for each trip for the chosen path), time of the day, type of the service, demographics of customers and many more detailed data points through which we had to select best set of features to check on and have a model created.

 

This project was really challenging and complex to handle. We had different scenarios but since the volume of data was huge, we could tackle deep learning easily, although in normal situations we always start with simple solutions (as simple as a linear or logistic regression method), here after scrutinizing the data and because the feature engineering may be complex as the number of total features were relatively big (almost 100 different features), we pull out our deep learning toolset and created our model using Tensorflow and Keras. We trained our system on azure on a single GPU and tuned it later so to have a better model for our Intelligent price offering system.

 

I don’t want to be so technical here, but one very interesting outcome of the system was the process of finding the best optimizer. We initially used ‘RMSProp‘ optimisation method for our deep neural network and we were so sure that it’s gonna work perfect but later in the tuning phase, one of our researchers suggested that using ‘Adam‘ in our deep layers may be better because of the way it works and actually has momentum feature inside. We did that and we had a 10% surge in the overall accuracy of the system.

 

Ok, this is one big step, now we can predict what prices we can offer to people based on different conditions and the accuracy is around %85 but we could enhance it even more.

 

What we came up with afterwards was to use a recommendation system using collaborative filtering technique and PyTorch based on similarities of conditions. Using this system we could have user acceptance of prices for specific set of conditions including the origin and destination zones so to ultimately adjust the results of the first system with the second one to be sure that the first system is perfectly fit to the current situation. We could increase the overall accuracy of the intelligent price offering system using the second AI recommendation system in place to almost %90 which is a very good result for such a service.

 

We also created a simple Bayesian A/B testing system to tweak prices automatically for some customers and get results of their sensitivity to prices so to have them recorded for our price offering system.

 

The model is hosted on Azure using azure machine learning service as an API on Azure Kubernetes Service (AKS) which is good for high-scale production deployments. It also provides fast response time and autoscaling of the deployed service. Also having our Multi-Cloud motto in mind, we created a dashboard using Google BigQuery and data studio so to have a clear understating of what we need to do. we also had our final model on GCP so in case of any incident we could easily switch to GCP in almost no time.

Date

April 21, 2019

Category

AI, Cloud, Deep Learning, Google GCP, Machine Learning, Microsoft Azure, Recommendation System