Monetize your Data with Operational Optimization

ML use cases from the trenches 

 

Last week I wrote about the key tenets for building analytics teams for real, measurable impact in your organization. This week, I’ll focus on one of the four fundamental data monetization strategies that companies should employ: capitalizing on their data assets to deploy operational improvement initiatives that drive cost savings, revenue increases or both. Operational optimization initiatives are usually a good place for companies to start their analytics journey, assuming some foundational data capabilities are already in place: reliable internal data, decent data governance and tech stack, a good understanding of customer behavioral profiles and foundational data science capabilities.  

Recall that one of the biggest failure points in companies’ analytics transformation efforts is both a lack of focus (trying to pursue all key data monetization strategies like operational improvements, building information businesses to sell data, or deploying customer-facing analytics products and solutions) and a lack of adoption (end users are not engaged). Admittedly, AI, ML and analytics in general has been in its hype phase for the last decade, and many organizational leaders still view these capabilities as some sort of panacea to foundational domain ailments. As you are strengthening your analytics DNA in your organization, I strongly suggest you focus on one clear data monetization strategy for the first couple of years that builds your data foundation and captures measurable results while aligning to broader business goals or company vision. 

Having built from the ground up or transformed analytics and revenue management teams in three different industries (Consumer Packaged Goods, Retail and Distribution), I wanted to share some real-world use cases of operational optimization analytics efforts. For those with a bit more analytics acumen or who want to learn more about various machine learning techniques (really, it’s all statistics!), we will also break things down into Classification, Regression or Optimization problems. For now, I’m omitting use cases that leverage unsupervised machine learning methods (e.g.: clustering being the most popular one), though unsupervised methods are foundational inputs to your predictive analytics models.  

And before you read further, please remember analytics is not about the technology or the algorithm. It is about understanding key business pain points, decomposing them into smaller, manageable problems that you can clearly frame and articulate (I.e., your grandma or 13-year-old child can understand it) and solving those problems with the simplest analytics method possible that will give you strong, measurable business results. Afterwards you can iterate and perfect your solution. 

 

Key Operational Optimization Use Cases from Select Industries 

 

Consumer Packaged Goods (CPG) 

For CPG, Supply Chain and Sales and Marketing improvements are by far the most impactful. Within supply chain, problem areas like predictive maintenance for production lines, inventory optimization, spend optimization and demand forecasting improvements are top of mind for most CEOs and Chief Supply Chain Officers. These are all key areas that improve manufacturing efficiency of the company, improve fill rates, minimize lost sales or maximize liquidity. In Sale and Marketing, the most impactful analytics improvement initiatives center around Revenue Management optimization efforts (mix management and optimization; pricing and promotional effectiveness and optimization), and a distant second is marketing spend effectiveness and optimization (both advertising and shopper marketing spend/mix). 

Retail 

Retail is like CPG, although Sales and Marketing use cases can often be more impactful given the B2C nature of the business. Pricing and promotional effectiveness and optimization work dominate, but customer acquisition, retention and customer lifetime value optimization projects are also important. On the supply chain side, inventory optimization is still key, and so are improvements to demand forecasts using either more impactful machine learning methods, employing stronger feature engineering including relevant exogenous variables, or both. Another, less talked about operational improvement initiative is intelligent automation (e.g. robotics process automation or RPA), especially as most traditional big box retailers still rely on internal processes that are unnecessarily complex and manual.  

Distribution 

By far the biggest area for inventory-heavy distribution businesses are Supply Chain improvements, with Sales and Marketing focused analytics initiatives a distant second. Warehouse optimization (assortment, network optimization), product stocking and merchandising analytics and route optimization are the most impactful problem areas for distributors that analytics can improve. Pricing and promotional improvement opportunities are also impactful, but largely due to the lack of revenue management maturity that exists in most distribution businesses.  

Examples of Use Cases and Machine Learning Methods 

 

Inventory optimization 

This is a mature data science problem even in many traditional industries. There are a host of specialist supply chain analytics firms dedicated to one or more industry, ranging from small analytics companies centered on a handful of use cases to well established analytics consultancies with turnkey solutions that address all key areas of the inventory management journey. Inventory optimization requires a multi-dimensional problem framing and a well-coordinated problem-solving effort in the enterprise. Next to sales and marketing type analytics problems, this is typically the most important analytics use case for manufacturing, and especially for distribution companies. It reduces operational costs and therefore increases your operating profit, increases both B2C and B2B customer satisfaction and reduces the need for deep reactionary markdowns due to heavy overstocking, which further improve operating profit.  

Classification problem 

Inventory auditing and cycle counting. Many companies have outdated inventory management software or simply are bad at using it and not adhering to processes, resulting in countless manual over-rides of system records. This results in physical inventory on hand quantities being misaligned with database records. I’ve seen this firsthand, where a company has a tough time tracking accurately what is coming in the warehouse, where it is placed and at what quantity, and what is leaving the warehouse. This results in costly inventory audits, recounts and reshuffling of inventory to ensure it is placed in the right section or bins for faster retrieval. In recent years, with the advanced of image recognition algorithms, companies have started to employ automated inventory audits using static images or videos installed at warehouses or using drones. Drones can fly around the warehouse, scanning barcodes or augmenting barcode scans with image recognition to properly identify the type and quantity of product on hand, along with its location in the warehouse. Certain DNNs are particularly effective at the image recognition piece, and you can use this to update your inventory management databases real time.  

Regression problem 

Demand forecasting to understand inventory levels, safety stock and reorder points. Surprisingly, most companies struggle with accurate demand forecasts, especially those businesses that deal with multiple channels of distribution. In some cases, there are 2-3 layers of distribution between manufacturing and when the consumer finally buys the product. The item goes from the manufacturer to a distributor, to sometimes a secondary distributor to a retailer to the end consumer. Forecasting inefficiencies at each level are impacting the supply chain, resulting in demand forecasting inefficiencies for an entire industry, causing poor fill rates, stock outs and lost profitability, overstocking and tied up cash, adverse customer satisfaction or defection to a competitor.  

Whether you are a manufacturer or distributor wanting to improve demand forecasts, the optimal way to do it is to augment your historical shipment data with consumer sellout data (that is with aggregated consumer purchases). For most companies, your product is ultimately sold to a consumer, even if you are in the business of selling or distributing to other companies. Having an accurate consumer forecast feed your own shipment demand forecast as an important exogenous set of variables will be a game changer. There are many companies, especially in more traditional manufacturing or distribution industries that are still stuck with their own historical data-based demand forecasts, not accounting for other key inputs like consumer sellout. And when you do create your forecast algorithms, remember to exercise good business judgment, and focus on the problem you are trying to solve. Businesses with steep assortment curves (I.e., 5% of products generate 95% of sales) will have majority of their products with very sparse sales histories. You may find that simple moving averages may suffice (and outperform) more sophisticated algorithms given the issue of heavy data sparsity. You may also find that even when there’s strong data density, more classical statistical methods like (S)ARIMAX (the “S” and “X” for the inclusion of seasonal and exogenous variables, respectively) or Exponential Smoothing will do an excellent job – please remember this as some of you in data science may be quick to reach for LSTM or Prophet

Optimization problem 

  • Optimizing inventory assortment and on-hand quantities. These can be computationally expensive analytics problems, particularly for large companies with tens of thousands of products to stock. However, it’s a particularly important problem to solve as stocking too much inventory ties up precious capital and cash, leading to forced discounting efforts and margin erosion. On the other hand, not having enough inventory on hand results in missed sales opportunities, share decline and lowered customer satisfaction.  

  • Warehouse / network optimization. Beyond simply figuring out what product at what quantities should be carried, this solves an important problem for companies with multiple distribution centers (DC) and especially those with a hub and spoke network. While it makes sense to hold popular products in a single DC to fulfill the demands of the local market, it is suboptimal to carry less popular products that serve a niche customer segment. Instead of housing these products at your local warehouse, it may make sense to carry them in your central DC (a hub) that serves many local distribution points.  

  • Placement optimization. Is your warehouse optimally setup so the most popular products are closest to your loading docks and the least popular ones the furthest? While I am oversimplifying the problem statement, many companies are dealing with this phenomenon where they are optimizing for space based on product features, or worse, clustering products based on some hierarchical attribute (product groups, brands, sizes, etc.) as opposed to what is most optimal for their business KPIs and customers. 

Pricing and Promotions 

Classification problem 

  • Predicting pricing bid wins or losses. For larger customers in more complex deal structures (e.g.: a manufacturer bidding for a retailer’s private label business), highly customized and transparent pricing is often the norm. You may be bidding against many competitors, and knowing your historical bid performance, including competitive intelligence inputs like competitors’ prices and quality, you can build simple but effective models that predict the win probability of larger, more complex B2B pricing bids. 

  • Predicting customer churn due to pricing or promotional actions. Customers routinely defect to competitors if businesses increase prices too much, or if they simply change prices (up or down) too often. Knowing with high confidence which of your key customers will be lost or retained is an important foundational capability. 

Regression problem 

  • Calculate price sensitivity (elasticity). Knowing the price sensitivity of your customer or consumer base is a key input to not only basic scenario analyses, but more importantly to your overarching pricing strategy. In B2C businesses (I.e., retail) that use promotions and discounts to spur product sales, differentiating between everyday price elasticity and promotional price elasticity is also crucial. Everyday and promotional price sensitivity, coupled with other inputs such as competitive density, competitor pricing and promotional intensity (or competitive cross-price elasticity), product pageviews, market basket effects and others will enable you to formulate the right pricing strategy, such as Everyday low pricing that Walmart or Amazon predominantly use, or a High-low pricing strategy (deep promotional discounts, followed by longer periods of high base prices) that most grocery chains or department stores rely on. 

  • Predict business impact from pricing or promotional changes. Believe it or not, in the age of AI and Deep Learning, multi-linear and logistic regression models still dominate most industries, particularly for demand forecasting type problems that need interpretability and are often used for what-if scenario analyses. Suppose your suppliers just increased pricing on key products, and you have competitive intel that your competitors intend to follow suit and pass on the price increases to consumers in the next week. You need a mechanism to evaluate empirically whether holding price for N weeks is the right choice for Unit, Revenue, Gross Profit or Market Share goals, or whether you should match competition. This type of scenario analyses should be foundational in all companies, and it needs to be dynamic and democratized (Ie. Scenario tools available for merchants or finance team members as well). 

Optimization problem 

  • Promotional optimization. Knowing which products, how often for what customer segments and at what discount depth should be promoted to maximize profitability, unit sales, revenues or market share is a key differentiating capability for companies. While many companies are using pricing or trade promotion optimization software nowadays, I have yet to see promotional optimization done well, dynamically and at scale (at least in CPG, Retail and Distribution). The reality is that most pricing software implementation fails to deliver on their promise, and over 90% of promotions (both B2B and B2C) have a negative ROI. If you have a strong data science team with equally strong domain expertise in pricing or revenue management, I recommend building your promotion effectiveness and optimization capability in-house as it can be more easily tailored to company and industry dynamics and to your operating rhythm. Otherwise, there are many excellent vendors to choose from in this space – just make sure you carefully evaluate the best fit for your business and build in enough flexibility in the system to enable a successful outcome.  

  • Price optimization / dynamic pricing. Price optimization is also a foundational capability for most companies that have pricing or revenue management teams. Price optimization itself may or may not be automated and dynamic. However, if you are a company that manages thousands of products across hundreds or thousands of customers in an environment characterized by relatively strong competitive density, then automated, dynamic price optimization is the way to go. I highly recommend coupling price optimization with carefully designed A/B tests since iterative, in-market price testing will give you the real answers to what your optimal price points should be (by channel, product, customer segment, etc.). 

Predictive Maintenance 

Companies, especially manufacturing businesses rely heavily on machines and systems (e.g.: assembly lines, vehicle fleets, electricity, etc.) to produce goods and services. In the absence of an algorithmic predictive maintenance system, companies have historically performed periodic maintenance to ensure proper continuity for their production machines and systems. Most traditional companies still approach maintenance this way. This creates substantial resource waste – both labor and parts. Predictive maintenance typically predicts two things:  

Classification problem 

The probability that there is a failure in the next N-iterations. “N” is typically based on a specified time window (e.g.: next day, week, month), but could be based on production runs or units produced as well. To make it more actionable, this could be a multi-class problem where you are predicting the probability of various types of failures, which will give the engineering teams insight on what specific maintenance actions need to be executed. 

Regression problem 

Predicts how much time (e.g., Days, weeks, production runs, etc.) is left before the next failure (typically called “Remaining Useful Life” in manufacturing).  

Optimization problem 

Optimizing for the right maintenance cadence of various systems that maximizes throughput and minimizes human labor and machine waste 

For both the classification and regression problems, your choice of algorithm will depend on your specific use case: what type of failure you are predicting, the size and scope of available data, and what sort of model output do you need to make things actionable for your internal customers. If you plan to use more complex inputs such as sound or image files, which can greatly augment your predictive maintenance models, then deep learning (especially LSTMs, given the time series aspect of the problem) are a safe choice. 

Famous last words... 

Regardless of the analytics application you focus on, please remember it is always about solving a real problem that matters for your customers and makes their lives better and easier. Customers can be defined as your internal stakeholders as well as your external B2C or B2C constituents. Doing analytics for the sake of analytics will almost always result in frustration by your stakeholders – but if you ensure that you and your team spend ample time on getting to know your key stakeholders, and understand their domain problems, you will be that much closer to adding real, measurable and sustainable value with your analytics efforts. And your customers will love you for it! 

Would love to learn about your own experiences building data science solutions to solve a key business pain point. Whether it’s a cool analytics story (success or failure and lessons learned), or need help thinking through an analytics and machine learning use case, get in touch with me on Linkedin or drop me a line at armin@revologyanalytics.com

Subscribe to
Revology Analytics Insider

Want to stay abreast of the latest
Revenue Growth Analytics thought leadership by Revology?


Use the form below to subscribe to our newsletter.

Previous
Previous

The Science (and Art) of Estimating Price Elasticities

Next
Next

Building analytics teams for real impact