The answer to a single question can quickly tell me if a company is mature with its data.
That question is, “What is your average sales price (ASP)?”
If the answer is of the flavor of, “well, it’s hard to say because we sell all sorts of things to all sorts of clients,” then the organization is not mature with its data.
ASP as a metric of data maturity
When a company cannot tell what their ASP is, it indicates that there are several issues with their data.
Concomitantly, not being able to nail down your ASP can significantly impact how your sales and marketing organization operates. So why is it so difficult for some organizations to get a handle on their ASP?
The issues that bind up an organization and inhibit it from determining its ASP are~
- the market they are targeting,
- their swimlanes, and
- and the parsing and binning of the data therein.
While all these may seem different, they are all symptoms of the same thing.
Making sense of your ASP
The conversation that I’ve had with numerous founders, sales leads, and other folks who have been charged with handling sales metrics usually goes something like this,
“What’s your ASP?”
“It’s all over the place.”
“Ok, do you have any addressable market that you feel pretty good with?”
“Yes, we are great in market X,Y, & Z.”
“Fantastic, what’s the ASP in market X?”
“It’s all over the place.”
Reading it out loud of having folks write down the issue sometimes gets me a mild “aha” out of them. They realize if they parse their market segments, industry, or some other metric, they can begin to identify an ASP or three. However, getting here sometimes requires a lot more to be done, and not everyone is equipped to do it. Or rather they are equipped to deal with the problem, but other issues are far more pressing than figuring a meaningful ASP value.
Here is where I stand on my soapbox and state, knowing your ASP needs to be part of your fundamental metrics, like churn and growth.
ASP for a market segment is an immediate indicator of your effectiveness in a particular market segment. Suppose your organization claims to be significant in a particular industry. In that case, you should grow your ASP in that sector and grow your ASP in most market sectors.
For example, at FatStax, we claimed we were dominant in the life science sector, however, our ASP growth was stagnant. We didn’t think we were competitive in the HVAC sector, however as our sales tactics matured, our ASP growth crept slowly up and to the right. Based on those and several other indicators, we refocused our sales team into these more traditional markets. The result? FatStax was acquired by BigTinCan less than a year later.
So now you have identified that knowing your ASP needs to be accurately measured, how can you do it?
What is your data dependent on?
The first step in solving your data problem is identifying the dependent axis.
In the case of business data, the dependent axis is the data you will segment by. While it may seem obvious parsing your market segment by employee count or annual revenue, you need to know how your product or service naturally partitions according to your buyers.
At SkySync, we sold content migration software. Since people make content again, I used the number of employees as the market segment metric.
At FatStax, we sold a digital catalog to field sales representatives. In this case, the number of SKUs and field sales representatives were how we demarcated market segments.
With Systematic Ventures, we sell an analysis package to private equity investors. Hence, assets under management were how we break out the market segments.
Now that you’ve figured out how your market segment is partitioned, it is time to figure out the dependent axis.
What revenue is your dependent variable?
While this will by and large be a revenue metric, the question is which one? Is it the “all-in price,” or is it simply new software sales each month or the annualized contract value? Figuring out your yardstick for revenue is key; without it, you could be comparing apples to orange meteors; two things that have zero relationships to each other.
Consider the SkySync example.
We sold data migration software; each package had a software cost and a services cost; both software and services prices were variable. When we would aggregate the two and plot against the independent variable, the results were nearly linear, however, when the revenue was parsed the revenue data and only plotted annual software costs vs employee counts, the results became much clearer in each plot.
Now that we’ve just talked about it, here is the three-step process to parsing your data.
Step 1: Plot your data
Now that you’ve got your independent and dependent variables, it is time to plot them.
In this example, I’ve generated a random matrix of the number of employees (independent variable) and revenue dollars (dependent variable).
This plot seems pretty stochastic, similar to several different data sets I’ve reviewed for companies in the past where the dependent variable, price, tends to follow the independent variable, number of employees.
The first question I get is, “we don’t have enough data to do this”.
For me, that doesn’t particularly matter; just plot it.
You may find you have a considerable amount of variability or variance in your data, or you may find there isn’t much variance. In which case, you can quickly parse by eye, splitting your data into three groups; SMB, Midmarket, and Enterprise.
Step 2: Apply a trendline
While this sounds fancy, it really isn’t.
Anyone reading this article has more computational power at their disposal than the entire Apollo space program had throughout their nine-years in operation. For this case, applying a trend line is a relatively trivial task; the key is understanding which trend to apply.
In most spreadsheet packages, there are several options to choose from; linear, exponential, or polynomial. Choose the polynomial, as linear is a line, you don’t want that, exponential will give you two humps, and polynomial will provide you with half as many humps as are degrees of the polynomial. Remember, our goal isn’t being very precise or accurate as would be required to land something on Mars or even the moon; the goal is figuring out what your market segments are.
You aren’t even trying to figure out what the best-fit equation is for your dataset; you are simply trying to apply a heuristic so you can begin the process of segmenting your market segment.
In the above example, I applied a polynomial function that much more readily allowed me to see where there might be apparent segments in my data.
Here I would caution you that moving from this step to the next, where you demarcate the segments, requires a bit of thoughtful consideration. While it would be easy to drop vertical lines creating bins simply, consider these three factors;
- how easy is it to search for prospects depending on your segmentation?
- how continuous is your independent variable, and
- How discontinuous is your dependent variable,
- e.g. is there a segment that typically pays more or less than would be expected?
The key here is marrying the quantitative (trendline) and qualitative (your gut) data to come up ‘swim lanes’ or market segments that make sense to you and your company and, importantly, are easily searchable.
Step 3: Deploy your swimlanes
The key here is marrying the quantitative (trendline) and qualitative (your gut) data to come up with‘ swimlanes or market segments that make sense to you and your company and are easily searchable.
Once you’ve decided where the swim lane demarcations should be, it is time to run some analysis.
Now for the first time, you and your organization can start to come up with your ASP values.
From the illustration below, you can see that while the data is random, the polynomial was able to help us visualize several different swimlanes. I recommend that people start the segmentation process off as granular as possible, then bin multiple groups together.
For example, at Duo Security, we parsed our market segments along the lines of employee counts as listed on LinkedIn as they were easy to identify and if we moved data providers it would not affect our segments.
The result was a matrix such as;
In this way, we always kept the smallest values so we could parse into minimal segments later on if required, we could parse more and not have to redo the entire process fully.
As well, when we ran the numbers, we’d be able to see if there were any natural convergences or segmentations happening that would perhaps affect the way we segmented moving forward. As well, an likely for another time, we also began to parse by industries, with each industry possibly having different market segments.
With that out of the way, it is time to figure out the minimal level of data you and your organization should collect for each swimlane.
Those metrics would be;
- Average sales price (ASP)
- Median sales price (MSP)
- Mode, and
- Sales cycle
Average vs. Median
First and foremost, the key metric to measure for each swim land would be the average sales price (ASP). However, the average is not necessarily the best measure for each segment, the median sales price is. Where the ASP is simply the average value of the group, the median is the middle of the group. While these may seem the same, they are, and they aren’t.
If you have an evenly distributed group, or normal distribution, the average and median are the same. If the values are skewed in one way or another, the median is a much better measure, as it is the middle of the group, and the average could be one end. Think of a group of numbers that are skewed with several higher paying clients and a few lower priced clients. The ASP would be closer to the higher paying clients, whereas the median would be the middle price point. This is the reason you should also measure the mode.
The mode is the most commonly occurring value in a set of data. Particularly for SaaS products where prices are on a digital scale, so to speak, similar values keep coming up.
Why is the mode important?
At Duo Security, we sold licenses in blocks often up to one hundred, blocks of twenty-five up to five hundred, then blocks of one hundred. When we had launched Duo Access, a feature rich version of our platform alongside our original product. We wanted to know if the original Duo product would cannibalize the new Duo Access product. We found that in the smallest swimlanes, 2-50 employees, the mode for our original product was 20. In contrast, it was 40 for the Duo Access. This confirmed our hypothesis that smaller organizations did not need these feature sets, ergo there was a need for Access even in the smallest market segments, and it would drive a profit.
The last basic metric to measure for each segment is the sales cycle. Knowing both the sales cycle, how long it takes a prospect to get to paying, married with the ASP or MSP can make or break a company. With these two numbers, it is easy to key into the more lucrative fast converting prospects quickly.
At SkySync, if a rep was coming into the third month of the quarter and their pipeline was a little light and they were at risk of missing their quota, they’d be directed to the fast converting low friction, market vertical law firms.
Why? We knew that law firms that employed a full-time IT manager were large enough to pay a reasonable price, wouldn’t haggle on our price, and would close three times faster than a SaaS company of a similar size.
When you or your colleagues can’t appropriately state your ASP, it is a staggeringly strong indication that the organization does not have a good handle on their data. The average sales price, median sales price, mode, and sales cycle are all metrics that should be socially known across the top level. At the very least known to the sales representatives working their segments.
As Lars Nilsson puts it, sales reps are coin operated. Just handing them a patch and a quota is not an avenue for their success. Sales reps need to know many things about their target market to be successful, with the ASP being a core part of their standard equation;
Quota attainment = #prospects * conversion rate * ASP
The above equation is the most rudimentary version of a quota attainment plan you can hand a sales rep, where a sales leader can clearly state, objectively, on average how many prospects are needed to be fed into the funnel every day/week/month for the rep to be successful. This algorithm can be expanded and drilled into a number of ways, however every way it is sliced, ASP is a core component. ASPs can be different for every market segment, industry, and even geolocation. The key is collecting all the data, reviewing your model on a semi-annual basis (too much review and adjustment will kill morale, trust me, I’ve been there), then slowly pivot into a system that maximizes growth.