Thursday, 29 January 2009
Meaningful
When we hear the word "average" we know what it means and we know it's a good number to know because it tells us something useful, right?
Wrong.
I work in an industry where numbers are used all day, every day.
Still, we bandy average around far too frequently for my liking.
When we say "average" we generally use it to describe a common value within a set of data.
If we're talking about the number of days a service provider make take to do something for us, we might talk about something being turned around on average in 10 days.
As a customer, that gives us a measure of expectation.
In this example we'd probably calculate the "average" by adding up all the numbers of days that a sample of work took the company and divide it by the number of values we added up. That would be the arithmetic mean.
I don't have a problem with calculating the mean, per se ...but a lot of people forget that in order for them to be useful values the underlying numbers need to conform, more or less to what is typically called a "normal distribution"
The graph at the top of the page shows normal distribution. If you calculate the arithmetic mean of the numbers being graphed here you'll get the number in the middle of the graph - at the place the incidence is most frequent.
However, if the graph of the numbers you're looking at doesn't form this sort of shape then there's little or no point calculating an arithemetic mean.
Imagine for a second that graph is flipped upside down. Now the most frequent incidences of the numbers are at the beginning and end of the graph. If you calculate the arithmetic mean, it'll still come out as the number in the middle of the graph but now it represents almost none of the data in the data set. Ergo...it's just a number that doesn't tell you anything about what you can expect in the real-life stuff underlying the data.
Averages are OK but context is everything
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment