Ian Ayres reminded me how much I like Pandora and the Bodeans. It had been a while since I'd paid attention to either and by using Pandora as an example of the way regression analysis can help an organization more accurately meet its member's needs, I was hooked. Sure, I knew Pandora had numbers behind its recommendations but Ayres explained the use of data in a very accessible manner.
As Ayres said, "If you aren't running some regressions and running the numbers, you're screwing up."
Clearly there is a need for data driven decisions. ASAE's 7 Measures of Success includes data-driven strategy as one of the measures of a great organization. Ayres agrees. He also pointed out that anyone trying to introduce data-driven decision-making into an organization that traditionally doesn't use data will run into the "Iron Law of Resistance" the natural tendency of most to say, "What I do is too special to be aided by statistics."
How to apply regression analysis in associations? Start with moving from a dashboard, which provides descriptive statistics to regressions which provide predictive statistics. Also, Come up with attrition scores for your members. Go beyond engaged/not engaged to a membership value score. You know whether or not to do a study with the decision rule. If you are not willing to write down a decision rule it isn't worth doing the study. e.g. member value score below X, don't offer renewal = decision rule
Good regressions contain both a precision estimate as well as predictions. Regression procedure makes an estimate of the precision of the prediction as well as the prediction. For example, forcast on Bing. Supercrunching can go wrong - think AIG.
Randomized testing is more powerful than regression because you get more buy-in from decision-makers using the data. Example, a charity wants to raise X dollars. Through randomized testing the charity knows that people who have already donated tend to respond more to "to go" appeals whereas people who have not yet donated tend to respond to the competitive pressure of "to date" appeals.
Randomized testing is more accurate than "experts" because randomized testing focuses only on what comes up and at what frequency. Experts tent to over-emphasize 3rd and 4th level factors rather than properly weighing 1st level factors.
Randomized testing differentiates between cause and effects. Regression analytics doesn't always differentiate.
The flaw with randomized testing is that you need a minimum sample size to get significant benefit. You can do credible studies of 200 or 300 but larger groups are better. Randomized testing & regression analytics may not be the best technique to use in the following circumstances: insufficient numbers, results don't justify the cost, not consistent enough data pools or legal and ethical issues.
I would love to do more with data but I'm afraid WATDA might fall into the too small of a data pool category. This is on my list to research further once my current projects are completed.
As an aside, Ian Ayres did extensive research on discrimination in auto finance transactions.