This op-ed of mine ran in yesterday's Gazette. It's about the study of right-to-work-for-less mandated by the legislature earlier this year. It also gave me a chance to give some love to one of my dear departed professors.
Most people probably had a teacher or professor who left a mark. One who did for me was late Marshall sociology professor Bill Westbrook, who some readers may recall.
When I knew him, he taught basic and advanced research and statistics, four classes I avoided as long as possible. Given the option, I preferred to BS my way through theory classes. But you couldn’t BS Westbrook.
He taught that doing research right really matters, especially when peoples’ lives are at stake.
A case in point is the Republican-led Legislature’s decision to have WVU study proposed “right to work” legislation (RTW), which I think of as “right to work for less.” I believe this legislation, if enacted, would reduce wages, benefits and living standards for workers across the state, union and non-union.
But love it or hate it, it’s important to get the research right. We’re playing for real marbles.
Westbrook emphasized three things relevant to any credible RTW study: causality, validity and reliability.
We’re great at shooting from the hip about cause and effect. Turn on any talking heads TV show or listen to many conversations and you’ll hear all kinds of wild claims about this causing that.
For example, a friend of mine believes that unpleasant events are caused by the planet Mercury being in retrograde, whatever that means. A school friend once went on an extended rant about how juvenile misbehavior was caused by teaching evolution. (I was tempted to interject that the delinquents I knew didn’t seem to grasp the finer points of natural selection, but he was on a roll.)
Supporters of RTW often claim that huge benefits will follow as a direct result or that harm is done by not implementing it.
I learned from Westbrook that before you can say X causes Y, three criteria need to be met, two of which are easy and one of which isn’t. First, the proposed cause and effect have to be associated or correlated. Second, the cause has to come before the effect. Then the kicker: you have to be able to rule out everything else. Most of the time, direct causality is hard to establish. Often, the best we can do is to calculate how much we can reduce our errors if we take certain variables into account.
Two other key ideas for a credible RTW study are validity and reliability. To be valid, the methods chosen need to be ones that can actually measure what one is trying to study. Thermometers are great for measuring heat, but not so much for measuring speed. Clocks are great for measuring time but lousy for measuring weight. Individual case studies or surveys might be right for some kinds of research but not for others, which may require statistical analysis, controlled experiments or other methods.
Reliability simply means that if others repeat the study using similar valid means, they should get the same result. That too can be a kicker. Several years ago, some scientists made worldwide headlines that they’d discovered cold fusion. If that was really true, it would provide a clean and virtually endless supply of energy. Alas, their research couldn’t be replicated.
Now back to RTW. Recently, two heavy hitting economists, Dr. Richard Freeman of Harvard and Dr. Paula Voos of Rutgers, sent WVU a letter outlining five standards for a credible (i.e. valid and reliable) study of the subject. Here goes:
• First, the study should measure outcomes most directly impacted by RTW (i.e. weakening unions and lowering wages and benefits), rather than factors such as per capita income, which includes things like profits, dividends and interest income.
n Second, the data should cover a relevant time period, with more recent information given greater weight. “Data that go back to the 1950s or 1960s are less relevant to assessing the likely impact of a RTW law in 2015 than data for recent years, when globalization and inequality have massively changed the U.S. labor market.”
• Third, the study should select “appropriate and robust statistical design.” This means controlling as much as possible for other factors that could influence a state’s economy and growth, such as geography, natural resources, etc. Translation: not shooting from the hip about causality.
• Fourth, the study should compare with already existing research and “consult the major, rigorous studies done by other organizations,” particularly a recent study by the University of Kentucky. In addition, “It is critical that any study make its data and methodology transparent and available to the public so that others can test and confirm the results.” Translation: transparency helps ensure the study was valid and its results will be reliable.
• Finally, the study should rely on quantitative evidence, not anecdotes or testimonies from business groups, unions or other interested parties.
Freeman and Voos argue that this is a big decision that should be made only after a careful look at all the possible upsides and downsides and a careful review of how this would or would not fit in with the state’s economic development strategies.
This is no time to shoot from the hip — especially when West Virginia’s working people are the targets.