Data Transparency is a Really Big Deal

I consider myself many things — dedicated Flyers fan, irrationally fearful of monkey attacks, the most fabulous person you’ll ever meet, and wickedly humble. But I am foremost a massive proponent of data transparency.

So when I read this article in The New York Times recently, I was appalled. The article is about Michael LaCour, a political science grad student at UCLA, who had a fascinating question: Can canvassers with a personal stake in a political issue directly change the opinion of a voter?

He picked the best time to study this query — it was 2012 and California was in the midst of trying to legalize same-sex marriage. He decided to study canvassers who self-identified as being part of the LGBTQ community and measure how voters responded after their interactions.

A study of this magnitude takes a few things, starting with funding and clout. Mr. LaCour found the latter in Dr. Donald P. Green, the Godfather of modern policy on field experiments in political campaigns. In 2013, their study was published in the journal Science.

This is why it’s kind of a big deal. I’ve spent my entire academic career studying the social sciences (my undergrad is in political science,) and I can attest to mainstream science giving no credence to the social sciences. The major reason is some social science theories cannot be properly “tested” by the scientific method.  I won’t annoy everyone with my standard “there are other ways to test a theory” tirade (ahem — Meta-Policy or Third Wave Evaluation approach — ahem). To present such a landmark study in such a prestigious journal as a graduate student? It’s the academic equivalent of when Sidney Crosby became eligible for the NHL draft — everyone would want you.

Dr. Green agreed to co-author the study because of the importance and timeliness of the subject matter. Mr. LaCour also assured Dr. Green his funding was solid, although Dr. Green never did find out where Mr. LaCour secured said funding. Also a bit perplexing is the fact no one, not even Dr. Green, saw Mr. LaCour’s raw data.

The paper published in Science reported a 12 percent response rate among participants. That’s a fairly high response rate for something that is voluntary and multifaceted. Mr. LaCour did report he was paying participants, which would account for the increased numbers. Paying participants is a standard data collection practice and something widely used in marketing data collection. However, when fellow researchers tried to replicate the same results in a similar political setting in Florida, they yielded only a 3 percent response rate.

Due to this failure to replicate, Mr. LaCour’s raw data was requested by several people. He claimed he deleted his data files to protect the identity of respondents. For the analysis projects I have worked on for Montgomery County Community College, I am given raw prospective data. This contains sensitive information. I am cognizant of this fact and encrypt my files. I backup as well. I also have an open-door policy about my raw data, particularly with my colleague Simon Lindsay, who acts as my Svengali during analysis projects. So, frankly, I don’t buy Mr. LaCour’s excuse, and I find it downright upsetting.

The article suggests Mr. LaCour really wanted his theory to be right, and it may have caused him to make some career-limiting decisions in the process. It also may have sullied the reputation of a well-respected scholar (who should have been more proactive but readily admits it in hindsight). This further illegitimatizes social science research within the scientific community.

Transparency should be a top priority when conducting any analysis, particularly with surveys. Data tells a story, and sometimes it may not tell a story your client would like to hear. However, it is your responsibility to hold yourself accountable. If someone does call into question information, that’s why the raw data is there. Have I made mistakes in my calculations? Yes. But I always had someone there to vet my data before it was presented. I cannot stress the importance of data vetting. But vetting cannot be effective unless the entire picture is shown.

Views: 70

Add a Comment

You need to be a member of HE Comms to add comments!

Join HE Comms

Connect

About

HE Comms is the international social network for communications and marketing professionals working in Higher Education. The network is supported by Pickle Jar Communications Ltd

© 2017   Created by Tracy Playle.   Powered by

Badges  |  Report an Issue  |  Terms of Service