I love doing research. Actually, I like finding out new stuff. But sometimes the research process makes me rue the fact that I work on a dry campus.
Like this week.
I've been working on a paper where I needed to update the data on. Since the latest version was a rush job put together for a conference (yes - this happens a lot), I decided to go back and check every line of my program (always a good thing to do). I also wanted to do the anal-retentive (I know, that's redundant. - except in research, where it's expected) thing where I can relate what happens to my sample at each filtering step. While doing this, I found out that I'd used the wrong data code for one of my variables - one of my MAIN variables. So, the whole data set was, in a word, crap.
After taking a deep breath, I made the corrections and redid most of the analysis. Luckily, the results still held, with minor modifications.
Then I discovered a minor discrepancy in the number of observations at one step. It's likely not very important at all. But I need to track it down before I go further. So, since my coauthor reads the blog, he'll just have to wait another day or so. But I'm getting close, so I should be able to finish my part of the work and ship it off to my coauthors in another day or two.
Then a coauthor on another paper told me that she'd found an error she made in her coding. In this case, when she found and corrected her error, it quadrupled our sample size. If you're an empiricist, you know how much an increase from about 90 observations to about 400 means. If not, let's just say it's a big deal to the alpha nerds among us (and that description applies to most of my friends).
So, like most days in the research salt mines, there's some good and some bad.
Now about that "dry campus" thing...