Often times I come across a question that baffles me to this day. At this time, I encourage my mathematically inclined colleagues to chime in. But there is ‘thing’ people do where they ask you what type of data you have analyzed in the past. Have you analyzed oncology data? Have you touched psychology data? I get why the question is being asked because you want someone who is used to all the nuances that come with a type of data. For instance, you need to know that certain parameters in virology data require a log transformation. You can’t simply dive-in and start analyzing CD4 data- you need the log. I get it. But there is one question that makes me scratch my head to this day. And that is- what is the largest sample you have worked with in the past? As I reflect on the question many thoughts come rushing in…if a person has only been exposed to sample sizes of 200, they are well-equipped to handle 500. That same person could probably handle the analysis for 100 more cases for their sample size. It all depends on the point of reference, which I have mentioned in previous blogs. If you have 30000 cases, well that should be disclosed in your questioning. A better question would be – have you dealt with survey data? Or legacy data? Because 9 times out of ten certain types of data require thousands of records. And there is a particular way of dealing with those scenarios. This is just my two cents. What do you think?
-Moore to follow-Amy