Sunday, July 22, 2012

Is replication an issue in music cognition?

This week the 12th International Conference on Music Perception and Cognition (ICMPC) is being held in Thessaloniki, Greece. A week long hunderds of researchers will present their latest work in a dense program with five parallel sessions and four keynotes. Slightly overdone perhaps, but it shows the still growing and international interest in music cognition as a research topic.

On the first day there will be a symposium on 'Replication'. By way of introduction below a blog entry that was originally published in May 2010:

"In the last few years Web-based experiments have become an attractive alternative to lab-based experiments. Next to the advantages of versatility and the ecological validity of the results, Web-based experiments can potentially reach a much larger, more varied and intrinsically motivated participant pool. Especially in the domain of music perception and cognition it is important to probe a wide variety of participants, with different levels of training and cultural backgrounds.

Nevertheless, to get research published that takes advantage of the Internet is not straightforward. An important reason for the conservatism held by some journals in publishing results obtained with Web-based experiments is the issue of replicability. Especially in the fields of experimental psychology and psychophysics there are serious concerns about the (apparent) lack of control one has in Web experiments as opposed to those performed in the laboratory. Where in the lab most relevant factors, including all technical issues, are under control of the experimenter (i.e. have a high internal validity) it is argued that Web experiments lack this important foundation of experimental psychology. As a result of the first issue, it often proves to be problematic to convince University Review Panels to give permission when there is little insight in the environment in which participants tend to do these experiments. As a result of the second issue, some high-impact journals made it a policy decision not to publish Web-based studies, as such discouraging Web experiments to be performed (cf. Honing & Reips, 2008). Nevertheless, it is important to stress that if an effect is found - despite the limited control in Web-based experiments over the home environment and the technological variance caused by the Internet - then the argument for that effect and its generalizability is even stronger.

The latter issue was recently discussed in an issue of Nature Methods by researchers from the Universities of Giessen and Münster, Germany (see reference below and [modified] figure above). In fact, the authors make the opposite argument! They argue that standardization should be seen as a cause of, rather than a cure for, poor reproducibility of experimental outcomes. Their study showed that environmental standardization can contribute to spurious and conflicting findings in the literature. Würbel and colleagues conclude that to generate results that are most likely going to be reproducible in other laboratories, the strategies to standardize environmental conditions in an experiment should be minimized.

As such the variance caused by Web-based setups (as discussed above) might actually amount to experimental results with a much higher external validity than thought before."

ResearchBlogging.orgRichter, S., Garner, J., Auer, C., Kunert, J., & Würbel, H. (2010). Systematic variation improves reproducibility of animal experiments. Nature Methods, 7 (3), 167-168. 10.1038/nmeth0310-167 Honing, H., & Reips, U.-D. (2008). Web-based versus lab-based studies: a response to Kendall (2008). Empirical Musicology Review, 3 (2), 73-77.

ResearchBlogging.orgSimmons, Joseph P., Nelson, Leif D., & Simonsohn, Uri (2011). False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant Psychological Science DOI: 10.1177/0956797611417632


  1. Interesting post! I suspected there was some quiet skepticism in the field about online studies, but didn't realize there are journals with a formal position on this.

    For what it's worth, my lab has tried extensively to replicate well-known cognitive psychology laboratory studies online (using Mechanical Turk) and have had great success. The difference between college undergraduate in the lab and online participants seems small and unsystematic.

    Here's a blog post we wrote about our investigation into this:

  2. Oh it is a good article about music. I like it.