Richard Wiseman, interviewed for the submitted article, is a psychologist who studies human cognitive biases. His concern regarding Bem's purported studies on precognition is that if journals reject studies showing no effect, while accepting flawed studies appearing to show precognition, this results in a file drawer problem
in which publication bias among the published studies will then skew any subsequent attempt at meta-analysis of the published studies. The file drawer effect is of particular concern in meta-analysis of purported extrasensory perception.
By the way, Wiseman is perhaps best known online as the author of the Colour-changing Card Trick website, featuring videos of an experiment that demonstrates selective attention in human perception.
The main point of it is that the parapsychologists have done nothing wrong: they're following the same methods and using the same techniques as the actual psychologists. The real problem is that those methods and techniques are not good enough.
I don't know how the lesswrong people came to that conclusion about parapsychology a-priori. This is a problem in Science in general. People use parametric-value-based reasoning (e.g. significance values) to get binary conclusions (existence or not of physical processes etc).
Which, to me, seems to be an admission that the Journal of Personality and Social Psychology isn't actually a scientific journal. An initial experiment/study is only a part of the process. If you get extraordinary results out of an experiment, but no one else sees the same results with the same experiment, well, something was up with your run of it. It is vital to the body of scientific knowledge that re-runs of published experiments are performed, recorded, and published. Otherwise, there's no way to know whether the original result was real or a fluke (or fraudulent).
When dealing with failed replicability it is important not to conflate a poorly designed experiment with a poorly written methods section. Often there is tacit or socially negotiated knowledge required to properly replicate an experiment. This doesn't mean the science is bad, it means science is more social than some would like to admit. For some interesting reading on this check out Harry Collins's paper "The Seven Sexes: A Study in the Sociology of a Phenomenon, or the Replication of Experiments in Physics"[1]
The truth of that depends on the accuracy of the title of:
Violence is a curvilinear function of temperature in Dallas: A replication.
Rotton, James; Cohn, Ellen G.
Journal of Personality and Social Psychology, Vol 78(6), Jun 2000, 1074-1081. doi: 10.1037/0022-3514.78.6.1074
I'm curious how this will play out in the scientific world - if there's a slew of failed attempts to reproduce the findings, this will be the new cold fusion, and such an embarrassment may actually affect the way papers get published.
In the layman world, of course, the original paper will be used as justification for whole shelves of pseudoscience books, and the public won't actually hear about any failed attempts to reproduce the claims.
http://en.wikipedia.org/wiki/Publication_bias#The_file_drawe...
in which publication bias among the published studies will then skew any subsequent attempt at meta-analysis of the published studies. The file drawer effect is of particular concern in meta-analysis of purported extrasensory perception.
http://www.skepdic.com/filedrawer.html
By the way, Wiseman is perhaps best known online as the author of the Colour-changing Card Trick website, featuring videos of an experiment that demonstrates selective attention in human perception.
http://www.quirkology.com/USA/Video_ColourChangingTrick.shtm...
If you haven't seen the video before, it is well worth watching.
I hope the more responsible journals of psychology will step up their efforts to publish commentaries on the study design
http://norvig.com/experiment-design.html
of ESP studies and publish in general more studies showing failures of replication of earlier study findings.