

Discover more from Living With Evidence
A Word Stretched Far Beyond Its Breaking Point
Evidence is a complicated word, but it shouldn’t be.
This photo from my garden is evidence that it’s spring in southern Australia. Sure, it’s not the best evidence of what season it is. Evidence has limits, and a single close-up photo has a lot of them.
That narrow snapshot wouldn’t even count as relevant evidence, say, for me to claim that my whole garden is looking fab-U-lous, or that I’m a great gardener. (It isn’t, and I’m not.) To justify being called evidence for a claim, something should, at the very least, be capable of proving the point.
With research, that means you have to have the right type of study. This week, I waded into some claims that particular, very expensive, grief services are “evidence-based” or proven to work – even though studies of them don’t come anywhere near to clearing that minimum bar.
That doesn’t mean the people behind these claims are not well-intentioned. Many people in this field seem to think any kind of study at all is proof that what they do works, as long as the participants felt better over time. But grief is one of those experiences that improves over time for the overwhelming majority of people anyway. It’s easy, then, to take credit for something that would have happened without your intervention. You could even be dragging people down or holding them back, and you wouldn’t be able to tell without well-controlled trials.
It’s tough when problematic scientific practices are very widespread in a field: How, then, are researchers supposed to know they’re on the wrong track? Yesterday, Cailin O’Connor tweeted that a paper she and colleague Paul Smaldino wrote has been accepted by a journal. You can see an earlier (preprint) version of it here. “Why,” they ask, “do bad methods persist in some academic disciplines, even when they have been clearly rejected in others?”
It could be, they wrote, that poor scientific methods persist because too many people in the field don’t have the competence to judge them. Weak methods might proliferate just because people copy each other, too. (That was the hypothesis from an old study about problematic statistical methods in meta-analyses as well.) The Smaldino and Cailin paper also discusses self preferential bias – a tendency for researchers to favor “the (possibly inferior) methods that they themselves are already using” – and “conservative biases for existing methods”.
Smaldino and Cailin use a case study and a model to argue that interdisciplinary science could spread superior scientific methods into fields with weaker science. I wish that would happen more quickly! Here’s hoping that it’s the stronger methods that are more inclined to win those encounters, than the shortcuts to wrongness. And thanks to journalist Christie Aschwanden, whose work was drawn on heavily in the case study, and whose tweet alerted me to this interesting paper.
Needing a bit of inspiration, I dug into the life of a scientist I’ve been wanting to learn more about for a while – the amazing Katsuko Saruhashi.


Saruhashi was a shy child, who became an activist for peace, nuclear testing bans, and women in STEM. Her life’s experience is an important reminder that sheer prejudice can be responsible for the perpetuation of weaker scientific methods, too. One of her achievements was a more precise method for detecting radioactivity in seawater – critical when the US was testing hydrogen bombs in the Pacific near Japan in the 1950s.
Hamblin and Richards wrote an illuminating article on the political backdrop to the struggle for acceptance Saruhashi and her colleagues faced. They report that “many American scientists toed the line in playing down the dangers from atmospheric testing, and dismissed Japanese scientific work as ill-informed at best or wildly irresponsible at worst.”
Saruhashi then faced the additional whammy of gender discrimination when she went to the Scripps Institution in San Diego to test her analytical technique against a well-accepted US one. First, writes Sumiko Hatakeyama, Saruhashi wasn’t set up at the Institution itself, but “was instead asked to work in a wooden hut.” The scientific scales were reportedly tilted against her technique as well: She was given samples with lower concentrations, making detection harder for her than it was for the Americans.
Nevertheless, she persisted.
Wishing everyone taking on the barriers of discrimination a particularly good week,
Hilda
You can see more photos and info about Katsuko Saruhashi’s life and work in my Twitter thread.