How do you teach novice scientists to be thoughtful, critical researchers?
I recently had two experiences that spoke to this issue. First, I ran into my colleague Carl Johnson in the BioSci department at Vanderbilt. He said that he was finishing up getting ready to present at journal club, and that he needed to go take another look at the paper “to challenge them to be hypercritical.” I was intrigued by the notion of forcing the other readers of the paper into the uncomfortable position of being hypercritical and so I stopped to ask Carl what he meant. He said, “We basically piss on any weak aspect of the paper. We’re trying to teach them to think like reviewers,” or to look at every piece of data with a very critical eye. I haven’t been to Carl’s journal club, but I suspect that graduate students and post-docs quickly come to expect that readers will view their work with the same level of critique—and thus learn to apply a fairly rigorous lens to the work before submitting it for presentation or publication.
Later the same day, I attended presentations by graduate students on teaching-as-research projects that they are completing as part of CIRTL Network programs. Teaching-as-research, a core tenet of CIRTL, bears a lot of resemblance to the scholarship of teaching and learning (SoTL) and discipline-based education research (DBER); I’m not sure I’m smart enough to parse out the differences, but all investigate questions of teaching and learning. Thus these grad student presentations, which could be works-in-progress, investigated some facet of student learning, usually in response to an intervention (e.g., do active learning techniques improve learning in a graduate science class? Does cognitive processing facilitated by think-pair-share improve student retention of content?). The tone during these presentations was notably different than the one that Carl espoused earlier in the day, with facilitators (including myself) celebrating the fact that graduate students and post-docs were doing this type of research rather than casting a truly critical eye on the results. In this case, the goal seemed to be to support and applaud the effort rather than to critically review the result.
As all experienced mentors know, both approaches are important to the development of budding scientists. Excessive early criticism drives people away from a field—be that a disciplinary research field or a discipline-based education research field. In the early stages of development, it’s vital to provide support and gentle, helpful criticism. In the case of discipline-based education research, however, how do we move from relatively uncritical support to Carl’s training to “piss on any weak aspect”? For scientists to trust the results of the work, it’s vital that we be able to apply the latter lens: we have to be our excessively, exuberantly critical selves. And yet, few of us are trained in this sort of research. How do we move from the gentle support that the CIRTL Network is providing to more rigorous, critical support of the work?
CBE-Life Sciences Education is providing one mechanism to make this transition with their recent implementation of a “Research Methods” feature in the journal. This feature seeks to make discipline-based education researchers (or SoTL Scholars…or Teaching-as-Researchers) more familiar with good practice in the sort of social science research that many of us seek but are not sure how to do. The first feature focused on the importance of calculating the effect size as well as the significance when characterizing the effect of an intervention on learning, while the most recent one addressed using linear regression to control for student characteristics in undergraduate STEM education research.
There are also excellent local resources to help us be more rigorous as we investigate student learning. The CFT’s SoTL Guide, authored by Assistant Director Nancy Chick, provides a fantastic introduction to the frameworks and approaches common in assessing teaching and learning, walking the reader through the process of identifying evidence, planning the project, and analyzing data.
Finally, some of our science colleagues are leading by example. From Richard Hake’s classic meta-analysis comparing the effects of interactive engagement to traditional instruction on learning in physics, to Carl Wieman’s careful analysis of the efficacy of deliberate practice, to Erin Dolan’s careful construction of a tool to measure the effects of science inquiry experiences (and many more that I am omitting), we have excellent examples of rigorous analysis of student learning by scientists employing their inherently critical natures.
And so I think the answer is this: we must support novice scientists in their research through praise, example (from our own work or others’), thoughtful questioning…and the expectation of careful criticism. We owe them this support no less in discipline-based educational research (or SoTL, or TAR) than in their disciplinary research.