Formal and Informal Research

I have been thinking a lot about formal and informal research: how both have a place, but how they shouldn’t be confused with each other. One of my longstanding objections to “action research” is that it confuses the informal and formal.

Andrew Gelman discusses this problem (from a statistician’s perspective) in an illuminating interview with Maryna Raskin on her blog Life After Baby. It’s well worth reading; Gelman explains, among other things, the concept of “forking paths,” and acknowledges the place of informal experimentation in daily life (for instance, when trying to help one’s children get to sleep). Here’s what I commented:

[Beginning of comment]

Yes, good interview. This part is important too [regarding formal and informal experimentation]:

So, sure, if the two alternatives are: (a) Try nothing until you have definitive proof, or (b) Try lots of things and see what works for you, then I’d go with option b. But, again, be open about your evidence, or lack thereof. If power pose is worth a shot, then I think people might just as well try contractive anti-power-poses as well. And then if the recommendation is to just try different things and see what works for you, that’s fine but then don’t claim you have scientific evidence one particular intervention when you don’t.

One of the biggest problems is that people take intuitive/experiential findings and then try to present them as “science.” This is especially prevalent in “action research” (in education, for instance), where, with the sanction of education departments, school districts, etc., teachers try new things in the classroom and then write up the results as “research” (which often gets published.

It’s great to try new things in the classroom. It’s often good (and possibly great) to write up your findings for the benefit of others. But there’s no need to call it science or “action research” (or the preferred phrase in education, “data-driven inquiry,” which really just means that you’re looking into what you see before you, but which sounds official and definitive). Good education research exists, but it’s rather rare; in the meantime, there’s plenty of room for informal investigation, as long as it’s presented as such.

[End of comment]

Not everything has to be research. There’s plenty of wisdom derived from experience, insight, and good thinking. But because research is glamorized and deputized in the press and numerous professions, because the phrase “research has shown” can put an end to conversation, it’s important to distinguish clearly between formal and informal (and good and bad). There are also different kinds of research for different fields; each one has its rigors and rules. Granted, research norms can also change; but overall, good research delineates clearly between the known and unknown and articulates appropriate uncertainty.

Update: See Dan Kahan’s paper on a related topic. I will write about this paper in a future post. Thanks to Andrew Gelman for bringing it up on his blog.

Action Research and Its Misconceits

When I was in education school and teaching at the same time, we were all required to complete an “action research” project. I objected to this on ethical grounds but ended up doing it anyway. (I made my project as harmless and unobtrusive as possible.) At the time, I ransacked the Web for critiques of action research. The only substantial critique I found was an intriguing article by Allan Feldman, professor of science education at the University of South Florida, who suggested that action research should be more like teaching practice and less like traditional research. (Feldman has written extensively on this subject; he supports action research vigorously but recognizes its pitfalls.)

Action research is research conducted by a practitioner (say, a teacher, nurse, or counselor) in action—in his or her normal setting. It is often informal and does not have to follow standard research protocol. The researcher poses a question, conducts a study or experiment to investigate it, and arrives at conclusions. The action research study may or may not ever be published.

To some degree, teachers conduct action research every day. They frequently try things out and make adjustments. The difference is that (a) they don’t call this research and (b) they don’t have to stick with a research plan (if something isn’t working out, they can drop it immediately). Action research, by contrast, calls itself research and requires more sustained experimentation (if it is to have meaning at all). There lie its problems.

First of all, action research (that I have seen) adheres neither to traditional nor to alternate standards. To call it research is to muddy up the very concept, unless one clearly states what it is. What can action research be? It clearly cannot follow traditional research design. First, it is difficult, if not impossible, for a practitioner to conduct a scientifically valid experiment while performing his or her regular duties. Second, most practitioners have mitigating influences: their prior experience in that setting, their knowledge of the individuals, and their preferences and tendencies. This almost inevitably shapes the findings. If action research is to follow an alternate protocol, then this must be defined with care.

Now for the second problem. Although most action research projects probably do little or no harm, it is ethically problematic to require them. First of all, the teacher may distort her work more than she would do otherwise; she may find the project more distracting than helpful. Second, because of the sheer number of required action research projects, there is rarely much supervision. Teachers conducting such “research” are not required to obtain permission from parents or even notify the students. If education schools were to institute a protocol for action research, they’d double or triple their own work and that of the teachers. That would be impractical. Thus many novice teachers conduct their experiments without even the schools’ knowledge.

I originally thought that the ethical problem was the primary one. I am no longer sure. Most teachers conducting these projects are just getting their bearings in the classroom. An action research project usually amounts to a mild distraction at worst and an interesting investigation at best. However, there should be a standard protocol for such experiments, and they should be voluntary.

The greater problem, in my view, is the intellectual one, with all its implications for policy. We already have enough trouble with the phrase “research has shown.” Again and again we find out that research hasn’t quite shown what people claim it has shown. Because few people take time to read the actual research (which can be cumbersome), researchers and others get away with distorted interpretations of it. Add to this a huge body of quasi-research, and anyone can say that “research has shown” just about anything.

Proponents of educational fads can almost always find “research” to support what they do. Some of it is action research of dubious quality. For instance, the Whole Brain Teaching website cites, on its “Research” page, a “study” titled “Integrating Whole Brain Teaching Strategies to Create a More Engaged Learning Environment.” (I am linking to Google’s cached version of the “Research” page, since the original “Research” page is now blank.) As it turns out, the study took place over the course of a week. The author was testing the effect of “Whole Brain Teaching” on student engagement. She made a list of nine student “behaviors” and observed how they changed between Monday, October 19, 2009, and Friday, October 23, 2009.

One could write off Whole Brain Teaching as some fringe initiative, yet it made its way into an elementary school where I previously taught. It touts itself as “one of the fastest growing education reform movements in the United States.” (I have written more about it in my book and in my blog “Research Has Shown—Just What, Exactly?”) Important or not, it cites shaky research in support of itself—and so do many initiatives. One way to combat this is to insist on basic research standards.

Now, I recognize Dr. Feldman’s argument that action research should try to be less like traditional research and more like actual teaching practice. But in that case, its claims should be different. Its purpose should be to inform the practitioner, not to produce findings that can be generalized. Even in that case, it should have some sort of quality standard. In addition, those conducting the research should exercise caution in drawing conclusions from it, even for themselves. Any action research paper should begin with such cautionary statements.

I am not suggesting that action research be abolished; it has plenty of useful aspects. Of course, teachers should test things out and question their own practice—but voluntarily and perspicaciously. Should such investigation be called research? I’d say no–but the name isn’t really the problem. The challenge here–and for education research overall–is to dare to have a modest and uncertain finding.

  • “To know that you can do better next time, unrecognizably better, and that there is no next time, and that it is a blessing there is not, there is a thought to be going on with.”

    —Samuel Beckett, Malone Dies

  • Always Different

  • Pilinszky Event (3/20/2022)

  • ABOUT THE AUTHOR

     

    Diana Senechal is the author of Republic of Noise: The Loss of Solitude in Schools and Culture and the 2011 winner of the Hiett Prize in the Humanities, awarded by the Dallas Institute of Humanities and Culture. Her second book, Mind over Memes: Passive Listening, Toxic Talk, and Other Modern Language Follies, was published by Rowman & Littlefield in October 2018. In February 2022, Deep Vellum will publish her translation of Gyula Jenei's 2018 poetry collection Mindig Más.

    Since November 2017, she has been teaching English, American civilization, and British civilization at the Varga Katalin Gimnázium in Szolnok, Hungary. From 2011 to 2016, she helped shape and teach the philosophy program at Columbia Secondary School for Math, Science & Engineering in New York City. In 2014, she and her students founded the philosophy journal CONTRARIWISE, which now has international participation and readership. In 2020, at the Varga Katalin Gimnázium, she and her students released the first issue of the online literary journal Folyosó.

  • INTERVIEWS AND TALKS

    On April 26, 2016, Diana Senechal delivered her talk "Take Away the Takeaway (Including This One)" at TEDx Upper West Side.
     

    Here is a video from the Dallas Institute's 2015 Education Forum.  Also see the video "Hiett Prize Winners Discuss the Future of the Humanities." 

    On April 19–21, 2014, Diana Senechal took part in a discussion of solitude on BBC World Service's programme The Forum.  

    On February 22, 2013, Diana Senechal was interviewed by Leah Wescott, editor-in-chief of The Cronk of Higher Education. Here is the podcast.

  • ABOUT THIS BLOG

    All blog contents are copyright © Diana Senechal. Anything on this blog may be quoted with proper attribution. Comments are welcome.

    On this blog, Take Away the Takeaway, I discuss literature, music, education, and other things. Some of the pieces are satirical and assigned (for clarity) to the satire category.

    When I revise a piece substantially after posting it, I note this at the end. Minor corrections (e.g., of punctuation and spelling) may go unannounced.

    Speaking of imperfection, my other blog, Megfogalmazások, abounds with imperfect Hungarian.

  • Recent Posts

  • ARCHIVES

  • Categories