Adult Literacy Practitioners Use Research
TCALL Occasional Research Paper No. 2
St.Clair, Chia-Yin Chen, and Lyndsay Taylor
Texas Center for Adult Literacy & Learning
As staff members of the Texas literacy resource center, we are always interested in how practitioners use the research related materials our center distributes. Do they have any impact on practice, and is this what practitioners expect when they call us up and ask to borrow materials, or visit our website? In 2002 we conducted a research project to discover what practitioners really think of research, and what kind of use they make of it. Even though the authors are all trained academic researchers we entered the project determined to bracket our professional perspective and build bridges across a gulf we perceived between practitioner and researcher views of research.
The central idea of this discussion is application - the way that research gets applied to practice in an educational context. As Hammersley (2002) points out, looking at issues of application involves considering three questions: what the role of research is and has been in the past, what it could be, and what it ought to be. In this discussion we mainly tackle the first question, though we discuss the implications of our findings and make recommendations in the spirit of the third question. We view the relationship of research to practice as a serious problem, especially in the light of current US policy calling for high quality research information to underpin all educational provision (USDOE, 2002). It is extremely pressing for researchers and practitioners to get together and work out what are reasonable expectations of research, and how they can be met.
Perspectives on the use of research
Is the role of research to provide information or to produce change?...It is both. Research findings that are not shared with practitioners in ways that foster application are ineffective. Unfortunately, traditional models of research to practice assume that the transfer of knowledge can take as long as 50 years. First, scholars conduct the research, then they publish findings in academic journals, then the academic articles form the basis for similar research and at the same time make their way into the syllabi of preservice academic training for teachers, and slowly the knowledge makes its way into classroom practice (Garner et al., 2001, p.8).
Different people have different perspectives on why research matters-or, indeed, whether it matters at all. Researchers, practitioners and policy makers each have their own views on why they feel research is important. Researchers typically "…want research studies that address global issues in a way that is broadly applicable to understanding" (Fox, 2000, p.239). There is a tendency for researchers to see and understand the world on a conceptual level, and they are trained to enjoy working with cloudy concepts and ideas. Practitioners have a diametrically opposed view, tending to "… want studies that will guide them through everyday problems of practice rather than studies that address broad areas of interest" (Fox, 2000, p.239). Practitioners want useful information, clear suggestions, and they want to know it immediately, right when they need it (Fox, 2000). This tension is probably recognizable to anybody who has tried to cross the divide between practice and theory.
Even attempts to discuss the issues of research utilization can founder on the differences between researchers and practitioners. Researchers often want to know "why" practitioners are not using results from educational studies in their classrooms, while the practitioners are asking "how" they are supposed to incorporate research findings into their everyday practice (Warby et al., 1999). There is a gap in communication and understanding between the providers of research and those attempting to apply research-based information. Researchers and practitioners view the purposes of conducting research and methods for utilizing research differently.
Furthermore, practitioners are frequently unsure of the validity of research (Gersten, 2001) because they are not experienced in, or comfortable with, methods of determining which studies are valid and which are not. Often the validity of a study can be determined by the reputation of the source in which the research was published or by careful review of data collection and analysis procedures, but making these judgments requires a fairly sophisticated understanding of research structures (such as journal reputations). A further complication is that highly respected people often have opposing views on very significant issues, which leaves practitioners confused and untrusting.
Although many professions contend with cynicism about the validity and utility of research, cynicism toward educational research is extraordinary by comparison...[V]irtually every college of education in this country contains faculty members who advocate diametrically opposed approaches to teaching reading and mathematics, all of which are presumably based on research. As a result, many students of education leave universities feeling bewildered, betrayed, or both (Gersten, 2001, p.45).
As graduates leave the university setting and pursue a career in education, they continue to struggle with the uncertainty of research-based information (Gersten, 2001). "The research community often has conflicting opinions [about the 'best way' for research to be conducted], which can confuse and mislead practitioners" (Gersten, 2001, p.45).
Researchers and practitioners hope to improve our current knowledge and educational practice. However,
Research has not led to widespread practice changes, and persistent practice problems have not generated many research projects. We have not changed the practice very much and the practice has not changed us very much. That is partly because practitioners and researchers have different perspectives as to the needs of the field (Fox, 2000, p.238).
Practitioners often believe that researchers could be doing more to make their work accessible and relevant to everyday issues, while other commentators argue for continued separation of research and practice:
Some people consider that the researchers do not need to think of their work in terms of answers to questions, thereby making research legitimate; instead, researchers should view research as something that gives rise to new ideas, questions old assumptions, and contributes concepts that can be tools to better understand everyday problems (Hultman, 1995, p.347).
There appears to be no simple answer to the best relationship between research and practice, leading to the question of how research is currently used by those who find it useful.
Studies on the Use of Research
The director of the Educational Resources Information Clearinghouse (ERIC) and his graduate assistant conducted a study in 1998 to determine the utility of the ERIC database. This study arose from the observation that "[t]here is a large body of literature claiming that most end-users obtain poor results when searching for themselves" (Hertzberg, 1999, p.4). There are 950,000 citations available in the ERIC database (Hertzberg, 1999), yet users are not consistently obtaining the information they are seeking. The study therefore, focused specifically on the end-users of the ERIC database. Questions addressed included who was searching the database, what strategies they were using for their search, the quality of their results, and their experience overall.
In November 1998, the ERIC clearinghouse initiated their study with an online questionnaire and tracking system. Analysis was based on 3,420 users, with a significant data set generated. The purposes for searching the ERIC database, as stated by on-line users, are (in order from most to least given response): "research report preparation (53.4%), class assignment (17.6%), professional interest (16.2%), lesson planning (5.2%), background for policy making (5.1%), [and] classroom management (2.6%)" (Hertzberg, 1999, p.6). The users of the ERIC database are strongly oriented towards research, and most users identified themselves as researchers (Hertzberg, 1999). The interesting insight from this study is that few users were practitioners accessing research reports for assistance with practical issues-the ERIC database, this study suggests, is mainly used by researchers talking to other researchers.
Another commentator, himself a researcher, is clear about the pointlessness of the research/practitioner divide:
One of the great ironies of traditional education is that teaching and learning are so inherently concerned with knowledge but those engaged in the education process have so little ownership over the "knowledge products" on which they build their careers…the majority of teachers, counselors, and administrators spend their careers at the receiving end of "manufactured" research products produced in remote university "factories" by unseen research experts. Seen this way, the conventional role of the teaching system is merely to buy and use the products of others (Quigley, 1997, p.3).
The Research and Teacher Learning (RTL) study was conducted in an effort to address this irony, and understand how research could influence teacher learning (Kennedy, 1997). The two questions used as focal points were "What and how do teachers learn from reading research?" and "What and how do teachers learn from conducting research of their own?" (Kennedy, 1997, p.25). One hundred teachers who were involved with some form of professional development or continuing education that involved use of research results (such as graduate school or a teacher-research program) were interviewed.
The interview consisted of four sections addressing teachers' previous beliefs and values about teaching, beliefs and values in regards to research, experience in conducting their own research (which 75% of participants had), and responses given after reviewing specific research articles (Kennedy, 1997). The results of the study focused in on the last of these sections. The researchers argued that in order for teachers to learn from research, they must "understand what the main message is from the study, test the validity of the message somehow, [and] connect that message in their own situation" (Kennedy, 1999, p.26), and that responses to research readings could provide a great deal on insight into this process of application. Teachers in this study were asked a series of questions after reading a research article. Researchers found that teachers gave the same kinds of reasons for agreeing or disagreeing with articles, even though each article was based on a different type of study. Many teachers tended to use their own values, beliefs, and experiences as their method for deciding whether or not a research study was "valid," rather than evidence presented in the study.
...[F]indings suggest that teachers connect research to their practice in two very different ways. On one hand, they use their own beliefs, values, and experiences to evaluate the validity of the study, but on the other hand, they also take something new from the study, as it stimulates their thinking and prompts them to reinterpret their own experiences and to reconsider their practices (Kennedy, 1999, p.27).
Overall, this review suggests that educators tend not to access research as a way to modify practice, and that when they do read research there is a tendency to filter it through their own experiences. The implication is that the impact of research on practice is far lower than might be hoped for by researchers. The aim of our research was to discover if these perceptions held true for adult literacy and ESL practitioners in Texas, and how the relationship between research and practice could be improved.
This project used mixed
methodology combining a mail survey to collect quantitative data and short
responses with telephone interviews designed to deepen our understanding
of the responses. The survey was a four page form laid out in a regular
font with lots of white space, hopefully making it less intimidating than
a typical survey form. The relatively small size of the adult education
workforce in our state means that they are regularly asked to respond
to surveys, focus groups, and interviews, and we were concerned that the
survey looked both appealing and simple to complete. Included with the
survey instrument were a letter to the program director (explained below),
a letter to the respondent, and an interview form (explained below).
The survey was mailed out to all 404 adult education and ESL program directors in the state in late 2001, with a request that directors pass the survey on to one instructor who was interested in completing it. This rather cumbersome method was necessary because a list of instructional staff was not available. The cover letter to directors contained all the details necessary for informed consent, because we needed them to feel comfortable asking their staff to participate. We also invited the directors to examine the survey itself so that they were fully familiar with the task.
After two rounds of follow up (once by mail and once by internet listserv) we received 143 responses (a 35% response rate). Given the nature of the population, and their understandable reluctance to commit a great deal of time to research, we judged that this was a satisfactory response. In almost every case the respondents completed all of the multiple response questions, though the open ended questions were less fully addressed. Respondents also had the option to complete and return an interview form, which asked them to supply contact details so that they could be contacted for an hour long telephone interview. Forty-two people returned this form, of which 16 were selected for interview based on geographical and professional diversity. The interviews were transcribed and analyzed using the Nud*ist qualitative software program. Comments about research were listed by interview question and then specific themes identified.
Of the 143 respondents to the survey, 66% were involved in instruction, whether in the classroom or individual tutoring. The remainder (34%) were involved only in administration. We had expected to hear exclusively from instructional staff, but our interest was in the research utilization of practitioners, and we concluded that administrators fall into this category. Not only are they intimately involved with practice, they usually have substantial instructional experience. In terms of educational focus, 68% of respondents worked in ESL, 51% in Adult Basic Education, 52% in GED preparation, 5% in prison education, and 13% in employment preparation. Seven individuals also selected "other." Respondents could select more than one focus, and generally did so.
The employment setting for respondents was usually a state agency (community college, school district, etc.) with only 34% based in voluntary agencies. Forty-six percent worked in a family literacy program. Fifty-five percent of respondents were paid full time staff, and another 29% were paid part time for a total of 84% paid staff. The remainder were volunteers, one of which was full time.
Respondents were generally experienced in adult literacy education-only 9% had been in the field under a year, and the biggest category (45%) had been in adult education more than seven years. Survey respondents were overwhelmingly female (82%) and more than half were over the age of 50 (53%). Eighty-two percent were over forty years of age. When asked how they described their ethnic background, 71% of respondents stated White or Caucasian (non-Hispanic), 21% stated Hispanic, 4% stated African American, 2% Native American, and 1% Asian. Though there are no figures available for the ethnic background of adult education practitioners in this state, based on observation we would claim that this breakdown is not too far from the population level, though non-Hispanic Whites may be over-represented. Care is needed in interpreting our findings in the light of this issue.
Education is likely to have a bearing on the use of research. Ten percent of our respondents identified high school as the highest level of education they had completed, 44% had a bachelor's degree, 42% had a master's degree, and 4% had a doctorate. Respondents were almost evenly split between those currently qualified to teach in the state's public schools (49.3%) and those not (50.7%). When asked what specialized training in adult literacy they had received, only 9% stated they had none. The most common response was an in-service course, with 73% stated they had gone through such training. The second most common response was pre-service training, which 39% had experienced. Sixteen percent had been through a graduate course in adult literacy and 7% had been involved in an adult literacy graduate program. Other forms of training mentioned included Literacy Volunteers of America and Laubach Literacy Action training.
When we asked about training in research, 37% of respondents replied that they had such training, and comments indicated that this was usually part of a graduate degree. A slightly smaller group (34%) had experience conducting research, and this experience was less likely to be linked to a university. One respondent mentioned examining "learning needs in the Hispanic community" and another mentioned "county-wide research." We also asked about people's future research and education intentions. Forty-nine percent said they were very or somewhat interested in doing more research in the future, with 28% not at all interested. Forty-nine percent were also interested in further education in adult literacy or ESL, with 36% not at all interested.
One important issue to consider was practitioners' pattern of consumption of research information. Twenty two percent read research related items at least weekly, 28% suggested monthly, 38% said several times a year, and 12% replied that they rarely read research. We also asked how many research related resources they typically read in a year. Seventy-six percent indicated that they read less than 10, which is somewhat contradictory with the frequency responses (only 24% say they read more than ten a year, but 50% are reading research at least monthly). Sixteen percent estimate 11-20 items per year, and 8% read more than 20 per year. So research consumption, while relatively frequent, is not particularly extensive.
These responses indicate a reasonable level of engagement with research, and we were interested to discover where they found it. Multiple answers were encouraged in this category. State and national conferences (77%) and state and national newsletters (77%) were identified as the most common sources, though internet sources (67%), professional development meetings (54%), and academic journals (48%) were also frequently mentioned. The least frequently mentioned were academic books (23%) and research reports (25%). When asked which format they prefer, respondents identified the internet (49%), newsletters (58%), and conferences (48%) as their favorite means of finding research information, with academic books (9%) and research reports (18%) being the least popular. Reasons included accessibility, convenience, speed, and brevity. It is worth noting that no respondent mentioned validity of the research.
One dimension striking us as important was practitioners affective perspective on research. Eighty-four percent of respondents reported that they find research to be somewhat or very interesting, with only 2% replying "not at all interesting." This pattern held for relevance, with 85% saying they found research somewhat or very relevant to their work. Four percent selected "not at all relevant." We were quite surprised by the proportion of practitioners stating they found research somewhat or very interesting given the relatively small number of resources they were reading each year.
Changes in Practice
The impact of research on practice is an important component of the relationship between the two areas. We were interested to discover if research led to meaningful changes in practice, and whether the sources for those changes were what we, as university based researchers, might expect. We asked respondents whether they had ever changed their practice as a result of research based information, and 66% indicated that they had. The vast majority of changes were within the last two years, and we judged that almost all were significant. The least far-reaching change of the entire list-and it is far from trivial-is to use more visual aids in teaching.
The source of information behind the change ranged from a general reference to "reading" to specific examples of conference presentations or publications. Conferences and in-service training were particularly frequently mentioned, with newsletters and books referred to much less often. Of 63 respondents providing details of the changes, 36 (57%) mentioned some form of face to face interaction, whether with colleagues, trainers or presenters. Newsletters were also widely mentioned.
Reasons for not making changes based on research also varied widely, though two themes within the responses were lack of time and the feeling that current practices were effective enough to maintain without change. One interesting comment came from a practitioner who commented that they "can't recall any earth-shocking research that was life-changing." Overall, responses suggest that for these practitioners making changes in practice-whether based on research or not-was not a high priority. This comment is not a criticism, but simply reflects the context in which they were working.
Finally, we were interested in how practitioners assess the value of research they read. This was an open ended question, so we categorized and scored the responses. The response category with the highest score was source credibility-whether the practitioner knew the author's work, the journal, or distributor of the research and found them credible. The second highest score was relevance and applicability, or the degree to which the research findings could be applied to the specific arena of practice. The third was similarity of setting, or the extent to which the population and framing of the research resembled that of the reader. The last two factors receiving significant mention were quality of research design and whether the reader agreed with the conclusions of the research based on their experience.
Factors affecting the use of research
In an attempt to understand more about the factors coming into play when practitioners read research we tested a number of hypotheses about relationships between various influences. This was done by a statistical test referred to as Pearson Correlation, which examines the relationship between variables measured at the interval levels. For all of the tests we used a significance cut-off of 0.05 probability-if the given results would occur by chance more than one time in twenty we did not consider them significant. It is worth noting that while these statistics can describe the relationships within the sample, their generalizability may be somewhat compromised due to the small sample size.
We had speculated that newer instructors would use research resources more as they attempted to become familiar with the field. In fact, there was a strongly significant trend of practitioners reading more research as they became more experienced in the field. In addition, more experienced practitioners tended to read more research reports and academic journals, which usually report research at greater length and in more detail. Rather than new practitioners relying on the products of research, it appears that established practitioners work with it more. This is supported by the finding that more experienced practitioners are more likely to have made a recent change due to research, suggesting that the research they read is more fully integrated into their practice.
We were also interested in how formal education changes the relationship with research. Somewhat to our surprise, there was no significant effect. Practitioners with more education did not use research resources more, are not more likely to use academic sources (journals and reports), and are not more likely to have made changes based on research. In effect, an established practitioner with a high school education is as likely to read research as a new instructor with a PhD.
What did matter, however, was whether people had specific training or experience in conducting research. These two factors are themselves strongly related, as might be expected, and each has a strong correlation to frequency of reading about research and the amount of research read. In fact, the more training and experience they have, the more they use research. A further finding was that practitioners with training and/or experience in research find it more relevant to their work. We also tested whether practitioners with more training or experience in research were the ones with the most formal education, and found there was no significant correlation. Once again, formal education is not a predictor of the use of research.
We were also concerned about the ethnic breakdown of respondents, and whether this factor affects the relationship to research. We checked whether ethnicity was related to length of time in the field or education and found no significant link. We found no significant correlation between ethnicity and the source of research information, or the frequency with which it is read (this was only tested for "White non-Hispanic vs. other" and "Hispanic vs. other" due to the small number of respondents in other categories). Gender also showed no significant correlation with these factors. Among the group of hypotheses we tested the only significant correlation suggested that White non-Hispanic practitioners tend to prefer research reports less than other groups, and that Hispanic practitioners tend to read fewer research related resources each year.
Two other significant findings emerged. First, there is a very clear correlation indicating that as practitioners view research as more useful and relevant, they are more likely to have recently made a change based upon it. Second, there is a strong correlation suggesting that more consumption of research makes it more likely that practitioners will have made a change based upon it. These two correlations suggest that practitioners making changes based on research tend to do so as informed consumers-they read a lot of research and regard it as relevant.
Information from the interviews
The practitioners we interviewed over the telephone were predominantly female (87%), white (94%) and over fifty (50%). The two overarching themes emerging time and again in the discussions were application and credibility-how does this research change the way I approach my work and how do I know I can trust it? When asked about the use of research, the responses fell into three categories: validation, design, and improvement. Validation is using research to add credibility to a course of action. This can occur inside a program when educational decisions are made or externally, when defending a program to others. An example of the first use would be choosing a curriculum to follow, where having research showing the effectiveness of the chosen system can help justify its cost. An example of the latter use is grant writing:
. . . there are a lot of useful things and mainly it's towards the grant writing entity, or proving to United Way why or why not you can make certain gains, or to business or companies that think literacy can be a business (practitioner 11)
Research was often described as a resource for designing programs or instruction, making decisions "as to what works and doesn't work" (practitioner 15). One interviewee talked about how it "helped direct me in my teaching strategies or curriculum or whatever" (practitioner 14). Another talked at length about her application of research:
. . . it helps me to go about with a program plan. You know, what do we implement? And what kind of materials are we going to use to implement a specific process because we usually look into, how is this going to work for an adult in a learning environment situation? How do we reach this person? How do we reach that person? What is this learning technique or style? (practitioner 9)
The most commonly cited use for research, however, was program improvement. "It just helps the program to run a little more smoothly and a little more cheaply and a little more efficiently, and all of those things" (practitioner 9). However, scepticism was also expressed about the ability of research to improve programs: "And so much in the teaching field, it's not all that new, it's something somebody did a long time ago and they are reviving it or they are calling it by a different name" (practitioner 3).
Since application is clearly an important concern, we were interested in what kind of research practitioners were drawing on, and how it was concretely used to inform practice. The results were consistent with the findings of the survey, with interviewees citing many different forms of information and a variety of ways of gaining access to it. Several people talked about large scale curriculum guidelines such as the "Bridges to Practice" material on adult learners' learning disabilities and the "Equipped for the Future" material. Some practitioners were clear that their practices were informed by several sources: "Well, cooperative learning, that theory and the research that went along with it that I was exposed to through articles, reading about it, and also through these seminars . . ." (practitioner 14).
The content of the research mentioned by interviewees was often strikingly eclectic. One practitioner talked of her interest in playing background music in the classroom as a way to help people learn, and another talked about the usefulness of knowing about alopecia. Practitioner 16 stated that the most valuable thing she had learned from research was the importance of patience: "You have to go down to their level and bring them up. And, you have to research that to find that." Nonetheless, some strong examples of applying academic research information to practice did emerge, one of the most interesting being the incorporation of student interests into the educational process.
. . . what we did with that research is we, the teachers who deal with GED students, looked it over and I think the bottom line of what we found is GED students are very focused. They want very focused instruction . . . while we want to have the instruction based on real-world application they don't want a lot of side-routes to be run. And by side-routes I mean too much career readiness that takes them away form learning their math, for example . . . GED students are really, really interested in practicing, they want to prepare for the test. (practitioner 10)
This information directly influenced practices within the program:
So the teachers learned when they instruct they want to make sure that the principles that they're teaching, that the student clearly understands that this is going to help you do well on that test and, by the way, you're going to learn a few other things that might help you once you get your GED. (practitioner 10)
These comments reflect
one of the most significant struggles within contemporary adult literacy
provision-the relationship of the academic to the vocational. The instructors
within this program were able to take a research based insight, the focus
of GED students, and meaningfully apply it to the issues they were facing
in the classroom. The instructors have taken a pragmatic insight and applied
it pragmatically and effectively to their practices, and by doing so have
avoided immobilization arising from the vocational/academic divide. This
is highly consistent with the comments by Fox (2000) quoted earlier, where
he suggests that practitioners want research to provide resources for
dealing with everyday problems.
This is reinforced once more by the responses practitioners gave when asked what they would like to research. Several interviewees mentioned areas where a fair amount of research has already been conducted, such as how to keep learners coming to classes and the relationship between adult literacy and children's degree of preparation for school-based literacy learning. Others mentioned issues that are becoming pressing at the national or state level, and that researchers will have to turn their attention to, such as "the hard facts of progress, why and how the adult makes progress and how you justify the dollars" (practitioner 11). A number of interviewees came up with areas needing more research to be conducted, such as "the issue of children being brought up in bilingual households" (practitioner 14) and cross cultural instruction: "you have to understand the culture of those individuals in order to present whatever you're presenting" (practitioner 16). What these research interests have in common is an emphasis upon solving a dilemma of practice.
There was one exception to this pattern. Practitioner 4 was interested in issues on a broader scale.
We're facing an awful lot of policy issues that are fairly troublesome to me. I would have questions about what's going to happen if we begin to back away from funding adult education programs . . . since the President's administration is heavily invested in early childhood education, what's going to happen when we pull money away from good, practical, working programs that support families being first teachers (practitioner 4)
While this question may not be suitable to be addressed by empirical research, it demonstrates that the overarching interest in issues of practice is not seamless. Some practitioners are interested in wider issues of the field.
We asked our interviewees how universities and their research communities could help with their work. A common theme to responses was for researchers to "tie up with practitioners so that the research is focussed on real issues" (practitioner 10). Researchers needed to "focus on the need that's presented rather than going off in a direction . . . just not where we need to be going"(practitioner 9). One practitioner went further, suggesting researchers learn to demonstrate true responsiveness to the field by being:
. . . open to request[s] for research. Maybe access, you know, a more open access to results of research they carry out. And maybe even the possibility to participate in the conducting of research . . . and I'm sure this stuff must be going on, I've just never been drawn into it before (practitioner 14)
This quote helpfully opens up the question of dissemination of research results, mentioned by several interviewees. Practitioners sometimes sounded frustrated as they talked about how difficult it was to get information already generated by research, and how helpful it would be "just getting that information out, you know, when they find something, or you know they prepare something maybe, just to have some means of getting [it]" (practitioner 3). One interviewee commented that "it would help me if I could just tell someone 'oh please let me know what you have on A,B,C topic' and someone would provide me with the materials"(practitioner 8). This suggests that practitioners can be distanced not only from research itself but also from the mechanisms used to manage research information-digests, research newsletters, and search procedures.
Reflecting on research
As we stated earlier in this discussion we did not approach this topic with a pre-determined notion of what was good or valuable research. What we learned from this project is that there really is no such thing as a single universal object called "research," and that practitioners hold significantly different views from the researchers whose perspectives we quoted in the literature review. At the same time, there is common ground. Practitioners and researchers alike, we suggest, see research as producing high status knowledge (hence its use as a validation tool by practitioners) with the potential for useful application. In this final section we would like to discuss how research and practice can be brought closer together.
The survey results suggest that people with experience in both research and in their field of practice tend to make more use of research and, therefore, make changes based upon it. Practitioners have to know enough about both sides to be able to build the bridge from the research to the practice. Researchers cannot possibly anticipate every implication of their work, and it will always be necessary for practitioners to work out how research generated insights fit into practice. One helpful strategy may be for practitioners to try to learn more about how research is conducted, and perhaps even try to conduct some if the resources are available.
It is also important for practitioners to know what they want from the literature when they start looking around for research. Professional researchers rarely just "read research"-they have a specific question in mind and exclude everything not directly relevant to that question. Practitioners need not take this task entirely upon themselves. Research by the ERIC clearinghouse shows that librarians tend to do a better job of searching than any other group, even professional researchers (Hertzberg, 1999). Resources such as state literacy centers and clearinghouses can provide invaluable assistance in focusing and conducting searches of the research literature.
Practitioners should be cautious about using research findings to validate their programming decisions. Claims made by researchers about the generalizability of their research tend to be very careful, especially in the case of qualitative research. What improves a family literacy program in New York may not be as effective in New Mexico, even if both programs work with Hispanic learners. Research data is no easy, universal guarantee that a particular approach will work. Practitioners still need to apply their own expertise to judge the merits of research claims in the area of their practical expertise.
Researchers need to give more thought to transferability of their work, and how to provide the information practitioners need to assess the utility of research findings. One important step would be for researchers to be clear about the implications of their work. These may be entirely theoretical, which is valuable in itself, but if an insight into practice is being claimed it should be laid out clearly. An important component of this process is for researchers to bear in mind the way practitioners use research-validation, design, and improvement. Which of these uses does the research address, and what is the critical point to bear in mind?
The stereotypical divide between research and practice does seem to have some basis in the lived experience of practitioners in our state. The professionals we talked to were clear that they had issues that were not being addressed by the research they were reading, and were sometimes a little frustrated by this disjuncture. While we understand this frustration, it also makes sense to us that researchers and practitioners have different interests and worldviews since they do completely different jobs in different contexts. The idea that researchers would entirely focus on problems of practice seems both unrealistic and limiting and the idea that practitioners all become research experts seems a little naïve. However, we do believe that valuable work can be accomplished if both sides continue to do what they are good at.
We have suggested a number of ways that the two areas of work can converge, but we remain cautious about the degree of integration implied in government mandates that all curricula must be research based. The test of practical resources must be in practical use, and it is unlikely that any one resource will be tested in every possible context by the research community. In the end the skills and experience of the practitioner, albeit informed by research, must be the ultimate criterion of what knowledge is most useful to their practice.
Fox, R. D. (2000). Using theory and research to shape the practice of continuing professional development. Journal of Continuing Education in the Health Professions, 20(4), 238-239.
Garner, B., Bingman, B., Comings, J., Rowe, K., & Smith, C. (2001). Connecting research and practice. Focus on Basics, 4(D).
Gersten, R. (2001). Sorting out the roles of research in the improvement of practice. Learning Disabilities Research and Practice, 16(2), 45-50.
Hammersley, M. (2002). Educational Research, Policymaking, and Practice. London: Paul Chapman.
Hertberg, S., & Rudner, L. (1999). Quality of researchers' searches of the ERIC database. Education Policy Analysis Archives, 7(25).
Hultmen, G., & Hörberg, C. (1995). Teacher's informal rationality. Science Communication, 16(3).
Kennedy, M. M. (1997). How teachers connect research and practice. Mid-Western Educational Researcher, 10(1).
USDOE. (2002). Information Quality Guidelines. Washington, DC: U.S. Department of Education/U.S. Office of Management and Budget.
Warby, D. B., Greene, M. T., Higgins, K., & Lovitt, T. C. (1999). Suggestions for translating research into classroom practices. Intervention in School and Clinic, 34(4), 205-206.