Background, Process & Overall Outcomes
Although states, districts, and schools implement educational technology programs that appear to improve teaching and learning, education stakeholders struggle to provide data and research that demonstrate the effectiveness and impact of the programs.
In many cases, research or data exists to support the program effectiveness; but it is not communicated or identified effectively because the research is not readily consumable or understandable.
SETDA identified a need to provide education stakeholders with assistance in identifying and analyzing research and results that demonstrate the rigor and proof to make the case for continued funding for educational technology programs and to facilitate policy and program decisions.
Through a grant from the Intel Foundation, the “Showing Evidence of Educational Technology’s Effectiveness” project was designed to engage SETDA members in a year-long collaborative effort to assemble and communicate research data.
SETDA members attended the “Becoming a Better Consumer and Communicator of Research” session lead by Jason Osborne, PhD, of North Carolina State University at the 2005 Leadership Summit. The group utilized this and other resources to develop rubrics which provide criteria for ranking research.
The rubrics are provided as tools, as they help in the evaluation of research. The first rubric looks at the type of research, and the second provides ratings based upon peer review.
Finally, the online “Showing Evidence” thinking tool was used to allow individuals to work together with others in a team-like setting to evaluate evidence which supports or refutes a claim.
Participants in the Showing Evidence project noted that the discussions about research methodology and results enhanced their knowledge about research.
Having the opportunities to discuss the research design and utilize the language learned through the rubric development and “Becoming a Better Consumer and Communicator of Research” session increased comfort levels with research.
A primary challenge was finding the time to devote to a thorough analysis of research, its design, or an evaluation of either. With this in mind, participants chose to develop tools which allow other SETDA members to learn from their experience and provide templates and examples that all SETDA members can readily use to communicate research, program, and evaluation findings.
^ Return to Top
Download the entire toolkit and more.
"Learning to use the tool for the purpose of reviewing research was valuable in that it called upon team members to develop a much deeper understanding about what quality research really looked liked including rigor of the evaluation, quality of reporting, and the ability to replicate the conditions of the study."
Oregon Dept of Education
"State leaders across the country using collaborative, online workspace to share and analyze original research is an excellent demonstration of 21st Century learning."
US Program Manager
Intel Education Initiative
“Developing the tool served as apowerful professional development experience for state educational technology leaders. It provided a hands on, job embedded approach to increasing our ability to critically analyze studies conducted on the use oftechnology in schools.”
Technology & Innovation in Education (TIE)