Drs. Peter Rich and Geoffrey Wright, School of Technology
Overall Evaluation
The goals of this project was to bring students together to research the use of video analysis in varied contexts. Through two different projects, we were able to successfully carry out research in four different contexts: seminary, a private school for autistic children, the English Language Center at BYU, and BYU-Provo. We initially set up and began an study at LDS Business college, but due to unexpected medical emergencies, were unable to continue our work with the college. Overall, we feel this has been a successful experience, as we have discovered an approach that seems to work best for conducting videoclubs.
Results
In addition to multiple papers and presentations, this research has revealed that teacher who engage in small-group videoclubs on a regular basis report a higher teacher self-efficacy and are able to point to actual changes in their teaching. Current external coding reveals that, after only a single video club, there is a trend toward asking more higher-order questions (specifically, application and analysis-type questions). Perhaps the most impactful result, though, was a student who indicated that this research “changed her life,” as she decided to change her teaching from actuary science to mathematics education. Consequently, we have since hired her to work with the current cadre of participants.
Budget
All expenses on this MEG grant went toward paying graduate and undergraduate students hourly wages to meet together and individually work on video analysis.
Principal Research Activities
Research developed over the course of the grant into 3 different phases.
1. Literature Review
To familiarize students with the literature on video analysis, I asked them to help complete a table cataloging the various aspects of video analysis review studies. The criteria for inclusion in the review was that the study must have been done with teachers reviewing video of their own practice. We collaborated on this using a Google spreadsheet. This data played a key role in a literature review that is currently under review in The Journal of Technology and Teacher Education.
Students are currently helping to create a similar review by culling the data on the effectiveness of teacher questioning strategies on student achievement. Students share and discuss this data through a Google spreadsheet. This data will form the basis for al literature review for an article that shares the findings of our current research with the videoclub over three semesters.
2. Inter-rater Training
A key aspect of this project was for Statistics 221 TAs to analyze their teaching using Bloom’s taxonomy. Using several videos from prior semesters, we trained students to independently code a video for each instance of a question and rated the video using Bloom’s revised taxonomy as a framework. Over 5 videos, we established 85% reliability (> .72 reliability) which was our goal before proceeding with indepdent coding of videos.
3. Video Coding
15 TAs video were recorded at least twice during Winter semester 2009, 5 statistics TAs were recorded four times in Winter 2010 and eight statistics TAs are currently (Fall 2010) being recorded for all three lessons they teach during their training semester. Even after establishing a regular inter-rater reliability of .7 or higher, two mentored students code each video. Video analyses are shared with participating TAs in their monthly videoclub meetings of three to four statistics TAs.
Collaborative Activities
1. Statistics TA teaching and training
Research participants (Statistics TAs) needed to be trained on Bloom’s taxonomy as a pedagogy. The mentored students and Dr. Rich planned and delivered training. They then taught TAs how to use MediaNotes as well as Bloom’s taxonomy in a 45-minute class. True to our goal of increased sustainability, this training is now given by the TA supervisor.
2. Weekly Meetings
To help the students become part of a video analysis community, mentored students meet weekly with myself and a doctoral student whose research focuses on using teacher video analysis in different contexts. Students share progress and we discuss possible meaning of our tentative and later, more conclusive, findings. Through these meetings, we determined changes to be made in research for subsequent iterations. This has led to more effective videoclub meetings. Perhaps the most powerful result we have noted thus far is that one statistics TA was so impacted by regularly reviewing her teaching that she changed her major from actuary science to mathematics education. We subsequently hired her to work with the current iteration of the videoclub.
3. Transcription
Mentored students aided Tonya Tripp with her dissertation research by transcribing upwards of 25 interviews and video club meetings. These lent insight into different ways in which video analysis is used in different situations, as well as how change might come about due to participation in video clubs. They were then able to participate more fully in weekly discussions by adding their insight gained during transcription.
Resulting Products
Dissertations/Theses
Tripp, T. (2010). The influence of video analysis on teaching. Unpublished dissertation. Brigham Young University. Provo, Utah.
Refereed Articles
Rich, P. & Tripp, T. (under review). Video analysis tools: Choosing the right tool for the right job. Submitted to TechTrends.
Tripp, T., Rich, P., Lane, E., & Chavez, M. (under review). Using video to analyze teaching. Submitted to the Journal of Technology and Teacher Education.
Conference Presentations
Rich, P. & Garrett, J. (2010). Improving teacher questioning strategies though collaborative video self-analysis. Presentation given at the annual meeting of the Association of Educational Communications and Technology. Anaheim, CA.
Rich, P. (2010). Choosing the “right” video analysis tool: A conceptual framework. Presentation given at the annual meeting of the Association of Educational Communications and Technology. Anaheim, CA.
Rich, P., & Tripp, T. (2010, April-May). The process of change: Video feedback in varied contexts. Presentation given at the annual meeting of the American Educational Research Association. Denver, Co.
Rich, P. & Tripp, T. (2010). Choosing the Right Video Annotation Tool for the Job: A Conceptual Framework. In D. Gibson & B. Dodge (Eds.), Proceedings of Society for Information Technology & Teacher Education International Conference 2010 (pp. 1171- 1178). Chesapeake, VA: AACE