Do interactive video tools lead to more active and deeper learning?

Introduction

Last year I completed a Masters research paper with Waikato University (NZ) called “Digital Technologies in the Classroom: New pedagogies, new practices”. I was tasked to choose and analyse an emerging technology within a teaching and learning context. I chose to explore how interactive digital video tools, used in the classroom, could enable more active and engaging learning. My interest in this topic is brought about by the increasing prevalence of video use in education (Stigler, Geller & Givvin, 2015) along with the recent rapid emergence of online tools that afford all educators the ability to add interactivity to online videos.

Why Interactive video

Different brain processes are required when watching video as opposed to reading. While reading requires active involvement through deeper cognitive effort, watching video is far more passive, less demanding cognitively, and the process is more automatic (Margalit, 2015) and therefore will not necessarily lead to deep learning (Clark, 1994; Mayer, 2009; Chi, 2009 as cited in Stigler et al, 2015).

The main claim of these interactive video technologies then, is to “turn online videos into interactive learning experiences that engage students and deepen understanding”. This type of technology claims to turn students from passive to active learners. Therefore my objective for this investigation was to assess claims made about the effectiveness of how interactive video can make learning more active and allow for high levels of thinking. Specifically I wanted to know:

  1. Does interactive video better engage students
  2. Do students operate at higher levels of thinking?
  3. Does it improve learning?
  4. Is this type of tool worth implementing?

Previous attempts to move video viewing from passive to active learning

Previous attempts to make video viewing more active have tended towards overlaying interactive elements over video or constructing interactive elements during the design process (Wijnants, Leen, Quax & Lamotte, 2014). Others have focused around the enhancement of learning via adult mediation such as regularly stopping the video for class discussion (Linebarger, 2009 as cited in Golos & Moses, 2011). Hoxley & Rowsell (2006) came to the conclusion that a paper-based quiz assessment for students to complete during a viewing was highly effective, yet even more so when the subsequent focus of learning was based around feedback from student results.

The possibility that there is a relationship between a student’s viewing style (how they view the video) and their learning styles or short-term memory ability has been negated by a study by De Boer, Kommers and de Brock (2011) whose students were given multiple choice comprehension tests to complete along with the video. Cheon, et al (2014) investigated both passive and interactive spatial sequencing techniques while viewing animated sequences given the assertion that “people learn better when a multimedia message is presented in user-paced segments rather than as a continuous unit” (Mayer, 2009 as cited in Cheon, et al, 2014). Their results showed that active pauses such as answering questions during a viewing resulted not only in better recall during later testing, but also when testing learners ability to apply the information. Moore (2013) explains effective techniques utilising video which go beyond a guided lesson, to become a springboard for in-depth discussion, critical thinking and to strengthen online research skills while driving conceptual understanding.

Most of the research is based around learners viewing multimedia individually in fully online or blended environments. But, there is still a large amount of class-based group viewing. The benefits are that it ensures that all students do actually view the same material, but also that interactions are instantly collaborative. In class-based situations, however, where students have set question(s) or a task to answer along with the video resource, then it may be beneficial for them to be given the flexibility to interact with the video in their own way, rather than watching or interacting with the same viewing as everyone else.

Choosing the Technology

Criteria for choosing the tool

Technology Acceptance Model (Venkatesh, 2003)

Before choosing a technology, I decided to review currently available tools. I used the Technology Acceptance Model (Venkatesh, 2003) to help decide my criteria. So, I decided specific criteria that provides the most useful functionality along with considerations for the end-users – both the teacher (creator) and the student (user):

  1. Teachers can build the interactivity into online videos that they currently use from (YouTube, Vimeo and other major sources)
  2. Make the interactivity available either individually (online) or to a whole class audience (face to face)
  3. Is easy to learn and use
  4. Is robust and reliable
  5. Integrates well with other tools & systems
  6. Has a useful analytics component
  7. Has functionality (question types) that enable higher levels of thinking

Interactive Video Tools Review

A review of currently available technologies confirmed that interactive online video is a fast growing phenomenon within the education sector. I discovered a range of tools from:

  • Those with build-in interactivity – eg KhanAcademy
  • Note-taking tools – eg VideoNot.es
  • Those built into video hosting systems – eg YouTube editor
  • Commercial tools – eg hapyak
  • Tools that enabled the adding of text, buttons and links – eg Wirewax
  • Products for “flipping” the classroom so students can source and annotate their own videos before submitting them – eg VideoANT, Vialogues and MoocNote
  • Interactive closed complex solutions – eg playposit
  • Interactive stand-alone complex solutions – eg Zaption

My choice – Zaption

Playposit and Zaption were the two that best matched my criteria. Playposit was built into a larger LMS style system, whereas I preferred something that was more stand-alone. Zaption is made specifically for education and allows a teacher to not only crop, but add a collection of online video segments to a single viewing, called a “Learning Tour”. It is a stand-alone tool but integrates well with the likes of eTV (NZ), Google Classroom and Learning Management Systems. Zaption was designed out of the University of California’s desire to address issues surrounding classroom practice based on research on how students learn (Stigler, 2015). They even developed a Zaption implementation model based on the teacher’s learning goal.

 

Zaption was launched in May 2013 with the catchphrase “Don’t just watch. Learn” (Karlin, 2013). Zaption’s CEO, Chris Walsh, says that “everyone loves to use video for learning, but there isn’t a lot of research saying that it makes a difference” and that “Zaption aspires to turn video from a passive to an active viewing experience” (Locke, 2015). It is designed to address issues surrounding classroom practice based on research on how students learn, and the goal of Zaption is to not re-create existing content, but to “augment the quality of teaching by scaffolding a rich and engaging conceptual learning opportunity into a teacher’s instructional practices” (Stigler et al, 2015).

Research Data Gathering

I decided to choose two video resources that I had used in the past. This gave me the advantage of both familiarity with the subject content, and the ability to compare the quality of learning interactions using Zaption, to those I have taught earlier using different methods and tools. This also enabled me to have a clear learning goal for each resource as recommended by the Zaption Implementation Model (Stigler et al, 2015), rather than in isolation. I was guided by Moore’s (2013) effective techniques utilising video which go beyond a guided lesson, to become a springboard for in-depth discussion, critical thinking and to strengthen online research skills while driving conceptual understanding.

Learning Task 1 – “What makes an effective documentary?”

Students will analyse a short student-made documentary (7 minutes) looking for techniques used to make it effective. This is an introductory lesson for a wider student project of documentary making. Specifically they will be asked to:

  1. Identify visual, audio and any other techniques that made the documentary effective
    Purpose: to help students to understand and utilise effective production techniques when applying to their own projects
  2. Identify 3 possible focus questions that the student structured their documentary
    Purpose: to help students to understand structure and to write good focus questions for their own projects
  3. Identify each step that they themselves would take in producing a similar project based on a given topic.
    Purpose: for students to break a project down into manageable steps and in a logic sequence

I am using the video as a guided lesson for analysis, applying Moore’s techniques of setting expectation and posing a question (built into the video) before each section. I will, however, not give an outline-like structure as I want the students to identify it themselves.

Learning Task 2 – “Understanding the causes of World War One”

Students are to answer questions about the range of causes/factors put forward in the documentary “World History – Causes of World War I” (35 minutes). It is aimed for students to achieve the following:

  1. Answer difficult comprehension questions during the video.
    Purpose: to help students develop a deeper understanding of key ideas and causes such as Industrialisation, Imperialism, Nationalism and various links or effects of each.
  2. Lead into Multi-choice and Discussion tools
    Purpose: to ask students to develop an opinion about the principal cause, and to develop arguments for their choice as opposed to others.

I will again use Moore’s techniques of setting expectation and posing a question at the start of the video. It is also being used as a springboard for in-depth discussion and critical thinking.

Finding an appropriate model or framework

I needed a model or framework to evaluate the learning. I started by reviewing a number of technology specific models:

Technology Models

SAMR

SAMR is often referred to and acknowledged as a good model to use when assessing how technology is integrated with learning. Applying SAMR gives the underlying assumption, and message, that teachers are using lower level teaching strategies, prior to the intervention of technology (Moroder, 2013). I believe that it will fast become outdated and be of little value as technologies, particularly the the internet, become ubiquitous in our daily lives (Anderson & Rainie, 2014). SAMR assumes a progression from basic substitution of technologies by teachers, ending up with redefined student-centred tasks, often not reflecting the experience of teachers (Schwartz, 2014). It is also problematic that a range of teachers could justifiably assess their use of technology at different levels on SAMR using the exact same learning task with their students, fully dependent on the pedagogical position they had previously applied.

I found the TPACK model to be of little value. It focuses on the required teachers knowledge (Content, Pedagogy and Technology) and the need to integrate each with another, and ultimately all three combined. However, I discarded it as it offers no help in evaluating the quality of learning in a task.

The Trudacot model, an acronym for Technology-Rich Unit Design And Classroom Observation Template (McLeod and Graber), is a set of questions to help us move beyond the likes of “better student engagement” with technologies, to a investigate more deeply the pedagogy behind and given task or activity (Schwartz, 2014). This tabulated version of Trudacot appears to be quite useful, particularly in a class observation scenario. However, I require a model to assess the quality of learning interactions within a given situation, which this model would struggle to do.

TIM (Technology Integration Model)

In my view the TIM (Technology Integration Matrix), produced by the Florida Center for Instructional Technology, appears far more helpful than SAMR. TIM gives a more realistic progression of how overall changes are occurring when technology is integrated into the classroom, allowing the focus to move from any individual task to the environment being created. The vertical axis of the matrix is built around  five 21st Century Learner pillars (Active, Collaborative, Constructive, Authentic, Goal Directed), thus encouraging teachers to reflect on how their technology impacts the learner. Because TIM focuses more on a progressive change in environments, rather than assessing learning in an individual task, I have decided to look for something more appropriate. So I looked at Pedagogical Models.

Pedagogical Models

Bloom’s Taxonomy

The key question should be “what have students learned” when evaluating the impact of technology integrated learning (Schwartz, 2014). For this reason I have decided to explore some non-technology based pedagogical models, Bloom’s and SOLO Taxonomy, that can help me to assess the quality of learning interactions in the tasks.

Where Bloom’s Taxonomy was developed through the process of theorising by a group of educators, SOLO Taxonomy is a research and evidence based theory about teaching and learning. SOLO has the capability to easily apply, understand and appraise learning in real situations, whereas Bloom’s is less realistic, implying a progression of understanding before moving to the next level (Hook). SOLO is built on “constructive alignment” where 

“we start with the outcomes we intend students to learn, and align teaching and assessment to those” (Biggs). I have found SOLO to be easy to understand and apply in the past, as it is practical and makes sense to both educators and students (Hook).

Zaption Issues

I encountered a number of issues with the Zaption technology:

  • The question options were more limited than I had expected. I had prepared multichoice and multi-select questions with more than 5 options. However, I had to chop out a number of answers as the software only allowed a maximum of 5 possibilities – I have not encountered this issue in previous quiz software I have used. A matching question would have been very useful, drag’n’drop even better
  • There is no in-built grading of Multi-select questions, only a graph and students individual answers. This means that the “Distribution of Answers” do not give the full picture of how students answered the question.
  • While presenting with Zaption to a whole class, students were required to wait for everyone to complete even simple multiple choice questions before moving on
  • Analytics appeared quite limited. You could not analyse an individual student’s performance across the whole lesson, just completion and multi-choice results, which are the only questions properly graded.

Long preparation time for video questions

  • I found that the preparation time for making videos was a lot longer than I expected – for these activities it took hours to go through and write questions, check, modify etc
  • You could not copy and paste a whole question – and then change the answers, which would have saved a lot of time.
  • Overall the software was easy to learn, but clunky to use

As I was making the video I kept asking myself the question: “Is this a good use of a teacher’s’ limited and valuable preparation time?”

Project Outcomes

Learning Task 1 – “What makes an effective documentary?”

Year 11 students (grade 10, age 15-16) were mostly asked a range of short paragraph answer type questions. While students rated the actual documentary video very highly, their overall responses were fairly average. For instance, students were asked to Identify a range of documentary making techniques (visual, audio & other) used to make the video documentary. The average number of techniques that students identified was unexpected low at 3.6 with a maximum of 6 by one student. The majority of answers were repeats of other responses, and lacking in any insight or real depth. I compared the results the responses from a Year 10 (grade 9, age 14-15) class in 2014, who instead watched the video and were asked to write down their responses individually, then discuss in groups, then contribute to the whole class. There was no comparison really. What was more noticeable back in 2014 was that student collaborative responses fed other students thinking. This was not possible in the Zaption viewing method which was very mechanical, and did not allow for student-student, student-teacher, or student-class interactions.

Students were also asked to identify 3 possible focus questions that the documentary maker may have used to guide their research/presentation. I rated  the quality of their questions as Good, Adequate, Bad or No Response. The combined overall ratings show that only one third of the focus questions were of good quality (see graph).

The last question had students identifying sequential steps that would have been used to carry out the project which again resulted in mixed responses.

I graded each student according to their overall responses using the SOLO Taxonomy Assessment Grid (below).

As you can see (graph below) the results are very average, with only 5 of the 23 students registering in the desired Relational and Extended Abstract range.

These less than desirable learning outcomes from this Zaption trial led me to run a trial of the learning task without Zaption, but instead in groups using an online collaborative group question sheet (using Google Docs). A comparison of the focus questions constructed by each group, as opposed to the Zaption method, shows an overall improvement (below).

This clearly demonstrates that students working collaboratively in a group can share ideas to improve knowledge and skills, also demonstrated by the average number of Visual, Audio and other Techniques identified (graph alongside).

 

Learning Task 2 – “Understanding the causes of World War One”

This second task leaned towards comprehension with mostly multi-choice or multi-select questions, apart from the final paragraph question where students are asked to make a choice and justify it.

Only the Multi-choice answers, six in total, gave any accurate information as to performance. This is a typical Multi-choice question:

Over half the class performed very well, scoring 5 or 6 out of 6 correct. However, others in the class were average or poor at being able to comprehend and answer the questions. As discussed earlier, there was no point in me analysing the Multi-select questions because they are not graded by Zaption and would thus take too much time to analyse.

The final “open response” (paragraph) question asked students to choose which of the causes of World War I they thought was most significant and why.

These responses were graded according to SOLO Taxonomy (grading grid below).

The SOLO rating of their results showed that only 4 students were at the Relational or Extended Abstract level, whereas the majority of students were below Multistructural. This makes me question the Zaption tasks altogether, particularly when high performing students cannot use knowledge to justify their position.

I compared the SOLO ratings of this very capable top Year 11 (age 15-16) History class with those of my average to good Year 10 Social Studies class from 2014. The two classes did not do the same activity when it came to comprehension. While this Year 11 class completed a Zaption Lesson on WWI causes, my own Year 10 (age 14-15)v 2014 class was given a range of content over time, then they were tasked with choosing the cause of WWII (a later war) that they thought was most significant, and argued their case with each other in an online forum. The performance of the less capable Year 10’s back in 2014 was clearly superior to that of the very capable Year 11 History class. Although I am not comparing like with like, it does highlight the shortcomings of the individual teacher-centric Zaption lesson approach, compared to the depth of understanding achieved where students gain knowledge and apply it in a class forum discussion/debate, albeit over a period of time (see graph below).

How are other teachers using Zaption

In these Zaption activities I believe that I had put a lot of effort and thought into forming good interactive (quiz) questions. Despite this, the responses had not met the expectations of student comprehension or depth of understanding. I therefore decided to look at what kind of lessons that other educators were publishing to the Zaption public gallery. This gallery allows teachers to make a copy of a published quiz to either use it as it is, or a modified version.

I took a random sample of 20 zaption lessons from the Social Sciences Gallery, systematically choosing every 5th lesson. When analysing the lessons (videos) I found that the average length of the videos was 7:10 mins with the average number of built in interactions at 7.25 – note that an interaction included simple information or instruction slides.

I then went through and rated each question on its quality. The Multichoice questions were overwhelmingly of poor quality or simple recall (below left). There were very few that made the student think carefully about what they had viewed. However, the Open response questions (below right) seemed to have more depth, requiring students to think about their responses at a deeper level.

Having reviewed all 20 Zaption lessons, I came to the sad conclusion that only the one was worthy of utilising, whereas the other 19 were of little or no real learning value.

Conclusion

According to Stigler et al (2015) and the architects of the Zaption tool: “The goal of Zaption is to not re-create existing content, but to “augment the quality of teaching by scaffolding a rich and engaging conceptual learning opportunity into a teacher’s instructional practices”. Given my experience and evidence I challenge this statement along with the capability and worth of Zaption and other interactive video platforms in the classroom. I conclude that interactive video technologies are not worth the investment of preparation time and money. During my investigation I continually stuck to Schwartz’s (2014) principal of asking the key question: “what have students learned”. For me the answer was plain and simple from their Zaption lessons: “very little”. There were no real gains in student learning in a face to face environment that could justify the investment of the time and money required to set up interactive video lessons. In fact, results showed students interacted with less depth than they would have with a non-technology solution, such as if they were given focus points or questions prior to a viewing, followed by group discussion and class questioning post viewing. This is certainly borne out by my own experiences with the same two video tasks used two years earlier in 2014.

Zaption’s stated advantage of being able to get students to interact with the video with questions built in is, I think, over-stated. If anything, this type of tool actually impeded worthwhile learning actions in the class, that could have come about by using effective techniques such as those outlined by Moore (2013) and by choosing and incorporating other more suitable technologies. One student, showed particular insight when commenting on his experience, said that

“Zaption didn’t help that much as we get disconnected from the teacher and we can’t get help or explanations if we need.”

Instead of using Zaption in face to face or online lessons, I could easily have used a Google Form (Quiz) with question responses, a Google Docs template if students have more suitable tools (such as Chromebooks), Moodle Quiz has a far better range of questions, and Moodle Forums would be an excellent place for students to discuss/debate video content. There are likely many other discussion tools for learning, not to mention social media tools such as Twitter and Facebook.

One thing that Zaption did have was the ability to include headers, chop parts of video, and blend several different videos into the one lesson. But, there are other cheaper and even free options to do this if I thought it to be necessary or worthwhile.

Note: After completing my research the Zaption technology was sold to another company, Workday, and ceased operating soon after.

Bibliography

Anderson, J., & Rainie, L. (2014, March 10). Digital Life in 2025. Retrieved March 18, 2016, from http://www.pewinternet.org/2014/03/11/digital-life-in-2025/

Biggs, J. (n.d.). Constructive Alignment. Retrieved March 18, 2016, from http://www.johnbiggs.com.au/academic/constructive-alignment/

Cheon, J., Chung, S., Crooks, S. M., Song, J., & Kim, J. (2014). An investigation of the effects of different types of activities during pauses in a segmented instructional animation. Journal of Educational Technology & Society,17(2), 296.

de Boer, J., Kommers, P. A. M., & de Brock, B. (2011). Using learning styles and viewing styles in streaming video. Computers & Education, 56(3), 727-735. doi:10.1016/j.compedu.2010.10.015

Ferriter, B. (2010, January 20). Why I Hate Interactive Whiteboards. Retrieved May 20, 2016, from http://www.edweek.org/tm/articles/2010/01/27/tln_ferriter_whiteboards.html?tkn=Q%5BRFGmQux6XnMebDMl4nddRDutTae13KtmNE

Golos, D. B. & Moses, A. M.(2011). How Teacher Mediation during Video Viewing Facilitates Literacy Behaviors. Sign Language Studies 12(1), 98-118. Gallaudet University Press. Retrieved March 12, 2016, from Project MUSE database.

Hook, P. (n.d.). Advantages of SOLO Taxonomy. Retrieved March 18, 2016, from http://pamhook.com/wiki/Advantages_of_SOLO_Taxonomy

Hoxley, M., & Rowsell, R. (2006). Using video in the construction technology classroom: Encouraging active learning. Architectural Engineering and Design Management, 2(1), 115. doi:10.1080/17452007.2006.9684609

Karlin, M. (2013, May 9). Zaption: Bringing YouTube to Life in the Classroom. Retrieved March 19, 2016, from http://www.edtechroundup.org/editorials–press/zaption-bringing-youtube-to-life-in-the-classroom

Locke, C. (2015, May 19). With $1.5 Million, Zaption Offers Online Interactive Videos (EdSurge News). Retrieved March 19, 2016, from https://www.edsurge.com/news/2015-05-19-with-1-5-million-zaption-offers-online-interactive-videos

Marcovitz, D., & Janiszewski, N. (2015, March 2). Technology, Models, and 21st-Century Learning: How Models, Standards, and Theories Make Learning Powerful. Retrieved March 16, 2016, from http://www.learntechlib.org/p/150163/

Margalit, L. (2015, May 1). Video vs Text: The Brain Perspective. Retrieved March 13, 2016, from https://www.psychologytoday.com/blog/behind-online-behavior/201505/video-vs-text-the-brain-perspective

McLeod, S., & Graber, J. (n.d.). Trudacot. Retrieved March 16, 2016, from http://dangerouslyirrelevant.org/resources/trudacot

Moore, E. A. (2013, May 20). From Passive Viewing to Active Learning: Simple Techniques for Applying Active Learning Strategies to Online Course Videos. Retrieved March 13, 2016, from http://www.facultyfocus.com/articles/teaching-with-technology-articles/from-passive-viewing-to-active-learning-simple-techniques-for-applying-active-learning-strategies-to-online-course-videos/

Moroder, K. (2013, November 4). Push My Thinking: TPACK or SAMR or ? Retrieved March 9, 2016, from http://www.edtechcoaching.org/2013/11/ed-tech-frameworks-why-i-dont-use-tpack.html

Schwartz, K. (2014, September 10). Taking Classroom Tech Use to the Next Level: Specific Traits to Look For. Retrieved March 16, 2016, from http://ww2.kqed.org/mindshift/2014/09/10/taking-classroom-tech-use-to-the-next-level-specific-traits-to-look-for/

Staton, M. (2010, May 12). Why Smartboards are a Dumb Initiative. Retrieved May 20, 2016, from http://theinnovativeeducator.blogspot.ae/2010/05/why-smartboards-are-dumb-initiative.html

Stigler, J., Geller, E., & Givvin, K. (2015). Zaption: A Platform to Support Teaching, and Learning about Teaching, with Video. Journal of E-Learning and Knowledge Society, 11(2). Retrieved March 20, 2016, from http://www.learntechlib.org/p/151061/

Technology Acceptance Model. (n.d.). Retrieved March 18, 2016, from https://en.wikipedia.org/wiki/Technology_acceptance_model

TIM: The Technology Integration Matrix. (n.d.). Retrieved March 16, 2016, from http://fcit.usf.edu/matrix/index.php

Venkatesh, V., Morris, M., Davis, G., & Davis, F. (2003). User acceptance of information technology” Toward a unified view. MIS Quarterly, 27, 273-315. doi:https://en.wikipedia.org/wiki/Technology_acceptance_model

Wijnants, M., Leën, J., Quax, P., & Lamotte, W. (2014). Augmented video viewing: Transforming video consumption into an active experience. Paper presented at the 164-167. doi:10.1145/2557642.2579368

Wilson, B. (2012, August 9). Breaking my Silence on Smartboards. Retrieved May 20, 2016, from http://www.21innovate.com/write/breaking-my-silence-on-smartboards

Wouters, P., Tabbers, H. K., & Paas, F. (2007). Interactivity in video-based models. Educational Psychology Review, 19(3), 327-342. doi:10.1007/s10648-007-9045-4

Leave a Reply

Your email address will not be published. Required fields are marked *