Oppgrader til nyeste versjon av Internet eksplorer for best mulig visning av siden. Klikk her for for å skjule denne meldingen
Ikke pålogget
{{session.user.firstName}} {{session.user.lastName}}
Du har tilgang til Idunn gjennom , & {{sessionPartyGroup.name}}

The Use of Learning Technologies and Student Engagement in Learning Activities

PhD Student, corresponding author, Department of Computer and Systems Sciences (DSV), Stockholm University
PhD, DDS, Professor, Department of Computer and Systems Sciences (DSV), Stockholm University
PhD, Department of Computer and Systems Sciences (DSV), Stockholm University
PhD, Associate Professor, Department of Computer and Systems Sciences (DSV), Stockholm University

As digitalisation spreads in education, it is vital to understand its relation to student engagement. We used student diaries and observation data to approach student engagement and explore the use of learning technologies on a lesson-to-lesson basis. Results show that a less thought-through use of technologies might lead to unconsidered effects. Positive indicators of the facilitation of student engagement included making the learning process accessible and visible to teachers.

Keywords: Student engagement, upper secondary school, ICT, learning technologies

Introduction

In the wake of the uptake and spread of digitalisation within education, research has continuously offered a more nuanced understanding of the consequences and potentials of technologies in education.

One of the more pressing aspects brought forward by a number of studies is the suggestion that learning technologies in education are not used as effectively as they might be (e.g. Gudmundsdóttir, Dalaaker, Egeberg, Hatlevik, & Tømte, 2014). These studies suggest that the strategies teachers use to engage students in traditional practice might not match those needed to engage students in activities that adopt learning technologies (e.g. Grissom, McCauley & Murphy, 2017), and that effective learning activities should be student-centred and promote students’ active learning (Goodyear & Retalis, 2010; Grissom, McCauley & Murphy, 2017; Laurillard, 2012).

Several studies argue that learning technologies in themselves can engage students. Studies with this focus have typically approached engagement by using technology brought into the learning situation. The findings indicate increased levels of student engagement when using clickers (Han & Finkelstein, 2013), blogs (Cakir, 2013), virtual worlds (Pellas, 2014) or gamified learning (da Rocha Seixas, Gomes, & de Melo Filho, 2016). However interesting, the novelty effect must be noted, as students’ enthusiasm for “getting to use a laptop in education” might decrease when the laptop (or other technology) is no longer a new sensation (Punie, Zinnbauer, & Cabrera, 2006). Moreover, as the typical focus on engagement was approached in a particular situation, they do not take into account the relation between the approached engagement and the potential individual variations of that engagement.

While the fast-paced development of digital technologies might spur beliefs that learning technologies may solve all educational problems, concerns have stressed that it is not the learning technology itself, but rather how it is used, that will affect learning (Chen & Jang, 2010; Giesbers et al., 2013). Some studies point out that even after implementing learning technologies, teaching practices have remained traditional, and that not applying deliberate pedagogical strategies in this implementation might even lead to poorer results (Chen & Jang, 2010; Håkansson-Lindqvist, 2015). More specifically, Grissom et al. (2017) refer to traditional teaching practices as those that are mainly instructor-centred and focus on lectures and summative assessments instead of student-centred interaction (which includes student-student or student-content interaction). Similar observations are brought forward by Håkansson-Lindqvist (2015), who found that even though access to learning technologies has increased in schools, there is no increase in the support offered to the teachers using them.

Although some studies show that learning technologies and effective strategies to increase student engagement exist, we have not found studies that have approached student engagement in relation to its existing learning technologies. The purpose of this study was therefore to approach the variation of student engagement and the everyday use of learning technologies in an upper secondary school. With this in mind, we raised the following research questions:

  • How are learning technologies used in the classroom throughout the school day?

  • What challenges and possibilities are associated with engagement when learning technologies are used in the classroom?

Theoretical background

Student engagement

With its roots in drop-out prevention programs, engagement theory has sought to decrease the rates of students dropping out from K-12 education, and has been a focus of studies since the 1980s (Eccles & Wang, 2012; Finn, 1989). In everyday life, people sometimes talk about motivation and engagement without separating one from the other. Engagement theory and motivation theory are closely related, and theories of motivation can inform studies of engagement, and vice versa (Fredricks, Blumenfeld, & Paris, 2004; Reeve, 2012). However, these are different constructs (Fredricks & McColskey, 2012; Reeve, 2012). Motivational theories typically approach student interest (intrinsic motivation) and student responses to triggers (e.g. parental expectations or grades) (extrinsic motivation) (Ryan & Deci, 2000). However, some studies propose that motivation alone is not enough for students to persist in their learning (Boekaerts, 2016; Poskitt & Gibbs, 2010), and suggest that while students may be unwilling to use self-regulatory strategies, they still engage (Boekaerts, 2016). Others draw parallels between engagement and learning; declaring that if students do not engage, there will be no learning (Reeve, 2012). Following this argument, Reeve (2012) states that engagement is a full mediator for learning, and that engagement may even be a stronger predictor of school success than teacher instruction.

Engagement is both a visible and measureable outcome of motivation (Boekaerts, 2016; Fredricks & McColskey, 2012). Another, perhaps more compelling, reason to approach student engagement is that it has been shown to drive learning and predict school success for all students (e.g. Boekaerts, 2016; Fredricks et al., 2004).

In their review, (Fredricks et al., 2004) brought together definitions and operationalisations of engagement, which are used for the purpose of this paper. Fredricks et al. (2004) suggested that engagement is a multi-layered construct consisting of a behavioural, an emotional and a cognitive dimension. The behavioural dimension includes the actions taken in order to learn, the emotional dimension encompasses attitudes and feelings, and the cognitive dimension reflects the effort invested in learning. As such, student engagement includes situations when the student:

  • initiates action to learn (e.g. take notes or ask questions; the behavioural dimension);

  • confirms acceptance of the teacher’s instruction and the learning situation, observable in students’ verbal or facial expressions or body language (e.g. displaying curiosity or leaning forward; the emotional dimension);

  • directs the attention or concentration towards the object of learning (e.g. listening or reading; the cognitive dimension, which reflects the effort invested to learn and master the material).

The standpoint of this study is that it is critical to engage students as, when they stop engaging with the learning material at hand, the learning process comes to a halt. Hence, it is critical that learning design uses learning technologies to support engagement.

Technology-enhanced learning and learning technologies

Educational activities that use technologies are commonly referred to as technology-enhanced learning (TEL) (Goodyear & Retalis, 2010). Within TEL, many terms are used to describe the variety of tools used in the classroom: digital technologies, information and communication technologies (ICT), Web 2.0 technologies, handheld technologies and more. Goodyear and Retalis (2010) suggest that technologies can be categorised by focusing on their role. The categories they suggest, which also apply to this study, are:

  • Technologies to access and study learning materials (e.g. learning management systems [LMS]);

  • Technologies that enable learning communication and collaboration (e.g. Cloud services that allow the simultaneous revision of a shared document or presentation);

  • Technologies for assessing learners (e.g. online tests);

  • Technologies enabling a learning-by-doing approach through construction and programming (e.g. assembling and programming robotics);

  • Technologies for developing digital and multimedia literacy (e.g. multimedia tools such as video editing or image processing);

This study uses Goodyear and Retalis’ (2010) categorisation to focus on the use of the learning technology. We use the term learning technologies when referring to the technologies used within TEL, as learning indicates the role of the technology at hand.

Methods and data collection

To explore the variation of student engagement and the everyday use of learning technologies, we followed students throughout their school days, as this would enable us to follow lesson-to-lesson variations. Hence, we applied a mixed-methods approach and gathered both qualitative and quantitative data from each lesson (Creswell & Clark, 2011; Miles & Huberman, 1994). To approach student engagement and the use of learning technologies, we conducted classroom observations and asked students to fill in diaries. Furthermore, to approach the potential relation between 1) observed and reported student engagement, and 2) the learning technologies, we developed two dedicated classifying schemes, through which the data was analysed.

Context and participants

The study was conducted in one upper secondary school in Stockholm, Sweden. The students (aged 17) were in the second year of the Media & Information Technology program, and all had access to a personal laptop. After receiving approval from the school principal, one of the researchers shadowed the four students throughout their school day. All students were male, which was a reflection of the gender balance at the school, as more than 89% of the enrolled students were male at the time of the observation. All students filled in their diaries each lesson, except for student D, who was absent from lessons 2 and 3.

Data was collected over five weeks during fifteen lessons, distributed as follows: one lesson on day one, two lessons on day two and four lessons on days three, four and five. Eight different subjects, taught by six different teachers, were involved: Mathematics, Physics, Programming, User Interface Design, Communication, English, Swedish and Technology. When conducting classroom observations, the researcher was present in the morning and followed the class until the students left the school building (e.g. when students left for Physical Education or went on excursions).

Ethics

The choice of upper secondary school was made using convenience sampling, as the research department already had established connections to that school. Informed consent was obtained from the teachers and students prior to the observations. Participants were informed about the purpose of the study, their right to withdraw at any point and that all data would be treated anonymously. In line with anonymity, participants are referred to as Student A, B, C and Teacher T, U, V, respectively.

Classroom observations

An observation schema and field notes were used during all classroom observations. The field notes aimed to capture classroom life and the learning activities that took place throughout the entire lesson. Using an inductive approach, the observation schema was informed by the observation and initially included a number of candidate categories that the researchers thought might be noticed. After the initial observations, the categories were reworked to match what was actually observed. During the initial observations, it also became clear that observing the dimensions of engagement separately did not help in answering the research question, as these dimensions were heavily intertwined (as also noticed by Reeve, 2012). Instead, the researchers developed a dedicated classifying scheme for the observation of student engagement (see Table 1). The observation schema also informed the development of a second classifying scheme, aimed at reflecting the different ways technologies were orchestrated during the learning activities (see Table 2).

Student diaries

Student diaries were distributed during each lesson and consisted of three sections: 1) a five-grade Likert scale, in which 1 equalled “not engaged at all” and 5 equalled “engaged all the time”; 2) a prompt to reflect on their learning; and 3) open questions (e.g. “What contributed to your engagement?”) (see Appendix A). All four students filled in their diaries individually following all fifteen lessons. While the students answered in their mother tongue, the quotes used in this paper were translated into English.

Classifying schemes

To observe engagement and orchestration of learning technologies we developed two classifying schemes through which the data was analysed. To offer a rich and nuanced understanding, different data types were used in the classifying schemes. The first dedicated classifying scheme (see Table 1) reflected student engagement in learning activities by noting to what extent they engaged in their learning activity. The classifying scheme could include categories other than the selected ones; however, in this study, the following ones were identified during initial classroom observation, from which we developed the scheme: To reflect the extent to which engagement was observed, we used the letters A-E; E meant that one or two of the students engaged in some of their learning activities to some extent during that particular lesson, and A meant all of the students were engaged fully in their learning activities for most of the lesson (see Table 1).

Table 1 Dedicated classifying scheme to analyse student engagement in the classroom

The activity types and groups reflects a process in which more and more students direct their attention and action toward the learning by taking three aspects into account: the number of students (none, one to two; two or more, all); the extent – which reflects if students were fully concentrated and persistent, or easily and repeatedly distracted during their activity; and the duration for which they engaged. There is a qualitative progression: e.g. between D and E reflects a change in duration, as all the observed engagement in rows A-D reflected that students were engaged most of the time.

The second classifying scheme (Table 2) was developed with the aim of reflecting the use of learning technologies and related learning activities, and indicators that support engagement (reflected in how learning was designed). Only learning technologies used in learning activities were included, and all lessons were coded using the scheme.

Table 2 Classifying scheme to analyse indicators of facilitation of student engagement

A-E reflect indicators of the facilitation of student engagement. In Table 2, N/A was noted when the lesson did not use learning technologies, “E” when the use of learning technologies was not orchestrated by the teacher but was used on the individual’s initiative for individual work, “D” when one learning technology, designed for one user at a time, was provided to share by a group of students, “C” when the lesson contained one or more collaborative learning activities using technologies, “B” when learning technologies were orchestrated to engage all students and “A” when arenas were offered in which student participation was requested and seen by the teacher. By the term ‘arena’, we mean any space that students could use to express themselves (e.g. posting contributions on forums, or writing answers in text boxes in assessment applications).

Data analysis

To visualise observations of and variations in self-reported engagement over time, and to create and analyse patterns, the qualitative data was displayed in figures (Creswell & Clark, 2011; Miles & Huberman, 1994). Student diaries were analysed using thematic analysis. Data was coded while the data collection was still on-going. This was an iterative approach in which the researcher repeatedly switched between identifying candidate themes and gathering new data. After the final observation, the researcher returned to the data anew to rework the themes.

Validity and credibility

That the study was conducted within the Media & Information Technology program with a small number of informants implies that generalisation is limited. Still, the authors believe the findings can inform considerations on how to use learning technologies to engage students in other settings. The researcher who conducted the observations has fifteen years of teaching experience, and it is likely that this experience affected how student engagement in learning activities was perceived. According to Braun and Clarke (2013), researcher subjectivity should be embraced as different means of analysis are both possible and needed (Braun & Clarke, 2013).

Results

The use of learning technologies varied over time and related to both observed and self-reported levels of student engagement.

Figure 1 Observed engagement

Figure 1 shows that the observed engagement varies throughout the school day, and that the students at least engaged to some extent most of the time (see Table 1, “D”). During the orchestration of technologies in lessons of Teacher L and Teacher T, all students engaged in their learning activities to a large extent most of the time (see Table 1, “B”). Teacher V and Tteacher U taught in two subjects, with the same observed engagement, regardless of subject.

Figure 2 Self-reported engagement.

Figure 2 reflects self-reported engagement spanning the fifteen lessons.

Figure 2 shows that students regularly report that they are engaged during lessons, and that their reported engagement reflect levels between 3-5 (engaging to a great extent, or almost always being engaged). While some lessons engaged the students at different levels, some were more engaging for all four students (4, 5, 7, 8, 9, 12, 13, 15). Students’ self-reported engagement also show that they engaged to some extent in all their learning activities, whether learning technologies were used or not. During lessons 2, 6, and 10, none of the students reported being engaged on the highest level, 5. These lessons coincided with orchestration type N/A; no technologies were used.

The results suggest that the higher levels (4-5) of engagement were related to lessons in which the use of learning technologies was orchestrated in certain ways. We identified three indicators of facilitation of student engagement: these were when the teacher: 1) offered an arena in which every student’s participation was requested; 2) witnessed the student’s contribution; and 3) requested insights into the student’s knowledge and learning process. These are presented below:

Observations of how learning technologies were used and themes from student diaries

We identified five ways (A-E) learning technologies were orchestrated during the 15 lessons (see Table 2). A-E thus reflect indicators of facilitation of student engagement. Approaching qualitative implications of these orchestration types on student engagement, four themes were identified in the student diaries: 1) “Invisible learning process”; 2) “Learning technology for one user at a time”; 3) “Motivated but not engaged”; and 4) “Request engagement and confirm”.

The different ways of orchestration could be associated to the themes in the following way:

Orchestration E: The use of learning technologies was not orchestrated by the teacher, but was used on the individual’s initiative for individual work.
Theme 1: Invisible learning process.

Subject: Swedish: lessons 5, 9 and 13.

Pedagogic context: In response to the instruction “read a book and write an analysis”, the student browsed the Internet instead of reading.

“I was on it. The music I listened to engaged me.” (Student D)

The researcher always confirmed student reported engagement and the observed engagement during class. Most often students and researcher observation were related (data types vary, and students’ estimations vary between 3-5, and the observed between D-B). During lesson 9 however, the observed engagement was high, as student D reported he was more engaged than during the previous Swedish lesson (5). To understand more, the researcher approached him. The student explained that he did not have to read as he could get “inspiration” from the Internet to write the analysis. Despite having access to learning technologies, and despite the teacher having requested an outcome, this student’s learning process not only remained invisible, it was uncertain if it was even initiated. After class, when asked in the diary to reflect on his engagement, the student referred to the music coming from his headphones, and not the learning activity when writing he “was on it”, which in this context meant accepting the request to produce an outcome that the teacher would accept as proof of learning. That the student did not read the book did not become apparent to the teacher. That the learning process was invisible was true also for lesson 5 and 13. Students engaged in meeting the request from the teacher, but not having insight in students’ learning process might mean the requests are met in ways the teacher did not intend.

Orchestration D: Learning technologies intended for group work were only fit for one user at a time.

Theme 2: The learning technology at hand can only be used by one user at a time.

Theme 3: Motivated but not engaged.

Subject: Technology: lesson 11.

Pedagogic context: Four to five students were gathered around a laptop, or a robot, for collaborative work. However, the learning technologies were designed to be used by one person at a time, which led to some issues. After some time, the students who were not actively engaged with the technology started moving around the classroom, turning their attention to their mobile phones or talking to peers.

“It was a fun idea that we should build and discuss robotics. I got side-tracked midways.” [The student perceived that he engaged] “about as much as the rest” (Student D)

Despite expressing positive feelings about the learning activity, not all students were engaged in their learning. When approached, one student explained he thought it was a “fun idea” and it seemed as though he looked forward to presenting the final product to the teacher. However, that motivation did not make him engage in the learning activity. In group work, some students positioned themselves to be able to engage in the learning activity, while others displayed uncertainty; it seemed unclear to them in what ways they could engage in the learning. While it was potentially possible for a peer to sit next to the student who was using the learning technology, this required persistence, which not all the students possessed. These students more often than not walked out of the classroom, turned to their mobile phone or engaged in conversations with peers. That did not mean they had given up their intention or desire to participate. To varying degrees, they kept trying to re-enter the group or to find an opening for engagement. That the students thought it was a fun activity did not ensure they engaged in the learning process. The researchers interpreted the situation as one in which the students wanted to engage but could not, as the technology was only fit for one user at a time.

Orchestration C: The lesson contained several TEL activities intended for group collaborations

Theme 3: Motivated but not engaged

Subjects: Communication and English: lessons 3 and 8.

Pedagogic context: Students worked on producing films; they wrote manuscripts, at times, in shared documents (using Cloud services), and chose the technologies to use in recording and editing the film themselves. Observations revealed that these group activities did not engage all students; rather, one student could write the manuscript and film or edit the film on behalf of the group. It was observed that this was often the same student.

“Mostly, I did nothing, as I did not want to challenge NN’s artistic touch. That I got to see input to our work made me committed.” (Student A)

Approaching the student that had reported a lower level of engagement, the student explained he “mostly did nothing”. During some lessons in which learning technologies were offered, the teachers had not guided students in their collaboration. Even if there were more technologies accessible and activities could be shared, this was not done. The comment “That I got to see input to our work made me committed” (Student A) can be understood in a context where the student perceived himself as an otherwise motivated and engaged student (see Figure 2), and also reflects the general desire to hand in projects, a culture with an acceptance of someone else doing the job. This was observed on a number of occasions, such as when students were working on projects and presentations. A recurring finding during instructed group work (with four to five students) was that only one or two students engaged in the learning activity. It was up to the students to divide tasks and access to the different learning technologies. A group could hand in their work without all members participating. The learning related to engaging with that particular task was limited to the one student engaging. Moreover, it was observed that it seemed to be the same students who were always active, and the same ones who remained passive. This might reflect that students recognised which of the students in the group would provide the highest quality of work on behalf of the group as a whole.

Theme 4 (1 st example): Request engagement and confirm.

Orchestration B: Learning technologies were orchestrated to engage all students.

Subject: Programming: lessons 7 and 15.

Pedagogic context: Students were working on their individual laptops on problem solving within programming. This was a learning activity with four or five different exercises randomly allocated to the students. Some students were given problems to match their special interest or to challenge them if they were above-average performers in the class.

“I’m excited to work with programming. I don’t know if I was more or less engaged than my peers, since I did not turn around [student was sat at the front]. I may have been more engaged than the others.” (Student A)

The teacher walked around to check on student activity and to confirm students’ progress. The teacher was constantly moving about, but could only respond to a student at a time. This meant a line of students were waiting. This variation of interaction meant students were prompted repeatedly to engage in several slightly differing activities related to their learning.

Theme 4 (2 nd example): Request engagement and confirm

Orchestration A: Arenas were used in which all students’ participation was requested and seen by the teacher

Subjects: User Interface Design: lessons 1, 4 and 12.

Pedagogic context: an online assessment was introduced in which everyone’s participation was requested and noticed, online, in real time. Students and teacher received instant feedback. Later, students were instructed to work on tasks and asked to share their progress.

“Simply put, it was a task and I completed it. I feel...relieved, as a break is coming up.” (Student D)

Contributions in forums (Google Classroom™) and assessment systems (Socrative™) confirmed to the teacher that all students had engaged, and the teacher acknowledged student contributions as they occurred. Orchestrating technologies this way, meant the teacher shared student experience in the VLE, or accessed their knowledge. The variation of interaction meant students were prompted repeatedly to engage in several slightly differing activities related to their learning.

Students with different dispositions toward learning (“I feel...relieved as a break is coming up” [Student D] and “I’m excited” [Student A]) engaged in the learning activity. However, when a teacher prompted the students to log in to arenas and requested insights into the students’ learning processes, all students, regardless of disposition, engaged in those learning activities. It was also observed that students reacted positively when their efforts were acknowledged, both when they received instant confirmation and when the teacher walked around the class to check on student activity.

The observations showed that learning technologies could be used for a variety of pedagogical solutions. These included individual work, student-student interactions and content-interaction, which at times could mean that there were multiple simultaneous, learning-centred dialogues going on in the classroom. While most of the lessons were designed to include one or two longer learning activities with little variation, some lessons (such as User Interface Design lessons 1, 4 and 12, reflecting orchestration type A) included several different types of interactions. During these, the students reported their highest level of engagement, and it was observed that certain ways orchestrating learning technologies provided indicators of facilitated student engagement. In other words, these lesson designs were successful, but rare.

Following the Goodyear and Retalis’ (2010) classification, several categories of technologies could be used to orchestrate one lesson (e.g. during User Interface Design both technologies to access and study learning material, such as LMS, and technologies to assess learners were used). Using technologies from within one category did not automatically mean they were orchestrated to engage students in the same way. For example, technologies enabling a learning-by-doing pedagogy approach through programming and construction were included in both the Technology class and the Programming class, but how the learning was designed supported student engagement in different ways.

Furthermore, the results suggest that teachers typically repeated the way they made use of the learning technologies (e.g. in Swedish, English and User Interface Design classes), even when teaching different subjects (one teacher taught Mathematics and Physics, and one teacher taught English and Communication). That the ways of orchestrating learning technologies were repeated was observed in all lessons taught by all teachers.

Discussion

In this paper, we explored challenges and possibilities associated with engagement when learning technologies were used in the classroom.

Our findings indicate that student engagement varied during the school day, and that there were differences in the orchestration of learning technologies between teachers. While it was the case that students could be engaged in lessons without learning technologies, it also seemed that when learning technologies were used, they were not always orchestrated in ways that supported all students’ possibilities to engage. Previous research on students’ engagement and their use of learning technologies (Cakir, 2013; da Rocha Seixas, Gomes & de Melo Filho, 2016; Han & Finkelstein, 2013; Pellas, 2014) have claimed that using learning technologies increases student engagement. However, our findings suggest that merely implementing these technologies did not automatically mean all students reported the highest level of engagement. However, we propose that certain ways to orchestrate learning technologies support student engagement more than others.

Invisible learning process and motivated students who do not engage in learning

It seemed that when the teacher did nothing to orchestrate the use of learning technologies, the learning process remained invisible and it was not sure all students had gotten an opportunity to engage. The lowest levels of engagement were reported and observed when there was no use of learning technologies or when the learning design was not characterised by thought-through planning (orchestration type D-E, Table 2), even in cases where students reported that the learning activity was fun. This was also noted by Boekaerts (2016) and Poskitt and Gibbs (2012), who explained that a motivated student does not necessarily engage. Boekaerts (2016) furthered this by explaining that an unwilling student, or a student who is incapable of adopting self-regulatory strategies, might still not engage in the learning activity. Whilst referring to the notion that motivation is not always translated into action, this study highlights that another reason not to engage might potentially be that the students are not always given the opportunity to engage, then delegates contributions to group members and accepts not learning that which was offered to those who were engaging. Similar to Boekaerts (2016), our findings also indicate that both students who reported they were “bored” and “excited” engaged in the learning activity. We found that one potential reason to engage could be when there was one tool, and this was only fit for one user at a time. We also found that with the constant access to Internet, students might very well be engaged with the idea of presenting a paper product to the teacher, but not in learning. As such, the student commenced a search for facts and “inspiration”, which in this case could mean learning is bypassed (along with the teacher’s the request to engage in an individual interpretation of the novel). Matched with an invisible learning process, the risk is that teachers remain unaware.

Arenas were used, in which all students’ participation was requested and seen by the teacher

Our findings suggest that learning technologies offer possibilities to support student engagement. The results reveal that high levels of engagement correlate with learning activities in which the teacher requested and noticed the participation of all students, and technologies were orchestrated to gain insight into students’ knowledge and learning processes. Some orchestration types meant several technologies were used in requesting and noticing all students’ participation. This variation in interactions meant students were prompted repeatedly to engage in several slightly differing activities related to their learning.

Implications

For the most part, it seemed the learning technologies were not orchestrated to support student engagement. Instead, there were plenty of instances in which the appointed learning technology was only fit for one user at a time, and during most lessons the teacher had no insight into the students’ learning processes. One interpretation of our findings is that teachers chose to orchestrate technologies the same way, instead of exploring or developing their use of learning technologies. During some lessons, we observed that the orchestration meant not all students engaged in the learning process. Such consequences are unfortunate, and pose a challenge that urgently needs to be overcome. We argue that when using learning technologies, teachers need to ensure their practices offer the possibility for all students to engage. This is in line with other studies (Chen & Jang, 2010; Giesbers et al., 2013; Warschauer et al., 2014) that have argued that how teachers integrate learning technologies and pedagogical thinking in aiming to promote engagement and learning is both fundamental and critical for successful learning. These findings are also echoed in practical suggestions into how to implement learning technologies for effective learning (Goodyear & Retalis, 2010; Grissom et al., 2017; Laurillard, 2012), which propose that both peer modelling and peer and teacher feedback are essential. We found that the indicators of the facilitation of engagement included orchestrating technologies to enable insight into the student learning process, and offering arenas in which student engagement was requested and confirmed by the teacher.

In the light of our findings, we speculate that teachers might not always be aware of how to orchestrate learning technologies to ensure all students engage. This is in line with Håkansson-Lindqvist (2015), who identified a gap between the use of digital technologies in society at large and their use in the classroom, owing to teachers’ lack of support and guidance to integrate learning technologies. In her study, Håkansson-Lindqvist (ibid.) suggested that teachers are responsible for planning learning activities in ways that sustain student motivation when using learning technologies, but that little training is provided to allow them to develop this competence. As technologies are continuously advancing, creating ever more advanced learning technologies, and with digital innovations accessible in society outside the classroom, it is likely that the gap between the use of digital technologies in society and in the classroom will continue to widen. Doing nothing to prevent this might lead to outdated teaching practices being perceived as irrelevant for and by the students of tomorrow.

Limitations

As this study does not support causality or correlation measures, limitations are clear. We do not have hard data to provide evidence – instead this study is explorative in aiming to understand the phenomena in context, and discussing qualitative implications for student engagement. It was noted that the students estimated their engagement to be at high levels, and it has been recognised that informants might overestimate their abilities (Cole, Martin, Peeke, Seroczynski & Fier, 1999). Yet, when examining studies that adopted scientific rigour and transparency, “substantive correlations between observation and self-report” have been found (Desimone, 1995, p. 189).

Conclusion

Prior research has suggested that bringing learning technologies into the classroom would have an effect on student engagement. This study contributes by sharing both challenges and possibilities associated with engagement when learning technologies were used in the classroom: e.g. not orchestrating learning technologies might allow for students learning process to remain invisible, and could mean not all students engage in their learning. On the other hand, the possibilities include indicators of facilitation of student engagement: on occasion, teachers orchestrated technologies to enable a variety of interactions to shape learning, engage all students and request insights into student learning processes. These pedagogical strategies were rare, but successful when implemented. We argue that teachers might not always be aware of how to use learning technologies to ensure students engage and learn in effective ways. Teaching practices need to ensure that learning activities offer the possibility for all students to engage. More research is needed to explore how the use of learning technologies might facilitate student engagement. Future design implications could further explore how the findings in how technologies were orchestrated not only to facilitate engagement, but also how students engage when the limitation of less thought-through orchestration of technologies no longer poses a limitation.

Acknowledgement

This research forms part of the overall project “I use IT”, funded by the City of Stockholm. The aim of the project is to study the effects of digitalisation after the implementation of “one student per computer” in upper secondary schools in Stockholm.

APPENDIX A: Student diaries

These instructions were given in print:

After each lesson, the students estimated their engagement, and what motivated it:

An additional open question was also raised, varying for each lesson:

References

Appleton, J. J., Christenson, S. L., & Furlong, M. J. (2008). Student engagement with school: Critical conceptual and methodological issues of the construct. Psychology in the Schools, 45(5), 369–386. http://doi.org/10.1002/pits.20303

Boekaerts, M. (2016). Engagement as an inherent aspect of the learning process. Learning and Instruction, 43, 76–83. http://doi.org/10.1016/j.learninstruc.2016.02.001

Braun, & Clarke. (2013). Successful Qualitative Research: A Practical Guide For Beginners. London, UK: SAGE Publications.

Cakir, H. (2013). Use of blogs in pre-service teacher education to improve student engagement. Computers & Education, 68, 244–252. http://doi.org/10.1016/j.compedu.2013.05.013

Chen, K.-C., & Jang, S.-J. (2010). Motivation in online learning: Testing a model of self-determination theory. Computers in Human Behavior, 26, 741–752. http://doi.org/10.1016/j.chb.2010.01.011

Cole, D. A., Martin, J. M., Peeke, L. A., Seroczynski, A. D., & Fier, J. (1999). Children’s Over- and Underestimation of Academic Competence: A Longitudinal Study of Gender Differences, Depression, and Anxiety. Child Development, 70(2), 459–473. http://doi.org/10.1111/1467-8624.00033

Creswell, J., & Clark, V. (2011). Designing and conducting mixed-methods research (2nd ed.). Thousand Oaks, CA: SAGE Publications.

da Rocha Seixas, L., Gomes, A. S., & de Melo Filho, I. J. (2016). Effectiveness of gamification in the engagement of students. Computers in Human Behavior, 58, 48–63. http://doi.org/10.1016/j.chb.2015.11.021

Desimone, L. M. (1995). Improving Impact Studies of Teachers’ Professional Development: Toward Better Conceptualizations and Measures. National Commission on Teaching and America’s Future Educational Researcher, 38(3), 181–199. http://doi.org/10.3102/0013189X08331140

Eccles, J., & Wang, M.-T. (2012). So What is Student Engagement Anyway? In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of Research on Student Engagement (pp. 133–145). New York, NY: Springer. http://doi.org/10.1007/978-1-4614-2018-7_6

Finn, J. D. (1989). Withdrawing From School. Review of Educational Research Summer, 59(2), 117–142. Retrieved from http://gse.buffalo.edu/gsefiles/documents/alumni/Fall09_Jeremy_Finn_Withdrawing.pdf

Fredricks, J., Blumenfeld, P., & Paris, A. (2004). School Engagement: Potential of the Concept, State of the Evidence. Review of Educational Research Spring, 74(1), 59–109. http://doi.org/10.3102/00346543074001059

Fredricks, & McColskey. (2012). The Measurement of Student Engagement: A Comparative Analysis of Various Methods and Student Self-report Instruments. In Handbook of Research on Student Engagement (pp. 763–782). Boston, MA: Springer US. http://doi.org/10.1007/978-1-4614-2018-7_37

Giesbers, B., Rienties, B., Tempelaar, D., & Gijselaers, W. (2013). Investigating the relations between motivation, tool use, participation, and performance in an e-learning course using web-videoconferencing. Computers in Human Behavior, 29(1), 285–292. http://doi.org/10.1016/j.chb.2012.09.005

Goodyear, P., & Retalis, S. (2010). Technology-Enhanced Learning. Sense Publishers. http://doi.org/10.1080/14759390802383827

Grissom, S., McCauley, R., & Murphy, L. (2017). How Student Centered is the Computer Science Classroom? A Survey of College Faculty. ACM Transactions on Computing Education, 18(1), 1–27. http://doi.org/10.1145/3143200

Gudmundsdóttir, G. B., Dalaaker, D., Egeberg, G., Hatlevik, O. E., & Tømte, K. H. (2014). Interactive technology. Traditional practice? Nordic Journal of Digital Literacy, 2014(1), 23–43.

Han, J. H., & Finkelstein, A. (2013). Understanding the effects of professors’ pedagogical development with Clicker Assessment and Feedback technologies and the impact on students’ engagement and learning in higher education. Computers and Education, 65, 64–76. http://doi.org/10.1016/j.compedu.2013.02.002

Henrie, C. R., Halverson, L. R., & Graham, C. R. (2015). Measuring student engagement in technology-mediated learning: A review. Computers & Education, 90, 36–53. http://doi.org/10.1016/j.compedu.2015.09.005

Laurillard, D. (2012). Teaching as design science Building Pedagogical Patterns for Learning and Technology. Routledge.

Miles, M. B., & Huberman, A. M. (1994). Qualitative Data Analysis. Thousand Oaks Sage Publications (Vol. 27).

Pellas, N. (2014). The influence of computer self-efficacy, metacognitive self-regulation and self-esteem on student engagement in online learning programs: Evidence from the virtual world of Second Life. Computers in Human Behavior, 35, 157–170. http://doi.org/10.1016/j.chb.2014.02.048

Poskitt, J., & Gibbs, R. (2010). Student engagement in the middle years of schooling (year 7-10): A literature review. New Zeeland.

Reeve, J. (2012). A Self-determination Theory Perspective on Student Engagement. In Handbook of Research on Student Engagement (pp. 149–172). Boston, MA: Springer US. http://doi.org/10.1007/978-1-4614-2018-7_7

Ryan, R. M., & Deci, E. L. (2000). Intrinsic and Extrinsic Motivations: Classic Definitions and New Directions. Contemporary Educational Psychology, 25(1), 54–67. http://doi.org/10.1006/ceps.1999.1020

Warschauer, M., Zheng, B., Niiya, M., Cotten, S., & Farkas, G. (2014). Balancing the one-to-one equation: Equity and access in three laptop programs. Equity & Excellence in Education, 47(1), 46–62. http://doi.org/10.1080/10665684.2014.866871

Idunn bruker informasjonskapsler (cookies). Ved å fortsette å bruke nettsiden godtar du dette. Klikk her for mer informasjon