It is a common intention in 2015 that teacher education has to reflect what is going on in practice, develop sustainable partnerships with schools and continuously strive for increasing the teaching quality for student teachers. A bigger challenge is how to realize these good intentions in the digital era. The articles in this and the previous issue (4/2014) show examples of the important steps we need to take to realize this. We need research-based knowledge within teacher education on how to develop sustainable partnerships and how to improve teaching quality both in lectures, seminars, tutorials and in periods of practical training. This implies the need for methodological improvement and innovation within teacher educational research, with increased degree of user involvement, and where technology is seamlessly integrated as part of teaching interventions. An interesting example of such research design is Allan Collins and Ann Brown’s Design Experiments (Collins 1992; Brown 1992) or Design Based Research (DBRC, 2003). Such research designs, combined with dedicated teacher educators who value teaching and research equally, can provide new research knowledge of how to improve teaching quality when ICT is seamless integrated in teaching interventions. A good example of such dedicated lecturers is Professor Carl Wieman of Stanford University, a renowned scientist who – after having won the Nobel Prize in physics in 2001 – devoted his career to undergraduate science education at his university (he is also science advisor to President Obama). In Wieman and colleagues' study ,“Why Peer Discussion Improves Student Performance on In-Class Concept Questions” (Smith et al. 2009), published in the prestigious journal Science in 2009, we find examples of how technology and peer discussion interventions are seamlessly integrated in the design. In order to realize the overall research question of the study and carry out the teaching interventions, they applied the digital tool Student Response System (“clickers”) in these large lectures (N=350). They found, interestingly enough, that peer discussion enhances understanding, even when none of the students in a discussion group initially knows the correct answer. In a similar study published in Science, “Improved Learning in a Large-Enrollment Physics Class” (Deslauriers, Schelew and Wieman 2011), Wieman and colleagues found a 2.5(!) standard deviation effect with interactive instruction and active use of clickers. A core message from this study is that teaching methods matter and have great impact upon the learning outcome of the undergraduate students (Deslauriers, Schelew and Wieman 2011). These studies are examples of how dedicated lecturers carry out research in their own teaching courses, in order to be able to improve the teaching quality (based on solid research findings and not on anecdotal evidence). This also shows us the potential new technology can give both in teaching and research. Wieman and colleagues could not have carried out the teaching part of this kind of intervention as well as collect the data in situ “there and then” with several hundred students, without the digital technology (clickers). The studies enhance first of all our understanding of the value of peer discussion and interactive teaching in students' learning processes in large lectures, and also build on previous research which shows that digital clickers in large lectures have an effect size of 0.40 (Mayer et al. 2009; Hattie 2012). Mayer et al. (2009) explain it in this way why this kind of technology and adjunct questions seems to enhance learning outcome:

The act of trying to answer sample questions and then receiving immediate feedback may encourage active cognitive processing in three ways:

  1. (a) before answering questions, students may be more attentive to the lecture material,

  2. (b) during question answering, students may work harder to organize and integrate the material,

  3. (c) after receiving feedback, students may develop metacognitive skills for gauging how well they understood the lecture material and for how to answer exam-like questions (Mayer et al., 2009, p. 56).

In many ways, we find several elements of formative assessment in this kind of teaching intervention which rely on the fact that constructive feedback and formative assessment have great value for teaching quality and students' learning outcome (Hattie and Timperley 2007). In a broader sense, the studies from Wieman and colleagues and Mayer and colleagues indicate that “no technology is in itself formative but almost every type of technology can be used in a formative way” (Pachler et al., 2009, p. 21). We find this also in our own studies on the use of Student Response System (clickers) which have proved to have an interesting potential in handling elements of formative assessment in lectures if the teaching design is well developed and pedagogically solid (Krumsvik & Ludvigsen, 2012). To gain more in-depth knowledge of formative assessment and students learning in digital learning environment it seems like Learning analytics will be an significant area in the years to come.

Teacher education has, today, the possibility of taking some of the same steps as shown above and of revitalizing our traditional perception of teaching and lectures, and the possibility to “bring the reality into the lecture halls,” for example, student teachers analyzing authentic teaching situations from classrooms in schools through video cases. Also, we might reconsider our traditional perceptions: Is the concept of a “lecture” today more a stereotype with certain negative connotations than in fact what actually goes on in some of the auditoriums in higher education of today? Eric Mazur (Harvard University), Richard Mayer (University of California) and Carl Wieman (Stanford University) have shown that it is possible to turn lectures almost into large class workshops with peer discussion and technology seamlessly integrated, which is quite a step away from our traditional perception of “lecture” as “chalk and talk,” monologue, no interactivity and passive students. However, to achieve this, it is necessary in teacher education to examine and design lectures not only in isolation, but as a coherent part of a larger pedagogic teaching design along a continuum with seminars, peer groups, periods of practical training, tutorials, etc. as vital elements (e.g. using teaching methods like flipped learning). This requires more planning and collaboration, both within teacher education and in partnerships with teachers and schools. In addition, this requires new competencies – both for the student teachers and teacher educators within teacher education. There is today a gap between what new teachers need in the field of practice (Krumsvik et al. 2013) and what the teacher education prepares them for concerning digital competence (Tømte et al. 2013). Thus, digital competence among student teachers and teacher educators within teacher education seems to be an area that we need more focus on, as this has become a requirement both for handling academic skills, teaching interventions and research (as shown above), and for coping in working life afterwards as new teachers. In our recently published literature review in this journal about digital competence among student teachers in teacher education (Røkenes & Krumsvik, 2014), we find that this area is still in its infancy, with a wide range of interpretations of what digital competence among student teachers is (or should be). In order to utilize the range of different technologies pedagogically and increase teaching quality in teacher education, the development of professional digital competence seems to be an important supplier of conditions (which has been examined in both this and the previous issue). Manuel Castells reminds us of the importance of not forgetting that in the end “…educational technology in general, is only as good as the teacher who uses it” (Castells, 2001, p. 258).


Brown, A.L. (1992). Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings. The Journal of the Learning Sciences 2 (2), 141–178.

Castells, M. (2001). The Internet Galaxy. New York: Oxford University Press.

Collins, A. (1992). Toward a design science of education. In: E. Scanlon og T. O’Shea (Red.). New directions in educational technology (p. 15–22). New York: Springer.

Design-Based Research Collective (2003). Design-Based Research: An Emerging Paradigm for Educational Inquiry. Educational Researcher 32 (1), 5–8.

Deslauriers, L., Schelew, E. & Wieman, C. (2011). Improved Learning in a Large-Enrollment Physics Class. Science, 5(332), 862–864.

Hattie, J., & Timperley, H. (2007). The Power of Feedback. Review of Educational Research, 1(77), 81–112.

Hattie, J. (2012). Visible Learning for Teachers: Maximizing Impact on Learning. New York: Taylor and Francis.

Krumsvik, R., & Ludvigsen, K. (2012). Formative assessment in plenary lectures. Nordic Journal of Digital Literacy 1 (7), 36–54.

Krumsvik, R.J., Egelandsdal, K., Sarastuen, N.K. Jones, L. & Eikeland, O.J. (2013). Sammenhengen mellom IKT-bruk og læringsutbytte (SMIL) I videregående opplæring. Sluttrapport. Oslo/Bergen: KS/Universitetet i Bergen.

Mayer, R.E., Stull, A., DeLeeuw, K., Almeroth, K., Bimber, B., Chun, D., Bulger, M., Campbell, J., Knight, A., Zhang, H. (2009). Clickers in college classrooms: Fostering learning with questioning methods in large lecture classes. Contemporary Educational Psychology, 34(1), 51–57.

Pachler, N., Mellar, H., Daly, C. et. al. (2009). Scoping a vision for formative e-assessment: A project report for JISC. Version 2. April (

Røkenes, F. M. & Krumsvik, R. J. (2014). Development of Student Teachers’ Digital Competence in Teacher Education – A Literature Review. Nordic Journal of Digital Literacy, 4(9), 250–280.

Smith, M.K., Wood, W.B., Adams, W.K., Wieman, C., Knight, J. K., Guild, N., & Su, T.T. (2009). Why Peer Discussion Improves Student Performance on In-Class Concept Questions. Science, 2(323), 122–124.

Tømte, C., Kårstein, A. & Olsen, D.S. (2013). IKT i lærerutdanningen. På vei mot profesjonsfaglig digital kompetanse? Rapport 20/2013. Oslo: Nifu.