Oppgrader til nyeste versjon av Internet eksplorer for best mulig visning av siden. Klikk her for for å skjule denne meldingen
Ikke pålogget
{{session.user.firstName}} {{session.user.lastName}}
Du har tilgang til Idunn gjennom , & {{sessionPartyGroup.name}}

Evolution of a portfolio-based design in ecology: a three-year design cycle

Department of Education, University of Oslo
Department of Biological Sciences, University of Bergen

Portfolio-based designs are among the most popular student-centered approaches in higher education. While the pedagogical literature typically provides generic advice on ideal portfolio-based designs, there is little empirical data on how such designs are developed over time and what design decisions teachers take in response to challenges in practice. This article provides an empirical account of a three-year design cycle of a portfolio-based ecology course at a Norwegian university. It investigates how the design changed over the years and how these changes related to the challenges the teacher met during the enactment of the course. To that end, a thematic analysis of course plans, evaluations, and interviews with the designing teacher was conducted. The findings show how the teacher introduced, removed, and (re-)configured different course components in order to address challenges related to the limited coherence between portfolio items and organized class meetings, and students’ limited engagement with the disciplinary knowledge. Thereby, the course design gradually evolved from a portfolio-based design towards a hybrid design combining portfolio, traditional exam and team-based learning. This study illustrates that portfolio-based designs have great potential but no guarantee to support students in active engagement with knowledge; and that teachers need to actively maintain their student-centered focus in their designs by responding flexibly to the emerging challenges.

Keywords: portfolios, course design, biology education

Emneplaner basert på mappevurdering er blant de mest populære student-sentrerte tilnærmingene i høyere utdanning. Mens den pedagogiske litteraturen gir flest generelle råd om hvordan mappevurdering bør implementeres, er det forsket lite på hvordan emneplaner basert på mappevurdering utvikles over tid og hvilke utfordringer dette kan innebære i praksis. Denne artikkelen gir en empirisk beskrivelse av et treårig utviklingsarbeid av et mappe-basert emne i økologi ved et norsk universitet. Studien undersøker hvordan emneplanen ble endret gjennom disse tre årene og hvilke justeringer selve vurderingsformen utløste i formingen av hele emnet. Metodisk er studien basert på tematiske analyse av emneplaner, emneevalueringer samt intervjuer med læreren. Funnene fra analysen viser hvordan læreren gjennomgående har måttet bearbeide ulike emnekomponenter for å skape sammenheng mellom mappeelementer, undervisning, og arbeidsmåter som oppfordrer til aktiv bruke fagkunnskaper i mappearbeidene. Dermed utviklet emneplanen seg gradvis fra et rent mappe-basert kurs mot en hybrid-løsning der mappevurdering kombineres med tradisjonell eksamen og teambasert læring. Studien illustrerer at mappevurdering har et klart positivt læringspotensial, men at dette krever aktiv tilpasning av lærerkreftene underveis. Metoden er derfor ikke i seg selv en garanti for å lykkes med implementering av student-sentrerte arbeidsmåter.

Nøkkelord: mappevurdering, emneplanlegging, undervisning i biologi

Introduction

In recent decades the use of portfolio-based design has become popular in higher education. This popularity emerged in the wake of an increased focus on creating ‘student-centered learning environments’ that emphasize opportunities for students to actively engage with knowledge in a student-driven manner (Land & Jonassen, 2012). Portfolios in higher education are commonly defined as ‘a purposeful collection of student work that exhibits the student’s efforts, progress, or achievements in one or more areas’ (Paulson & others, 1991, p. 60). This focus on the students and the documentation of their learning process makes portfolio-based course designs a prime example of how student-centered learning environments might be designed for.

While it is widely acknowledged that the use of portfolios is an effective way to involve students actively with knowledge, we have only limited understanding of the processes teachers actually engage in while planning and creating portfolio-based designs. Empirical research on portfolios in higher education has mostly focused on understanding how portfolios foster student learning and reflection (e.g. Lam, 2014; Tillema & Smith, 2000; Yancey & Weiser, 1997) and what tasks and assessment strategies portfolios are ideally composed of (e.g. Clarke & Boud, 2016; Dysthe & Engelsen, 2004; Tigelaar, Dolmans, Wolfhagen, & van der Vleuten, 2005). We argue that it is important to shed more light on the way teachers create and plan portfolio-based designs and what challenges teachers typically need to overcome in planning for portfolio-based courses and for activating their students more generally.

To address this gap, this article investigates the iterative changes made during the design process in a portfolio-based course over the period of three years, and in which ways the responsible teacher addressed the challenges he met during this process. Thereby we aim to generate important practical insights for practitioners who are interested but uncertain about how to employ a portfolio-based design in their course. At the same time, we aim to provide an empirical account of the processes that are involved in designing for student-centered learning environment more generally. Drawing on a longitudinal case study of a portfolio-based undergraduate course in ecology at a Norwegian university, we address the following research questions:

  • How did the portfolio-based design change over the period of three years?

  • How did the changes in the design relate to the challenges the teacher met during the enactments of the course?

The second author was the teacher and designer of the course, while the first author studied the design process as an independent researcher. In the following, we present a review of the literature on portfolios in higher education before outlining the conceptual framework and methodology we used to address our research questions. After presenting the findings of our study, we conclude with a discussion and a number of practical recommendations for teachers who wish to design for student-centered learning environments and, in particular, portfolio-based courses.

Portfolios in higher education

Portfolios have received considerable attention in the higher education literature. The growing use of portfolios has led to a complex diversity of idiosyncratic practices and a lack of consensus concerning what exactly portfolio-based designs entail (Dysthe & Engelsen, 2011). Meeus et al. (2006) identified nearly fifty different nomenclatures describing portfolios in the literature. In the Norwegian context, the most common portfolio form is the so-called ‘disciplinary-based course work portfolio’, which usually comprises a collection of portfolio items (e.g. written assignments) that focus on the students’ mastery of disciplinary course content and ability to communicate (Dysthe & Engelsen, 2011). These portfolio items are usually closely linked to the course objectives and the disciplinary knowledge content (i.e. the syllabus), and are provided with formative feedback before the final submission of the portfolio.

A range of studies has focused on the ways portfolios foster learning and reflection. Portfolios have been shown to support active knowledge construction and self-regulation of learning among students (Lam, 2014; Tillema & Smith, 2000; Yancey & Weiser, 1997). They also help students and teachers to form communities in which they think, work and learn (Darling-Hammond & Snyder, 2000). This contributes to the development of students as continuous and reflective learners (Boud & Falchikov, 2006; Darling-Hammond & Snyder, 2000; Zeichner & Wray, 2001). Another opportunity emerging from portfolio-based designs is related to the insight that teachers gain into the learning progress of their students, which can be used to adapt the teachers’ design and teaching approaches for future course iterations.

Finally, a number of studies have focused on the components portfolio-based designs are ideally composed of in order to support student learning. Dysthe and Engelsen (2004) argue that ideal portfolio-based designs entail three phases. The first phase should engage students in various activities that result in different products, such as written assignments. The second phase is to include the students’ selection process of the assignments they want to present and the final phase includes the summative assessment of the final portfolio. Another study suggests that portfolios should be assessed in a transparent way that includes both summative and formative assessment (Tigelaar et al., 2005). Following the same argument, Clarke and Boud (2016) argue for including a peer review component in portfolio-based designs to foster the students’ competence to judge their own work by receiving and giving feedback. The ability to self-assess has long been recognized as an important skill needed not only in higher education, but also in the workplace setting and for lifelong learning (Boud, 2000). Including peer review would also make portfolio-based designs more manageable for teachers, who often struggle with the high workload related to providing formative feedback on all portfolio items of their students.

The reviewed literature provides important insights into portfolio-based designs and how they contribute to making higher education more student-centered. There remains, however, a gap between our theoretical understanding and our knowledge of the actual practices of teachers when developing portfolio-based designs. In the following, we outline how our study addresses this gap by investigating the changes in a portfolio-based design in an ecology course.

Conceptual framework: course design as a continuous process

A recent strand of conceptual research on design in higher education provides compelling arguments that designing student-centered courses such as portfolio-based courses is a complex, iterative process that involves both “configuration, orchestration, reflection and redesign as its main phases” (Goodyear & Dimitriadis, 2013, p. 9) . This theoretical perspective has been termed ‘design for learning’ rather than ‘design of learning’; it suggests that learning and activities in themselves cannot be designed, but can only be designed for by creating and configuring ‘course components’ that make such learning activities possible (Goodyear & Dimitriadis, 2013).

These designable components include tasks and the physical and social environment. Tasks are suggestions of what students should do (communicated both in written and oral form) and have a strong influence on the learning activities students are likely to engage in during the course. Physical environments refer to the material and digital objects that constitute the setting of the course activities. Social environments refer to the things that influence how course participants work together during activities (Goodyear & Dimitriadis, 2013). These ideas suggest that student-centered learning environments can be generated by configuring course components in such ways that generate opportunities for students to engage actively with knowledge (for more detail, see Damşa and de Lange in this special issue). In our study, these course components serve as analytical tools to describe the characteristics of the different design iterations and how they changed over the three years.

Another central idea in the design-for-learning perspective is that design is not a project delimited in time and space. Design work starts in media res – in the middle of things – and it should be seen as a normal part of the regular flow of ongoing educational practice (Goodyear & Dimitriadis, 2013). In other words, time plays an important role in the non-linear design process that comprises several feedback loops and entry points over time. For analytical purposes, Goodyear and Dimitriadis (2013) distinguish several phases in a design lifecycle: configuration of the course components to create a specific learning environment; orchestration or real-time management of the activities during the enactment of the course; and reflection/redesign of future course iterations based on data collected about the past course activities. In this article, we focus on the reflection/redesign phase in the design lifecycle, as we are interested in exploring the changes in design that were implemented in between three course iterations, and how they related to the challenges experienced by the teacher during the enactment of the course.

An important part of the reflection/redesign phase constitutes the identifying and addressing of challenges that might have emerged during the enactment of the course (i.e. the orchestration phase). In student-centered designs, challenges are often related to the multiple and sometimes competing intended learning outcomes and learning activities students are supposed to engage in. These challenges are expected to be particularly salient in the context of portfolio-based designs, which require the teacher to design a number of portfolio items that need to be aligned with each other as well as with the course’s learning outcomes, the assessment forms, and the organized class meetings. The conceptual notions outlined here serve as a framework to analyze our empirical data and will be further elaborated in the methods section.

Methods

Case study of an ecology course

The data is drawn from a case study of a semester-long introductory ecology course (10 ECTS) at a Norwegian university that used a portfolio-based design. This case was part of the wider QNHE1 project, which aimed at identifying issues that matter for the quality of educational practices in higher education. The course was designed and taught by the second author, and offered yearly in the spring semester. It was first introduced as a portfolio assessment in spring 2015 and repeated annually in the consecutive years. In each iteration, the second author was supported in his teaching by up to four co-teachers, who each held a PhD with a specialization in ecology. Due to its introductory nature, the course had a diverse student intake and enrolled students from the local bachelor’s and master’s program in biology, teacher students, students from other natural science degrees, and exchange students. The students number varied between 10 in the first, 27 in the second and 38 in the third year.

Data collection

The data were collected by the first author only, while the second author remained in a participant role during the data collection phase. The dataset comprises interview data, course plans, course evaluations and observational data. The second author was interviewed three times, twice between the design iterations and once after the third iteration to gain information about design decisions and experiences. Each interview lasted about one hour and was transcribed verbatim. The course plans were the official documents used to inform students about the aims, learning activities and schedules of each course iteration. The evaluation reports refer to the official documents each teacher is required to issue as part of the quality assurance system2 at the university. The reports also entailed student evaluations that were collected through a survey by the end of each iteration, the second author and his co-teacher’s reflections about the challenging aspects of each teaching phase and the design changes planned for the next year. This data set is supplemented by observational data of the enactment of the second design iteration, which was the central data collection period in the overarching QNHE project. More supplementary data stem from interviews with three student groups about their experiences with the course activities in the second design iteration. These data sources allowed us to get information about the teacher’s design decisions and the challenges he experienced.

Analytical strategy

The data was analyzed in several steps. To address the first research question, the first author mapped the course components through a thematic analysis (Braun & Clarke, 2006) of the course documents and interview transcripts. Each design iteration was described according to the tasks and its physical and social environment. This analysis revealed the changes in the way the different course components were (re-)configured during each reflection/redesign phase. The generated overview was validated and supplemented by the second author.

For the second question, the first author analyzed the interview data to distill the core challenges the teacher experienced during the enactment of the course and how they related to the configuration of course components during the design process. This comprised two steps: first, the analysis of which challenges were identified by the teacher during the enactment of the course; and second, the analysis of how these challenges were addressed through the changes in the redesign phases. In dialogue with the second author, these findings were further elaborated and validated.

Findings

Changes of the portfolio-based design across the three iterations

Across all iterations, the teacher integrated changes that aimed at facilitating active student engagement with the disciplinary knowledge content of the course. The tasks – or suggestions of activities students were to engage in – took the form of different portfolio items (see Appendix 1 for a more detailed overview). These items required students to engage in activities such as developing problem statements, solving problems, searching, reading and making meaning of relevant literature, using software to analyze given data sets and writing academic texts. As the portfolio items were the basis for the final assessment, the students had to engage in all of these activities in order to pass the course. In addition, the teacher also designed optional tasks, such as participating in the organized class meetings (lectures, tutorials) and making use of the offered feedback. These activities, however, were not mandatory and not required for the final assessment.

In order to support students in the suggested activities, the teacher planned the physical environment by organizing class rooms for lectures and tutorials and giving access to a learning management system, software, textbooks and other relevant knowledge resources; and he arranged the social environment by inviting several co-teachers to teach and offer feedback in the course, and by specifying for each task whether students were to work individually or in groups.

In each design iteration, these student-centered learning environments took slightly different forms, which create different conditions for the activities students eventually engaged in during the enactment of the course. These design decisions were influenced by the challenges the teacher experienced during the enactment of the course, and in turn, every decision influenced the kind of challenges that emerged in the future iterations.

Based on our thematic analysis, we created a tabular overview (see table 1) of the following course components that emerged as most relevant for illustrating the changes across the three iterations of the portfolio-based design: teachers, learning outcomes, portfolio items, final assessment, feedback, organized class meetings. The first column represents the initial configuration of those components in the first design iteration (2015), while the other two columns highlight the changes that have been made in the two subsequent iterations (2016 and 2017).

Table 1:

Overview of course components and their changes across the three design iterations

Initial design iteration (2015)Second design iteration (2016)Third design iteration (2017)
Involved teachers:

Main teacher

4 co-teachers

No change:

same teachers remain

Change:
From 4 to 2 co-teachers
Learning outcomes:

Establish basic knowledge of ecological theories

Develop skills in analytical methods

Change:

More focus on transferrable skills like academic writing, critical thinking

Change:

More focus on disciplinary content knowledge

Portfolio items:

Portfolio with up to 6 items

3 quantitative group exercises, with individual reports afterwards

6 individual assignments, 3 topics from textbook, 3 from group projects

Change:

Increase to 12 items

1 Open essay

3 Group assignments

5 Textbook assignments

Oral presentation

Peer review of one Open essay

Reflection piece (not realized)

Change:

Decrease to 4 items

1 Open essay

2 Group assignments

Peer review of two open essays

Final assessment:

Final grade based on 4 best submissions of portfolio tasks

Change:

Final grade based on all items (weighted average)

Change:

Introduction of final oral exam

Final grade based on final exam (60 %) and portfolio items (40 %)

Feedback:

Written feedback on final version of all assignments.

General video feedback on one group report

Change:

More written feedback on draft versions during course

Introduction of peer review on open essay

Introduction of oral feedback sessions

No more video feedback

Change:

More continuous dialogue online

No more oral presentation

Fewer oral feedback sessions

Organized class meetings:

32 h of lectures

12 h of tutorials on software

Change:

Decrease to 16 h of lectures

Decrease to 8 h of tutorials on software

Change:

Increase to 36 h of class meetings

Introduction of TBL format

Two challenges and how they were addressed in the design iterations

The interviews with the second author and his reflections showed his continuous effort to identify challenges and to address them through adjustments of the course design. Among the most prominent challenges were a) the limited coherence between portfolio items and organized class meetings and b) the limited student engagement with the disciplinary knowledge. The following sections outline the way these challenges manifested themselves and how the teacher addressed them through changes in the course design.

Challenge 1: Limited coherence between portfolio items and organized class meetings

This challenge was related to the coherence between portfolio items and organized class meetings. The teacher’s original idea to use a portfolio-based design was related to the possibility of creating several tasks that required students to engage in a variety of learning activities. The intended learning outcomes in the initial iteration included the establishment of basic knowledge of ecological theories and the development of skills in analytical methods important for ecology. To meet these, the teacher had designed six portfolio items that required students to write three individual reports based on quantitative modeling exercises (solved in groups) and three other individual essays on broader topics related to the textbook and additional journal articles. The organized class meetings in the first iteration comprised lectures that employed mostly traditional forms of frontal teaching, and that served to provide: (1) disciplinary knowledge complementing the knowledge students engaged in through the textbook, and (2) physical meeting points for the students who were to work mostly on their own.

In his evaluation of the first iteration, the teacher identified the challenge that students struggled to see the connection between the lectures and the assignments they were working on. In response, the teacher explored the possibility of reducing the traditional lectures and to include more “possibilities for on-demand lecturing” (course plan 2016) in which students could ask for help and further instruction if needed. This was intended to generate more time for the students’ independent work on the portfolio items, which the teacher considered as a more valuable activity than making students attend lectures they could not see the relevance of. The lectures were not completely abandoned, however, as they were still considered important meeting points for the class. Based on these considerations, the teacher reduced the number of the frontal lectures in the second iteration, and he integrated a formative feedback loop, in which students had the opportunity to get on-demand feedback from the teacher on their work.

In the evaluation of the second design iteration, the challenge had shifted slightly but remained the same at the core. The increased number of portfolio items resulted in students spending most time with learning activities that were of independent and solitary nature. While students appreciated the opportunity to receive tailored feedback on their assignments, it remained difficult for students to see the relevance of the remaining lectures for their portfolio tasks, resulting in decreased attendance in the lectures. One student explained that ‘the lectures didn’t really encourage you to attend them, because … you didn’t really get valuable information that you could put into the … assignments’. The teacher remarked that the amount of face-to-face contact with the students eventually became so limited and the use of online communication so extensive, that the course had inadvertently taken the character of an online course.

For the third design iteration, therefore, the teacher attempted to solve the challenge of the limited coherence between portfolio tasks and organized class meetings by taking two measures: adding a final exam to the portfolio items, and using a new interactional lecture format based on team-based learning (TBL) (Michaelsen and Sweet 2008). This new format made use of frequent quizzes, Immediate Feedback Assessment Technique (IF-AT) scratch-cards and group discussions around topics from the textbook. Introducing a final exam made it possible for the teacher to establish a clear link between the final assessment and the – now weekly – interactive class meetings that were aimed at helping students prepare for the exam. The positive student evaluations of this third design iteration indicated that the students found it easier to see the relevance of class meetings.

Challenge 2: Limited student engagement with the disciplinary knowledge

Another main challenge was related to the limited student engagement with the disciplinary knowledge in the course. The whole idea of letting students work on different portfolio items had been aimed at ensuring that students would become familiar with the diverse disciplinary knowledge that forms the basis of the field of ecology. During the first iteration, however, the teacher experienced that students did not engage as thoroughly with these basic ecological concepts and theories as he had intended as learning outcomes of the course. He realized that this was related to the way the final assessment was organized (grade calculated based on the four best submitted portfolio items), which led to most students engaging only to that extent with the disciplinary knowledge that was necessary to submit the minimal amount of items necessary for passing the course.

In response, in the second iteration the teacher increased the number of portfolio items and changed the final assessment into a final grade based on the weighted average of all portfolio items. This was an attempt to make it necessary for students to engage more deeply with the course’s disciplinary knowledge. While the teacher observed that the students did indeed engage more with the assigned reading than in the previous iteration, the challenge remained to a certain extent due to the students’ tendency to cherry-pick relevant parts from the textbook material instead of studying and reading more coherently.

Another consequence of the additional tasks was the increased complexity of the different activities students were to engage in. On the one hand, some portfolio items required students to show autonomy and independence in developing their skills to work scientifically, perform analyses and solve problems on their own. At the same time, other items were about certifying or ensuring the students’ understanding of ecological concepts. The relation between these two purposes remained unclear to many students, who wondered whether they were supposed to ‘learn about ecology or learn how to do ecology?’

After becoming aware of these partly conflicting learning outcomes of the course (i.e. transferrable skills vs. basic ecological knowledge), the teacher attempted to re-calibrate the alignment between the course’s intended learning outcomes, the portfolio tasks and the assessment. In the third iteration, thus, the teacher opted for a profound change in the assessment form. On the one hand, he kept part of the original portfolio structure (i.e. the open essay and two group assignments) to assess the students’ development of transferrable skills such as scientific thinking and academic writing. On the other hand, he introduced a final oral exam to assess how students had engaged with the basic knowledge content of ecology. The evaluation of the third design iteration indicated that students did indeed engage more comprehensively with the textbook – and presumably the disciplinary knowledge content – in preparation for the exam.

Evolution from a portfolio-based design into a hybrid design

This study has provided insight into the iterative changes made during the design process in a portfolio-based course over the period of three years and how the teacher addressed the challenges he met during this process. Addressing the first research question, the findings revealed that the design changed in the following way over the three design iterations. The first iteration was characterized by a two-fold focus on frequent lecture-based class meetings and a six-item portfolio, where only four out of six best items were included in the final portfolio assessment. In the second iteration, the teacher expanded the number of portfolio items and introduced a formative feedback loop through which students could get help on-demand during the semester. At the same time, he reduced the number of lectures considerably and changed the final assessment to being based on an average grade across all portfolio items. In the third iteration, finally, the teacher introduced a final exam and regular class meetings employing new work forms (i.e. team-based learning). He concurrently reduced the number of portfolio items, which turned the design into a hybrid form that draws on various pedagogical principles, i.e. traditional exam, portfolio and TBL.

These findings show the gradual evolution of a course design that had started out as a ‘disciplinary-based course work portfolio’ (Dysthe & Engelsen, 2011). In a constant effort to maintain a student-centered focus, a ‘watering down’ of the principles of the originally intended portfolio-based design took place and an organic integration with other pedagogical approaches (e.g. exam, TBL) occurred. This example of design evolution provides an interesting illustration of how the wide variety of portfolios types we find in today’s higher education (Meeus et al., 2006) have emerged out of the continuous (re-)design processes teachers engage in across our higher education institutions (Goodyear & Dimitriadis, 2013).

Concerning the second research question of how the changes in the design related to the challenges the teacher encountered during the enactments of the course, the findings showed two challenges that were at the core of many changes: (1) the limited coherence between portfolio items and organized class meetings, and (2) the limited student engagement with the disciplinary knowledge of the course. The teacher addressed these challenges by either adjusting the envisioned learning outcomes, introducing new course components (e.g. formative feedback, final exam) or adjusting existing course components (e.g. expanding or decreasing extent of portfolio items).

These challenges may be considered common for portfolio-based designs that are typically characterized by an array of different tasks and learning activities (Dysthe & Engelsen, 2011). This complexity makes it particularly demanding for teachers to calibrate the relations between the different course components and to predict how the students will respond to certain calibrations during the enactment of the course. These findings illustrate that portfolio-based designs may have great potential, but no guarantee to support students in active engagement with knowledge.

As shown by our case study, the original portfolio-based design that had been chosen as a way to create a student-centered learning environment did indeed not quite serve that purpose in the way the teacher had intended. He therefore modified the design by extending, reducing, or eliminating different course components in an attempt to create learning environments that were truly tailored to the students’ needs. In that sense, only those components that proved to be relevant for engaging students in the desirable learning activities ‘survived’ throughout the design iterations. This design process was further complicated by the fact that one component could not be changed without having an effect on the other components. For example, the introduction of a final exam changed the relevance of the other portfolio items and thereby the ways students engaged with those items. These reciprocities were not always apparent beforehand, and re-configurations often took the form of pedagogical experiments. These findings are in line with notions of ‘design for learning’ (Goodyear & Dimitriadis, 2013) and show that an important part of creating student-centered learning environments is to experiment and respond flexibly to emerging challenges, rather than to select a student-centered design from a textbook template and then to follow its principles religiously.

Our findings illustrate how important it is for the teacher to maintain a focus on the students in redesigning the course. This focus requires a continuous orientation towards what students need and how they have experienced the previous course iterations. It requires the teacher to experiment with different tasks and social and physical structures, and to critically evaluate whether a specific configuration leads to the desired learning activities among students or not. When some students do not engage with the activities as envisioned, it is important to search for new and more productive design configurations rather than to assume the fault rests with the disengaged students. This student-centered mindset also involves opening up to new and unfamiliar pedagogical practices when it becomes clear that students do not respond to the previous design decisions as expected.

Concluding recommendations

Based on our findings we now outline a number of recommendations for teachers who are interested in taking a student-centered approach and in employing a portfolio-based design. In line with our conclusions above, we do not attempt to provide an ideal design template, but rather wish to highlight relevant aspects that should be taken into account during the design process.

First, designing a portfolio-based course requires more than developing different tasks. It is also important to envision the different activities students are likely to engage in when working on these tasks and what students will need to engage with the relevant disciplinary knowledge. By maintaining the focus on the students’ activities, it becomes possible to envisage and calibrate the elements in the physical and social environment that are required for students to succeed. One may ask questions such as: What resources will students need? What forms of work will help them understand a task? Or, what forms of feedback will be useful at what times during the course? In responding to such questions, it may become clear that certain changes will require adaptations of other elements in order to serve their intended purpose. Especially in the case of portfolio-based designs, we argue, it is therefore important to think of the course components such as the portfolio items and the organized class meetings as an inseparable whole in which any change will have an effect on the other elements.

Second, designing a student-centered course implies that decisions should not be based on what teachers’ know best and which pedagogical approaches they are most familiar with. Instead, it is important to design for the students’ learning, and design decisions should rather be based on what works best for the students. This also implies that challenges should not be formulated in relation to what the teacher finds difficult, but in relation to what students seem to find difficult. We argue that taking such a student-centered mindset makes it easier to identify the core issues in a course design and how they might be addressed productively.

Third, challenges and unexpected student responses during the course should not be shut down or ignored, but instead embraced as important learning opportunities that can help developing the course further towards becoming student-centered. By engaging deliberately with design work and reflecting on the components of each course iteration that worked or did not work, teachers might develop new insights into their own pedagogical conceptions, and into the role their course could play in the wider study program and in preparing students for the workplace.

Finally, new ideas for alternative course components and ways of configurations can be found in the literature on higher education pedagogy and through discussions with colleagues. However, our study has also shown that every ‘ideal version’ of a pedagogical approach or assessment design that can be found in pedagogical literature always needs to be tailored to the concrete context. Concluding, we argue that the sharing of the challenges we encounter in our design work, and how they might be overcome, is an important element of improving the quality of higher education.

Acknowledgements

This study was carried out in the context of the QNHE project (Quality of Norwegian Higher Education; www.qnhe.no), funded by the Norwegian Research Council. We would like to thank the participating students in the study for allowing us to gain insight into their work.

Appendix 1

References

Boud, D. (2000). Sustainable assessment: rethinking assessment for the learning society. Studies in Continuing Education, 22(2), 151–167.

Boud, D., & Falchikov, N. (2006). Aligning assessment with long-term learning. Assessment & Evaluation in Higher Education, 31(4), 399–413.

Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. DOI: https://doi.org/10.1191/1478088706qp063oa.

Clarke, J. L., & Boud, D. (2016). Refocusing portfolio assessment: Curating for feedback and portrayal. Innovations in Education and Teaching International, 0(0), 1–8. DOI: https://doi.org/10.1080/14703297.2016.1250664.

Darling-Hammond, L., & Snyder, J. (2000). Authentic assessment of teaching in context. Teaching and Teacher Education, 16(5), 523–545. DOI: https://doi.org/10.1016/S0742-051X(00)00015-9.

Dysthe, O., & Engelsen, K. (2004). Portfolios and assessment in teacher education in Norway: a theory‐based discussion of different models in two sites. Assessment & Evaluation in Higher Education, 29(2), 239–258. DOI: https://doi.org/10.1080/0260293042000188500.

Dysthe, O., & Engelsen, K. S. (2011). Portfolio practices in higher education in Norway in an international perspective: macro‐, meso‐ and micro‐level influences. Assessment & Evaluation in Higher Education, 36(1), 63–79. DOI: https://doi.org/10.1080/02602930903197891.

Goodyear, P., & Dimitriadis, Y. (2013). In medias res: reframing design for learning. Research in Learning Technology, 21(1), 19909. DOI: https://doi.org/10.3402/rlt.v21i0.19909.

Lam, R. (2014). Promoting self-regulated learning through portfolio assessment: testimony and recommendations. Assessment & Evaluation in Higher Education, 39(6), 699–714. DOI: https://doi.org/10.1080/02602938.2013.862211.

Land, S., & Jonassen, D. (Eds.). (2012). Theoretical Foundations of Learning Environments (2 edition). New York: Routledge.

Meeus, W., Van Petegem, P., & Van Looy, L. (2006). Portfolio in higher education: Time for a clarificatory framework. International Journal of Teaching and Learning in Higher Education, 17(2), 127–135.

Paulson, F. L., & others. (1991). What Makes a Portfolio a Portfolio? Educational Leadership, 48(5), 60–63.

Tigelaar, D. E. H., Dolmans, D. H. J. M., Wolfhagen, I. H. A. P., & van der Vleuten, C. P. M. (2005). Quality issues in judging portfolios: implications for organizing teaching portfolio assessment procedures. Studies in Higher Education, 30(5), 595–610. DOI: https://doi.org/10.1080/03075070500249302.

Tillema, H. H., & Smith, K. (2000). Learning from portfolios: Differential use of feedback in portfolio construction. Studies in Educational Evaluation, 26(3), 193–210. DOI: https://doi.org/10.1016/S0191-491X(00)00015-8.

Yancey, K., & Weiser, I. (Eds.). (1997). Situating Portfolios: Four Perspectives. Logan: Utah State University Press. Retrieved from https://digitalcommons.usu.edu/usupress_pubs/118.

Zeichner, K., & Wray, S. (2001). The teaching portfolio in US teacher education programs: what we know and what we need to know. Teaching and Teacher Education, 17(5), 613–621. DOI: https://doi.org/10.1016/S0742-051X(01)00017-8.

1QNHE: Quality of Norwegian Higher Education; For more info: www.qnhe.no
2Evaluation reports and course plans of this course can be accessed via: https://kvalitetsbasen.app.uib.no/, under BIO201

Idunn bruker informasjonskapsler (cookies). Ved å fortsette å bruke nettsiden godtar du dette. Klikk her for mer informasjon