Using Student-Centered Assessment to Enhance Learning
Essays on Teaching Excellence
Toward the Best in the Academy
Vol. 17, No. 8, 2005-2006
For many years now, postsecondary educators have utilized a variety of student-centered learning methodologies to enhance student learning. (DeBoer, 2002; Norte, 2005; Scott & Buchanan, 1998). Unfortunately, many instructors who incorporate these approaches often use assessment methods designed for traditional teaching. Research shows, however, that assessment methods should also be student-centered (Ma & Zhou, 2000). Indeed, one can hold students to higher performance standards when they play a role in establishing assessment criteria that are clear and reasonable (Shepard, 2000).
To be considered student-centered, the assessment technique should directly involve students in examining their own cognitive development by having them focus on learning first and the grade second (Pedersen & Liu, 2003). Strategies should be engaging and interactive while incorporating sharing, trusting, teambuilding, reflecting, helping, and coaching (Pitas, 2000). Thought must also be given as to whether student-centered assessments are individual, team-based, or a combination of the two.
As examples of student-centered assessment, let us look at how two courses have been implemented. In the course Introduction to Hospitality and Tourism Management at Paul Smith’s College (Jacobs, La Lopa, Sorgule, 2001) and the senior capstone course Tourism Business Feasibility Studies at Purdue University (La Lopa, 2004) students were required to utilize a team-based structure in the classroom to comprehend, synthesize, apply, and evaluate the course content while developing a tool to help establish their own grades.
The student teams in the introductory course developed two summative assessment tools during the semester (as part of a larger, overall assessment strategy which the teacher had planned). They created and implemented an exam which was used to evaluate their overall learning in the course as well as a tool to evaluate the contributions made by everyone involved. The instructor introduced students to the basics of legitimate exam development, various examination models, and test question development exercises. When the exams were administered, a rich dialogue regarding course materials and content ensued. The students enjoyed the challenge of the assignment so much that they requested the chance to develop another one.
In the capstone course the student teams worked on a semester-long feasibility project for the local Convention and Visitors Bureau to create a marketing campaign designed to attract tourists to the area. The student teams presented their ideas to a panel of industry experts at the end of the semester. The presentations were then graded by the panel members using an assessment tool created by the students. In creating the tool, students were required to work through all six of Bloom’s (1956) cognitive domains to learn about and develop higher order thinking.
In another team-based approach, students read any assigned chapters they like as long as all are covered by the team. Members from each team are then asked to present their understanding of the chapter to the rest of the class, in any creative or traditional way they like, so that the students and instructor can openly assess what was learned. The instructor uses this feedback to determine whether or not the students have read the chapter, synthesized the information, and applied what was learned in the context of their assignment.
One of the challenges of adopting student-centered assessment strategies is students’ misconception about what it is (Hewitt-Taylor, 2001). The teacher needs to orient students to the learning and assessment methodologies so that they understand the reason student-centered approaches are being used with intended benefits clearly articulated. In addition to getting students excited about innovative education, this helps deflate the common complaint that the teacher is not teaching anything, in the traditional sense.
Another challenge is related to whether or not one’s peers are engaging in innovative assessment approaches as well. “Lone wolf” reformers may experience more difficulties than successes. According to Huba & Freed (2000), the lone reformer may wind up distancing him or herself from the other faculty members; confusing students who have their own ideas about how they should be taught; or finding out that student-centered assessment is harder than previously imagined, especially when implementing it for the first time. With regard to student challenges, there were some students who took advantage of the assignment by not doing their fair share of the work. Also, the Purdue students balked initially at being required to develop the assessment tool, mostly because they had never done it before.
A final challenge is addressing the concerns of those who contend that taking class time to develop student-centered assessment somehow detracts from the content of the course. If done properly, however, class time development is a necessary prelude to deeper learning. Indeed, a greater danger is not spending enough time. Such under-preparation can cause frustration and disorientation among students who are accustomed to organized learning environments (Brush & Saye, 2001).
The best advice for those who might want to create a student-centered assessment is simply to pilot one. One way to approach it is to use a continuous improvement tool developed by Deming (1982) known as Plan, Do, Study, Act (PDSA). A meta-assessment tool, PDSA helps determine if a given assessment is effective. With PDSA, the teacher must first Plan the means by which to develop a student-centered assessment activity and reflect upon its potential benefits to the student. The next step is to Do the assessment exactly as planned. Once the assessment has been piloted the next step is to Study whether or not it provided the intended benefits to the students. The teacher then Acts on what was learned in the Study step before using it again. For example, if the student-centered assessment activity delivered the intended benefits to the students, the teacher might implement it again for the next class, or add another one like it to the assessment repertoire. Conversely, if the pilot version had problems, the teacher would make the appropriate adjustments before implementing it a second time via Plan, Do, Study, Act.
The second and third pieces of advice are to inform the department chair and to identify others in the department who already incorporate or who might be open to experimenting with similar methods. Meeting with the chair to explain the reason for adopting student-centered assessments may help him/her defend (or so it is hoped) the newfound pedagogy should colleagues and students question or complain about the change. Identifying peers on campus, or even on other nearby campuses who employ similar pedagogies, will also provide a sounding board and a support network to sustain one’s efforts to put students increasingly in charge of assessing, and thus guiding, their own learning.
Bloom, B. S. (1956). Taxonomy of educational objectives: The classification of educational goals. Handbook 1: Cognitive domain. New York, NY: McKay.
Brush, T., & Saye, J. (2001). The use of embedded scaffolds with hypermedia-supported student centered learning. Journal of Educational Multimedia and Hypermedia, 10(4), 333-356.
DeBoer, G. E. (2002). Student-centered teaching in a standards-based world: Finding a sensible balance. Science & Education, 11, 405-417.
Deming, W. E. (1982). Out of the crisis. Cambridge, MA: Massachusetts Institute of Technology, CAES.
Hewitt-Taylor, J. (2001). Self-directed learning: Views of teachers and students. Journal of Advanced Nursing, 36, 496-504.
Huba, M. E., & Freed, J. E. (2000). Learner-centered assessment on college campuses. Needham Heights, MA: Allyn and Bacon.
Jacobs, J. M., La Lopa, J. M., & Sorgule, P. (2001). Pilot-testing a student-designed team exam in an introduction to hospitality and tourism education course. The Journal of Hospitality and Tourism Education, 13, 113-120.
La Lopa, J. M. (2004, Winter). Developing a student-based evaluation tool for authentic assessment. In M.V. Achacoso & M. D. Svinicki (Eds.) Alternative Strategies for Evaluating Student Learning (pp. 31-36). New Directions in Teaching and Learning, 100.
Ma, J., & Zhou, D. (2000, May). Fuzzy set approach to the assessment of student-centered learning. IEEE. Transactions on Education, 43, 237-241.
Norte, M. B. (2005). Self-access study and cooperative foreign language learning through computers. Linguagem & Ensino, 8, 145-169.
Pedersen, S., & Liu, M. (2003). Teachers’ beliefs about issues in the implementation of a student-centered learning environment. Educational Technology Research & Development, 51(2), 57-76.
Pitas, P. A. (2000, Winter). A model program from the perspective of faculty development. Innovative Higher Education, 25, 97-110.
Scott, J., & Buchanan, B. (1998). Student centered learning in a large, first year management class: History, reflections, and future directions. VITAL Event Day University of Waikato (177-200). Hamilton, New Zealand.
Shepard, L. (2000). The role of assessment in a learning culture. Educational Researcher, 29(7), 4-14.
Joseph “Mick” LaLopa (Ph.D., Michigan State University) is associate professor in the Department of Hospitality and Tourism Management at Purdue University.
This publication is part of an 8-part series of essays originally published by The Professional & Organizational Development Network in Higher Education. For more information about the POD Network, link to http://www.podnetwork.org.