Learning to love assessment
A saying goes around in academic circles: “I teach for free; they pay me to grade.” There is truth in that — teaching is often fun, and grading is usually tedious. But as every scientist knows, there is no point in doing an experiment if you don’t have a way to assess the result. So assessment is a crucial step in teaching and learning.
An enormous literature exists on assessment. And yet, reviewing materials from different sources, I see far too many exam items that simply test recall, a low-level and increasingly obsolete skill.
I recall the first time I was faced with an exam that wasn’t simple regurgitation of facts. Like many biochemistry students, I had memorized the Krebs cycle: reactants and products, enzymes, coenzymes and effectors. But when I walked into the exam, I was presented with a page that outlined all that information. I thought, “But everything I know is on this piece of paper — what else can they ask me?” Then I realized that the information was just the starting point; the real test was understanding it in context and being able to apply that understanding
Throughout my teaching career, I sought that moment of realization for my students. I used to tell them that every exam was a learning experience; one student told me that she wished there was not so much learning on my exams.
Assessing collaboration
I found that figuring out what I wanted students to learn and how to measure that learning is an essential and creative activity. I experienced an epiphany when I realized that if I wanted students to value collaboration, I would have to include teamwork on exams in some way — if something doesn’t count toward a grade, many students believe it has no value. Of course, it is one thing to decide what you want to measure, another to design an instrument to do the measuring.
I first heard about pyramid testing from a group of math faculty at Smith College. They gave their exams in three parts (individual, small group, full class), with a diminishing number of points for each part. I tried that but found it took enormous amounts of class time and the students’ enthusiasm diminished as the points went down. My goals and my testing scheme then went through multiple iterations.
I finally settled on giving a take-home individual exam followed by a class period spent discussing the exam in small groups and ending with giving the students an opportunity to revise one section of the exam after this discussion. These group discussions had a greater level of engagement than any other classroom activity I assigned, which I saw as a first step toward recognizing and rewarding collaboration.
Similarly, my colleagues Don Elmore and Adam Matthews and two of their students, Valentina Alvarez and Julie Bocetti, developed an assessment of discussion-based activities in introductory courses. Their approach melds quantitative survey and qualitative student response data to determine whether the activities achieve the goals of building student community and perseverance.
Over the years, I began to think that constructing an exam was as intellectually stimulating and time-intensive an activity as writing a research grant proposal, and it had much the same goal: testing a hypothesis.
Authentic assessment
Assessments often have been used as a tool, especially in K-12 education, by politicians, school boards and accrediting agencies to reward and punish teachers as well as students. Whatever the original worthy goals of these bodies in setting standards, many frontline educators see such assessments as tests imposed by agencies external to the school and classroom. Just using the word “assessment” can get faculty up in arms. The same principles of assessment should apply to teachers and institutions as to student learning. Defining goals is the first step; figuring out how to assess whether you have met them is the next.
The American Society for Biochemistry and Molecular Biology has developed a set of requirements for biochemistry and molecular biology undergraduate programs and outlined a method to assess their success in meeting those goals toward accreditation. Resentment can arise when standards are set by outsiders, but our goals and rubrics are the result of discussion and sometimes argument among our members, so the assessment tools feel authentic.
The economist James Gustave Speth wrote, “We tend to get what we measure, so we should measure what we want.” He was talking about economic growth and environmentalism and how we get sidetracked by measuring what is measurable rather than what we really care about, but the statement is true of all kinds of assessment. We need to figure out what we want students to learn as well as what we value about our institutions and programs and then set about developing ways to measure how close we come to achieving our goals.
Jenny Loertscher made valuable contributions to this essay.
National assessment projects
Several national organizations are working to set academic goals and ways to assess student progress toward those goals.
The Association of American Colleges and Universities developed the Liberal Education and America’s Promise, or LEAP, rubrics with broad input from faculty across disciplines. They measure learning outcomes in several areas, including quantitative literacy, teamwork and ethical reasoning. Training on the rubrics ensures fidelity of evaluation of student work.
More directly relevant to biochemistry and molecular biology is the BioSkills guide. These rubrics were designed to align with the learning goals of Vision and Change, the American Association for the Advancement of Science report on transformation of biology education. Another project is the move toward specifications grading, an assessment system based on students’ arrival at defined learning outcomes, by some in the chemistry community.
The committee that developed the ASBMB certification exam recently published a paper on its origins and evolution. I urge all instructors to read that paper and look at sample questions on the ASBMB website, even if your program is not yet accredited. The questions on the exam test conceptual knowledge yet are amenable to large-scale evaluation. The committee’s work writing questions, editing, developing grading rubrics and evaluating answers not only has created an exam that tests what we value in a biochemistry program but also has created a community of educators who have a deep understanding of authentic assessment.
Enjoy reading ASBMB Today?
Become a member to receive the print edition four times a year and the digital edition weekly.
Learn moreFeatured jobs
from the ASBMB career center
Get the latest from ASBMB Today
Enter your email address, and we’ll send you a weekly email with recent articles, interviews and more.
Latest in Careers
Careers highlights or most popular articles
Upcoming opportunities
Just added: Register for ASBMB's virtual session on thriving in challenging academic or work environments.
Who decides when a grad student graduates?
Ph.D. programs often don’t have a set timeline. Students continue with their research until their thesis is done, which is where variability comes into play.
Upcoming opportunities
Submit an abstract for ASBMB's meeting on ferroptosis!
Join the pioneers of ferroptosis at cell death conference
Meet Brent Stockwell, Xuejun Jiang and Jin Ye — the co-chairs of the ASBMB’s 2025 meeting on metabolic cross talk and biochemical homeostasis research.
A brief history of the performance review
Performance reviews are a widely accepted practice across all industries — including pharma and biotech. Where did the practice come from, and why do companies continue to require them?
Upcoming opportunities
Save the date for ASBMB's in-person conferences on gene expression and O-GlcNAcylation in health and disease.