An end to grade grubbing? Novel assessments help engineering students gain confidence and competence by focusing on what they need to learn—not their GPAs.
Inspiration can strike at the darnedest times. For Matthew Siniawski, associate professor of mechanical engineering at Loyola Marymount University in Los Angeles, the aha! moment arrived while reviewing his daughter’s kindergarten report card at a parent-teacher conference. Instead of comments or scores, the document listed the various skills pupils should develop over the year, with checkmarks indicating which ones his child had met, excelled at, or still needed to reach.
“Wow, this is really kind of informative!” Siniawski recalls thinking. He looked for a higher-education equivalent of this increasingly popular K-12 assessment model, but found little. Thus began his quest to introduce standards-based grading (SBG) to undergraduate engineering education—an odyssey that started with a revamped sophomore-level mechanics and materials course in 2011 and recently resulted in a two-year, $249,000 National Science Foundation grant to establish and evaluate such systems with three other investigators from vastly different engineering schools. Early evidence suggests this new assessment method spurs motivation, confidence, and professional abilities along with deeper learning and fewer complaints about grades, say the researchers, who plan to hold a workshop on the topic at the ASEE annual conference in New Orleans in June.
Faculty have fretted about gauging students’ progress ever since Yale’s president rolled out what may have been the world’s first university grading system in 1785. As the standards movement took hold in K-12 classrooms over the past two decades, teachers began retooling lessons around specific learning outcomes and measuring proficiency levels. Yet except for a handful of pioneers—notably Alverno College and the University of Wisconsin—that have ditched grades or created degree programs based on competencies rather than credit hours, most of higher education still clings to the traditional formula: Sum the scores from multiple exercises and then calculate the final mark on a predetermined scale.
Trouble is, the resulting letter or numerical grade reveals nothing about whether students nailed the objectives outlined in the syllabus, only that they performed at a certain level on separate assignments. “A teacher never gets a handle on what students do well on, and what they don’t do well on,” explains Heidi Diefes-Dux, a professor of engineering education at Purdue University who teaches a large first-year introductory course and is one of the NSF study’s co-investigators. Students, she adds, are equally at sea, knowing only that they flamed out on a homework assignment but not what it was supposed to teach them.
Backwards Course Design
SBG flips the focus from scores to skills, with ample time to build them. Instructors begin by defining what they want students to learn and be able to do—such as follow the engineering design process or create a MATLAB plot that includes clear labeling. All activities, labs, group projects, and assessments are then mapped to those objectives, with a rubric outlining what approaching, meeting, or exceeding them entails. This allows instructors to provide targeted, meaningful feedback while producing a fairer, more transparent grading process that encourages learning, regardless of the class’s overall performance. Students get a clear idea of what they’re supposed to glean from each assignment and what quality work looks like. They can go at their own pace without fear of tanking their GPA if circuits or another core concept doesn’t instantly click. And since individual performance is measured over time, differences in levels of preparation cease to be a factor. In essence, students shift from obsessing about earning A’s to understanding the material.
This method of “backwards course design” can help instructors “get a handle on those design and project grades that tend to be more subjective,” says SBG co-investigator Sara Atwood, an assistant professor of engineering and physics at Elizabethtown College in Pennsylvania, who developed an SBG system for the year-long introduction to engineering design sequence in 2012. Learning objectives focused on such skills as CAD, fabrication techniques, and effective written and oral communication, with weekly exercises and individual quizzes to assess progress, culminating in a final group project each semester. “There was a lot of conversation about student outcomes, but we hadn’t made that link explicit to students,” she explains.
SBG “forces you to develop assignments that are relevant to the learning process and make sure things match what you’re trying to measure,” agrees Siniawski, who teaches a variety of courses, including introduction to engineering, machine design, and senior capstone. He recently tweaked the syllabus and now gives out detailed grading rubrics, so students are clear about what learning goals are being assessed. He also counts only the most recent grade received for each outcome over the semester, allowing students to recoup from initial stumbles. “We give them more time to learn something and demonstrate mastery,” says Siniawski. “Isn’t that what we want?”
That question cuts to what could be an underlying philosophy of SBG: The point of teaching—and therefore grading—is to develop talent, not to distinguish high from low achievers. As Thomas Guskey, a University of Kentucky education professor, elaborates in a 2011 paper, educators who aim to develop talent first clarify what they want the class to master and then “do everything possible to ensure that all students learn those things well.”
Clarity Builds Confidence
Kylee Burgess, a second-year engineering student at Arizona State University’s Polytechnic School, “definitely felt the difference” between her conventional, lecture-based classes and assistant professor Adam Carberry’s SBG use-inspired design course this past fall. In traditionally graded sections, she thinks, “How can I get this assignment done, just to have it over with and get the best grade on it?” By contrast, with SBG, “even if your project doesn’t go as planned, you’re not going to fail the class or make your GPA a disaster.” Nina Lepp, a Loyola Marymount senior majoring in mechanical engineering who took Siniawski’s first- and second-year design courses and now has him for her capstone project, “liked the clarity of knowing what was expected of me.” The abilities and confidence that SBG nurtured, she adds, “solidified” her decision to study engineering.
Research suggests that switching to this nontraditional form of assessment can bring the same benefits to engineering education that have been documented in K-12 settings. In a 2014 ASEE paper examining the impact of SBG, Carberry, Siniawski, and Atwood reported that cornerstone design students perceived significant growth in their ability to conduct a variety of engineering design tasks. The open-ended, applied nature of project-based courses make them “great fits” for this system, the authors concluded. A 2012 study by Siniawski, Carberry, and Loyola Marymount computer science professor John David Dionisio found that 89 percent of students thought SBG was more conducive to learning than traditional summative scores, and 86 percent preferred it.
SBG seems particularly well-suited to supporting international and nontraditional students. Elizabethtown senior Martin Fevre, a soccer standout concentrating in mechanical engineering, credits SBG’s individual feedback and emphasis on presentation skills with improving his ability to write and speak English—which he barely could do upon arriving from France in 2012. His focus has shifted from “you need to get this done” to “you need to know this,” and he’s now keen to attend grad school. “I spend a lot more time doing weekly assignments—mostly homework—and a lot less time [studying for] for quizzes and exams,” says Fevre. At the end of the semester, “you can go through all the learning objectives of the course and see how far you’ve come.”
Design Courses Redefined
If SBG seems like more work for instructors, it is, acknowledges SBG investigator Carberry, who teaches back-to-back foundational courses in ASU’s second-year engineering design sequence. “But it benefits the students,” he adds. “The [traditional] grading system complements a pedagogy [in which] students leave the class wondering what they’re learning.”
In revamping the intro course, Carberry identified “a small set of skills” for students to develop over the semester, such as the ability to create a theoretical model, and then developed a zero-to-4-point scale to show what aspects they were struggling with. Three mini-projects culminate in a design project, giving students “opportunities to practice, apply, try, and fail—and not have it affect their final grade too much,” says Carberry. This year, he implemented a rubric for each assignment, pinpointing what a top score of 4 would require. “If you don’t learn from mistakes, yeah, then your grade will suffer in the end,” he cautions newbies, adding that most end up doing quite well.
Randi Taylor, a 2015 graduate now earning a Ph.D. in mechanical engineering at the University of Maryland, College Park, offers proof that SBG can work. The mother of four entered ASU’s engineering program with a bachelor’s in linguistics from Rice University and seven years as a social worker. Walking into her first class as a second-year student, she felt sick to her stomach. “I felt like I was coming in with bigger deficits than others had, not being familiar with the terminology or the design process,” recalls Taylor, adding that all the formulas were new and electrical concepts “didn’t make a bit of sense.” Then, during a project to build a multimeter, things suddenly clicked. “Oh, my gosh! I actually understand what’s going on,” she realized. When her team “just crashed and burned” on another project—their robot worked fine outside but never inside the maze because its sensors were in the wrong place—she discovered that setbacks in one area didn’t mean overall failure. “It was a learning experience throughout; I had documentation of what I could do,” says Taylor, who earned an A+ in the class. Rather than dread the required electrical class that followed, she decided, “OK, I’m just going to learn a lot this semester. That was a turning point for me.”
Like the engineering design process, there’s no one correct way to implement SBG. Diefes-Dux’s expectations for students learning MATLAB, for example, include being able to create plots suitable for technical presentation, meaning with a reasonable title and axes appropriately labeled. In computer-assisted design, Elizabethtown’s Atwood wants her 50 freshmen to know how to create a part, dimension it properly, put multiple pieces together, and do a drawing with certain characteristics. Students can retake a quiz on any skill, with final projects such as reorganizing a laundry closet or designing a community garden counting the most. “It helps students to spend time where they need to,” she says. Atwood also has upped the writing—“something we really have to work on,” she says—starting with documenting the taking apart of a toaster in the first lab and encompassing four presentations plus one-minute talks about what engineering is. ASU’s Carberry gives only one quiz a year and no final, preferring written and oral reports. To promote critical inquiry versus “plug and chug,” he gives students a document with six equations—but only two or three are needed to solve the problems.
Loyola Marymount’s Siniawski modified the syllabus of his senior capstone design course around a few learning outcomes, each with several core competencies. He includes whatever ABET student outcomes an assignment covers, and gives out grading rubrics showing how each ability or skill is being assessed. The same skills—for instance, the ability to analyze—crop up in different assignments, providing snapshots of each student’s progress. The course grade is based on the most recent grade for the outcome. “It eliminates the time pressure and doesn’t penalize the slower learner,” says Siniawski. Instructors from industry who teach the senior capstone course say SBG “is exactly what we do” as professional engineers, “so it’s really relevant to what students need to know in the workplace,” he concludes.
For Purdue’s Diefes-Dux, the impetus for SBG grew out of a first-year engineering course decision to go completely paperless, forcing faculty to figure out how they could explain grades to students without “scribbling” on their papers. “It was a practical issue but also an opportunity to change the grading system,” she says. “We’ve always had learning objectives in front of the class, but we hadn’t been assessing them.” SBG’s progress is measured in “baby steps,” starting with meatier homework problems, deep discussion about defining objectives, and setting proficiency benchmarks.
For students, the new grading system can be confusing at times, says ASU grad Taylor. “Things didn’t always add up.” For example, a 2 on one assignment and 3 on another didn’t average to a grade of 2.5. “Students are sort of frustrated that every class uses something different,” says Carberry, who creates visuals to help students compare traditional and standards-based grades. Siniawski’s newbies typically ask if an early poor mark will “count” toward their final grade. “Keep working,” he reassures. “Your next grade will override it.”
SBG can be a labor saver for faculty in some respects. Because he’s already assessing student outcomes, it takes Carberry no more than 10 minutes to extract, cut, and paste data for an ABET report. Siniawski finds SBG a boon when writing job recommendations, since he can pull details on the specific areas in which a student excelled. Another plus: Fewer students ask him why they got a B and not an A on an assignment. “The conversation changes from ‘why did you take this point off’ to ‘gosh, I don’t know this very well,’” confirms Diefes-Dux.
For SBG enthusiasts, such merits outweigh the work required to convert those standards-based grades to the university’s transcript standard. Diefes-Dux does so by tallying points generated from the assessment of learning objectives for the two projects, 10 homework assignments, and three exams her students complete. Carberry must award letter grades, but counts the final project—building an exhibit for the local science museum—as 50 percent of the total, helping him streamline grading.
Challenges abound, however. These include a dearth of trained graders among teaching assistants, whose turnover is high. Diefes-Dux finds herself “writing a rubric of some kind every year” and wrestles with how to state clear objectives and train staff with the needed precision. Then there are course-management systems, which at some schools can make it hard for instructors and students alike to track trends in performance.
SBG proponents hope their research will persuade other engineering faculty to adopt the practice. Siniawski believes the NSF grant will “add a lot of credibility” to the team’s efforts to expand the assessment model to other colleagues. Ideally, says Diefes-Dux, the whole campus would embrace SBG, so students could see how they were doing across all their classes.
Impossible? Consider how rapidly such innovations as flipped classrooms and massive, open online courses (MOOCs) have spread. Meanwhile, engineering educators looking for proof of concept might find Loyola Marymount senior Nina Lepp’s experience persuasive. Still months from graduation, she already has landed a job in propulsion design at Boeing, not far from her Seattle home. “You carry yourself very well,” the interviewer told her. Lepp attributes that to the multiple presentations required in Siniawski’s course, where what counted was steady progress—not how she performed each time.
By Mary Lord
Mary Lord is deputy editor of Prism.
Design by Michelle Bersabal
Images Courtesy of Purdue University School of Engineering Education and Loyola Marymount University