Creating New Pathways for Students into STEM

Note: My blog = my viewpoints and opinions, not necessarily those of my employer.

One key issue preventing broader participation in STEM (science, technology, engineering, and mathematics) is that our nation’s two-year and four-year institutions are, relative to each other, respectively oversubscribed and underutilized along certain demographic groups. For example, there is a robust system of community colleges in the Los Angeles area that serve approximately half a million students annually, most of whom are Hispanic/Latin@. Compare that to the 6,000 students at the consortium of colleges and universities where I currently work and where students of color are underrepresented.

Many two-year institutions are Hispanic Serving Institutions (HSIs) and many of their students are the first in their families to go to college. Unfortunately, because the demand for STEM courses is so great and counselors have enormous case loads, the time to transfer to a local state school can be very long. I spoke to a colleague at a local community college who told me that the average time for students at his institution to transfer to a California State University or University of California is about 6 years! This long time to transfer is one of the most important reasons why the rate of successful transfer is low. And, this long time also affects student engagement and confidence in their ability to transfer and complete a four-year degree in STEM.

Across the U.S., most highly-selective four-year colleges and universities typically don’t accept many transfer students because (1) specialized core curricula at these schools make it difficult to transfer in community college coursework and (2) they have not equipped themselves with the capacity to work with students with different lived experiences than their majority populations. (Don’t forget that these institutions are typically PWIs–predominantly White institutions.) Yet, many of these community college students could thrive in these kinds of four-year institutions because of their talent and resilience.

I teach at one of these highly-selective four-year institutions, Harvey Mudd College (HMC). It’s a STEM-focused liberal arts college with a highly specialized common core curriculum consisting of courses in mathematics, biology, chemistry, physics, computer science, and engineering. Though we’ve has made great strides in diversifying our student body over the last decade, particularly in reaching gender parity, we still have a long way to go. African-American students make up only 5% of the student body, and Hispanic/Latin@ students roughly 20%. Both of the points in the previous paragraph apply to us. Because of our common core curriculum, it is very difficult for students to transfer in from two-year institutions. We typically get one or two students that transfer to HMC from other selective four-year institutions each year–that’s it. And, we still have lots of work to make HMC a place that is welcoming and inclusive for everyone.

In this post, I’d like to propose an idea for increasing the number of historically underrepresented students successfully completing STEM degrees. This idea takes advantage of the fact that in many metropolitan areas of the United States, oversubscribed two-year institutions and underutilized selective four-year institutions are often in close proximity to each other.

Idea: Create alternative pathways for STEM-interested students at two-year colleges to selective four-year institutions to transfer in as accelerated first-year students instead of as third-year students, which is the more typical pathway. This alternative pathway will result in (1) higher rates of successful degree completion, (2) faster time to degree completion, and (3) lower overall cost to students.

First, let me be absolutely clear that I do not propose this plan because I think that four-year institutions are innately better at two-year institutions at preparing students for STEM careers. I need to own my four-year-college privilege here. Yes, it is probably true that on average four-year institutions have more resources than nearby two-year institutions, but I am not making any claims about the quality of STEM teaching and learning at two-year institutions. I know plenty of extremely dedicated and talented faculty at two-year institutions who would teach circles around us privileged four-year college people. The argument that I am making here has to do with capacity, not quality.

For this idea to work, like-minded two-year and four-year institutions would need to get together to work out an articulation agreement: a sequence of courses at the two-year institution that would help students prepare for admission to the corresponding four-year institution. Students with an interest in STEM and in being at a four-year institution would be advised early on to enter in this track of courses. The four-year institution would agree to mentor and work with these students to prepare them to apply to their institution, but also to other similar four-year institutions. The four-year institution would also commit to giving generous financial aid packages to students with financial need.

Other bells and whistles to make this plan more compelling: (1) early research experiences, (2) mentoring and community, (3) cross-enrollment at the four-year institution.

(1) Imagine that this two-year sequence of courses also includes, at the end of the first year, a paid summer research experience at the four-year institution. There is lots of research that early research experiences in STEM have all kinds of benefits for students, including increased likelihood of graduation, attainment of an advanced degree, etc. In addition, this arrangement would allow faculty from the four-year institution to get to know students in this accelerated transfer pathway. And the students in this accelerated transfer pathway get to see what the environment is like at the four-year institution.

(2) There could be all kinds of interesting mentoring and community-building opportunities for students at both institutions. Joint research symposia, travel to the SACNAS annual conference or discipline-specific conferences, and social gatherings would be great ways for students to learn more about each others’ lived experiences. What if students from both institutions form a club to mentor local area high-school students participating in robotics competitions? Or a math club? How cool would that be?

(3) If the two-year and four-year institutions are close to each other, it might also be possible for students in the accelerated pathway to take courses at the four-year institution. This is especially helpful if there are certain courses that the two-year institution doesn’t offer that the four-year institution does. (For example, computer science is a booming area right now and there is great demand for those courses.) If a student eventually transfers to the four-year institution, then that student wouldn’t need to take that course and can accelerate on to more advanced courses. And again, this cross-enrollment strategy is another opportunity for the two groups of people to get to know each other.

If at the end of this accelerated transfer program a student decides she isn’t interested in going to a highly-selected four-year institution, she could still continue at the two-year institution so as to transfer as a junior to a larger state school.

I’m not suggesting that all or even most STEM-interested students at two-year institutions should transfer to highly-selective four-year institutions as first-year students. I am merely advocating for there to be more options and opportunities made available to them. Most students at two-year institutions aren’t even aware of the opportunities that exist at highly-selective four-year institutions. Cost of attendance is often misunderstood to be deal-breaker, whereas the reality is that many schools are competing with themselves over a relatively small pool of talented students of color who are interested in applying to them. And, there are lots of schools that have the means to offer generous financial aid packages.

This accelerated transfer pathway idea address the problem of “scale up” in two ways. First, the idea is relatively easy to replicate because there are many other selective four-year institutions around the U.S. that are underutilized relative to nearby two-year institutions. The resources that are required to keep this pathway operating are relatively modest. The financial aid resources required is another thing, but I’ll get to that later. Second, it is much easier for four-year institutions to forge alliances with surrounding two-year institutions than with high schools because of the sheer number of high schools and the fact that high school counselors and administrators turn over more quickly.

Finally, another important feature of this idea is that it would truly increase the number of students of color and first-generation students into STEM disciplines rather than poach them from one program to another program. Other efforts focused on reaching talented students in high schools through summer programs and the like are important but in many cases attempt to reach students already considering attending four-year institutions directly after high school. In contrast, many community college students don’t think of transferring to these kinds of institutions. So, this program could truly broaden participation.

To get this to grow organically, a two-year and four-year institution would start the program on a small scale. If it goes well, then they could create a network of schools in that area who would cooperate together. They would need careful and systematic program evaluation to identify whether the program is working, what specific pieces of the program make it work well, and what could be improved. Over time, these institutions would share their work with others and help them start similar programs around the country.

One fly in the ointment is that if this were truly to scale up, highly-selective four-year institutions would need to translate their desires for greater diversity and inclusion into cold hard cash–financial aid cash. Many colleges and universities, HMC included, need to do a better job of increasing the number of low-income students that they admit. I think there is a growing collective will to do this. Malcolm Gladwell’s Revisionist History podcast has been highlighting the issue of money in higher education in episodes 4, 5, 6. And here’s a great article in the Chronicle of Higher Education about how some schools are trying to increase the economic diversity of their student bodies.

If you know of other institutions already doing things like this, I’d really like to hear about it. Please let me know your comments too. Would this work in your area or for your institution? And, if you’re with granting agency (like the NSF), I’d love to know if this sounds like a fundable idea.


Designing the Morning Math Course for the IAS/Park City Mathematics Institute Teacher Leadership Program

I’ve had the tremendous honor of attending the IAS/Park City Mathematics Institute (PCMI) every summer since 2003. I’ve gained a “math camp family” that I treasure dearly. I have learned so much from the people that I’ve met. It’s affected my career in major ways*. I can’t say enough good things about it. If you’re a math person of any flavor, you should definitely try to attend if you’ve never been.

Since 2008, I’ve also had the honor of co-facilitating the morning math course for the Teacher Leadership Program (back then known as the Secondary School Teacher Program or High School Teacher Program). Side note: You should watch this awesome video about TLP. Our work has resulted in a series of books published through the American Mathematical Society.

book series 1-5

Bowen Kerins has been my main co-conspirator in this course, but in 2008 I also worked with Ben Sinwell. The work that we do for this course is really intense while we’re at PCMI, but it’s one of my favorite teaching experiences.

Many people who’ve experience the course are curious about how we design and implement the course. So, this post is meant to give a bit of insight about our ridiculous process.

First, a bit about the course itself. PCMI is a three-week residential program. There are usually 14 or 15 class meetings, each about two hours long. The participants in the program are elementary, middle, and high school teachers. There is always a different research theme for PCMI each year, and this math course for the Teacher Leadership Program tries to connect to that research theme in some way. In 2016, the research theme for PCMI was big data, and the mathematics content for our course had to do with probability, games, and Markov Chains. We have the luxury that we don’t need to follow any specific math content standards–we get to teach a course on whatever we want and we don’t need to assess the learning in any formal way.

The math course is designed to help teachers deepen their mathematical content knowledge, to experience authentic mathematics in a collaborative environment, and to feel joy and wonder while doing mathematics. We want teachers to walk away with an understanding of how mathematics is socially co-constructed so as to make sense of the world around us and solve problems. We want teachers to experience the beauty and power of mathematics.

During the two-hour class itself, Bowen and I do very little talking. We start the course by passing out a problem set and we spend 95% of the time walking around, taking notes, answering questions (obliquely), encouraging people to work well with each other. Teachers work together in groups of five or six at tables; each table has a table facilitator (a past participant that has been invited back to serve in this capacity). Sometimes we end class with a short (~5 minute) wrap-up. These wrap-ups usually consists of a carefully curated and sequenced set of observations from participants in the room. We only allow participants to share about a mathematical observation about a problem to the whole class if we’re sure that everyone has already worked on that problem and had the opportunity to make that observation. The wrap-up is usually orchestrated so as to help participants make connections between different mathematical ideas or representations.

Typically, participants work on the problem set non-stop during the entire class. They are often disappointed about having to stop at the end of class. But, we never expect the participants to continue working on the problem sets outside of class and actively dissuade them from doing so.

2012-07-06 08.45.57
July 6, 2012 PCMI Teacher Leadership Program Morning Math Course

The design of the problem sets has evolved over the years, and I cannot take credit for their genius. At first, when Ryota Matsuura, and Bowen led the course in 2001**, their design was inspired by the problem sets of the PROMYS program, which were themselves inspired by the problem sets used in the Ross Mathematics Program. Over time, the design of the course evolved to its current state. The current form of the problem sets is also somewhat similar to the Phillips Exeter Academy math problems in that they are a carefully sequenced set of tasks. Program leadership has tried to use the term “problem-based approach” to describe how the problem sets work, but I think we need to come up with less klunky.

The problem sets are divided into three sections: Important Stuff, Neat Stuff, and Tough Stuff. Because we have elementary school teachers working alongside Calculus teachers, there is a huge heterogeneity of mathematical expertise among the participants of the class. But, we view this heterogeneity as an asset instead of a challenge.

The problem sets are written in such a way that the main arc of the problem sets from the first day to the last day are woven through the Important Stuff problems. We design the problem sets so that all participants get through the Important Stuff problems, but they never get through all of the problems for the day. If we notice that any participants don’t make it through all of the Important Stuff problems on a problem set, we design the next problem set taking this into account by repeating problems. The Neat Stuff problems build on the Important Stuff problems–they’re neat, but not essential. They often extend the main ideas into other interesting cases, use alternative representations, or involve other areas of application. And the Tough Stuff problems are notoriously difficult. Sometimes they involve unsolved mathematical problems.

So how are these problem sets written? Before PCMI begins, Al Cuoco, Glenn Stevens, Bowen Kerins, and others at Education Development Center, brainstorm about ways to make the PCMI research theme accessible to teachers. They develop some of the important problems and concepts that should be part of the course and think about the mathematical prerequisites that would be required to reach those problems and concepts. In some years, they have even test out some problems with groups of teachers in Boston.

When Bowen and I arrive at PCMI, we begin working immediately. Knowing the general idea for the theme of the course, we decide on the aha moments that we want to build into the course. Since the course generally fills up three weeks, we write the problem sets so that there is a medium-sized aha moment at the end of the first and second weeks, then a big aha moment at the end of the last week. We then think about the prerequisite mathematical ideas and skills that are needed for all of the aha moments and use these learning outcomes as targets for each day’s problem set.

Each day, our routine goes something like this:  Class starts at 8:30 a.m. Class ends at 10:40 a.m. Bowen and I debrief about how class went and come up with a plan for the next problem set. We each lunch at noon. We work in the afternoon. Bowen often takes a nap in the late afternoon and I keep working. We eat dinner. We keep writing. I go to bed around midnight. Bowen keeps writing into the wee hours of the morning. I wake up at around 6:30 a.m and make a final pass through the problem set for that morning. I print it and head to class. Repeat ad lassitudinem.

The three weeks are truly grueling, but also immensely fun. Bowen is a great partner in crime and there is a lot of laughter involved in the writing. He has a wicked sense of humor and inserts all manner of jokes, puns, and obscure pop culture references. We also make an effort to work in the names of every participant in the course into the problem sets somehow. References to events that take place outside of class time (like cowboy karaoke parties or escape room antics) often find their way into the problem sets. These references enhance the sense of community in the room.

Working for nearly 22 hours each day to prepare for a 2-hour class is not sustainable. I do not advocate that school teachers work in this manner. It is somewhat frightening that we approach PCMI each year with so little of the problem sets written in advance. But, one of the main reasons that the participants in the course seem to enjoy it so much is that it really is written on fly to meet them exactly where they are. Bowen and I take copious notes about what strategies participants are using, what they find interesting, what they struggle with. The problem set for the next day is written with that in mind. The design process of the course demonstrates the importance and effectiveness of formative assessment.

There are several other important principles that we try to adhere to in the construction of the problem sets.

  1. Every problem is deliberate and almost always has connections back to a previous problem and forward to later problems. It is not uncommon for us to write problems in the afternoon only to toss them later that evening.
  2. The Important Stuff problems, and especially the first problem (which we call the Opener), are constructed to have multiple entry points. We write problems that can be attacked via multiple approaches: enumeration, calculation, modeling (particularly with manipulatives), algebra, or more sophisticated approaches. This gives everyone a way into the problem and also encourages participants to compare their approaches and make connections with each other.
  3. Problems are often sequenced in such a way that participants can’t help but notice patterns. If you’ve done three problems in a row and the answer each time turns out to be 47, you’re going to notice that and wonder why that’s the case. We avoid being heavy handed about pointing out those patterns but instead leave lots of breadcrumbs for participants to notice. In many cases, the trail of breadcrumbs leads back to previous problem sets so there is a revisiting and reexamination of what has been learned before. We do a lot of foreshadowing of key ideas by building up a bank of examples before the key idea is revealed.
  4. Wording of problems is very important. Bowen is a master of efficiency and clarity. I have learned so much from him about how to say exactly what you mean and no more. We try our best to fit the problems onto 2 sheets of paper (4 pages total). Also, language is important and sometimes we need to obfuscate–If there are mathematical ideas that we think folks might recognize and be tempted to look up on Wikipedia, we sometimes purposely obscure those ideas. For example, in the 2007 problem set the Farey sequence appeared as the “godmother sequence“. In 2014, participants played with Penrose tilings for two weeks without us mentioning those words explicitly until they had discovered many of their properties through the problem sets themselves. We obfuscate in this way so as to create a more even playing field for participants and to encourage participants to make their own discoveries.
  5. We use computers and technology sparingly for two reasons: open computer screens often put up a barrier that impedes communication between people, and participants often have a wide range of experiences with technology. Technology should be used when it meets the learning objectives in a compelling way, not just because it’s whiz-bang neat. When technology is used, we craft problems carefully so that participants who are unfamiliar with the technology learn what they need to know in small chunks. Over time, their fluency with the technology builds. We are also big fans of using Google Sheets as a way for participants to pool their data with everyone in the class.
An example of what it looks like in the room when participants use Google Sheets to pool data with each other during class (July 15, 2016). Groups of participants enter data into a Google Sheet that is projected at the front of the screen. This technique is especially useful for collecting large sets of experimental probabilities for later analysis.

Over the years, we have noticed that the table facilitators play a crucial role in making sure that the participants at each table work well with each other. Sometimes there is a participant who knows a lot of mathematics and rushes through the problems to get to the Tough Stuff. In contrast, others at the table will take their time to “smell the roses” and make deep insights. These table facilitators try their best to make sure that the speed-racers are sufficiently intrigued by the deep insights made by the rose-smellers that they slow down and savor things too. It’s especially wonderful when a high school teacher with a lot of mathematical experience is humbled by the mathematical prowess of a neighboring elementary school teacher.

Part of the reason why I enjoy this course so much is that it is so gratifying to know that participants enjoy themselves and walk away with meaningful mathematical experiences. Here are some of the things they have to say.

“That’s where it’s had the biggest impact—in how I teach. Now I encourage and allow students to collaborate more in groups, to communicate mathematically with each other and collaborate to solve problems. I used to just always show my students how to solve problems, and now I encourage them to try on their own—to practice problem solving and arrive at their own conclusions, and then we come back to discuss it.”

“When I reflect on the morning sessions, the effect was positive. It was a touchstone. I think about the kinds of experiences we had and how to get more of that into my classes—sharing about math, giving kids time, allowing kids to dig in and take ownership, rather than, we have to cover this and that and this. [PCMI] helps me to stay student centered. So my experience of being in that room in those days resonates in that way.”

So, if you’re reading this post not having gone to PCMI, please take a look previous problem sets and seriously consider applying to attend via


* In 2003 I participated in the Undergraduate Faculty Program, which is where I first met Peg Cagle, a math teacher in Los Angeles. We became fast friends and started something like a math teacher circle in Los Angeles. That math circle is how I get to meet Pam Mason, who I then asked to be the director of Math for America Los Angeles when I helped to start that in 2007. Fast forward to 2016…MfA LA has supported over 120 math teachers in the Los Angeles area and going strong.

** Some more history: The morning math course dates back to 1981 when PCMI started as a regional geometry institute. The course took many different forms. In 2001, EDC was contracted to design the course. Ryota and Bowen led the course, each running the course for half the time. In 2002, Bowen and David Offner ran the course. In 2003, Bowen and Ben Sinwell ran the course, each for half the time. In 2004, the TLP got a large NSF grant and the number of teachers doubled, so Bowen and Ben ran the program together for all three weeks. I was a table facilitator prior to helping to lead the math course. In 2006, I had the good fortune of being put in the same housing unit and Bowen and Ben, and that led to me helping with the writing.



Reading list for “Social Justice and Equity: STEM and Beyond”

This past spring, I had the pure joy of co-teaching a course with Sumi Pendakur at Harvey Mudd College entitled “Social Justice and Equity: STEM and Beyond”. Even though we create the course one week before the start of the semester, we got over 50 students to sign up to take it! It was so uplifting to see students grappling deeply with these issues.

In case others are interested, here are the topics for each of the class meetings, and the readings that we assigned. There are some that we would definitely switch up the next time we do this, but I hope this is a helpful list for anyone thinking about teaching a similar course.

Link to our course description:

Class 1 (Jan 25, 2016): Patterns of Underrepresentation and Bias in STEM

Class 2 (Feb 1): The Importance of Diversity in Higher Education

  • HMC’s “Why Diversity” and “What is Diversity” documents
  • Chapters 1 and 2 of “Diversity’s Promise for Higher Education: Making it Work” by Daryl Smith
  • 1-page article “Diversity Challenge” in Nature, Sep 2014 (optional: read the other articles included in that first article)

Class 3 (Feb 8): Meritocracy in STEM and Affirmative Action

Class 4 (Feb 15): Racism, Prejudice, Bias, Stereotypes

Class 5 (Feb 22): Slavery, Jim Crow, and Racism Post-Jim-Crow

Class 6 (Feb 29): Immigration, Segregation, Desegregation, Resegregation

We cancelled our Mar 7 class to hear Keith Knight‘s talk.

Class 7 (Mar 21): Critical Race Theory and the “Others” Claim to Higher Education

  1. Just What is Critical Race Theory and What’s It Doing in a Nice Field Like Education” Gloria Ladson-Billings
  2. Toward a Critical Race Theory of Chicana and Chicano Education” Solorzano and Yosso
  3. Camouflaging Power and Privilege: A Critical Race Analysis of University Diversity Policies” Iverson
  4. From Jim Crow to Affirmative Action and Back Again: A Critical Race Discussion of Racialized Rationales and Access to Higher Education” Yosso, Parker, Solorzano, Lynn

Note: We did a jig-saw activity with these four readings that turned out well. These two were additional optional readings.

Class 8 (Mar 28): Marxism and Higher Education

Class 9 (Apr 4): Social Capital Theory and the Hidden Curriculum

Class 10 (Apr 11): Feminism

Class 11 (Apr 18): Queer Theory & Disability Justice

Class 12 (Apr 25): Campus Climate

Making Equity an Integral Part of All Instructional Decisions

A few days ago, CFED (Corporation for Enterprise Development) and the Institute for Policy Studies recently published a report entitled The Ever-Growing Gap: Without Change, African-American and Latino Families Won’t Match White Wealth for Centuries. It’s a worth-while read, both illuminating and depressing at the same time. The authors argue that the growing wealth divide in our nation is not an accident but the result of past and present policies that widened the difference between the wealth of White households and households of color. They then advocate for an audit of current Federal policies to determine their impact on the racial wealth divide.

This got me thinking about teaching and equity. What can I do in the context of where I work and teach? I believe that the racial gap in educational outcomes (in all forms: degree attainment, participation in advanced courses, productive disposition toward learning mathematics, learning outcomes, test scores) is not an accident but the result of past and present injustice. The fact that there is a growing wealth divide also means that there are deep divides in educational outcomes along class lines as well. And of course, there are clear correlations between class and race in the United States.

At the same time, I am also moved by Rochelle Gutiérrez’s (and others) writings about our gap-gazing fetish in education. (See this and this.) We need to walk a fine line of advocating for more equitable outcomes but at the same time avoid privileging the performance of White and Asian students as the standard to aspire to, propagating a culture of deficit when referring to other students of color, and having only school-based conceptions of what it means to do mathematics.

All of these things were swirling in my head and that got my wondering…

Imagine if instructors were equity-minded and routinely made instructional decisions taking into account the potential impact of those decisions on any patterns of difference in educational outcomes across groups of students. Imagine what would happen if measures of educational outcome differences were a regular part of all teaching evaluations at your institution.

What would it take to get there? In this post I am mostly thinking about how this would play out in my own teaching. That seems like the natural place to start.

I would need a range of measures of educational outcomes (not necessarily standardized tests) that I could use a the beginning and end of each of our courses. These could include measures of disposition toward mathematics, metacognitive skills, problem-solving skills, or pre-/post-assessments of mathematics content knowledge and skills. I could then use these measures to look for preexisting patterns of difference in outcomes for students entering a course, and the same or similar measures used at the end of course could show whether those patterns of difference were bigger, smaller, or stayed the same. Or, I could use the same instruments for different courses over time to so whether there are any systematic patterns. All of our instructional decisions could then be analyzed over time so as to reveal the instructional strategies that hit that sweet spot where educational outcomes are high and differences between groups of students is lowered or even eliminated.

Here are three paired histograms showing the distribution of men and women’s performance in my introductory differential equations course over the last three years. The course content and assessment stayed almost exactly the same, though between 2015 and 2016 we went from a flipped/unflipped experimental design to a uniform treatment (a hybrid course) for all students. (For more details see

yong men women histogram 14-16

What do these data reveal? I’m not sure. The differences in means are not statistically significant, but I wonder whether there is more that the shapes of the distributions can reveal. The means are heavily influenced by students in the tail of the distribution. It seems that the mode for men is the same as or slightly higher than that for women in each case. That doesn’t seem good to me. A similar analysis for the pre-test assessment scores shows no difference in performance between men and women going into the course. My conclusions are that (1) I didn’t do any harm (whew!) in creating any gender difference through my course, and (2) moving to a hybrid design had no impact on gender equality in my courses.

(Note #1: I did a really bad thing by manually coding each student as male or female based on what I knew of them instead using self-report data. I classified any transgender students according to their gender expression at the time of the course. Note #2: I didn’t have access to the metacognitive and attitudinal survey data that was collected in these courses. That would also be interesting to analyze in this way. The assessment that was used here consisted of five multi-part questions that address both computational skills, conceptual understanding, and ability to apply knowledge to model physical situations.)

Some practical problems to consider: (1) Norm-referenced assessments won’t work here and we would need criterion-referenced assessments instead. We want to make sure to encourage a growth mindset and to give students clear learning targets. (2) In most cases there are too few students in my courses to allow for meaningful statistical analysis. And there are all kinds of privacy and FERPA regulations to worry about if I want to get detailed demographic information about my students. (3) Sometimes we over-assess our students. How do we avoid that? (4) If I am at all a decent teacher and I use the same instrument as pre- and post-assessment of content knowledge, I should expect the distribution to narrow after a whole course-full of learning. That means that unless I have access to a good measure of prior knowledge, the comparison can only be made from course to course. (5) If I only use instruments once per course, results would be difficult to compare across courses unless there were some way to know that the groups of students are consistent in meaningful ways over time.

Clearly there a lot of problems to the ideas I’ve mentioned above, which is probably why people don’t do this more regularly. And all of this takes time and effort to do. (The analysis that I did above took me a few hours.) The key is to find ways to reduce the barriers so that it becomes easier to gather the right kinds of data and use those data to promote equity in learning. I want to get to a place where we can make data-driven decisions in pursuit of equity-promoting instructional decisions and practices. Is this a good idea? And if so, what tools, practices, and systems do we need to develop to get there? I welcome your thoughts!


Experiments with point-of-view video cameras

I’ve been playing around with using multiple cameras to capture my classroom this semester. I’m using more than one camera so that I can capture what’s happening in the class as a whole and also what’s happening as I interact with different groups of students. I want to be able to capture video from my point of view for several reasons: (1) I want to record (approximately) what I was seeing and noticing so as to help uncover the signals that led me to decide to intervene with a particular group of students or not; (2) I want to record the kinds of questions that students are asking and how I responded to them–is there something that I could have said that would have helped them more? (3) I want to be able to capture the way students and I talk to each other–what did I position myself relative to them (crouching down, or bending down to talk to them) and did it make any difference in their body language or response to me?

Here’s a sample still from a produced video. You can see four students sitting together in a group of four. (Ideally, you would be able to clearly make out what is on their paper.) In the corner is a wide-angle view of several groups.


Here’s my current setup:

1) Swivl camera base with my Nexus 5X phone as a recording source
2) GoPro Hero4 Silver mounted on my chest with a Sony ECMCS3 microphone.

Good points: both cameras capture at full HD (1920×1080) resolution, and the microphones are by and large capturing important conversations. The video is giving me lots of things to look at.

However, I’ve been plagued by lots of (mostly technical) problems. I have been trying for weeks but I still haven’t found a good method for capturing the video reliably.

Problems encountered so far

  • GoPro on my chest is mounted in such a way that if I bend down to talk to students, then I just get a shot of the floor or table. I need to constantly remember that I have to position my torso so I get a good view of students’ work and faces.
  • Swivl camera is pretty horrible because it needs line of sight to track you because it uses IR. It has a remote control that I wear on a lanyard that it uses to track where I move so it should theoretically always have me in the shot, but I’m not in the shot half the time. The only solution I’ve found so far is to stick it up higher above students and have it tilted down slightly.
  • Audio quality is also pretty bad from the Swivl camera.
  • The GoPro is supposed to have a feature where I can mark “highlight” moments using a remote control that I wear on my wrist. I haven’t figured out how to export those markers in a useful way. (I experimented with the “video loop” recording mode, but that isn’t really helpful because you only get 5 minute segments.)
  • Watching the footage from the GoPro makes me nauseous. I am using ffmpeg vidstab to stabilize the video, but it takes hours to process the footage.
  • Adobe Premiere Pro has a really steep learning curve. If I have to watch another video to figure out how to do something (that I think should be) simple I am going to scream! I’m also behind one version (CS6 instead of CC) and this older version doesn’t have the nifty feature of aligning two clips of video automatically based on the audio tracks.
  • HD video is great, but the processing time is long, in general. Even moving video files between devices is slow. (Protip: Instead of plugging the GoPro into the computer using a USB cable, take out the SDHC card and stick it in the USB adapter and plug in directly to the computer. Waaay faster. Whoa.. And about 10x faster than that is to put the MicroSDHC card into a SDcard adapter, then put it into a SDcard slot in your computer. I get about 42MB/s transfer rate that way on my computer.)
  • I am encoding the finished video using H.264. However, if I set the encoding compression too low, the video ends up being huge (15GB). If I set the encoding compression too high, then I can’t make out what is on students’ papers.
  • Both cameras use exFAT file system, so videos get chopped up into smaller files to avoid 4GB file size limit. That also adds more complications to video processing. On my Nexus 5X, I think it skips a few seconds of video when it transitions from one file to another while recording.
  • I look pretty ridiculous wearing all this equipment.


This is a work-in-progress post. I hope to have more technical issues sorted out soon.

Explanatory Power of the Hierarchy of Student Needs

In a previous post, I wrote about how a Lisa Bejarano’s tweet led me to adapt Maslow’s Hierarchy of Needs to better describe the needs that students have in our classrooms.

Since that time, I’ve been thinking about this hierarchy of student needs a lot. I’ve been trying to find scholarly writings on the subject (haven’t found much), and I’ve given several talks featuring these ideas. I’m ready to share these ideas more widely because I think these ideas could be much more fruitful than I originally realized.

A Hierarchy of Student Needs

Hierarchy of Student Needs in the Classroom, adapted from
Fig 1. Mapping of Maslow’s Hierarchy of Needs to a hierarchy of student needs in the classroom. Image based on

According to several colleagues who are psychologists, Maslow’s Hierarchy of Needs is a well-respected theory for human motivations. It’s included in most introductory psychology courses. One colleague told me that the theory is so sensible that he finds it difficult to imagine how it might not be true.

Here is how each of category of human needs might translate into students needs in our classes.

Physiological. Thankfully, many of us are blessed to teach in relatively comfortable environments where we are properly sheltered from the environment. Sometimes, the air conditioning makes my classroom too hot or too cold, but by and large I don’t have to worry about my Mudd students’ physiological needs, with the exception that they are often sleep deprived. However, we should remember that far too many students in the United States are food insecure and worry about having a comfortable place to call home.

Safety. Maslow’s original hierarchy had to do with personal and financial safety. While you wouldn’t think that personal safety is an issue that we instructors have to worry about, we must not forget that sexual violence is a big problem at many colleges and universities around the country.

Besides the need for safety from bodily harm, I think there are two other forms of safety to consider in the mathematical classroom: emotional and intellectual. Lisa’s tweet shows that her student isn’t afraid of being made fun of, being criticized, or being outed to others (except for the students’ mother!). That kind of emotional safety is crucial for students to be open to learning in a classroom. For example, would a student who wears a hijab feel safe in your classroom from ridicule or teasing? Respect for others begins with the instructor. Does the instructor make jokes that single out certain students or groups of students? Does the instructor speak disparagingly about certain groups of people (intentionally or not) or send messages about which groups of students are more or less competent (intentionally or not)? How does the instructor respond to microaggressions and “larger” instances of aggressions, perpetrated by students or the instructor?

All of the protests that have been taking place at colleges and universities over the last few months have underscored the fact that students need emotional safety. It’s disheartening to see how some people are disparaging the need for “safe spaces” by equating them to cocoons where students can never get hurt by disagreements or criticism. The truth is that at many colleges and universities, students of color and other marginalized groups of students go to school in fear of ridicule and scorn. Clearly, that is not an optimal environment in which students can learn.

Intellectual safety is possible when you feel that your ideas are valued by others even if they are incorrect or there is disagreement. Instructors wield a lot of power and respect in the classroom, and that power and respect can be misused. For example, one student shared with me his painful experience asking his professor a question in class. Students were working on a worksheet in groups, and this student had a question that none of his classmates could answer. The student raised his hand, and the professor came over and exclaimed “Oh c’mon!” in condescending and dismissive tone. According to the student, the professor’s response was so loud that other students in the room noticed and a few gasped in shock. The student got into a verbal altercation with the professor about why the professor felt the need to get mad when asked that question. Things did not turn out well. The student was very discouraged and highly unmotivated to learn in class. The student failed the class.

This anecdote reminds me that the way that I respond to incorrect answers and student questions is extremely important. No matter what teachers say to students, the ugly truth is that, yes, students can ask stupid questions. However, it is usually not appropriate to let students know that they have asked a question that they should be able to answer given what they know, especially when students are not confident in their own abilities. I may not respond as severely as the professor in the anecdote above, but I might still send subtle signals to students just from my body language or tone. I’ve learned to develop a poker face that hides my internal reactions to students’ answers or questions. I try to react to all students’ questions and answers in the same way. (That means that when I get a correct answer I also have to modulate my reaction so that I don’t smile approvingly when I get a correct answer and then not smile when I get an incorrect answer.) By the way, this poker-face-approach also has the nice side effect that students don’t know whether their answers are right or not and so I am able to more authentically press them for their justifications.

If you use group work in your classes, intellectual safety is a prerequisite for students to participate. Group work can be scary because it’s an opportunity for students to reveal to each other what they know and understand. I learned this lesson in my course on partial differential equations last semester. On an exit ticket one day, a student wrote this: “Right now, I’m insecure enough about solving problems that the pressure of group work makes me shut down, which only makes it worse.” This student’s lack of intellectual safety prevented him/her from working with others on in-class tasks. Ouch! I felt horrible.

It’s inevitable that students will have different levels of preparation and skill in our classes. What we need to do is to make it clear to our students that there are many different ways to be successful in our classes. It’s not just speed of calculation that makes one better at math. One can be great at observing patterns. One can be great at visualizing functions. One can be great at generalizing ideas. However, if most of our mathematical tasks are procedural and computational, then we risk collapsing mathematical proficiency into a single axis. Valuing all of those different axes is difficult to do in every single task that we assign, but I think it is very possible to make explicit the different ways that students can be successful in mathematics at different points during a semester or unit. (Also, I have written here about how speed and automaticity are often conflated.)

One important thing I’ve also realized since my last posting is that different groups of students can have very different needs in my classroom. It is well-known that men tend have more confidence in their ability to complete tasks based on their own assessment of the necessary skills whereas women tend to doubt themselves even when they have the same skill level. This effect therefore leads to a tendency for women to feel less intellectually safe in a mathematics classroom than men.

Love/Belonging. One could perhaps talk about love in a way that makes sense in the mathematics classroom, but I choose instead of focus on students’ sense of belonging since that seems to me a more straightforward idea. (Please see my previous post about radical inclusivity.) I believe that there at least three different levels of belonging that are relevant here. Students need to feel that they belong to (1) the mathematics classroom, (2) to a larger community of practice of mathematicians, and to (3) any groups that they’ve been assigned to.  (The latter will only apply to you if you do group work in your classes.)

I think that there are lots of things that instructors can do to help students feel a sense of belonging. If you care about your students and look after their well-being, you are probably doing things for them that increase their sense of belonging. And, I bet that many of us do these things instinctively without realizing how much they positively affect our students. Here are some examples.

  • Simply by learning students’ names (and learning how to pronounce them correctly), we give our students a sense that they belong in our class.
  • A teacher’s sense of humor can sometimes be a great tool for helping students feel a sense of belonging–when you’re in on a joke that only those in your class can understand, that helps you feel like you’re part of a community. This video of a manatee features prominently in all my classes. It’s silly, but it’s also memorable and super effective at building community.
  • Group work can lead to amazing results, but when implemented poorly it can also lead to disastrous results. When left untreated, status issues in a group of students can lead to students feeling excluded from the group. (Lani Horn has suggestions for addressing status issues here.)

As with emotional/intellectual safety, I have thought a lot about how different groups of students experience different levels of belonging in my classrooms. Any groups of students that are poorly represented in my classroom will naturally feel like they don’t belong to the group as much as the majority students. Therefore, underrepresented students will tend to feel like they belong less than majority students. Underrepresented students will probably also feel less belonging to the mathematics community as a whole because they don’t see many people like them getting tenure, attending conferences, winning professional awards, publishing research.

The messages that we send to our students about belonging are subtle but important. Sometimes we have the right intentions but we do things that make students feel like they don’t belong. For example, Lauren Aguilar, Greg Walton and Carl Weiman point out that if you continually tell a student that she can succeed and you don’t tell that to other students, that student might begin to wonder whether you doubt her ability to succeed. Or if you set up additional office hours and make a special effort to invite women and minorities to attend, you might be sending the message that you think all women and minority students have less adequate preparation.

Esteem. The Wikipedia page on Maslow’s Hierarchy of Needs describes esteem as “a need to feel respected… to have self-esteem and self-respect.” The analog of esteem in the mathematics classroom is a student’s self-concept as a learner of mathematics. Every time a student is presented with a mathematical task, that student’s self-concept is activated in the form of an appraisal of her/his own abilities as a learner of mathematics based on prior achievements, comparisons with peers’ abilities, and perceptions of the mathematical task at hand. That appraisal of success at the task gives the student confidence or reluctance to take on the task. When I taught high school, I noticed that students with low self-concept would become disruptive or disengaged when presented with a task that they thought they would not be able to complete. These behaviors, unconsciously or consciously, help the student avoid the possibility of failure and public or private shaming from the teacher so as to preserve his/her self-esteem. (I found this old blog post from 2009 that gives a specific example of this.) At Harvey Mudd, instead of disengaging or becoming disruptive, students with low-concept will procrastinate on their work and rationalize their poor performance as due to lack of time devoted to the task.

It is important to note that self-concept correlates very strongly with student performance, and may be one of the variables that is most strongly correlated to student performance. (See John Hattie’s book Visible Learning.) One of the most important ways that we can attend to students’ self-concept is by using formative assessment to have a very detailed understanding of students’ skills and understanding, and to give them tasks that are appropriate to their level of mathematical development.

Self-actualization. Ok, if you’re like me, you probably approach this word with a little hesitation about sounding self-helpy…. But, the according to Maslow himself, self-actualization refers to the desire to accomplish everything that one can, to become the most that one can be. That makes a lot of sense to me. When one has achieved a certain level of success with mathematics I think it is natural to wonder what else one can achieve and then to try to do it. That need to test one’s boundaries is a wonderful human characteristic.

Maslow theorized that the four foundational needs (physiological, safety, love/belonging, esteem) are “deficiency” or “basic” (see Fig 1) in the sense that those needs must be met before an individual will strongly desire any higher level needs. In other words, if I do not have enough food or water to survive, I am going to spend most of my energy making sure that I can meet those needs first before I think about my physical safety and love/belonging. People who don’t have these deficiency needs met will feel anxious. I imagine that the same is true about students’ needs in the classroom. If a student doesn’t feel intellectually safe, it’s a safe bet that the student will feel anxious and that anxiety will weigh on the students’ ability to learn.

How the Hierarchy of Student Needs Relates to Equity and Inclusion

Since my last post, I’ve thought a lot about how this hierarchy of student needs relates to equity and inclusion. The key insight that I come to over and over again is how different groups of students in my classes have different levels of needs. Majority students tend to come to my classes with a more secure sense of intellectual and emotional safety and sense of belonging to the classroom. Perhaps that is one reason why we often find differences in educational outcomes when we aggregate students into different groups.

One of the unsolved questions in education research is why certain classroom interventions seem to have disproportionate effects for different groups of students. In their paper “Active learning increases student performance in science, engineering, and mathematics,” Freeman, et al., present one of the strongest pieces of evidence to date about the positive effects of active learning:

“In addition to providing evidence that active learning can improve undergraduate STEM education, the results reported here have important implications for future research. The studies we metaanalyzed represent the first-generation of work on undergraduate STEM education, where researchers contrasted a diverse array of active learning approaches and intensities with traditional lecturing. Given our results, it is reasonable to raise concerns about the continued use of traditional lecturing as a control in future experiments … The data suggest that STEM instructors may begin to question the continued use of traditional lecturing in everyday practice, especially in light of recent work indicating that active learning confers disproportionate benefits for STEM students from disadvantaged backgrounds and for female students in male-dominated fields.”

A general consensus is building that many active learning strategies improve learning outcomes for all students, but that they also improve learning outcomes disproportionately for women and underrepresented students. Here are three examples of documented cases of exactly that:

(Each of these articles is worthwhile to read. I would appreciate it if others can point out additional examples of this kind of research.)

Of course, these studies are empirical. They observe that these disproportionately positive effects occur for disenfranchised or marginalized groups of students, but they can’t explain why that happens. Could this hierarchy of student needs be that explanation?

Suppose you have a teacher who implements group work effectively in a mathematics classroom. In this scenario, imagine what happens to a woman who initially comes into the class doubting her abilities. During the course of doing mathematics with other students in class, this woman realizes that others are having the same struggles too, or that she’s actually more capable than she realized. That realization increases her sense of intellectual safety, sense of belonging to the class, and self-concept as a learner of mathematics. The same thing could happen if an instructor used clickers/plickers/etc in class along with questions that generate meaningful dialogue and surface common misconceptions.

Suppose you have a teacher who is really great at orchestrating classroom discourse. Imagine an African-American student in this teacher’s class, who has received lots of signals that previous teachers have doubted his mathematical ability. This teacher is great at making sure every students’ idea is taken seriously and is worthy of consideration. One day, the student tosses out an idea that some students initially dismiss, but the teacher carries it out to its logical conclusion and finds it to be innovative and correct. The student’s self-concept as a learner of mathematics increases as he begins to realize that perhaps he’s skilled in mathematics in a way that he and others have never appreciated, and he begins to call into question all of the previous signals he’s received from others. Other students take notice of his abilities too, and that increases his sense of belonging in the class.

I’m sure you could come up with similar scenarios too.

Once we accept that there are certain teaching practices (for example, active learning strategies) that happen to be very effective also happen to promote greater equity and inclusion, we arrive at this question: Is inclusive teaching the same as effective teaching? I believe that this statement is true, but only in part.

Inclusive teaching is a set of principles, goals, and practices, grounded in research, experience, and commitments to social justice. A large subset of these principles, goals, and practices could easily also be described as effective teaching. And in fact, it may be difficult to distinguish one from the other simply by looking at a sample of teaching practices. (I wrote more about this here.)

Inclusive teaching adds to effective teaching a framework for understanding why teaching is effective, along with an intentionality of producing more equitable outcomes for students. A faculty member may teach effectively without consciously considering inclusiveness, but by being more intentional about the desired outcomes of learning and designing every aspect of the learning to address students’ needs, they could help to create even better results.

These ideas seem so natural to me and yet I feel like I’ve just scratched the surface. There is more to uncover and think about, I’m sure. For example, if this hierarchy of students needs can help to explain why different teaching strategies lead to different results for different groups of students, then perhaps researchers should measure students’ sense of safety, belonging, and self-concept along with their learning outcomes when they compare different interventions.

Added Aug 2016: Here are some references that connect to this topic.

Baran, E., Correia, A. P., & Thompson, A. “Tracing successful online teaching in higher education: Voices of exemplary online teachers.” Teachers College Record, 115:3 (2013), 1-41.

Call, Carolyne M. “Defining Intellectual Safety in the College Classroom.” Journal on Excellence in College Teaching 18.3 (2007): 19-37.

Cohen, G.L., Garcia, J., Purdie-Vaughns, V., Apfel, N. and Brzustoski, P. “Recursive processes in self-affirmation: Intervening to close the minority achievement gap.” Science 324.5925 (2009): 400-403.

Cook, J.E., Purdie-Vaughns, V., Garcia, J. and Cohen, G.L. “Chronic threat and contingent belonging: protective benefits of values affirmation on identity development.” Journal of personality and social psychology 102.3 (2012): 479.

Demirdag, Seyithan. “Management of Errors in Classrooms: Student Mistakes and Teachers.” International Journal of Humanities and Social Science 5:7 (2015): 77-83.

Edmondson, Amy. “Psychological safety and learning behavior in work teams.” Administrative science quarterly 44.2 (1999): 350-383.

Guzzetti, Barbara J., and Wayne O. Williams. “Gender, text, and discussion: Examining intellectual safety in the science classroom.” Journal of Research in Science Teaching 33.1 (1996): 5-20.

Harackiewicz, J. M., Canning, E. A., Tibbetts, Y., Giffen, C. J., Blair, S. S., Rouse, D. I., & Hyde, J. S. “Closing the social class achievement gap for first-generation students in undergraduate biology.” Journal of Educational Psychology, 106:2 (2014), 375.

Johnson, Christopher M. “A survey of current research on online communities of practice.” The internet and higher education 4.1 (2001): 45-60.

Kolb, Alice Y., and David A. Kolb. “Learning styles and learning spaces: Enhancing experiential learning in higher education.” Academy of management learning & education 4.2 (2005): 193-212.

Kunc, Norman. “The need to belong: Rediscovering Maslow’s hierarchy of needs.” Restructuring for caring and effective education: An administrative guide to creating heterogeneous schools (1992): 25-39.

Paunesku, D., Walton, G. M., Romero, C., Smith, E. N., Yeager, D. S., & Dweck, C. S. (2015). “Mind-set interventions are a scalable treatment for academic underachievement.” Psychological Science 26:6 (2015): 784-793.

Schrader, Dawn E. “Intellectual safety, moral atmosphere, and epistemology in college classrooms.” Journal of Adult Development 11.2 (2004): 87-101.

Sherman, D.K., Hartson, K.A., Binning, K.R., Purdie-Vaughns, V., Garcia, J., Taborsky-Barba, S., Tomassetti, S., Nussbaum, A.D. and Cohen, G.L. “Deflecting the trajectory and changing the narrative: how self-affirmation affects academic performance and motivation under identity threat.” Journal of Personality and Social Psychology, 104:4 (2013) 591.

Steuer, Gabriele, Gisela Rosentritt-Brunn, and Markus Dresel. “Dealing with errors in mathematics classrooms: Structure and relevance of perceived error climate.” Contemporary Educational Psychology 38.3 (2013): 196-210.

Walton, Gregory M., and Geoffrey L. Cohen. “A brief social-belonging intervention improves academic and health outcomes of minority students.” Science 331.6023 (2011): 1447-1451.

Yeager, David S., and Gregory M. Walton. “Social-psychological interventions in education They’re not magic.” Review of Educational Research 81.2 (2011): 267-301.

PDEs Course: Wrap Up and Reflection

I haven’t been posting much because of the busy-ness of last semester. Now that grades have been submitted, I’ve been reflecting on the partial differential equations (PDEs) course that I taught. (All previous posts about this class can be found here.)

I firmly believe that we must evaluate our own teaching if we want to improve, and that one of the best ways to gather data on our teaching is to ask our students. Students aren’t always the best judge of how much they have learned, but I trust my Mudd students’ ability to tell me about their experiences and opinions about the course. Here is what students said in their comments on an end-of-semester evaluation survey.

Comments about students’ perceptions of the course and their overall experiences:

I really appreciate your comments at the beginning of class that you realized that most of the applications you were planning on presenting were thought up by dead white guys and that that might cause some people dismay. I think that recognizing that that’s a problem in math/science that permeates into classrooms is important, and you saying that out loud helped me feel more like I belonged in the classroom even if I am not a white male.

This class somehow made me enjoy solving PDEs even though the past three DE classes I’ve taken convinced me I just really hate DEs. Nice job.

I started solving random PDEs in my free time, from which I deduce the class was pretty interesting.

I also want to thank you for being such a dedicated teacher. Your lectures and notes in numerical analysis and PDEs were effective in delivering material and it never felt like there was anything “hidden” about the subjects that I couldn’t figure out without some closer reading. I also think that the structure of this class was awesome, because it encourages you to learn all aspects of the subject, even if you missed that specific part of the semester.

I feel that [Prof. Yong] is extremely understanding and is very approachable to students, regardless of how comfortable they are with the material.

Overall, I noticed that student engagement was high. I was really pleased that I was able to change some students’ opinions of differential equations.

On the end-of-course survey, I asked students to indicate their affinity for the following statements on a scale from 1 (strongly disagree) to 5 (strongly agree). (A total of 32 students responded to the survey, though not every student responded to every question.)

  • “The students and instructor for this course created a welcoming community of learners.” Average response: 4.59 / 5
  • “In this class, I was able to express myself (whether it was to answer a question, or to say that I didn’t know how to do something) openly without judgment or ridicule from my instructor.” Average response: 4.81 / 5
  • “I generally felt secure and confident to speak in this class (to answer a question or ask a question or something else) when I wanted to.” Average response: 4.44
  • “I feel secure and confident to speak in my classes at Mudd in general.” Average response: 4.13 / 5
  • “I feel like an outsider in this class.” Average response: 1.68 / 5
  • “The instructor was respectful to all students in the class.” Average response: 6.91 / 7

Student comments about the proficiency assessment system:

Since the proficiency assessment (PA) system was a big change for me and students, naturally there were lots of comments about this part of the course. I tallied up all of the types of comments that I received about the system:  16 favorable comments, 4 negative comments, and 1 mixed comment.

Here’s a selection of the positive feedback:

The proficiency assessments are spot on in encouraging learning–I feel like I’ve learned from them while satisfactorily representing my understanding.

I’ve never had a class where I could schedule tests like this one, but wish I had! I have gotten a lot out of this freedom to prepare when I have time, and to take more advanced versions once ready for them.

Also, there seems to be the philosophy that students deserve credits when he/she knows how to solve the problems and not only when he/she can solve the problems within the exam time. I feel like this grading philosophy is more applicable to the work in real life.

Since I could take it many times, I didn’t care much about getting the right answer. Instead I was able to focus more on improving myself each time I take the PA.

I like how the proficiency assessments encourage me to understand the material at my own pace in a less stressful way.

There were several classes that I’ve taken at Mudd where I didn’t learn the material by the time the exam came, and so there was no reason for me to learn it afterwards. I also like the ways in which [the PA system] discouraged cheating: since you have the opportunity to retake exams, there is less of an incentive to be dishonest about it. For me it was also great because it meant that it was never too late for me to try to catch up. One of the most depressing situations that I encountered at Mudd was having several exams during one week and feeling like I could have performed better if the schedule had just worked out differently. I personally think that your curriculum is the right direction for a lot of the classes at Mudd, so thanks for being willing to try something new out.

On the whole, I think I was able to meet my objective of coming up with a system to assess students’ understanding but in a way that gave students more agency and flexibility, and that promoted students’ growth mindset about learning PDEs.

On the end-of-course survey, I asked students to indicate their affinity for the following statements on a scale from 1 (strongly disagree) to 5 (strongly agree).

  • “The proficiency assessments fairly assessed my understanding of PDEs.” Average response: 4.56.
  • “I was able to demonstrate my understanding of the course material through the proficiency assessments and final exam.” Average response: 4.50

I will definitely use this system again in the future, but I need to figure out how to reduce the impact on my time and on students’ time. Also, I need to think more carefully about how students’ grades are calculated based on their assessment scores.

I ended up writing four sets of proficiency assessments, on four different subtopics. There were standard and advanced assessments for each subtopic. The maximum score attainable on a standard assessment was 23/25, and the maximum for an advanced assessment was 25/25. The standard assessments contained tasks that I would have used as final exam questions, and advanced assessments contained more challenging tasks that involved novel situations or challenges that they had not encountered, but that they could if they dug deeper into the course content. My rationale for this arrangement is that a standard level of attainment would correspond to a 92%, which in my mind is an A-. To get an A in a course, I feel that some effort above and beyond the normal set of expectations is required. I required students to reach the standard level of attainment before attempting a more advanced PA. This had the effect of requiring students to take at least eight separate assessments, if they wanted to get an advanced level of attainment for all four subtopics.

I’m really mixed about the PA format. On the one hand, I think I’ll probably have learned the material better than usual by the end of this course, simply because I’ll have had more tests on the subject scattered throughout the semester. However, I also feel like the PAs took up so much time (both for the student and the instructor grading them) that they caused more stress than a midterm would have.

I feel that the course had too many disparate requirements. The presence of PAs, homeworks, a final test, and a final project made it an overwhelming experience. I would advocate for a more efficient PA system that doesn’t require as much time outside of class, since the current PA system feels more like it’s testing students for how much of their free time they can sacrifice as opposed to their actual knowledge.

I liked the concept behind PAs, but found it somewhat annoying that in order to get full credit on all of the PAs you would have to schedule 8 different time slots. One idea I had to ease up on this sheer time commitment would be to have on each PA the option to choose between the 23 point problem or the 25 point problem.

I don’t feel that the PA system ended up fairly assessing my understanding. I feel that they got close to providing a good assessment, but with the time restrictions I’ve faced this semester, I was unable to take the advanced PAs. I had a very busy schedule this semester, and I was sick for multiple weeks in the middle of the semester, so I didn’t get to take my last regular PA until the end of finals week.

I got perfect scores on the 3 PAs I’ve gotten back so far, and I suspect I also did perfectly on the last PA I took this morning. But I just didn’t have the time to attempt the advanced PAs, especially given that my score isn’t even guaranteed to increase, so it was very hard to justify adding something else into my already exhausting schedule. I feel that I have a very strong understanding of the material, but my time restrictions only allowed me to demonstrate a “basic” understanding.

Personally, I feel that scheduling time for four 90-minute assessments isn’t a huge burden on students, but some felt that way. One thing I will try in the future is to set aside some class time for students to take the proficiency assessments, so they aren’t required to use out-of-class time.


Overall, I’m really pleased with how the course went, even though it was my first time teaching the course and I was experimenting with lots of new ideas. I was trying to attend to students’ sense of belonging to the class all semester, and I think I was successful at that.

I consider the proficiency assessment experiment to be a strong success. I want to continue to refine and improve it. One important side effect of the proficiency assessment system is that I got to know all of my students much better than I normally would have. Another side effect is that the system enabled some students who would probably failed the course otherwise to pass and do well. For example, one student who had some family issues and was absent for almost 2/3 of the class was able to finish the course on time.