I’m sure the title got your attention: I have some troubling data.

For years, in my principles of macroeconomics, I’ve given a pre-test on the first day: ten questions, drawn from the whole semester the students have yet to take (or maybe just pass the first time). It counts towards their grades, so I think they take it seriously. But honestly, I expect them to guess on most questions.

There has never been a male/female pattern in points scored on that quiz.

This year, for the first time, I also gave FINRA’s 6 question Financial Literacy Quiz under similar conditions. I think this has broad applicability to macroeconomics too.

Here’s the disturbing results. In a simple regression of test scores on a dummy variable that took a value of one if the student was femaile (and an intercept term), I got the following estimate for the coefficient on that dummy from the “comprehensive” pretest:

-0.21 (s.e. = 0.32, t = -0.67)

So, no evidence of a gender difference. But for the FINRA quiz I got:

-0.42 (s.e. = 0.20, t = -2.13)

This indicates that not only did my female students average fewer correct (on a 6 question quiz), but the standard error is smaller indicating that for this material the estimate is sharper. The overall class average was 4.4 out of 6, so a difference of –0.4 is pretty large.

This is a disturbing result. The FINRA quiz is supposed to measure basic financial literacy needed by adults. My “comprehensive” pretest serves as a decent control, suggesting that this FINRA is picking up weakness in the finance scores of female college students.