If you haven’t read this editorial about “What ‘good’ dads get away with,” please do. It’s about the the “Myth of Equal Partnership.”
The best (and worst) ways to respond to student anxiety
Someone measured the disregard that natural scientists hold for research in the social sciences. You can imagine how this article is being received by the people they studied.
Here’s a column explaining how a lot of peer reviews are accepted by PIs when invited by the journal, but are ghostwritten by lab members. I think journals could do a lot more to acknowledge this and deal with it explicitly. (As an editor, I often try to directly invite postdocs and doctoral candidates to perform these peer reviews, but sometimes when I’ve invited PIs, the PIs will then write back and ask if it’s okay if they do this together with a member of their lab. How often do they do this and just not say anything?)
One of the better contributions to Quit Lit (in small part as it acknowledges its place in the genre of quit lit)
A comprehensive study examines who are successful “bullshitters,” claiming expertise in fields that they know nothing about (in fact, in fields that do not exist). Guess who bullshits the most.
The editorial board of the Dartmouth paper publishes an important editorial, that is relevant to so many other institutions.
How can one get duped into publishing in a predatory pseudojournal? One person explains their story. It’s not as farfetched as you might imagine.
For example, this article lays out a very long and horrific history of a professor at the University of Georgia. “In his defense, Kazez said the undergrad had been dressed provocatively.” This is absolutely disgusting that the university could fail to act over so many years, with such a preponderance of evidence.
A belief in meritocracy is not only false: it’s bad for you
We’ve run out of elections to waste
Scimeter: custom author metrics
How about a W.E.B. Du Bois ggplot theme?
An interesting research study on the impact of exam study guides, and the lack thereof. The results are counter to what I’d have expected. But you know, when the findings of an experiment disagree with what you would expect, what we typically do is say it needs to be replicated, rather than take it to heart. I think how a teaching method is implemented is critical for its success, so I really do wonder about replication here. Hmm.
I’d be curious to see the exams and study guides used in that study – my feeling has been that the problem with “concept” study guides is that they tend to be mismatched with the exam – general concepts on the guide and very specific fact regurgitation on the test itself, which means it’s very easy to study the wrong thing or misinterpret the level of detail/understanding expected. This would work fine if most exams focused on concepts and explaining the knowledge you have in your own words, but fails when exams emphasize memorization of specific facts or the professor’s precise wording (i.e. study guide says “understand process of photosynthesis” and exam asks to draw out the structure of a specific molecule produced). I suspect it’s less about student study habits (though study skills are important!) and more about mismatched study guides providing inadequate information for students with limited time/resources to prioritize their efforts.