Confusion, in Praise Of

Estimated read time (minus contemplative pauses): 22 min.

On this website’s “About” page and in many blog posts, I praise confusion. I’m not alone. Here are four currently active philosophers endorsing confusion (or close enough). At the third philosopher, I’ll share some thoughts of my own on the topic.

Quayshawn Spencer

In an illuminating interview on the UPenn-based OMNIA podcast, in an episode called “Philosophy of Race” (3/27/19), philosopher Quayshawn Spencer closes with the following words in reference to a course he teaches on philosophy of race at UPenn:

“Ok, what is racism? What is race? Convince me that you’re right. Give me the argument.” And when you put the challenge to students like that and you give them these rules, “this is how you deductively, validly make your argument,” it definitely makes it more humble to see how difficult this is. A lot of times, students flip. It’s like, “I came here thinking that race was just a non-biological social construct, now I see I can’t defend that”… “I came in thinking that race was this biological thing that’s obviously real.”

In most cases, they start to appreciate the other side and to see what sort of evidence they need to make their argument more tight, to make it deductively valid. And if they can’t find it, then a lot of times they just end up in this gray haze of “I don’t really know what to think,” which is usually kind of where we want you to be at the end of a philosophy class.

[To hear more from Spencer, check out this excellent SCI PHI podcast interview, in which he discusses his personal background, the evolution of his views on the science and metaphysics of race, and his development as a scientist turned philosopher (specializing in science, biology, and race): Episode 37: Quayshawn Spencer (4/12/18).

I’ve also recently enjoyed and learned a lot from his (and others’) contributions to the thought-provoking collection What Is Race?: Four Philosophical Views.]

Bears repeating: a lot of times they just end up in this gray haze of “I don’t really know what to think,” which is usually kind of where we want you to be at the end of a philosophy class.

David Albert

Spencer’s not the first philosopher I’ve heard, or heard of, talking that way. A 2010 student review of David Albert’s philosophy of science course (at Columbia University) quotes Albert (a philosopher with a PhD in theoretical physics):

At the beginning of the semester, Prof. Albert told the class, “You know you are in a good philosophy class if you leave the class more confused than when you came in.” I left Albert’s class absolutely more confused than I had come in, but the reason for my confusion was due to a radical confrontation of previously unexamined ideas I had about science and the way we experience the world.

(See this footnote for the full review from Culpa.com, Columbia’s internal teacher review site.1)

[To hear more from Albert, I recommend these recent discussions (with no shortage of disagreements) he’s had with physicist Sean Carroll: “Science Saturday: Time’s Arrow | Sean Carroll & David Albert” (from Bloggingheads.tv on YouTube, 6/17/2008); “Science Saturday: Problems in Quantum Mechanics | Sean Carroll & David Albert” (from Bloggingheads.tv on YouTube, 7/22/2008); “David Albert on Quantum Measurement and the Problems with Many-Worlds” (episode 36 of Carroll’s Mindscape podcast, 3/4/19).]

Amy Olberding

In the above quotes, Spencer and Albert are speaking of what happens in a good philosophy course. I assume the sentiment also applies outside the classroom. Which brings me to the third philosopher I’ll mention today.

In March 2018, University of Oklahoma–professor Amy Olberding published on her blog a wonderful essay called “Resignation,” in which she discusses confusion and uncertainty in a non-classroom—or what we might call ‘real-world’—context.

The essay2 is a public resignation from blogging at Feminist Philosophers (or “FP,” which itself ceased to publish new content in May 2019) and, more broadly, from “high-traffic online philosophy culture” in general.

I can sorta relate. I disabled my Facebook account many months ago. And I’ve tried to curate a stimulating Twitter stream, with some success, but every time I go there I quickly find myself stepping into some noxious pool of vitriol that sends me racing for the ‘close’ button. Most troubling is that there’s something intoxicating about the vitriol. I have to force out the temptation to go back for more, have to force whatever I saw there, usually at the top of my feed, out of my thoughts.

It’s especially disheartening to see philosophers engaged in this putrefying enterprise. My personal experience as a semi-recent (I finished in 2015) philosophy student at two different institutions—a community college and an Ivy League3—gives me good reason to hope that this behavior doesn’t represent where the field in general has gone or is headed any time soon. As does Olberding’s blog post, in which I detect no desire to resign from the field itself.

At any rate, it’s not hard to see why Olberding in particular would want to retreat. Her most recent work, as she notes on her website’s “About” page (accessed 2/27/20), “aims to use early Confucian work on li as a foundation for thinking through contemporary issues surrounding ordinary manners and civility.”

I obviously want you to read Olberding’s essay itself, but will share some of my favorite bits here with some commentary. This is my second pass at this. The first pass went 5,500 words over what’s here now. Anything cut worth saying will be said another day.

In “Resignation,” Olberding characterizes the above-noted focus as the idiom from which she blogged during her five years at FP, where her posts “most often focus[ed] on intersections in feminism and civility or [were] spurred by [her] own commitments to civility.”

That idiom’s poor fit with—or at least its inability to provide a balancing element or positively influence—the website’s prevailing ethos seems to have eventually become too painfully glaring to ignore. Not to mention the mounting frustration the dissonance must have resulted in.

I’d venture to guess that she did hope to have a good influence, given that, even before she began blogging at FP, she had observed in the broader world of online philosophy heated and even inhumane conversations that then, as they do now, often favored “the quick and agile, the aggressive and insistent, people who like (or at least can ably engage) the rough and tumble of agonistic back and forth—and most of all those who are confidently certain.”

Pre-FP, she had avoided entering those conversations: “the rough and tumble mostly makes me sad and I often have a shortage of certainty.”

“Shortage of certainty” is rather close to what I mean by confusion (more on what I don’t mean by the word below). I take this to be a bold admission in Olberding’s world—in my and probably your world—in which

people who speak most and most insistently seem not only to be absolutely clear about what they think, but think there is no other legitimate, respectable, or even moral way to think… In too many contexts, to confess confusion or uncertainty is to confess deficiency—sometimes in philosophical acumen, sometimes in ‘smarts,’ sometimes in moral clarity, sometimes even in basic humanity.

“Too many contexts” indeed. Where this tendency comes from and why it seems to have become so concentrated and powerful in recent years, I’m not sure. But I will offer one possibility, and with some reservation (I’ve been tired of hearing theories about this at least since 2016).

My understanding is that the following sort of pedagogy has not always been standard, at least not everywhere. I don’t know the history of it in this country, but I do know that I’ve seen it taken for granted among American students from elementary school onward in recent years, and I know that college students from other countries have told me they find the approach awkward. It bothered me while in college. It shows up on standardized tests, classroom discussions, and in writing assignments:

You are given a topic. You are asked to concoct a thesis. Preferably a strong and clear thesis; bonus points for a surprising one. You are asked to defend the thesis. You are not allowed to choose “I don’t know” as your thesis. My suggestion is that this sort of training has made its way into to the broader discourse, particularly on social media, where it is very easily implemented. From there, it is organically creeping its way into everyday speech. And this is harmful.

1.1 Digression: A Thesis Thesis

Here’s some elaboration on the suggestion. It’s the one digression I’ll allow myself today, while disciplining myself to keep the word count, personal anecdotes, and external references to a minimum. The result is a partially formed rib cage of an in-progress skeleton of a larger argument.

I think we generally know we’re inventing stories when tasked to, say, describe the contents of a long-dead person’s mind. Especially when the story is concocted lazily or impatiently. At the same time, it’s easy to imagine that we can’t help but cultivate some partiality to any thesis we defend. There’s evidence that cognitive dissonance resolves in favor of behavior. I suppose this would be especially true when social normals encourage us to match our attitudes to the behavior in question.

Research has also suggested that those who observe us defending a thesis are likely to assume at least some credence on our part for the thesis, as in the classic 1967 Jones & Harris pro-/anti-Castro study. From there, it’s easy to imagine that the expectation to defend—to (pretend) to believe—a thesis, coupled with being treated as though we believe it, could affect our credence in it.

My suggestion is that the social norm at the moment is to match our attitudes to some concocted, politically motivated thesis (or at least to pretend to do so, should the cognitive dissonance not quite resolve), and to defend the thesis in roughly the same way, and with the same (apparent) conviction, as we are trained to do in school. And if not (apparent) conviction than at least a demonstrable willingness to go through the obligatory motions (‘each student must respond to two other students…’ etc.).

The thesis need not be a personal or original one (and, again, you really only need to appear to believe it); in fact, it often shouldn’t be original: rather, it should be one currently in fashion. (I might like the game far more were it not for the element of fashionability; the game-ness of it would be more transparent, for starters.)

I’m probably overblowing the social impact of thesis-concoction pedagogy. Or maybe it’s even worse and more prevalent than I’m aware—has, is, and always will be the case. Maybe it falls under a timeless practice of bad-faith or lazy or super-credible (i.e., too fashion-prone) or intellectually unsubtle or power-hungry rhetoricians saying whatever it takes to win or to get themselves to sleep at night or whatever it is. Maybe I’m misreading things. Maybe I’m confused.

Whatever’s going it, it does seem to be going on more so, or to be magnified, by online channels. As Ezra Klein put it in a recent public discussion with Ta-Nehisi Coates, shared as an episode of Klein’s podcast (The Ezra Klein Show, “Ta-Nehisi Coates on my ‘cold, atheist book’,” 2/16/20):

I think about this [the “weird way in which social media weaponizes the worst or dumbest thing happening at any given moment anywhere”] all the time in the campus political correctness fight. I went to UC Santa Cruz … and I adore US Santa Cruz… but dumb shit happened there all the time, including by me personally. But it didn’t have a mechanism to go national in the way it does now.

Not to suggest Klein would agree with my present theory (though he’s no stranger to theory-weaving: careful and sincere ones, often prefaced with “I’ve been thinking about this a lot lately,” and book-length on his own theory on Why We’re Polarized (I haven’t yet read it)).

What I can safely say we—and by “we” I mean more than Klein and I— agree on is the concern around “weaponizing.” It doesn’t take a great portion of our population yielding that weapon to have a tremendous impact on lived lives and culture. How a relatively small group can get away with this would take yet more theorizing, which I’ll resist.*

[*It would have to do not just with complacency or fear or being otherwise busy, but also—and perhaps especially—with the spectacle of thing, in what strikes me as the modern-day equivalent of gladiator fights, wherein spectators not only take pleasure in the suffering of others, but are granted moral permission to indulge that pleasure and, if they want it, to contribute to that suffering. Pointing to all the ways our society is set up to morally permit the creation and enjoyment of suffering in others is a big project in its own right, of which social media is but a subset, though perhaps the most efficient ever. I’ve kept track and made note of particularly egregious examples when I see them, and will one day share some portion of the list (these are not the usual instances of public shaming, of the sort Jon Ronson profiles in So You’ve Been Publicly Shamed (2015); they are uglier than this, in the deeper and danker corners of a vein whose more popular manifestation is the so called Darwin Awards—the deep opposite of compassion, well beyond mere, pedestrian schadenfreude).

What’s special about social media, even more so than traditional media outlets, is its ability to cater to the thrill people (myself included) get when meeting their ideal enemy. This is the enemy we’re dying to face, the person who holds the straw man version of every evil belief we hate and who produces every argument we’ve been trained to defeat. The ideal enemy is the person for whom the otherwise fallacious logic holds that, if they believe A, they also believe B through Z: the hold the entire package of all the beliefs we hate and, like everyone, they are their beliefs.

This is the very sort of person whose suffering we understand ourselves to have permission to create, contribute to, and enjoy. Social media is designed in such a way—for bare starters, with its discretizing of personal attributes that Jaron Lanier warns us against in You Are Not a Gadget (2010)—that, if a person of interest is not that ideal enemy (they rarely are), we can shape them to be that through a kind of harmonizing between the virtual reality on our screen and that in our mind.]

Instead, here’s a final defense and the conclusion of my suggestion—of my thesis, fine, but one I believe has merit.

I’ve often seen the sincere-or-not concocting of elaborately argued theses built into academic training. A key feature of there is to ignore confusion as either irrelevant (I’ve more than once been advised to omit strong counterexamples) or as a sign of a not-yet polished argument, rather than as a sign of the argument or supportive assumptions themselves being confused. Some of this has to do with the demands of a publish-or-die profession, or with making it easier to grade (with high marks), or with activism, or with the mentor’s theoretical commitments and reputation (cradled perhaps in decades of delicate theory-weaving), or with who knows what. But there it is.

And for it to be there with the leaders (young or old) in a field is, I suppose, as old as epicycles. What strikes me as somehow more worrisome is its being demanded of every student, starting young. My untested hunch is that this is new. If not, the in a certain local sense, I may well be be overblowing the impact of the phenomenon.

But social media, which is new, is not local. And that’s really the point here. It doesn’t take a lot of (conscious or unconscious) concoct-and-defend devotees—particularly loud ones, particularly of the sort described above by Olberding4 (though I don’t mean to imply she’d agree with anything I say here)—for consequences to ripple into the rest of our real-world lives. Particularly as we carry around those voices in our pockets twenty-four seven.

The upshot, finally, is this. I worry that the taken-for-granted practice of concoct-and-defend has moved from the classroom to the real world, where Tweets and such are the stuff to be interpreted, and those interpretations are viewed as a kind of mind-reading (in the sense of inferring a mental state, or worse: a system of beliefs, attitudes, dispositions). Add to this the casualness and spontaneity of the social media format—enabling us to Tweet while on the move, while tired, without thinking, while drunk, while drugged—make expressions there seem so close to being the verbalized utterances of private thoughts that they might as well be.

In this way, the exegetical training we get in school has, for many of us, made its way into actual human speech: “You claim to sincerely believe in and support and be dedicated to xyz, but what you really mean by your claim is obviously the opposite of all that.”** Why it’s so obvious may be grounded in specific behaviors, but they need not be subtle ones (e.g., the facile “Only someone who believes the opposite of xyz would claim xyz” distortion of over-protesteth-ing), as this would make the game too difficult to play and, perhaps more importantly, for spectators to follow.

[**I’m reminded of being told once over dinner: “Anything you believe is true, due to your believing it. For me, God exists because I believe in Him. But for you, God doesn’t exist because you don’t believe in him. So it’s true that God exists and doesn’t exist.” My response: “I believe that you don’t believe in God.” This earned a chuckle and ended the game. I’ve encountered plenty of less reasonable people who would have said “no problem” to that and doubled down on their bullshit. Or maybe it’s not bullshit, they really mean it. Which is worse? Some other day, I’ll share some of the examples of doubling-downers.]

It also means that academic practitioners of subjects that have until recently been obscure are put at an advantage: they are the trainees and the trainers. The experts. (Or expert enough to make the old Sokal hoax possible.) The chmess-masters in a world of players minimally schooled in the game, but schooled enough to take the game for granted. These experts write the rulebook and supply the referees for resolving the rulebook’s in-built contradictions.

But here is where the real problem perhaps lies. The rules must be popularized within 140-character and five-second-soundbite restraints. This isn’t Susan Sontag on photography and camp, or Edward Said on Orientalism and the intellectual in exile (both of whom are dear to me; choose your own examples†). It’s far simpler than this, something more like a gutless but an artificially valorized shell whose elaborate decorations falsely advertise a complex, internally coherent, system.

I hope I’ve said enough to suggest why such a thing—given to slogans and interpretive whims (where the loudest whim wins)—is dangerous. For now. I think people will tire of the game, grow bored of the spectacle, soon enough. But here we are.

[†For another day: It’s thought-provoking to notice how many towering public intellectuals passed away within a handful of years following the 9/11/2001 attacks, among whom Sontag (died 2004) and Said (died 2003) aren’t the only ones sorely missed. For a sense of what we’re missing, see Said’s 2003 preface to the 25th-anniversary reprint of his influential 1978 book Orientalism.]

On the other hand, perhaps the game is as old as old. My even deeper suggestion is that exegesis license is a requirement for social stability, most fundamentally in the interpretation the great monotheist texts. And not just in order to get away from literalism. It’s astonishing to read the skill with which, for example, the scholastic philosophers applied their craft and trade.‡

[‡An excellent less-technical example is Anselm’s On the Fall of the Devil (also available here under De Casu Diaboli). Be sure to see Anselm’s final answer to the question on the last page. Keep in mind the mortal pressure that philosophers must have felt to arrive at an answer that doesn’t threaten the integrity of relevant theological frameworks (though I don’t know how much such pressure himself Anselm felt).

“Integrity of frameworks” reminds me of an even older example. I recall learning that, in some ancient story or another, the ship of Theseus had to be the same ship, kept indefinitely in harbor, in order to please the gods. Philosophers found, to the relief of all, that it was indeed the same ship, even once every plank had been replaced. I don’t know where to find that, but the original story has its beginnings in article 23 of Plutarch’s The Life of Theseus, which you can find here. Hobbes later adds the question of what to make of the situation when the removed planks are used to build a separate ship.

And, for yet another day, is the thought that so many of the ancient questions aimed at sustaining the integrity of theological frameworks survive today, after much of that framework has deteriorated, as our Big Questions.]

So maybe I am observing symptoms here of a deeper, humanly cause, and not diagnosing the causes themselves. But anyway, there you have the half-assembled ribcage of a thesis towards a hermeneutics of social-media-age exegesis. (He says leaning into the irony).

1.2 What I Mean by Confusion

My antidote to the above-blathered phenomena is to embrace confusion (a natural state for me, I’m lucky to find). By which I don’t mean semantic confusion (e.g., imprecise language) or confusion for its own sake (e.g., intentional self-contradiction designed to interrupt of interrogate common sense, or even just in order to appear interesting in a boring world).

I mean, rather, a species of good-faith confusion: You’ve done your best to think a thing through clearly and with precise definitions and with due charitability (whatever that means for your conscience) granted to views from all sides5.

I mean by confusion whatever was meant by so many of the impressive student I met in college who would often preface their in-class questions with, “I’m confused…”

This is not to be confused with a confusion of a sort that amounts to a coy putdown from philosophers who say things like, “When philosophers talk about xyz, I haven’t the foggiest idea what they mean.” The implication being that there is nothing there to be understood, that it’s all nonsense, it’s not my fault but theirs. In putdown form, this generally means, I think, that the ideas in question are unsalvageable.

When not meant as a putdown, the sentiment might be more constructive. An admission of confusion can help root out poor accounts or less than ideally worded explanations. And, sure, sometimes nonsense, or close enough (for which I’ll thank you, as I have done more than once when rightly told “I think you’re confused about the meaning of such and such a word or concept, so fix it or snip it…”).

Decent stuff. But my favorite sort of confusion as an appropriate response to difficult complexes of ideas that, by virtue of their complexity, resist clear accounting for.

Before returning to Olberding’s essay, I might as well point out a broader sort of confusion I don’t mean to praise here (though if you’re confused about something, I think it’s for the best for all involved that you to feel safe saying so). Rather than try (and fail) to define this, I’ll give some intuitive examples. I don’t mean to praise confusion about certain general moral claims (e.g., “It’s wrong to murder innocent people”) or about basic sense data (e.g., about whether I’m sitting here typing this).

A call for confusion is a call for specificity, clarity, and nuance. If we come out the other side of such a process unconfused, that’s great, too. I suppose there’s a temptation to view this as a kind of Popperian attempt at falsifiability. Not even close. But I’ll leave this here.

I hope I’ve by now given a good sense of why I applaud the idea, to paraphrase Spencer and Albert, of being more confused now than when you started. And why I often end posts on this website with, “Ok, I’ve sufficiently confused myself.” I’m suspicious of myself whenever I start to feel like a complicated topic makes clear sense to me. The world isn’t simple. Humans certainly aren’t.

1.3 Back to Olberding’s Essay

It should be unsurprising, then, that, even when I agree with the certain person’s view, I share Olberding’s

despair of the quick condemnation, scorn, and contempt that so often animates the commentary offered by the certain, whatever the direction of their certainty.

And I certainly share her

worry that we incentivize both certainty and hiding confusion. Or, more accurately, that we encourage people to *perform* their engagement in online conversations as if their views are confidently, firmly settled—worse, as if all alternatives are justly derided and scorned. We also thereby suppress contributions by those who can’t or won’t do this.

Olberding rightly acknowledges that calls for civility can be “weaponized” to stifle legitimate views against an unjust status quo or for tone policing and, though she doesn’t use this term, pressuring others into a potentially oppressive politics of respectability. And I think she is right to “worry that we grow so cynical about civility that we assume its only motivation can be to stifle and police,” as well as to “doubt that few of us are as righteous as we think when we eagerly and aggressively assail.”

“Maybe it’s sometimes good to punch for justice,” she writes, “but maybe doing it too much and too often just cultivates an appetite to punch.” An interesting point to highlight. I found the punch a nazi slogan/injunction bizarrely counterproductive, as it seems to command me to revel in the toxic masculinity of which I’m simultaneously exhorted (by the same people?) to purge myself. Not to mention practical worries bout the unconsidered and almost euphoric looseness with which the term is applied. (I’m reminded of the man who, during the Obama years, used to hang around Chicago’s Wellington Brown Line entrance with a sandwich board alerting commuters to Obama’s Naziism. The sign’s portrait of Obama with a little mustache made the message vivid and memorable.)

At the end of her goodbye letter, Olberding beautifully recounts a real-world, clarity-inducing tragedy (in a literal sense of that word whose impact I don’t aim to diminish here) that nudged her to finally give in to a growing temptation “to just recede into handling my confusions elsewhere, off the FP blog, with others or in solitude,” where she may “attend to life’s big confusions gently, with trepidation, and away from the hastening, importunate ire of agonistic contests between those already wholly certain.”

Right there with you. I occasionally think maybe I’m missing out by keeping away from social media, but, as Olberding puts it, “The meanness, the derision and shaming, the inhumanity of our interactions online are too difficult to absorb into the life I really want.”

[Read more about Olberding’s work on this topic in the non-academic publications listed on website (where as of today, 2/26/20, she has published exactly one post since “Resignation,” aptly titled “20 Theses Regarding Civility.” See also her 2019 book The Wrong of Rudeness: Learning Modern Civility from Ancient Chinese Philosophy (I haven’t yet it).

For another pro-civility perspective—one from a philosopher who started out on the contra side—see this Philosophy Bites episode: “Teresa Bejan on Civility” (8/20/18).]

Joshua Glasgow

It’s not totally unthinkable that a student could write an argumentative essay without defending a particular thesis. Especially if the student does a good job of laying out some of the competing views while giving some sense of why—i.e., while arguing that—choosing is tough. One of the reasons I majored in philosophy was an increased openness to such an approach.

I once even encountered reverence for what I was told by an instructor to be the (now antiquated?) British way of teaching philosophy: “what makes undergraduates think they yet have the right to an opinion?” The idea there being that students would have to wait until a certain point in their training before putting together an argument around a thesis. That’s harsh. I’d just like papers to be more than exercise in how to defend an insincere claim. Better to be honest and humbly admit to confusion.

I’ll wrap up here with someone doing just that. Joshua Glasgow, professor at Sonoma State University, does not doesn’t explicitly use the word confusion in this excerpt, but close enough. It’s from the book I linked above that Spencer contributed to. Glasgow is the volume’s editor and a contributor: What Is Race?: Four Philosophical Views.

In his contribution, Glasgow argues first for a certain conception of race. Then he introduces a second, competing conception that, after it was brought up to him by a graduate student, confounded his first position. He was thus moved to close his essay as follows:

On this question, I’m afraid that I am at a loss. All I have are weak and wavering leanings about which of these commitments is entrenched in the meaning of the word ‘race.’ It may be that we have some conversations in which we deploy one meaning of ‘race’ and other conversations where we deploy the other, allowing basic racial realism to be true for some conversations while racial anti-realism is true for others. It might be that we have not taken a stand either way, in any conversation, in which case ‘race’ is semantically indeterminate on this question, meaning that there simply is no fact of the matter whether basic realism or anti-realism better fits what we mean by ‘race.’ Or it may be, instead, that there is a determinate, decisive answer in one direction that I am not seeing. Perhaps you can do better at navigating through this particularly heavy fog. (p 144)

Right there with you!

[For more from Glasgow, see his website, where you’ll find links to publications. See also his 2009 book, A Theory of Race (I haven’t read it).]


Enjoy or find this post useful? Please consider pitching in a dollar or three to help me do a better job of populating this website with worthwhile words and music. Let me know what you'd like to see more of while you're at it. Transaction handled by PayPal.

Further Reading

Footnotes:

  1. Dec 11, 2010
    Albert, David 
[PHIL W3551] Philosophy of Science
    Please keep in mind that this review is more than 5 years old.

    Prof. Albert approaches teaching with a loose and fluid style that more resembles storytelling than lecturing. I did not want to take philosophy of science and had no interest in the topic, but it was a requirement for my major. Nevertheless, I left Albert’s class with a completely different view of science and a profound appreciation for the subject. At the beginning of the semester, Prof. Albert told the class, “You know you are in a good philosophy class if you leave the class more confused than when you came in.” I left Albert’s class absolutely more confused than I had come in, but the reason for my confusion was due to a radical confrontation of previously unexamined ideas I had about science and the way we experience the world. If you are the type of person who doesn’t like going to class. If you want to be spoon-fed equations and procedures and take plug-n’-chug style exams, don’t take Albert’s class. You won’t get anything out of it. On the other hand, if you are willing to take new material seriously and aren’t afraid of a real philosophy class, jump on this opportunity. Prof. Albert’s teaching style is unconventional. He is less concerned about participation points, homework, and exams than he is about the material itself. He is an expert in his field, and if you approach this class with due respect, you won’t regret taking it. Appreciate the time you get to spend in the presence of genius.

    Workload: Reading that I was not tested on. Midterm exam. Final 15 page paper.

  2. Which I learned about from episode 12 of the Two Psychologists Four Beers podcast episode “Everybody Hates Social Media” (10/24/18).
  3. I got a lot from both. I started at the community college in 2009, just after the 2008 crash that led to the Occupy Movement, which ended in 2012, the same year I started at the Ivy League school. It’s fascinating to notice how different, due to a given moment’s official crises, the content and thrust of a college education can be for two people born a few years apart. That education can influence a person’s worldview for the rest of their life.
  4. Or what I take to be a similar or related sort of person who fits the hedgehog—in contrast to fox—description found in Philip Tetlock’s Superforecasting: The Art and Science of Prediction (2015).
  5. What we might beautifully imagine as a dragonfly-eyed-fox’s–eye view, in keeping with Tetlock’s account, as referenced in the Digression.

Share your thoughts: