sigmaleph: (Default)
sigmaleph ([personal profile] sigmaleph) wrote2018-12-07 11:23 pm
Entry tags:

wherein i try to talk about a show and sort of go off rails

The Good Place is a show that is clearly *about* philosophy in a way very little television is (or at least, television I've been exposed to). Which makes it all the more frustrating when the show is just... terrible. at analysing philosophical questions.

The comparison that comes to mind is that it's to philosophy what science fiction is to actual science, except, well... that's kind of awfully arrogant of me isn't it. I am not a scientist but I was a science student for a while. I typically have reason to be confident when I call out some bit of technobabble as making no sense whatsoever, and I have the backing of the scientific establishment behind me.

But like, me calling out Chidi for not being a consequentialist or whatever that nonsense was about free will vs determinism for not even mentioning compatibilism? Lots of real actual philosophers think consequentialism and compatibilism are wrong. I am not even an amateur philosophy student, just some girl who spends too much time on the internet, who the heck cares what I think

and yet compatibilism is the obviously correct answer to the philosophical question of determinism vs free will. like being 100% honest I'm not entertaining much reasonable doubt about this. I have the sense that I *should be* less confident that I've got this question right when clearly lots of very smart people spent longer than I've been alive thinking about this and came to different conclusions, but I'm not.

that sure sounds like it makes me arrogant, which is interesting because I tend to be a very epistemically anxious person (also an anxious person in general). why am i not on this, of all things?

anyway yeah don't watch The Good Place for philosophical instruction, and don't watch sci-fi to learn science.
fibonacci_reminder: (Default)

[personal profile] fibonacci_reminder 2018-12-08 02:48 am (UTC)(link)
This is pure speculation, but I'd guess that you're confidence on this is because, at least to a degree, you have spent a lot of personal time thinking about and teasing out the potential problems with the idea and creating personal nuance to answer those problems, and because the nature of free will vs. determinism means that it's a very, like, definitions-contingent thing, so if you're comfortable and explicit about your internal definitions there usually isn't that much to learn about the situation from external observation or experimentation (unless those very definitions point towards some testable parameterizaton of 'free-will'), and thus you can overall come to what feels like a 'solid' conclusion purely by thinking about it and tweaking your belief until it no longer feels like it needs tweaking.
theaudientvoid: (Default)

[personal profile] theaudientvoid 2018-12-08 03:04 am (UTC)(link)
and yet compatibilism is the obviously correct answer to the philosophical question of determinism vs free will. like being 100% honest I'm not entertaining much reasonable doubt about this.

If by compatibilism, you mean "redefining 'free will' to be functionally meaningless" then yes, you are correct.
Edited 2018-12-08 03:06 (UTC)
oligopsony: (Default)

[personal profile] oligopsony 2018-12-08 03:37 am (UTC)(link)
Chidi not mentioning compatibilism seems much worse than his not being a compatibilist. As you point out, plenty of philosophers are incompatibilists (i.e., wrong.) But none of them would fail to be aware of it as an option, if only to knock it down.

But then Chidi hsd always been a frustrating character, because the basic idea of the character is neat and compelling, the actor delivers it well, but he's just served such inconsistent characterization from the script. He's an academic philosopher but shows almost zero interest in the bizarre metaphysical situation he finds himself in, except for one reboot that gets depressed. His defining character traits are being extremely smart and pathologically scrupulous, but he misses whole strains of ethical considerations that should be obvious.

This is implicit in your post, I think, but I think TGP might be the rare fiction where becoming more like a rationalist fic would be better by its own criteria.
unknought: (Default)

[personal profile] unknought 2018-12-08 04:11 am (UTC)(link)
There's a model of philosophical disagreement which has become compelling to me recently, which I don't fully endorse:

We don't come to philosophical positions by pure logic alone, but by thinking about our intuitions about the world and trying to fit them into a coherent picture of the thing we're trying to understand. No matter how we do this some of the conclusions we reach are going to be counterintuitive; we're going to need to bite some bullets to get anywhere. But which intuitions we start with and which counterintuitive claims we're willing to accept is something that varies a fair amount from person to person. Philosophical analysis can help us expand on a particular viewpoint and give a detailed and precise picture of the world based on that intuition, but it can't really resolve differences between people coming from different starting points. So you can get to a point where you can be pretty confident that no further philosophical argument could convince you to change your mind while still knowing a lot less about the topic than some philosopher with a different view.

But why should you trust your own intuitions? I think the only answer I can give here is that it's not really possible to do otherwise. No matter what approach or meta-approach you take to the problem, you're ultimately starting from a place of "What makes sense to me?" and you can't get away from that. In Bayesian terms: Two rational Bayesian predictors with very different priors can reach different conclusions from the same evidence, but it doesn't ever become rational for either predictor to change their priors in response to the disagreement.

Like I said I don't completely endorse this model, but for me it's given an explanation for how I can be justifiably confident in a philosophical position even while acknowledging that there is no consensus among people who understand the issue much better than I do.
packbat: A bat wearing a big asexual-flag (black-gray-white-purple) backpack. (backpack bat)

[personal profile] packbat 2018-12-19 07:37 pm (UTC)(link)
*glances at the other comments*

I really enjoyed the college class I audited on the free will question, but I think this is one of those philosophy topics that highlights how much of the disagreements are intractable differences of intuition.

(...huh - "Free will" article on Wikipedia looks like a pretty solid overview, at least in the intro part.)

For what it's worth, I mostly agree with you? Sometimes in conversation with people whose conception of free will is profoundly different from my own, I will slide into a semi-compatibilist "we still have moral responsibility" position, but that's more "I'm often willing to speak someone else's language" than anything else. On my own, I compatibilism.

Popping back up to the attempted point about philosophy in fiction: I agree with you. A lot of people who tell stories are not students of philosophy, even on the level of "spends too much time on the Internet", and they tend towards obvious questions and obvious-to-them answers.