March 24th, 2012
|03:14 am - Iain M. Banks' Culture novels and disagreements about Utopias and Dystopias|
I enjoyed reading most of the novels in Iain M. Banks' Culture series Banks' has stated that one of the reasons he started writing about the Culture was to create a space opera for progressive, rather than yet another example of interstellar feudalism, and which Banks has described as " a Socialist utopia written as a Lefty wish fulfillment". Add in the increasing number of transhumanist technologies that have appeared in those novels and you have a setting that I greatly enjoy reading about (in a number of exceedingly well-written novels).
However, moving beyond the novels' admitted literary merits, I enjoy reading about the Culture, because it's fairly close to what I would consider not merely a utopia, but one of the most appealing utopias that I've ever encountered (although, it's also a place that's clearly non-ideal to live near, given the way amount of well-meaning interventionism). It's definitely on the rather small list of places that I'd moved to in a heartbeat if given the choice. In a recent and lengthy discussion of history and fiction with alephnul, amberite, and alephnul's relatively new housemate, and relatively new friend of mine, Ben. What puzzled me was that Ben had read several Culture novels and considered it a fairly unpleasant dystopia, in large part because humans and being with human-level intelligence are essentially pets of the AI Minds. This idea doesn't merely not bother me, but it seems inherently superior to government by humans, but to Ben the combination of the lack of self-determination and the fact that the Minds are sufficient smarter that they've thought of and imagined everything a human could possible think or imagine means that humans are entirely irrelevant, which makes it a dystopia to him.
I realize that my own view on politics is odd, because while I find democracies clearly superior to other existing systems of government in practice, the fact that I can vote is utterly irrelevant to me, since my vote is one in hundreds of thousands (in local elections) or hundreds of millions) in national elections, and so individually pointless. Also, in Jonathan Haidt's six moral foundations, I score by far the highest in care (ie keeping others from being harmed), high in fairness, with all of the others, including liberty/oppression being considerably lower, and with loyalty, authority, and sanctity all being exceptionally low.
Yesterday, Ben and I were talking with other people about gaming, and I was amused to learn that by far my favorite type of game is one where the GM is solely in charge of creating and controlling the world and comes up with the plots and stories for the PCs, while Ben prefers non-traditional RPGs, where power is shared somewhat, and even in a game where the GM is excellent and creating excellent and engaging stories, he has problems with that style of gaming and finds it too constricting. It immediately struck me that this difference in preference came from the same basis as my viewing the Culture as a utopia and Ben considering it to be a dystopia. In any case, I'm also intrigued this difference and am also curious as to how other people feel about this.
Do you consider Banks' Culture to be
a utopia you'd enjoy living in
a dystopia you'd prefer not to live in
a mildly pleasant place
a mildly unpleasant place
other (please explain)
ticky boxes are fun
Current Mood: thoughtful
Current Music: Steely Dan - Hey Nineteen
My definition of "utopia" is "a place people are happy". And The Culture definitely is.
Mind you, I consider Brave New World to be a utopia, by the same reasoning, so I may not be considered normal in that respect.
|Date:||March 24th, 2012 10:46 am (UTC)|| |
I have similar ideas, although I'd consider Brave New World to be ambiguous rather than either a utopia or a dystopia.
|Date:||March 24th, 2012 11:08 am (UTC)|| |
|(Link)|My definition of "utopia" is "a place people are happy".
Out of curiosity - would Failed Utopia #4-2
I've been told, by a very smart AI, that the people will be exceptionally happy, approximately one week later. Presuming that it is accurate, then yes, it's a utopia.
Of course, that doesn't mean I'd choose to be instantly transported there. I wouldn't choose to live in Brave New World either.
Okay, guess it's time to crack open these books...
|Date:||March 24th, 2012 10:04 pm (UTC)|| |
I have not read the books, so I cannot say. But I will take the risk, having read the linked wikipedia article, of venturing to guess: I not only have no idea, it would not be possible for me to formulate any idea.
One thing that the last three years have really deeply confirmed for me is the fact that most people are really, truly, awfully terrible at knowing what would make them happy. And while through my training and through my deliberate efforts at cultivating that ability I am, I believe, a hell of a lot better at that task than most people, I'll be the first to point out how little ability I would have to answer the question posed. To successfully answer it (i.e. to correctly anticipate whether I would personally be happy to live in such a social environment) would require me to have access to information that not only don't I have, but which it would be impossible to get.
For instance, the psychological relationship to work and to the necessity of work, manages to be both to be much more complex than most people know, and to be much less studied, scientifically, than it needs to be for us to assert any meaningful universals. Thus, we have lots of not very rigorous evidence that in this culture, which is most definitely not post-scarcity, taking someone raised in this culture and providing for all their basic needs but forbidding them to be an employee, is psychologically deleterious. We don't know why, though heaven knows both the Right and the Left are quick to supply presumed answers.
We don't know if this effect is a product of comparison effects and would go away if everyone was post-scarcity. We don't know if this effect is a product of selection bias and there's something wrong with the sort of people this happens to. We don't know if this effect is because the provision is not complete, and there are still unmet wants and needs in the people thus provided for. We don't know if this effect is a product of a sickness in our culture, such as, say, an unhealthy, early-inculcated identification with one's role in the capitalist economy, or a schooling system which indoctrinates people with a very external locus of control, or advertising-industry based manufactured desire for material goods, or PCBs in the water, or who knows what all else. We don't know if this effect has to do with something intrinsic to human minds, either some of them or all of them.
Meanwhile, we don't know if what we can learn from these very specific and culture-bound examples maps to any specific hypothetical example, because we have no idea what specifics matter. We don't know the answer to the question, "Do humans need to work to be happy?", but worse, we don't know the answer to the question, "If so, would the things available as 'work' in the Culture meet whatever the definition of 'work' turns out to be?"
This is generally true. When I, or you, or Ben, attempts to answer the question, "Would this make me happy?" we draw on past experiences which we think are analogous and judge from them. The problem is that our selection of which past experiences we consider analogous is entire based on emotional projection. To you is salient the liberty of not having to meet basic needs and not having to keep social order, so that you can get on with the creative work you so enjoy that flourishes in peace and prosperity, so you compare the Culture to periods of peace and prosperity in your life. To Ben is salient the experience of being controlled, or being in situations (e.g. childhood) in which meaningful work was not available, only make-work, and that was alienating and oppressive, so he compares the Culture to that. But in both cases, the points of analogy seem to me to be very superficial. It is not possible to tell from this great distance which, if either, has any meaningful congruence.
So not only don't I know, neither does anybody else.
That I'm pretty sure of.
And before anybody asks, why, yes, I have read Gilbert's Stumbling on Happiness.
Edited at 2012-03-24 10:07 pm (UTC)
|Date:||March 25th, 2012 08:06 am (UTC)|| |
Re: Militant Agnosticism
This pretty much is why I answered 'other', but put much better than I would have managed. The only aspect I'd add, having actually read the books, is that the books don't really describe the aspects of the society that actually seem like they matter in terms of human happiness. We get a lot about post-scarcity and the effective elimination of death and such not, and a lot about the adventures of the dissatisfied (and kind of murderous) badasses within the culture who get recruited to go off and do imperialism on the Culture's neighbors, but very little on what it is like to live and love within the core of the Culture (as far as I can remember).
|Date:||March 25th, 2012 09:00 am (UTC)|| |
Re: Militant Agnosticism
You see a bit of actual life in the Culture in the first part of Player of Games.
|Date:||March 25th, 2012 09:16 pm (UTC)|| |
Re: Militant Agnosticism
taking someone raised in this culture and providing for all their basic needs but forbidding them to be an employee, is psychologically deleterious.
Huh, I had no idea this sort of response was remotely common. From my own experiences, I know that if I have good books and video entertrainments, RPGs to play, loved ones to cook and care for, intelligent and interesting friends to talk to, and sufficient funds to meet my basic needs and enjoy myself moderately, I can be exceedingly happy for at least a year (the longest I've managed this) and have felt no need to work in any remotely productive fashion. OTOH, I also know that my own ingrained work ethic is far lower than that of many people I know.
|Date:||March 26th, 2012 05:59 am (UTC)|| |
Re: Militant Agnosticism
Not lower. Higher. And "ethic" is exactly the wrong word for this.
Normal people don't consider reading recreational.
Normal people don't play RPGs. Frankly, it looks to them like a whole lot of trouble for the level of reward.
Normal people watch TV, socialize, and do drugs.
This all comes under the aforementioned possible "sickness in our culture": vast numbers of people have been convinced that effort is "work" and "work" is nasty and not enjoyable. They are convinced anything which requires effort can't be fun (excepting a small number of them who got way into exercising, usually while in prison).
You and I, and the sort of people we hang out with? We're not normal. We're game-playing, beer-brewing, blog-writing, scarf-knitting, book-reading, costume-designing, rabble-rousing, feast-cooking, show-staging, bon-vivant freaks.
Normal is "let's not and say we did."
Normal people literally do not believe me except where they literally do not understand me when I explain "open source software." "Wait, you mean, there are programmers who... give away their programs? For free? Why would they do that? Who pays them? Then what do they do for a living? No, wait, I'm not getting something.... That can't be right...."
The inability to entertain one's self is in fact so endemic and so problematic, that the prison system I work with actually has it as an agenda item we're supposed to cover in therapy with inmates, so they don't return to crime from sheer boredom.
You see the problem.
ETA: Also, your comment I think illustrates some of the difficulties with our concept of "work". Cooking and caring for loved ones is work. (I say this with love: Male privilege check.) Often uncompensated, monetarily, but still work. Often a pleasure, but still work. Something being a pleasure doesn't mean it's not "work", by several definitions of "work". Only one definition of "work" is, "a service which can be traded on the labor market for cash", i.e. something is only work if someone is willing to pay you enough for you to live on for you to do it. So knitting a scarf for your own use isn't "work" by that definition; running a scarf factory is. We use that one a lot in our culture, but it's a wickedly provincial, culture-bound definition of "work".
And another problem there is that "work" can also mean "contribution to society". A volunteer with a rape crisis center may not be getting paid, so it's not "work", but obviously, it is work by various other definitions, especially the rarely used one of something being a meaningful contribution."
So if you lived in a society in which you could cook for your loved ones, but your best prepared meals could be "replayed" on the matter replicator any time you want, would that change how you feel about and when cooking? Would the fact that your loved ones didn't need you to cook change how you would feel about cooking for them?
Are you familiar with Tanith Lee's Don't Bite the Sun and Drinking Sapphire Wine?
Edited at 2012-03-26 06:13 am (UTC)
|Date:||March 26th, 2012 09:12 am (UTC)|| |
Re: Militant Agnosticism
Very well thought out answer!
"its motives aren't always particularly laudable on its own terms"
Also curious about the Bay Area filter; Banks is a Scot living in Edinburgh...
Motives of wanting to improve people's lives, and defend the Culture, are outre?
What I can find on the Californian Ideology isn't obviously relevant to the Culture to me.
Can you name a case where SC has exploited any outside cultures?
They're called the dirty tricks division, and lots of mainstream Culturniks are uncomfortable with them or warn against them. But, maybe it's Banks chickening out or whitewashing them, but I note their actual behavior tends to be pretty ethical and defensible. The last book was pretty hilarious there, with a 'nasty' warship turning out to have progressively more bark than bite, a goody two shoes pretending to be bad, until actually attacked.
If anything they seem more actively moral than the mainstream, having thought about the boundaries of their behavior and the hard cases -- which they exist to deal with -- rather than hiding in safe nostrums.
One thing humans do in the Culture is vote. It's rarely focused on so one can miss or forget about it, but it seems clear to me Banks envisions the Culture as a democratic anarchy, not benevolent despotism by the Minds. There are votes on little things like the disposition of a train system on an Orbital and on big things like the Idiran war. So humans get a voice in determining what is do be done. Minds will do most of the work on determining how to do it, because they're better at that.
The Culture is actually hard to distinguish from an Asimovian Machine 'rule' where the First Law was a bill of rights and the Second Law gave priority to democratic majorities.
Folks are totally free to go beyond human or vary it up, and the technology to do so is advanced and freely available
We know Culture people are free to vary it up, and join Subliming groups; we haven't seen one upload and expand in a posthuman way, but it doesn't seem like it'd be banned. Opting out is possible too, on a big scale (Peace Faction or Zetetic Elench) or small scale (guy in Inversions, guy in State of the Art, woman in Gift From the Culture, mention of the ability to get a non-Mind ship and go off somewhere, though also that Contact would keep an eye on you to make sure you didn't try to play God with the natives.)
I find Ben's position bizarre, as it is not as though any of us have any influence on national politics as it is. Having a national politics run by ridiculously powerful and intelligent AIs isn't going to change the amount of influence I have on national politics, and the Culture AIs largely don't seem interested in influencing how I and my friends and loved-ones set up our own lives, which is where we actually do have control over our own lives now.
I have a lot of problems with the Culture because it is basically imperialist, but I wouldn't hate my powerlessness as a Culture citizen any more than I hate my powerlessness as a US citizen (and at least the Culture isn't horrible to the majority of its own citizens as the US is).
|Date:||March 25th, 2012 09:12 am (UTC)|| |
I have a lot of problems with the Culture because it is basically imperialist
Absolutely. From my PoV, it would be an absolutely awesome place to live, but also one where many remotely moral people would seriously object to its foreign policy.
at least the Culture isn't horrible to the majority of its own citizens as the US is
Also, while the means of the Culture's imperialism are no better than those of the US (merely vastly higher tech), the ends seem mostly to be considerably better (which is admittedly not saying much at all, given the hideously self-serving nature of US imperialism).
From my PoV, overturning governments and starting wars in an honest and probably not-doomed attempt to increase freedom and reduce oppression is ghastly and wrong, but also less so than doing the exact same thing, with the same stated goals, but with the actual goal of looting the now-war-torn nation of its natural resources and installing a brutal puppet power to assure access to both cheap labor and further resources.
Given that you're consequentialist enough to dismiss democracy if Minds are running things, I'm curious as to why you call interventionism by them ghastly and wrong.
|Date:||March 27th, 2012 08:49 am (UTC)|| |
In part, it's a question of timing. The Culture starts wars and collapses societies from within. We see that these efforts cause large amounts of death and suffering (usually not directly at the hands of SC agents, but horrors happen nonetheless). The Minds believe that these efforts will reduce suffering in those societies in the long-term, and that the total suffering in the society will be reduced, but in addition to being occasionally wrong, I find the idea of balancing possible (or even likely, but definitely not certain) future suffering against the certaintly of causing present day suffering to be impressively dubious.
Well, they're also interrupting current day suffering. Azad had its own native horrors. The Idirans nuked cities as a matter of course. Don't do anything, and those are certain too.
|Date:||March 27th, 2012 10:49 am (UTC)|| |
Yes, but (and it's been quite a while since I read Player of Games, so I could be misremembering) from what I remember, the cultural upheavals caused by Gurgeh seem likely to do more short-term harm (and possibly considerably more). Similarly, the Idiran war killed vast number of sentients, considerably more than it sounded like the Idrians were killing before the war.
|Date:||March 26th, 2012 09:51 am (UTC)|| |
There is a complex question involved in evaluating a fictional civilization, because I think it should be evaluated both on Consequentialist grounds and on Systemic grounds.
In the real world, I would scrap systemic grounds for evaluating a civilization: the consequences are far more important to me than the actual system itself. Systems that are abhorrent are so because they produce awful (intended or unintended) consequences, not because they are objectively "bad."
The problem is that in a fictional world, the author dictates the consequences. You can write a system that would naturally result in horrible consequences and also write it so that it doesn't. In fact, it's easier to do that then to write something that would be objectively valid.
The problem is that when you read a story you get immersed into it. And in that immersion, it feels like the consequences of the civilization are a result of the systems in the society rather than a result of the mechanizations of the author. (Who has his own biases and values which do not necessarily conform with nature's)
So there are really two questions here: Do you consider the social system described in Banks' Culture to be utopian/dystopian, and do you consider the consequences to be utopian/dystopian?
When someone raises the objection that humans are irrelevant in that society, they are making a statement about the system more than the consequences: it is not directly clear that humans being irrelevant to the workings of society is harmful or awful in and of itself. Rather, it is harmful because in practice we have seen that systems that make segments of society irrelevant result in bad consequences for those groups.
When someone says that the consequences are things like happiness or greater technology or a certain social perspective, they are talking about consequences. But those consequences are a result of the author, not the system described by the author.
Further, fictionalized systems do not usually operate like their real-world counterparts. I am reminded of the Group Selection vs. Individual Selection debate, in which attempting to artificially create an environment where Group Selection occurred (in fruit flies) resulted in wide-spread cannibalism and not the sort of "natural harmony" that defenders of Group Selection thought would occur. Creating a real version of a fictionalized system doesn't indicate that you get the fictionalized consequences.
So for me, it isn't even clear what I should evaluate or how. I could evaluate the system, but since any mental model I create is likely to be as biased as that of the author it's not really valid. I can't CREATE a fake Culture in a realistic environment to base my modeling on. (And even if I could, it would probably be horribly unethical to do so.) Assuming that the stated consequences are a result of the stated system strikes me as epistemologically dishonest. And just evaluating the stated consequences doesn't really tell you anything since the fictional characters will always have "whatever consequences the author damn well feels like."
There's also that the Culture isn't described all that closely. People tend to see domination by the benevolent Minds, but votes get mentioned at multiple points; it's far from clear that humans don't get a say in what's done.
|Date:||March 28th, 2012 12:37 am (UTC)|| |
I am basically distrustful of utopias. It is the nature of living things to strive and grow. Utopias tend to be stagnant and lack adequate levels of striving for really intelligent beings. So, an AI-run human zoo could be great... for the AIs.