We are going to need to go far beyond a merely sharp knife. We are going to need a Vorpal blade. Let’s dive in and see what happens.
Genocide is deeper than Game A. (etc)
We definitely see eye to eye on genocide generally speaking, but I’m not convinced it’s deeper than Game A.
I think it’s important to note that genocide does show up in the ants (the goal of ant war is to kill the babies) but it doesn’t show up very many other places in nature, as far as I’m aware. Just the social insects. Possibly in how dolphins treat sharks, although that might be debatable. Perhaps in lion prides?
The ants are the most successful mobile organism, and genocide is part of their plan for the reasons you lay out.
Do monkeys practice genocide? I don’t know the answer to this question. If humans are unique in the primate kingdom in our practice of it, then that would lead me to believe that genocide wasn’t originally baked into our hard code, but rather it exists within Game A enculturation and it appeared there because the enculturation evolutionary track is fast. Whichever culture socially incorporated that bit of ant code first would dominate until that bit of ant code was found in all the surviving cultures.
If genocide is deeper than Game A, then is there anything Game B can do about it?
Game A rarely prosecutes genocide in the crudest sense. Domination, yes. Always domination in some fashion. If in doubt, my spelling wins. But far more useful to enslave (however subtly) than to slaughter. Even more useful to enlist.
Sure. Ants do the slavery thing too.
Slave-making ants are brood parasites that capture broods of other ant species to increase the worker force of their…
I found the second half of your response a bit hard to follow. Instead of trying to respond to it, I’d like to quote it, and interpret it, and for you to tell me if I’m interpreting it correctly.
But here is where we come to my proposition that the Nash Equilibrium of War is a stumble. I certainly don’t mean to say that “War” is a chance or random element of Game A. Far from it. The opposite in fact. What I am saying is that the presence and dynamics of War in the context of a species that is an innovation machine *forces* a very specific (though meandering and largely unwilling) directionality to the evolution of culture. A practical teleology of Power.
Sounds to me like:
Humans as a species are defined by their capacity for innovation, and the ones who innovate the best run out the ones who innovate less. Since war is a way to rub people out and win the cultural Darwinism game, war becomes the most important thing in which to innovate.
But I’m not sure that’s what you meant at all.
It is a stumble because of the three distinct tensions I mentioned earlier (integrity of society vis a vis nature, itself, and other societies). The conservative impulse knows that change is disruptive and risky. This is particularly true for those at the top of the ‘internal’ domination hierarchy in a society that is itself at the top of the ‘external’ domination hierarchy (Egyptian aristocrats in 1200 BC, Roman aristocrats in 200 AD, British aristocrats in 1600 and 1900, American aristocrats right now). If in doubt, so long as nothing changes, their spelling wins. But the asymmetry of innovation *requires* change. And not just change in a random direction. Change towards ever greater Power. Ever greater capacity to work your will on the world.
Sounds to me like:
People at the top of social hierarchies are scared of change because the change might displace them. But change is essential to keep from getting rubbed out, culturally speaking.
Which I buy, but I’d like to throw in some important qualifiers. There are people at the bottom of pyramids who are change averse as well, because change is just as likely to screw them over as it is to screw people over at the top. You mention AI, which in modern times invokes fear of automation, but automation has been screwing folks at the bottom forever, and this gets lost in the Silicon Valley cocktail party talk. I bang on that a bit at the end of this:
Modeling the Socioeconomic Future with Dungeons and Dragons
Gary Gygax: (1), Charles Murray: (0), Artificial Intelligence: (?)
I also feel the need to point out that the conservative impulse isn’t just about preservation of ones place within an existing hierarchy. It’s also about preserving a known cultural fitness, in a struggle against other cultures with known cultural fitnesses. I cover this towards the end of “Ants” way up top. If you tinker too much with your car engine and the race car one lane over doesn’t, you might lose in the short term and not get the opportunity to race again. Possibly in dramatic fashion, during something like a war or a genocide.
Pity the conservative impulse. Innovation stands on the shoulders of giants. It is a curve, not a straight line. This is because in the game of Power innovation in ‘what’ is far less Powerful than innovation in ‘how’. And that is far less powerful than innovation in ‘who’. Culture eats strategy for lunch.
I’ll be honest, I’m not sure I understand what you’re saying here at all, and I need you to help me through it. Seems to me that all innovation is innovation in how. Webster says an innovation is a “new idea, method, or device,” and those are all basically things to achieve tasks.
Culture and strategy have each eaten each other’s lunches in the past, historically, so I don’t see how one obviously dominates the other. In fact, strategies to infect of affect cultures are quite common, and very often effective.
Moving from catapults to cannons is one thing and profoundly shifts the Feudal equilibrium. But once the meta-game of ‘innovation on innovation’ sometimes known as Capitalism gets going, Feudalism as a whole is done.
The pathway by which transcendent innovation kills feudalism is not clear to me. I think a decent argument could be made that we’re not far past feudalism now, and that the wealth concentrations we’re seeing develop within those who wield automation (via innovation) are moving us closer to feudalism, not further from it.
There’s a pretty easy argument to be made that we’re basically running the “land owners, serfs, and peasants” program right now, just with a wider definition of what counts as land. That’s certainly how the communists see it, and while I’m not a communist, some of their baseline analysis is pretty sound.
Now the conservative impulse is in trouble (and in the West at least there have been no true conservatives for several hundred years). We are no longer using Power to compete on the landscape of spellings. We are using an increasingly conscious intent to compete on the landscape of Power. Acceleration.
This is self evident to you, but not to me. Can you give me a real world example of what you mean here, outside of the AI race?
Do you mean, for instance, Facebook or Twitter or China wielding technology to curate the media feeds people see, to concentrate power in places they prefer?
This dynamic is self-ramifying. No matter how much ground you gained in the last war, in a landscape of accelerating power, if you don’t run faster than your neighbors, the Barbarians will be breaking down the gates in a few decades. Or years. This process results in increasingly heedless acceleration into increasing Power (see the current AI race). And that, of course, is inexorably self-terminating.
I’m not getting this either. I need you to apply this “inexorably self terminating” prediction to an example of “Power to compete on the landscape of Power,” to show me how you think this is going to happen.
But herein lies the rub. Each move in the accelerating game of Power *must* be a move towards a “who” that is more capable of a “how” that is more capable of a “what”. And here is the thing that we have known since the beginning of our species: collaboration is always more innovative than domination. In many ways, the essence of Game A has been the myriad efforts of domination to maximize the innovative capacity of collaboration while maintaining the context of domination. Hence its increasing subtlety and ‘softening’. From Egyptian slavery to Feudal serfdom to the Liberal Republic to the modern welfare state. game~b is simply the result of conscious inquiry into the fundamentals of this dynamic and a realization that if you can remove the constraint of ‘conservation of domination,’ if you can move from the anti-rivalrous always in service to the rivalrous to the rivalrous always in service to the anti-rivalrous, you simultaneously innovate at a level that Game A can’t possibly (structurally) achieve and you get off the road to self-termination.
Let’s run with this for a second.
- Tribe A: anti-rivalrous in service to rivalrous
- Tribe B: rivalrous in service to anti-rivalrous
In a shooting war, I’m having a hard time believing that Tribe A doesn’t win. In a corporate war, I’m also having a hard time believing that Tribe A doesn’t win. Seems to me that Tribe B only really out-competes Tribe A within this framework down under the Dunbar number, at village scale or family scale interactions.
I’m an engineer. I need some examples to attach the concepts to, to get me where you’re at conceptually.
More generally speaking, we were talking about tools. I listed a toolkit, and your response appears to have been to issue a concept as your tool. And that’s fine. If I can get the concept, then we can pivot back to talking about tools for spreading that concept, which is really the level at which I’m trying to focus. But I’m a pretty smart dude, and if I can’t get the concept, you’re going to have a really hard time spreading the concept to the Walmart shoppers. And that’s the task in front of you.