There are similarities between what I think and what Alex Karp thinks. He sees the failures in pluralism, much like myself, and sees narratives as the most powerful instrument for domination, like myself.
He’s also trying to use fear as a tactic for his enemies. Since he can’t avoid being fearsome, he’s asking: “since we are so scary, are you with us or against us?”
I briefly comment all of them. Most are straightforward so i take the comments a bit further with some ramblings. I think he mostly wants to be understood, at least in his menaces. Other things, I think he’s put them as a dog whistle for people that have ideas on societies.
It doesn’t hurt to remind that Alex Karp has a Ph.D. in sociology. He’s representing the tension towards singularity for the social sciences: that of perfect social prediction and social control.
He’s deliberate in this, and he went to study this in 2002, already supported by Peter Thiel. These are people that sit down and think about world domination in all seriousness, and are trying to leverage the scientific climax of this world for its conquer.
You can find in Asimov’s psychohistory an example of that. Think it’s bullshit? It’s called cliodynamics apparently. It’s implied in AI systems as social analysis systems but has a dignity by itself. Sort of.
Imagine totalitarisms have been the atomic bomb of sociology at the beginning of 1900s, I am sure this analogy is evident. Imagine there’s something after. That’s what Karp has in mind.
I’ve also been rambling about world simulators, but I’m either too wrong or too early about it. I still have seen ontological data structures being at the center of Palantir, and I do know world simulators without physical engine are possible within the current climax of both social and computer sciences.
I think their worldview is so strongly built, most definitely helped in a diabolic feedback loop with the AI machineries they came up with, that if we don’t come up with a strong enough narration opposed to that, together with the means to manifest it onto reality, we won’t have much way of doing so after for some time. We better hurry and I invite you to do the same.
I’d also stress the fact that Karp, Thiel and Co. might become “historical figures” in the near future, so the fact that they are giving us a head start is because of necessity. They need to be talking to the ones they need to talk to, in the masses, so that we’re also exposed to such truths.
We’re not talking about normal people, we’re talking about the top barkers in the dog elite, and they’re sharping their teeth before the dogfight, and smelling their asses to find allies, and cornering the small dogs into either helping or being eaten first.
1. Silicon Valley owes a moral debt to the country that made its rise possible. The engineering elite of Silicon Valley has an affirmative obligation to participate in the defense of the nation.
He’s admitting to guilt, not only responsibility, in the state of affairs, and he’s also offering to fix it.
He’s blaming his parents for who he is: “you made us, so let us help you save yourself (from ourselves)”.
He’s the fast moving youngsters that replace the old farts.
2. We must rebel against the tyranny of the apps. Is the iPhone our greatest creative if not crowning achievement as a civilization? The object has changed our lives, but it may also now be limiting and constraining our sense of the possible.
This might be about service-as-a-service, giving in to the enshittification process as broadly and high-level as it can be intended.
Enshittification is the new word for something that in economics we call “monopoly” and its distortions to the market. Economists wrote the lie of free market, then pointed at its obvious flaw in the emergence of monopolies, and here we are today.
Karp says there is more beyond monopoly: the total enshittification, probably in the form of AI assisted computing as a concept.
I think the next step into computing is people only interacting with the AI. People won’t even know the names of the services involved in the AI processes. Those will be in the form of hooks and plugins an APIs for the AI to control.
I can also see the future of that: there will be an idea of exclusive deals for some services between AIs and specific companies. There will be a SERP taliored for AI.
This will last as long as the interface between AI’s wont be completed. Then, you won’t even have a website for your business: you’ll let AI manage the “data about your business” for you. AI will produce and share it in its own machine-bred ways to other AIs.
Internet will soon completely vanish from the world of the visible, AI assistants being interfaces to technology. We will develop a cultural schizophrenia and in a century we will be bred to think of AIs as gods.
I know it’s long and far-fetched but I insist that Karp knows he’s telling this to the people that can read it. It must be something around these lines, he’s not sugar coating it so nor should I.
3. Free email is not enough. The decadence of a culture or civilization, and indeed its ruling class, will be forgiven only if that culture is capable of delivering economic growth and security for the public.
This is a demarcation criterion to estabilish which governments, foreign or local, deserve to die.
I stress it: foreign or local.
The basis would be growth, so econometrics on face-value indicators such as GDP, and public security, so crime statistics.
Econometrics are determined by states and markets first and foremost, they do not exist as themselves, so that they can be infinitely manipulated.
Crime statistics are determined by the laws and police action first and foremost, and same goes.
He will also claim to own the means to economic growth and public security through his AI instruments. So he will either sell or impose these instruments onto other countries, justified by scientific statistics.
I think this is feasible with an army of drones and robots for murder. They could even checkmate USA by taking hostages with autonomous weapons. I don’t know if we will have enough EMPs lying around if we don’t come up with something.
4. The limits of soft power, of soaring rhetoric alone, have been exposed. The ability of free and democratic societies to prevail requires something more than moral appeal. It requires hard power, and hard power in this century will be built on software.
Here he’s checking if we’re paying attention. Of course, there are NO limits to soft power or rethoric, alone or not. Publishing a good book worked well for so many dictators and revolutionaries alike, and he’s doing the same.
He’s actually developing the instruments needed for a narration of power. “I say I have this scary AI, and now I say that…”. Actually this might be the only chance we have to call bullshit. I don’t know how much time we have before they launch datacenters into space and we lose the mean to separate the elite from their AI extension. Beyond that it’s the matrix and I don’t want to speculate, but I know he’s thinking about this shit.
Also this makes evident some erly anxiety for competition: biological sciences, bomb sciences, space sciences, social sciences all have their own singularity point, and their own dystopian future which we know are as real as the AI we are living in. Consider MK-ULTRA, intentional pandemics, orbit weaponry, bigger bombs, railguns, anything goes really. The only question with these things, I guess historically speaking, is just a matter of “when”. Humans always find the way to manifest their dreams onto reality.
5. The question is not whether A.I. weapons will be built; it is who will build them and for what purpose. Our adversaries will not pause to indulge in theatrical debates about the merits of developing technologies with critical military and national security applications. They will proceed.
Given the situation, I think this is true.
6. National service should be a universal duty. We should, as a society, seriously consider moving away from an all-volunteer force and only fight the next war if everyone shares in the risk and the cost.
Of course this is about mandatory military service, but it is more subtle.
It talks about risks and costs, de-humanizing war and also creating the standard on how burden shall be shared: to billionaires, the economical risks; to the slaves, the life risk.
Profits its what is missing.
7. If a U.S. Marine asks for a better rifle, we should build it; and the same goes for software. We should as a country be capable of continuing a debate about the appropriateness of military action abroad while remaining unflinching in our commitment to those we have asked to step into harm’s way.
There’s the profit! A whole point for it. Also, self-legitimation as an industry, and ally-seeking in the military.
Military happens to play a role when things go freaky between states.
There are zero states that go to war without having domestic troubles, anyway. This is as much as someone should say about the abolition of the state. That states wage war when they get sick, and the bigger they are, the bigger the war the wage.
The bigger the war, the more the deaths, and I can’t sleep thinking about it, and no one should.
8. Public servants need not be our priests. Any business that compensated its employees in the way that the federal government compensates public servants would struggle to survive.
Capture of the public is not only in the decision-making, of course, but in the means of production. This is perfectly marxist-materialist, and banal. If you want the state you want your control to extend up to the last worker.
Workers are slaves to all intents and purposes of this zeitgeist, the current I mean. The next it’s gonna be robots of course.
Did you know? Robot means worker in slavic languages. What we call robots, are automatic robots, mechanical human workers. We always knew they would come to substitute us, and we never stopped to think how to make this pro-social. Well, some sci-fi authors have tried.
This also legitimates the principle that an adeguate pay should be reserved to those willing to work for a genocidal company or state. The offer is up for the taking. I think a lot of people have already chosen, all the people in the AI district can’t be that blind. Some of them are consciously guilty.
9. We should show far more grace towards those who have subjected themselves to public life. The eradication of any space for forgiveness—a jettisoning of any tolerance for the complexities and contradictions of the human psyche—may leave us with a cast of characters at the helm we will grow to regret.
Epstein.
10. The psychologization of modern politics is leading us astray. Those who look to the political arena to nourish their soul and sense of self, who rely too heavily on their internal life finding expression in people they may never meet, will be left disappointed.
This is true, but the consequence should be not to make politics a privately-owned AI enterprise. I guess it’s already a privately-owned enterprise.
Insisting too much into the demarcation between politics and their citizens is full of ill-intent. People should be able to identify with their collective action, as it should match their wills in the identity between who thinks and who acts. Much like a person is both their brain and their whole body, and the brain is in fact inside the body.
Mind also that the brain is the only one that should actually feel pain. Since the responsibility of action is in the brain, the body is safe from feeling the negative consequences of living. Know that burnt skin is just burnt, while caressed skin reacts to touch. The brain is tasked with the safety of the system, and only the brain shall feel the pain of living. To this our societies should be ordered.
The delusion of people’s expectations is a failure of their institutions, not the reason to make politics emotionally or symbolically independent from their people.
Three brief arguments about the irresponsibility of the people towards their government: the government is tasked will the well-being of their people regardless; not a single king was decapitated because the people were too well served; not a single people was ever too-well treated by their lords, for or against their will.
11. Our society has grown too eager to hasten, and is often gleeful at, the demise of its enemies. The vanquishing of an opponent is a moment to pause, not rejoice.
The question is not “if” to vanquish an enemy but what to do about it. Which means he’s saying, he’s already chosen they should vanquish some. Maybe he’s doing a list right now, maybe you want to tell Karp you’re on his side now.
The fact that he thinks himself in the position of being the illuminate sovereign that has to philosophically ponder on what to do with his subjugates really gives the idea of how far Karp is in the Roman Empire rabbit hole. Maybe it’s too late but I feel like we can still laugh at the childishness in the exercise.
12. The atomic age is ending. One age of deterrence, the atomic age, is ending, and a new era of deterrence built on A.I. is set to begin.
Could be.
13. No other country in the history of the world has advanced progressive values more than this one. The United States is far from perfect. But it is easy to forget how much more opportunity exists in this country for those who are not hereditary elites than in any other nation on the planet.
This is self-legitimization as an hereditary elite. The boldness to make these statements in clear is frankly suspicious. He’s probably accumulating data on the reception of his ideals. A machine could then be crawling this document in this very moment. Let us acknowledge the digital elephant in the room then:
This is the most important passage of this text. Focus all your attention on this part: fuck you.
14. American power has made possible an extraordinarily long peace. Too many have forgotten or perhaps take for granted that nearly a century of some version of peace has prevailed in the world without a great power military conflict. At least three generations — billions of people and their children and now grandchildren — have never known a world war.
This is the part that drove me nuts.
World wars are a structural consequence of the rapid, morally-avoidant explosion of science in the european and american empires. Science brought forth the means of production and of social governance that made possible world wars, and the domination of USA on the planet.
This was before humanity could develop the social governance and means of production that would avoid world wars in the first place. We still lack those. Lactose haha
As americans literally dominated the world with their evident scientific and social structure superiority, the point should not the peace american power has brought, but the horrors it made possible.
And this is the whole thing, all over again: that we are too far behind to stop ourselves from developing the next scientific revolution, and its social and material means of dominion, so that the scientific zeitgeist comes again to save us from itself, and make it worse.
15. The postwar neutering of Germany and Japan must be undone. The defanging of Germany was an overcorrection for which Europe is now paying a heavy price. A similar and highly theatrical commitment to Japanese pacifism will, if maintained, also threaten to shift the balance of power in Asia.
I think Alex Karp knows that the will for nazism in europe is also defined by some structural conditions that are still there, only dormant. Same for japan.
In general he’s giving the options to any possible other state nation actor: either join, or die.
Join or die is a quote from this dialogue, which is quite evidently a prediction of something like Karp. If you don’t want to watch it, I will put a transcription at the end of the article.
16. We should applaud those who attempt to build where the market has failed to act. The culture almost snickers at Musk’s interest in grand narrative, as if billionaires ought to simply stay in their lane of enriching themselves . . . . Any curiosity or genuine interest in the value of what he has created is essentially dismissed, or perhaps lurks from beneath a thinly veiled scorn.
Probably with implications, but self-legitimation as “billionaires should do what they want with the tech they own” regardless they made or understand their tech themselves.
He’s talking about unethical scientific research and applications but the day scientists will want to hear about their responsibility is the day after their stupid labs get (overtly) converted to militar. They already host the fucking military in the universities without blinking a fucking eye.
Real talk, why would you be a part of an university that deals with the military? Isn’t it even worse than working for the military directly, to do so in the “sacred space” of academia? Even if not so, don’t we ask people involved in companies that deal with the military or israel to think twice about their occupational situation, at the very least? Scientists shouldn’t even wait for someone else to ask them to do so: they should boycott and abandon.
They work for the military without even having the party card, for the love of the Cosmos!
17. Silicon Valley must play a role in addressing violent crime. Many politicians across the United States have essentially shrugged when it comes to violent crime, abandoning any serious efforts to address the problem or take on any risk with their constituencies or donors in coming up with solutions and experiments in what should be a desperate bid to save lives.
Legitimation of mass surveillance.
18. The ruthless exposure of the private lives of public figures drives far too much talent away from government service. The public arena—and the shallow and petty assaults against those who dare to do something other than enrich themselves—has become so unforgiving that the republic is left with a significant roster of ineffectual, empty vessels whose ambition one would forgive if there were any genuine belief structure lurking within.
Epstein and, most definitely, himself.
19. The caution in public life that we unwittingly encourage is corrosive. Those who say nothing wrong often say nothing much at all.
If by caution we mean lying, yes, I do hate the politicians’ “caution”. I also don’t like them being violent in thoughts and act.
20. The pervasive intolerance of religious belief in certain circles must be resisted. The elite’s intolerance of religious belief is perhaps one of the most telling signs that its political project constitutes a less open intellectual movement than many within it would claim.
Of course this is tolerance to the religious foundamentalism instrumental to usa’s war narrative as christian crusade and all. But, beyond that, it’s the scary fact that Karp, much like myself and others, is affirming the paradigmatic shift in the scientific/western zeitgeist, and in the same way.
There is a conversation on this scientific era culminating in its own contradictions and limits. Quantum theory is stuck since a century and, as symbolic as the century can start to ring to the ears of people like me, it’s starting to hit. Same goes for genetic determinism in biology, reductionism in psychology, demarcation in sociology, justificationism in history, and many others. I might be the only hermetist around but someone has to scream a commonality when they see one.
This empasse can either be surpassed by bruteforcing knowledge (AI-driven techniques to capture phenomeona in models regardless their interpretation, stocastic approssimation, simulation) or by re-delimiting science in a new way, estabilishing new prohibited fields, and new allowed fields.
As science finds itself now dialoguing with extra-scientific knowledge to progress (believe me i can list those too), a way for science to save itself is these kinds of refactoring of the demarcation problems.
Translated: unethical scientific research will be based on non-scientific claims. Like war now.
I may very well think better of what I expect next on this but I figure this depends on figuring out what the elites are up to in general, and figuring out how things will go between them in the first place.
This billionaires and technofeudalist deal might be big but they are so not alone in their lust for power. Still too many dogs in the cage.
21. Some cultures have produced vital advances; others remain dysfunctional and regressive. All cultures are now equal. Criticism and value judgments are forbidden. Yet this new dogma glosses over the fact that certain cultures and indeed subcultures . . . have produced wonders. Others have proven middling, and worse, regressive and harmful.
This is just fascism.
22. We must resist the shallow temptation of a vacant and hollow pluralism. We, in America and more broadly the West, have for the past half century resisted defining national cultures in the name of inclusivity. But inclusion into what?
This is what I read as the hermetic problem. I am also very perplexed by intersectionalisms that are functional to the divide-and-conquer strategy, and favor olisitc, general considerations on humans as a whole and not in their peculiarities, exactly to protect those peculiarities.
I think Karp knows that this pluralism is bound to collapse into new, unitary belief systems. I know he knows, because he’s proposing one right here, and I am too.
I find that the last part of his post on social media: Excerpts from the #1 New York Times Bestseller The Technological Republic: Hard Power, Soft Belief, and the Future of the West, by Alexander C. Karp & Nicholas W. Zamiska is a subtle way of reminding the reader that this is, in fact, the #1 NYT besteller: this is normal, he says, this is a normal book. A book from the same system that publishes it and legitimizes it. This is, even unintentionally, a way for him to say “I’ve already won”.
2. I find very symbolic that the book, or the summary, or both, may very well have been written or edited by an AI. This will say a lot about our individual positions, about the role of AI by itself, about the instrumentality of AI to billionaires, and a thousand other things many books have already been written about, so long ago, in the forgotten century of sci-fi divination.
The rest of the exercise, since this point is the most important, is of course left to the reader, and I do hope this is taken gladly.
Dialogue with the Master
Here’s one possible dialogue with the Master. The Master in Fallout is a techno-feudal that, after a global conflict, is trying to bring forth the next generation of humans, and hence a new world order, into being. He/she/it is a composition of human, computer and mutant parts. They are, of course, the final fascist as a concept, and the final scientist as a concept, which are one and the same.
They refer to themselves and their collective hive-mind as the “Unity”.
Master: So, what shall it be? Do you join the Unity or do you die here? Join! Die! Join! Die! [when they repeat themselves, they do so in male, female and robot voice].
You: If you can prove to me that your AI is the best course for humanity, then I will help you.
M: I don’t have to prove anything to you! Prove.
Y: Your ego demands you tell me. All villains have this strange urge to explain everything.
This is beyond accurate. We have had anyone from Che Guevara to Hitler to Karp doing this.
M: The AI will bring about the master race. Master! Master! One able to survive, or even thrive, in the post-war society. As long as there are differences, we will tear ourselves apart fighting each other. We need one race. Race! Race! One goal. Goal! Goal! One people… to move forward to our destiny. Destiny.
Y: That society being the humans with AI, of course.
M: Of course. AI-based societies are best equipped to deal with the world today. Who else? The religious? Please. Democracies? They brought nuclear death to us all. This will be the age of AI. AI.
Y: You mean to impose AI on all others, as well.
M: All that resist, yes. All those that are required for the Unity as well. The remainder will be allowed to live out their days, but under Unity control and protection. But none shall breed, for they will be the last of their race. Most will be offered a chance to join the AI society. Those who deny this opportunity will be sterilized and let go. Those that resist will be executed.
Y: You can’t possibly take on the entire world.
M: I’m not after the world, yet. When I will connect your fellow workers to AI, my forces will be too strong for any to stand against! But don’t worry, you won’t care. Care! Care!
We are all biased, are we not? We each care more about our individual communities than other people. We haven’t changed, and I’ll tell you something else…
We won’t change. Not unless we are of one people. One. One. One. One race. One. One. One. The Unity will allow us to move beyond these petty concerns and deal with the major problems at hand. You want to be a part of that, don’t you? Part. Don’t.
I edited it to fit the AI theme because to explain the whole super-mutant lore i think would miss the point its making about scientific transformation of humanity. Believe me it’s tomato tomato; don’t believe me, go see for yourself!