promethea.incorporated

brave and steely-eyed and morally pure and a bit terrifying… /testimonials /evil /leet .ask? .ask_long?


The Basilisk of Phil Sandifer, part 2/10

The Rabbithole’s Event Horizon

The idea of a red pill features strongly in Sandifer’s book, the concept itself having taken a detour to the neoreactionary movement before finding its natural home in a subculture of fedora-wearing programmers who watch My Little Pony and love to complain that women never invented anything important (not that there’s anything wrong in being a fedora-wearing programmer who watches My Little Pony, it’s just that the last assertion is quite trivially incorrect).

Unfortunately Sandifer’s treatment of the topic falls short of the original inspiration. While The Matrix is an extended metaphor, sprawling temptingly and brilliantly into meta-five, meta-six and onwards into as deep a recursion as a human brain is capable of and hypnotizing me with its intertextual insight pornography into a state not unlike being on acid despite being fully and legally sober, Neoreaction a Basilisk is the sort of a book that is best enjoyed with half a tab of acid, being utterly hilarious and making a massive amount of sense mostly because absolutely anything is utterly hilarious and makes a massive amount of sense on half a tab of acid. Sadly I was not on half a tab of acid while reading it (because that would’ve been illegal, drugs are bad if you live in such a jurisdiction mmmkay) so I did not have access to the state of mind where I could have just leaned back and enjoyed the ride.

Leaning back and enjoying the ride is what the book ultimately is about. It is not a sophisticated argument or an honest attempt at genuine discourse (and neither is this review, frankly, just to clarify the issue to those who haven’t picked up that obvious fact yet). It is a confused amateur ethnography on cultures that haven’t earned enough mainstream respectability that writing confused amateur ethnographies on them would be considered distasteful. In that sense it could be best compared to the works of european colonialists traveling to Africa and reporting back on the barbaric disorganized nature of the locals’ communities, because african cities were organized according to a structure europeans didn’t understand, instead of the simple cartesian system white people considered the pinnacle of civilization.

This particular type of error is one of the biggest ones underlying Sandifer’s liberal arts eschatology, and it shows up again and again. The idea that all there is to know can be known by a marxist English major, and that the world is obligated to be fundamentally comprehensible to one, leaks through constantly yet remains forever unaddressed in explicit terms, thus leading to pattern-matching and deeply unsatisfying arguments.

For example, it is quite a cliche, faithfully repeated in the book, that transhumanism is “merely” a symptom of people’s fear of death. (Nothing is “mere”, my friend, nothing is “mere”.) This is nigh-universally accepted to the degree that no justification is considered necessary. Transhumanism is unwillingness to accept death, end of discussion. Nowhere is it actually explained why unwillingness to accept death would be a bad thing. The critical reader would obviously begin to suspect that perhaps they cannot explain it. The median reader would obviously get outraged at how a fundamental part of their worldview is not accepted as obvious.

One would expect a marxist to be more sympathetic to such ideas, as marxism itself is popularly dismissed as “envy, end of discussion”. The laborer in the dark satanic steel mills asks “why exactly should Mr. Carnegie have so much money while I have so little, and why exactly should the Pinkertons be allowed to shoot us if we protest while we aren’t allowed to shoot them?” and the popular opinion answers “haha, he is just envious that Carnegie has money and he doesn’t”. Marxism starts from the assumption that maybe things should not be that way, and like any movement it ultimately devolves into a cherished set of excuses for the parts of the status quo one doesn’t want to think about too deeply. Once again the red pill remains a blue pill with an instagram filter on top.

In fact, the book’s opening reveals the deeply corrupt nature of Sandifer’s modern marxism. “Let us assume we are fucked.” says Sandifer. “Let us not.” says Marx, “Let us assume that capitalism will indeed continue to disrupt every single industry until we each are gig contractors, languishing under the iron hand of the algorithmic management of the Uber of Whatever.” (of course, Marx originally did not know about the Uber of Whatever, but translating his original observations into modern language is mostly a simple search-replace operation) “Let us consider what might be done about this.”

Marx’s answer is obviously (spoiler alert to anyone who hasn’t been alive in the last 160 years) “historical inevitabilities will result in communism”. In fact, so is Sandifer’s, with one crucial difference: “communism” gets replaced by “extinction” which is even more revealing about this liberal arts eschatology. The world is ready, the answer is written on the first line, “Let us assume we are fucked”. Everything else is commentary. One may approach the conclusion in a “decelerationist” way, or in an “accelerationist” way, or shy away from it entirely, but rejecting the inevitability of this idea altogether is on the wrong side of the event horizon of comprehensibility, and thus the shadow it casts against the accretion disk must be pattern-matched into the nearest comprehensible thing.

(Or, to be more precise, the proper analogy to catch the true magnitude of the abomination this is would be Sandifer observing ideas from inside the event horizon and seeing something that claims it will escape the superluminal gravitational pull of the inevitable future of the black hole’s singularity; rejection of the limits of the comprehensible is to the liberal arts eschatology as magnificent a violation of the laws of reality as breaking the lightspeed barrier would be to a physicist. An astute reader might notice that only one of these rules seems to be hard-coded into the universe itself, and that the rules that are hard-coded into the universe itself are barely flickering within the boundaries of the comprehensible themselves. What this says about the merits of each might as well be left as an exercise to the reader, as I do not believe a person who believes in the dogma of mandatory comprehensibility would be willing to change their mind on this topic.)

Thus, rejecting death is seen as a personal flaw for one could not comprehend a reasonable mind that might not accept death. A universal feature of such a liberal arts eschatology seems to indeed be the unsolvability of problems, at least problems that are not fundamentally social in their nature. Dramatically restructuring the entire society and economy is seen as an obvious and laudable goal, for it’s “only” social, and the universe’s unwillingness to play along is unfair and unreasonable no matter how much the means of pursuing the goals conflict with the iron laws of incentives.

Now, Marx himself seemed to be quite aware of the iron laws of incentives; his predictions about where they might lead just happened to be subtly incorrect in a hard-to-immediately-anticipate way. Indeed, this attachment to the conclusions and rejection of the methods is a fundamental characteristic of Sandifer’s marxist liberal arts eschatology, and if reanimating the dead was possible I would be willing to bet money that old man Marx would readjust his beliefs in the present day while many of his followers would be left in the somewhat embarrassing position of wanting to die on the hill their idol has withdrawn from.


Part 1: A False Manhattan

Part 2: The Rabbithole’s Event Horizon

Part 3: Hubris

Part 4: The Marvels of Duct Tape

Part 5: The Darkening

Part 6: A Game to End All Games

Part 7: The Players of Games

Part 8: Men, Machines, Monsters

Part 9: The True Basilisk of Phil Sandifer

Part 10: Denouement

1 month ago · tagged #the basilisk of phil sandifer #basilisk bullshit #nrx cw · 16 notes · .permalink


The Basilisk of Phil Sandifer, part 1/10

A False Manhattan

I once freaked out when a computer program from the future threatened to hurt me.

Now, this obviously sounds preposterous and utterly ridiculous. Nonetheless, the truth is that some people, including myself, do not find it as immediately rejectable as most, and in some parts of the internet these people have become quite the subject of debate, vigorous and vicious alike.

If you are anything like me, you are probably looking for a book that would tell you the basics of what exactly is going on with two of possibly the strangest subcultures of the last ten years. A book that would point and laugh, mock relentlessly and savagely eviscerate their beliefs with the brilliance only someone who truly understands what they are talking about can muster. A book that would force even the most ardent supporters of those ideas to recognize that there is a certain absurdity in them, and laugh along the ride. A book that would neatly tie together the triptych of the good, the bad, and the ugly that Eliezer Yudkowsky, Mencius Moldbug, and Nick Land personify and emerge victorious with some impressive insight to the human condition.

If so, keep on looking and tell me if you find it, because ‘Neoreaction a Basilisk’ is not that book.

But in its attempt to be that book it provides a fascinating and frightful perspective to an unwritten ideology that pervades every aspect of western popular thought in the postmodern day: that of the liberal arts eschatology and the dogma of mandatory comprehensibility; and reveals Sandifer as an unwitting lovecraftian protagonist in a classic example of the genre: the writer who studies the diaries of others who have encountered something outside everyday comprehension, and follows them into something he did not expect to encounter, either recoiling at the last minute to a reality whose trustworthy foundations have been fundamentally shattered, or succumbing to it completely.

Our protagonist, Phil Sandifer, is a marxist English major at the Miskatonic University of Arkham, Massachusetts, who has stumbled upon the collected texts of three controversial eccentrics and seeks to study their works to understand the dark truths beneath the superficially serene consensus reality we share. For this purpose he made a kickstarter starting at $2000. The book mostly talks about those three, but make no mistake; Sandifer is the true main character whose descent into classic lovecraftian horror we perceive through his writings.

The book begins bleakly, setting the tone and conclusion in advance: “Let us assume we are fucked. The particular nature of our doom is up for any amount of debate, but the basic fact of it seems largely inevitable. My personal guess is that millennials will probably live long enough to see the second Great Depression, which will blur inexorably with the full brunt of climate change to lead to a massive human dieback, if not quite an outright extinction. But maybe it’ll just be a rogue AI and a grey goo scenario. You never know.”

Of course, this is an assertion of an assumption, which is mainly founded on the mainstream dogma of ~capitalism~ destroying the ~ecosystem~ so that ~we are fucked~ and ~nothing can actually be done about it~. A clear case of liberal arts eschatology, an eco-material fatalism of a degraded marxism that lost its will to live somewhere in the last 50 years. But to truly understand this liberal arts eschatology, a head-on assault would be difficult (or at the very least, deeply unsatisfying), so let us instead head back in time a bit to seek pieces of its origins to piece together a terrifying vision of. Our first stop shall be in 1987.

In Alan Moore’s ‘Watchmen’ Doctor Manhattan is a brilliant scientist whose physical body gets accidentally taken apart and who consequently becomes a disembodied consciousness living in a magical quantum dimension, able to manipulate matter on a fundamental level however he wishes. To Moore’s credit, he mostly does a splendid job of keeping the idea together; the universe of Watchmen operates on a different set of natural laws than ours, and the few glimpses the work reveals (prudently; just enough to maintain credibility while avoiding self-contradiction) fit together well enough to let the reader fill in the gaps. ESP, telepathy, mind over matter, and the superscience which produced Doctor Manhattan form neatly a coherent whole.

But where it falls apart is Manhattan’s psychology. Superintelligent characters are hard to write, because one needs to convincingly fake a level above one’s own. If you knew how AlphaGo would play, you would be just as superhumanly skilled, but because you aren’t, you are always at risk of making a move that vaguely seems like a move AlphaGo might make, but which does not fit the underlying logic by which AlphaGo plays. And if you are an amateur, making such a move may fool other amateurs, but Lee Sedol would recognize that something is off and AlphaGo itself would facepalm quite thoroughly if it had a palm. And a face. And a psychology.

This is basically exactly what Moore does to Manhattan. He is not actually a superpowered being to whom the world’s smartest man is little more than the world’s smartest termite, and thus when he needs to write Manhattan out of the story he does something that to him seems perfectly sensible, but to someone who is closer to what Manhattan would actually be than Moore himself is (I never promised to be humble), it is clearly a terrible move. A person’s father is someone unexpected, and Manhattan is like “woah, humans are way too random and unlikely, doc out”.

Unfortunately, Moore doesn’t understand what else is random and unlikely: the exact pattern of decay from a piece of plutonium, for example. And literally everything else as well. It is highly preposterous that Doctor Manhattan would so privilege the unlikely things of human psychology when he is completely unfazed by the unlikely things of nuclear decay; and especially grating because one can so obviously see a better answer.

“In this event, nothing was technically beyond my understanding. I could see the neurons, the axons, the transmitter chemicals, down to every single quark, with perfect clarity and the inevitability was obvious. Yet there is one thing I couldn’t know: the subjective experience of having this happen. This neuron sends this signal to that one, and it outputs actions, speech, thoughts, but I was not her, and from my own position I could never truly comprehend what was going through her head in that moment. Humans are the only thing in this universe that I can’t understand, they are way too fascinating for me, doc out.”

Of course, the weaknesses of this approach are still visible: Thomas Nagel could bring forth an impressive objection to why exactly Manhattan wonders what it is like to be a human, but not a bat, which surely must be an even more foreign experience. Nonetheless, this is defensible, and far stronger than Moore’s original; it is easy to imagine Manhattan’s mechanistic perspective, superhuman but still bound to his fundamentally humane mind, shaken at the realization when this one event makes him consider unexpected ideas, and not having wondered what it’s like to be a bat is obviously a simple oversight in Manhattan’s cognition which is all-seeing but not really all-knowing.

This idea that all human minds are fundamentally intercomprehensible underlies the works of Moore and Sandifer alike, and leads them to latch onto convenient stereotypes when they don’t know the more sophisticated reasons why people would believe different things (of course, as a marxist the author surely must have no experience in having his views misinterpreted by people who lack the background information with which they make a lot more sense; suffice to say, the very concept of ‘inferential distance’ gets its own dose of mockery early on because it was used in a less-than-optimal way in the early LessWrong community (yet again something marxists are obviously unfamiliar with)).

The Wachowski sisters’ masterpiece ‘The Matrix’ is another example of a work falling prey to inferential distances. The eponymous Matrix is a simulated reality which keeps people in a consistent state of non-awareness, to maintain them alive and sane so that the machine overlords of Earth can secretly run their processes on unused neurons (“You only use 10% of your brain, the rest runs the system that keeps you imprisoned”) because it’s a really convenient source of computing power in a world where humans destroyed other easy sources of computing power (and the reason why the society simulated is specifically the late 90’s american capitalism is obviously that, in its unironic embrace of “the end of history” and other ideas that would prove really embarrassing in just a couple of years, it was the least cognitively challenging period of humanity for your average corporate drone; convincingly faking the subjective experience of endless cubicle misery is far less computationally expensive than simulating the vibrant “life-or-death, doesn’t matter I’m living to the fullest” challenges of hunter-gatherer societies or the unpredictable synchronized global hivemind of the 2010s; and if someone questions why exactly they have been doing the same exact pointless intellectually unchallenging things in cubicles for what feels like fifty years, the perfect excuse is already there: this is the end of history, get used to it).

Of course, Hollywood wasn’t going to have any of that. They needed something that ~made sense~, so they switched the backstory away from stealing processing cycles from a brilliantly energy-efficient computer that can replicate itself even if semiconductor fabs are destroyed to the utter nonsense of using humans as batteries. Because with a form of fusion, the machines could satisfy all their energy needs with human bodies. Yes, you read that right, the machines have fusion but for some reason are still extracting energy from humans. The physicists in the audience are now facepalming really hard, the amateur physicists understand what I’m talking about, and the non-physicists demonstrate the validity of the crucial concept of inferential distance.

Naturally, Hollywood did not explicitly consciously mention that the movie should not be about ‘the things we don’t think about upholding an oppressive system that keeps us bound to serve it’, but that’s kind of exactly the point I am making here. Of course, the awakening to “reality” where people can be brave freedom-fighters against the evil system to liberate themselves from being squishy duracells is once again obviously simply yet another layer of The Matrix itself. The red pill is the ultimate blue pill, placating those who need to believe that they have some secret knowledge the rest of humanity lacks, to be willing to be placated. In actual reality the escape is no escape. Buy a Che t-shirt from Amazon. Identify as an objectively rational atheist whom absolutely nothing could convince of fairytales. Discard ideologies about gender and join the red pill movement. The Matrix is ultimately about ethics in gaming journalism.


Part 1: A False Manhattan

Part 2: The Rabbithole’s Event Horizon

Part 3: Hubris

Part 4: The Marvels of Duct Tape

Part 5: The Darkening

Part 6: A Game to End All Games

Part 7: The Players of Games

Part 8: Men, Machines, Monsters

Part 9: The True Basilisk of Phil Sandifer

Part 10: Denouement

1 month ago · tagged #the basilisk of phil sandifer #basilisk bullshit #nrx cw · 27 notes · .permalink


wackd:

philsandifer:

socialjusticemunchkin:

What does Phil Sandifer have in common with a respected literary genius of modern pop culture?

What does it really mean that his book is “stellar”?

Is 20 pages, with digressions to, among other things, the revolution of 1800, short fix fics of two masterpieces, recursive meta-paranoia, and the implications of half a tab of acid, the right length for a book review?

Why is it vital for the fate of the universe to convince Sandifer to install Linux?

What is the horrible secret that would make Karl Marx and AlphaGo alike facepalm if one wasn’t dead and the other a cold unfeeling machine without a palm, a face, nor a psychology for that matter?

And what if the true sneer culture was ourselves all along?

…the tantalization shall continue until friendliness improves!

New favorite review.

This really should be the blurb.

Just you wait until I finish the real thing I’m putting the final polishing touches on…

1 month ago · tagged #basilisk bullshit #nrx cw · 16 notes · source: socialjusticemunchkin · .permalink


What does Phil Sandifer have in common with a respected literary genius of modern pop culture?

What does it really mean that his book is “stellar”?

Is 20 pages, with digressions to, among other things, the revolution of 1800, short fix fics of two masterpieces, recursive meta-paranoia, and the implications of half a tab of acid, the right length for a book review?

Why is it vital for the fate of the universe to convince Sandifer to install Linux?

What is the horrible secret that would make Karl Marx and AlphaGo alike facepalm if one wasn’t dead and the other a cold unfeeling machine without a palm, a face, nor a psychology for that matter?

And what if the true sneer culture was ourselves all along?

…the tantalization shall continue until friendliness improves!

1 month ago · tagged #basilisk bullshit #nrx cw #drugs cw · 16 notes · .permalink


nostalgebraist:

NAB notes: Sandifer’s negative evaluations as wards against horror

This is turning into a series, so for reference: Part 1, Part 2, Part 3

While reading Neoreaction A Basilisk, I kept wondering how much it was meant as a takedown of the trio, and how much it was meant as a creative, conceptual riff which simply used the trio for some raw material.

Of course, the answer is “it’s both.”  But that isn’t quite right, either.  As a takedown, it’s scattered and not especially useful for someone who just wants to know what’s wrong with the trio.  I’ve seen a few posts from people saying they wanted it to be some sort of primer for fighting neoreaction, and it clearly isn’t that – saying “Moldbug’s use of Satanic negation reveals his unacknowledged sympathy for Satan as represented in Paradise Lost” is not the kind of idea that will help you out in direct political scuffles with Moldbug fans.

As a conceptual riff, though, it’s continually limited by the invasion of takedown-related material.  The book presents itself as an examination of strange internet (psuedo-)philosophers who – like classic horror story protagonists – are confronted with the unintended, disturbing, mind-searing implications of their own work.  This sounds like a good story, and it seems as though Sandifer wants to tell it.  But whenever the story starts to get interesting, whenever a bit of real narrative develops, whenever Sandifer starts tying the literary resonances here to his own literary interests like Milton or Blake … it all quickly runs aground, usually within a page, because Sandifer switches back to an evaluative mode.

Any attempt to build a mood, to dim the lights and get the audience spooked, is quickly interrupted as the lights flip back on and the storyteller starts haranguing you about how our mad philosopher protagonist made a totally shit point in this one blog post, oh my god, how are people so wrong on the internet.

It’s clear that Sandifer does not see Yudkowsky or Moldbug as intellectuals worth taking seriously (Land is a bit more complicated).  So it would be easy for him to just say at the outset: “look, I don’t think these people’s actual ideas are worth the virtual paper they’re printed on.  I do find them interesting as characters, and I’m going to tell a story about their journeys that I find potent in itself, like so many other good stories about awful or risible people.”

Indeed, this is sort of what he does, in the early parts of the book.  As @psybersecuritywrites:

One problem is that Sandifer can’t help but continue to use Moldbug and Yudkowsky as punching bags. It’s a bit of an issue - after presenting legitimately good, concise criticisms of the two in the book’s introductory segment, he seemingly feels justified in adopting a smug attitude towards them as easily ignorable figures that no respectable intellectual would take seriously. And yet he can’t help but bring up qualms with them again and again, as if he’s not quite as secure in his dismissal as he wishes he was. 

This is not just some little infelicity, I think.  It’s a major problem which holds the book back a great deal in its ambitions to do something creative and legitimately chilling.  The “story” is so stop-and-go that it’s barely there: the book is so wedded to the takedown format that any flights of fancy Sandifer wants to attempt must be weighed down with great ponderous loads of potshots.

Why is the book like this?

My bet is that the conceptual/narrative riff, not the takedown, was Sandifer’s driving motivation.  His descriptions of the book are heavily slanted in that direction, after all.  Take this paragraph from the Kickstarter:

Neoreaction a Basilisk is a work of theoretical philosophy about the tentacled computer gods at the end of the universe. It is a horror novel written in the form of a lengthy Internet comment. A savage journey to the heart of the present eschaton. A Dear John letter to western civilization written from the garden of madman philosophers. A textual labyrinth winding towards a monster that I promise will not turn out to be ourselves all along or any crap like that.

IMO, this is a great pitch.  It also sounds far more interesting and fun than the actual book.  The description suggests literary game-playing, genuine induction of unease in the reader, a work of creative writing by someone who, incidentally, doesn’t think much of the people who served as its inspirations.

Why couldn’t Sandifer have just written that book?  I suspect – and I could be wrong – that Sandifer has realized that his intended audience won’t look kindly at any book about neoreaction and Less Wrong unless it’s a takedown.  Sandifer is not aiming this book at fans of these ideas, and his target audience is either already hostile to the ideas or likely to become hostile when made aware of them.

He’s clearly interested in writing something that takes concepts like “Red Pills” and “democracy will destroy itself” seriously, and doing creative work within that framework.  But that framework comes from people whose other views he abhors.  Writing a book of riffs on the aesthetic potential of “the Red Pill” runs you the risk of looking like you’re sympathetic to “the Red Pill” as conceived of by Moldbug and PUAs.  “Roko’s Basilisk” makes Less Wrong a readily dismissable laughingstock to various parts of the internet; it’s also “a really spectacular story,” as Sandifer puts it, but if you push that angle to the point of admitting the idea really is chilling, you risk looking like you’re no savvier than the folks who freaked out about it in the first place.

So Sandifer must continually reassure his readers: “it’s OK, I think these people are ludicrous, I’m not taking them seriously.”  This explains why he keeps on taking potshots against Yudkowsky and Moldbug long after he’s fully dismissed them as serious thinkers.  He knows that a book that treats these people even as serious literary characters is going to strike a lot of people as conceding too much to them.  So he tries to treat them as serious literary characters, because that’s his fundamental project, but he still keeps worrying that he might be taking them too seriously for his audience’s tastes, and so he keeps interrupting the story with more disses, until the cancerous tissue of the disses occupies so much space that the story is a mere shadow of what it might have been.

This also explains why his disses are so half-hearted.  That’s not to say he’s too nice: he’s perfectly willing to call these people idiots.  If anything, though, he still pulls his punches.  He’s willing to call the trio some nasty names – because that’s a cheap, easy way to convey antipathy – but he doesn’t delve into their work far enough to identify its true (and vast and deep) flaws, sometimes ignoring obvious and damning critiques in favor of much weaker ones.  You can get a far more damning primer on Moldbug’s failings from the Anti-Reactionary FAQ (published Oct. 2013), and as sweet Yudkowsky dunks go, he has nothing on someone like @argumate​.

I don’t think this is because Sandifer can’t write a takedown.  I think it’s because his heart isn’t it in.  He’d never countenance this kind of laziness when it comes to Milton and Blake, because he actually cares about Milton and Blake.

But nonetheless, the half-hearted dunks interrupt the action again and again, insistently, compulsively.  Because if he went too long without them, he’d be writing an actual treatise on the serious literary potential, the horror and beauty, of “Red Pills” and “basilisks,” of silly and possibly evil internet ephemera.

I don’t want to go to far here, but I hope this way of going-too-far is in the spirit of all of this: it seems like his decision to send review copies to neoreactionaries and Less Wrong rationalists would fit naturally into this defense.  Presumably these people will get bees in their bonnets and write some infuriated words, which will reinforce the impression that Sandifer’s book is a takedown, which will neutralize any remaining sense that he’s fraternizing with the enemy.

I should be clear.  I’m not saying that Sandifer agrees with the trio’s substantial claims, any more than one has to endorse Humbert Humbert’s self-presentation to enjoy Lolita.  But there are some people who, understandably, can’t enjoy Lolita anyway, because they simply and for good reason want nothing to do with people like H.H., and are emphatically opposed to exploring his emotional complexities, his pathos, what can be done with him from a playful ironic literary remove.  They don’t want to explore his possibilities; they just want to say “fuck that guy” and be done with it.  So, too, with some people and neoreaction.  But Sandifer is not one of these.  He’s interested in the pathos and the playful possibilities.  He wants to write Lolita, not a manual on the prevention of child abuse.

And so, in the book itself, like one of the horror protagonists he discusses, Sandifer continually, compulsively – and less and less convincingly – says no, asserts that nothing is wrong, that he’s in control, that he’s not unhealthiy interested in his subjects, that he knows they’re wrong and evil (did you know he thinks they’re wrong and evil?  let’s say it again to make sure), that he may be gazing into the abyss but – rest easy – it’s not gazing into him, that nothing is off here, dear reader, oh no, that the trio is just as dismissible as you thought when you began reading, let me just reiterate that once again for clarity, no there is not anything going on over there in the shadows –

He’s of the Devil’s party, but he doesn’t know it.

(via argumate)

1 month ago · tagged #basilisk bullshit #nrx cw · 53 notes · source: nostalgebraist · .permalink


Neoreaction a Basilisk

(kickstarter.com)

philsandifer:

Any neoreactionary types who would like a review copy of the book in exchange for a promise of a fair and public thrashing of it, please let me know and give a link to the blog you’ll review it on. I’m happy to give you a PDF if you’ll promise to trash-talk it honestly and with quotations. 

The offer also applies to Yudkowskian rationalists, but you have to promise to say more than just “it’s sneer culture.” It’s totally sneer culture, and you can point that out, but that can’t be the main thrust. 

Yes I am doing this for the money. Will sorcelate for food. 

So I heard someone was giving neoreactionaries and people who once freaked out when a computer program from the future threatened to hurt them the John Oliver treatment, except that this person is not John Oliver but instead some guy with a vaguely similar-ish sounding name who started his Kickstarter campaign at $2000. As someone who once freaked out when a computer program from the future threatened to hurt them, and who always enjoys the John Oliver treatment of anything, I’m very interested in finding out the facts of the matter.

Now, whether or not this one is sneer culture is obviously not the relevant fact of the matter, but instead whether or not this one is good sneer culture.

In addition this is the rare treat of sneer culture actually directed at me without the highly visible hand of meatspace violence backed up by the sneering and thus, unlike the works of bioethicists or terfs, I expect that even in the worst case I would receive a highly unusual opportunity of getting to read something in the vein of “this is what these people actually think of people like me” that doesn’t make me feel like writing a vengeful computer god just to feel safe in a universe which contains such people.

And more, some people have alleged that this sneer culture is endangering the very fate of our universe by making a fanfiction writer who turns people trans and takes their money for no reason whatsoever through a text-only communication channel and once was very overconfident on quantum mechanics appear less seriously-takeable by Serious People, which is quite a fascinating prospect and I am very intrigued to find out more.

However, I’m unlikely to spend an actual $5 on the book, so I need to find a way to read it for free. Conveniently I’m allegedly quite good at producing value to people by writing things and it just so happens that the guy whose name sounds just a bit like “John Oliver” is offering free copies to people who could create value by writing about it. In light of this information I believe that it would be a mutually beneficial transaction to engage in such an exchange.

In addition, I am a person who chooses to like exercise so that I would have have received a better set of genes than otherwise, and I do Actually Believe in computer gods, programs from the future that threaten people, and living forever by dying from severe rapid hypothermia and turning into a number.

(At least for some values of “Actually Believe” that to most people, such as the ones who use phrases like “Actually Believe”, are utterly indistinguishable from other values of “Actually Believe”.)

As such, I believe that I am uniquely qualified to review this book: as an overconfident neophyte dropout with a lot of raw talent and weird ideas and a disrespect for the established and respected authorities and their sensible commonly accepted ideas, and an utter absence of actual accomplishments other than convincing many people of those weird ideas, I might be the closest thing to basilisk-era Yudkowsky the guy with a really small kickstarter could ever hope to get to read his book. The most important difference is that I openly display a substantial degree of self-awareness and do a lot of countersignaling on the topic of credibility; whether that makes me more or less fun to interact with on this matter shall be left to the readers.

I even promise not to leak the book to the pirate bay just because information wants to be free, because the weird computer god decision theory says that my promises should be reliable even when I totally could flake on them. In addition, I believe that incentivizing other people to purchase this piece of sneer culture (if it is worth purchasing; something I’d expect to have an answer to a few days after I receive a copy) for their entertainment might be a good thing: support your friendly local sneer culture instead of faceless corporate Big Sneer!


Does promethea get a free book out of this?

Do they write the review as promised, earning them the effective monetary value per work-hour of Bangladeshi minimum wage?

Is your friendly local sneer culture truly friendly and worth supporting over Big Sneer? Or is this all a ploy by the Unfriendly Sneer Culture in an attempt to blackmail us into bringing it into existence?

And most importantly of all: will the ultimate fate of the universe be decided for good (or evil, as it may be) by a guy who wrote a silly book and started his kickstarter campaign at $2000?

Tune in eventually to find out!

1 month ago · tagged #basilisk bullshit · 64 notes · source: philsandifer · .permalink


argumate:

@socialjusticemunchkin:

[…]

I’m not against the book and I don’t really want to be mean and I was entertained by all the exerpts I’ve seen of it, but I just think that there is ~complexity~ at play which pattern-matches to the kinds of things that have empirically been very harmful to people and ideas I care about and thus there is some cause for concern in how said ~complexity~ is addressed.

That’s fair enough, but I feel there is a certain hypersensitivity issue that comes up repeatedly around these topics.

And I realise that even by framing it in those terms I’m sort of playing into the narrative, eg. it sounds like I’m accusing people of caring too much and hence implicitly endorsing deaths and genocide, which is far from the truth. But there seems to be a greased waterslide where anything that smacks of mockery gets associated with every act of mockery ever, particularly the really nasty ones.

Ultimately it always boils down to whether the mockery is detected as coming from inside the tent or outside the tent, because mockery from outsiders cannot be tolerated or it will lead to gulags and terrible suffering.

But mockery is a common part of criticism, and forbidding criticism from outside the tent unless it can be expressed in more respectful and restrained terms even than those used by insiders basically shuts down all possible criticism.

I mean you give various examples of how mockery can have bad consequences, but they are not all very compelling. Many critics of cryopreservation mock it out of sheer frustration that people keep persisting with methods that cannot work, arguably a misallocation of resources that can lead to deaths from opportunity cost alone. (And of course proponents of cryopreservation can mock those who oppose it, when they are aren’t being outraged about Deathists).

While the dudes in dresses rhetoric often accompanies violence, I think it would be simplistic to say that it causes the violence. You could say that letting it pass unchallenged excuses the violence and sends a signal about what is acceptable.

But now the discussion has suddenly shifted from a mocking tone being used in philosophical discussions to actual violence and murder! I’ve seen plenty of jokes made at the expense of P-zombies, but as far as I know none of them have resulted in acts of aggression against people who lack qualia.

This isn’t intended as a defence of mocking people or being a jerk. But I think that some humour is justifiable in response to published texts pushing a political or philosophical worldview explicitly intended to convince others, and that forbidding any attempt at humour would make for a poorer world.

(And you know, someone would have to go searching through LessWrong and edit out any sarcastic remarks about talking snakes in the garden of Eden, and that sounds like way too much work).

My argument is basically that mocking the weird-sounding arguments that are low-status is significantly more harmful than mocking the weird-sounding arguments that are commonly accepted, and that I’d really appreciate it if people did less of the first type of mocking and if they are unable to tell the difference then doing less mocking whatsoever would be nice. (Heuristic for determining mocking: is it likely to feed into a pattern where people dismiss something out of hand based on the stereotype: “dudes in dresses lol” is a common dismissal of arguments for why trans people should be taken seriously, and “human popsicles lol” is a common dismissal of arguments for why attempts at cryonics (or alternative technologies pursuing the same goals) should be taken seriously, and “robot gods lol” is a common dismissal of arguments for why AI should be taken seriously.)

If I could achieve such an equilibrium by reducing the amount of “talking snakes lol” in the world I’d take the deal. And I want to scorn the people who only focus on mocking (not really blaming the author so much, but rather the people who take it as an excuse to engage in “robot gods lol” and “rationalists are nazis lol”) without having adequately engaged the arguments; it’s one thing to have “here’s my thorough argument for why I don’t believe in cryonics working: (…) in summary, human popsicles lol” because at least it has some actual arguments to address (and Cthulhu knows I’m sometimes snarky in my own writing), while just “human popsicles lol” makes it way too easy to dismiss attempts at addressing it with simple repetition of “human popsicles lol”.

I don’t know if this makes any sense as written, but it does in my head, and the brainpattern-to-language translation is at fault if it doesn’t.

1 month ago · tagged #cissexism cw #basilisk bullshit #transmisogyny cw #death cw #status games cw · 122 notes · source: argumate · .permalink


argumate:

@socialjusticemunchkin:

That fucking basilisk story was totally misrepresented though.

Sure, it is entertaining to say “freaked out when a computer program from the future threatened to hurt him” and I always enjoy such entertainment, but I enjoy it as cheap self-decrepating humor while many others seem to actually take it as argumentation and that is a bad thing. The basilisk was a security hole in the software of some human brains that needed investigating and patching so that it would not present a potential issue later.

I’m no stranger to seemingly unintuitive ideas that are trivial to mock despite being actually way more serious and thus anything that smells like an attempt to avoid addressing such things by pointing out how superficially ridiculous they appear puts The One Which Watches The Watchers into Defcon 3. I don’t think I should need to point out that “haha basilisk lol look at these fucking bayesians” is exactly the same kind of argument as “haha look at this scrawny dude who thinks he can be a lesbian just by popping some magic pills and wearing skirts lolnope”.

The Basilisk story fits in with various themes of the book, such as “red pill” ideas that drive one to madness and or reveal the hidden horror at the heart of things, plus the end of humanity and various attempts to hasten or avoid it.

For the Basilisk to be an issue in the first place requires accepting a whole bunch of propositions about the nature of consciousness, artificial intelligence, and future recoverability of information. But although this particular formulation is highly specific to the LessWrong community, as a literary phenomenon it crops up elsewhere in a variety of other guises, which is interesting.

Finally, please remember that Eliezer proposed giving lectures in a clown suit to avoid building up a cult of unnecessary formality and respect for appearances. People are awfully sensitive about the merest hint of sneer culture, and it is actually possible to have mild teasing that doesn’t result in pogroms.

Because “haha human popsicles lol” has never [possibly] resulted in gross misallocation of humanity’s resources and massive amounts of [possibly] unnecessary deaths, enforced by both coercively banning and culturally scorning this silly thing no sane person would engage in

Because “haha insect rights lol” has never resulted in people dismissing the [possible] horrible utilitarian catastrophe that might be going on, enforced by culturally scorning this silly thing no respectable person would engage in (and it can be argued that ag-gag laws are also coercively trying to ban animal rights work)

Because “haha dudes in dresses lol” has never resulted in people trying to morally and violently mandate vulnerable populations out of existence, enforced by both coercively banning and culturally scorning this silly thing no sane person would engage in

Because “haha adults watching cartoons lol” has never resulted in genuinely non-conforming people suffering unnecessarily, enforced by culturally scorning this silly thing no respectable person would engage in (and it can be argued that some laws are also coercively banning parts of it)

Because we live in a libertarian utopia where the vox populi can’t eradicate unpopular ideas by scorning dem and voting the scorn into violent enforcement

Because insiders being self-decrepating and outsiders being mocking is the exact same thing

Because none of us have ever had experience from living at the bottom of the status ladder

Because such a ladder definitely doesn’t exist and any attempts to claim that there are positions of informal power which dramatically influence the actual material effects of teasing are cultural marxism and sjw propaganda

I’m not against the book and I don’t really want to be mean and I was entertained by all the exerpts I’ve seen of it, but I just think that there is ~complexity~ at play which pattern-matches to the kinds of things that have empirically been very harmful to people and ideas I care about and thus there is some cause for concern in how said ~complexity~ is addressed.

1 month ago · tagged #basilisk bullshit #cissexism cw #transmisogyny cw #neckbeards are my ingroup #death cw #status games cw · 122 notes · source: argumate · .permalink


argumate:

The reaction (ha!) to Neoreaction a Basilisk from the local rationalist(-adjacent) community has been narrowly focused on these core issues:

1. Is this book accusing Yudkowsky of being neoreactionary?

2. No really, is it? I mean why else would it group him with Moldbug?

3. That fuckin’ Basilisk story, that was totally misinterpreted.

Having read it, I think it’s helpful to understand that this book is not attempting to be the annotated history of Internet politics circa 2k10, and the claims that it does make in service of its overall trajectory are modest and reasonable.

It is also worth remembering that not every work of literature is a textbook intended to be interpreted as a sequence of logical propositions. A community that sees value in communicating information in the form of fanfiction, poetry, and jokes should be well aware of this.

Finally the book does not just discuss Yudkowsky, Moldbug, and Land, but also the Matrix, Hannibal, and the works of Milton and Blake, among other things. Tying these topics together in no way implies that Yudkowsky is neoreactionary, any more than it implies that Nick Land is one of the Wachowski siblings or that Moldbug is a good writer.

That fucking basilisk story was totally misrepresented though.

Sure, it is entertaining to say “freaked out when a computer program from the future threatened to hurt him” and I always enjoy such entertainment, but I enjoy it as cheap self-decrepating humor while many others seem to actually take it as argumentation and that is a bad thing. The basilisk was a security hole in the software of some human brains that needed investigating and patching so that it would not present a potential issue later.

I’m no stranger to seemingly unintuitive ideas that are trivial to mock despite being actually way more serious and thus anything that smells like an attempt to avoid addressing such things by pointing out how superficially ridiculous they appear puts The One Which Watches The Watchers into Defcon 3. I don’t think I should need to point out that “haha basilisk lol look at these fucking bayesians” is exactly the same kind of argument as “haha look at this scrawny dude who thinks he can be a lesbian just by popping some magic pills and wearing skirts lolnope”.

1 month ago · tagged #basilisk bullshit #cissexism cw #transmisogyny cw · 122 notes · source: argumate · .permalink


.prev