promethea.incorporated

brave and steely-eyed and morally pure and a bit terrifying… /testimonials /evil /leet .ask? .ask_long?


playinghardtolistento asked: i dont have much interest in rationalism-as-thought-techniques because from what i've observed it doesn't seem to have uniquely benefited people who subscribe to it in ways that a general purpose self-help book and associated social support structure wouldn't have, with the added detriment of being tangled up in rationalism-as-techno-libertarianism. if there actually are "one weird tricks" employed by rationalists that would be helpful to leftists i'd love to hear about them tho

soundlogic2236:

socialjusticemunchkin:

oligopsony-deactivated20160508:

I’ll have to think about this because it’s important, but my facial impression is indeed that there aren’t really any superpowers or whatever - I just like discussing weird ideas and some versions of The Community are good places to do that

well, I DO think everyone reading “how to do things with words” (I think that’s the title?“) would nip a lot of the dumber arguments we keep having in the bud, like for instance the definition of socialism or whatever, but the basic insights aren’t unique? There’s probably a number of small things like that that ppl are likely to point out in the comments

A Human’s Guide to Words

Then there’s CFAR which seems to be “self-help, except we try to apply ~optimization~ to it, which is 100% more optimization than other self-help things”

Rationality checklists and deliberate de-biasing (I’ve been working on trying to recognize when my brain does something unsavory and bring it to my own conscious attention instead of letting it fester unnoticed, and like “why doesn’t anyone else anywhere even recognize that this is a thing”)

Then there’s that thing which makes people really likely to turn out trans and I suspect is at least partially modulated by the same transhumanist-y “optimize everything” morphological freedom attitude which makes people also use nootropics and sign up for cryonics even though all of them may be perceived as weird by the general population, and partially by the emphasis on changing one’s mind and not getting tangled up in silly things such as whether or not one’s non-doll-playingwithness in childhood makes using estrogen in 20-somethings verboten or not

Then there’s the parts of “technolibertarianism” that are positive instead of normative and thus would be very good for leftists to understand and use, such as public choice theory, behavioral economics, the corrupted hardware problem, group and individual irrationality, the impossibility of efficiently regulating things one doesn’t understand (and the diaspora is very much linked to things that are regulated by people who don’t understand them, such as nootropics, transhumanism, cryonics, transgenderism, urban planning, etc. and it might help illuminate the reasons why regulating things excessively and not respecting autonomy is extremely harmful), the need to deal with the coming post-labor future in which traditional ideas don’t work even to the very small degree they currently work and thus things like “who owns the robots” are even more important than presently, and so on

In addition the diaspora’s technolibertarianism is overwhelmingly social-technolibertarianism and the non-libertarian right is basically a rounding error, suggesting that either rationalism turns people non-rightist or repulses the mainstream right to begin with, and that in turn suggests that leftists should be interested in why these “technolibertarians” nonetheless aren’t what people usually think of when they hear the word “technolibertarian” even though they sure look like it, and that people like Thiel are more outliers than median examples of the wider rationalist-adjacent population

Then there’s Effective Altruism which is basically applied communism, in a way that is not vulnerable to the failure modes of working for a revolution (such as “Lenin” or “the fiftieth anniversary of the discuss the imminent revolution and never actually get shit done club”); eg. GiveDirectly is redistributing capital to people who don’t have capital and these “technolibertarians” routinely claim this redistribution of capital is one of the best and most important humanitarian interventions in the world, usually only outclassed by things like “not having people die and suffer from diseases that are really cheap to prevent, simply because they are too poor to afford even the really cheap prevention”

Then there’s the fact that the community has managed to derive a lot of significant leftist-associated insights from first principles and in the process repackage them as something the STEM class can understand and hopefully even apply in action occasionally

Huh. Never seen a list like that. Nice.
Thoughts:
CFAR bit is admittedly a weak-nothing about the optimization actually doing anything
Trans bit is good, though the description of improving introspection and such so people can notice ways of helping themselves feels awkwardly phrased to me? I had to read it twice to understand the meaning (and I only optimistically assume I parsed it correctly now-if I didn’t please inform me). Might just be me having trouble with the phrasing though-can anyone confirm?
Technolibertarianism part seems just about perfect
‘Applied communism’. That… Something feels wrong about the opening sentences there.
‘Derive from first principles’ seems understated. I feel that having a strong theoretical justification and comprehension of something is worth a lot more than ‘explain to STEM people’. It lets you catch failure modes in advance, do more diagnostics on it, all sorts of things.
In conclusion:
I like having this list. It is a good list. I want a better one, but that is usually true of almost everything. Thank you for creating it, especially because just listing the stuff… somehow hadn’t occurred to me?

CFAR:

I can’t comment on how much they have actually achieved with it because I haven’t researched it properly, but my prior is on “will attend a workshop as soon as I can afford it”.

The part with trans people:

Something keeps making them in the diaspora, and the obvious candidates are:

  1. transhumanism and the general attitude of “if you don’t like it, you don’t need to put up with it just because of some ~natural order~”
  2. the rejection of ontological bullshit about The True Metaphysics Of Gender or tying the social to the biological
  3. ideology that demands allegiance to evidence even when it suggests weird things about oneself

“Applied communism”:

The entire point of communism is to socialize the means of production (afaiaa). Historically this has been attempted mostly by taking them away from people who haven’t been interested in giving them away and thus getting in fights with them. This has resulted in communism being applied in ways that are optimized for winning fights, or not applied that much at all.

EA is the radical new idea that maybe charity should stop being charity (as in optimizing for fuzzies and good feelings for rich people) and start being world optimization (as in actually helping the receivers, not just the egos of the givers). I claim that traditional leftists should take the fact that EA has resulted in rich people redistributing capital to poor people as evidence that:

  1. the distribution of capital in the world is indeed a serious problem because redistributing capital has basically become the gold standard of “how to improve the world for humans” against which all other interventions are measured and which only a select few can surpass in effectiveness
  2. the traditional means for pursuing said redistribution should be at the very least reassessed, because it’s highly likely that the marginal impact of revolutionary discussion clubs and tribal political polemics is way worse than the marginal impact of actually getting shit done and redistributing; and the ingenious plan of sneakily “”“expropriating”“” capitalists by performing services they think they want in exchange for currency, and then giving said currency to people who don’t have capital so they can purchase some, should be seriously considered as it’s not only an immediately actionable strategy, but also one whose marginal effect is predictable, consistently positive, and individually achievable without needing to solve massive coordination problems first
  3. whatever made rich people arrive to the conclusion that yes, this should be done, is worth checking out because it seems to be outputting leftist conclusions without explicitly leftist input, and thus if one believes that such leftist conclusions are correct, this is evidence in favor of the patterns of thought that resulted in EA being correct as well
  4. and whenever said patterns of thought generate outputs that disagree with traditional leftism, the correctness of said patterns of thought on some questions should be taken as evidence that such traditional leftist may be flawed (or to stop vagueblogging: if people who derive “let’s redistribute capital” from first principles think that anti-market biases are a problem, traditional leftists should seriously reconsider their anti-market attitudes)

Thus, greater engagement by leftists with the diaspora ideas would be likely to be positive as there are some rather obvious convergences that should make people assign even the ones they initially disagree about as being worthy of consideration as any processes outputting correct ideas must inevitably converge, and the more correctness something outputs the more likely the rest of it is to be correct as well. And then there’s the reasons why the memeplexes diverge, and understanding the mechanisms that cause it is quite crucial in not ending up with bad ideas just because one made a mistake in choosing the process to generate the ideas with.

(Similarly, the diaspora converges with libertarianism in many ways, but when it comes to conservative ideas I can’t really think of any others than “go to church lol is actually not always terrible advice and reddit atheism is in some ways quite naive” that would have been “derived from first principles” in the sense of the diaspora tending to adjust people’s opinions on them against stereotypes (for example, I used to be much more of a reddit atheist before encountering the diaspora’s way of dealing with religion, and early LW antitheism seems to be very reddit-atheisty compared with the modern diaspora’s “religion as a social technology, its upsides and downsides” approach) and thus I consider it to be evidence in favor of some leftist ideas (such as the inequality in capital distribution being a severe humanitarian loss) and some libertarian ideas (such as markets being a very neat piece of social technology for information-processing in allocation problems) but not that much in favor of conservatism.)

1 month ago · 27 notes · .permalink


shieldfoss:
“ socialjusticemunchkin:
“ nostalgebraist:
“ cancer, and the pleasure of the weed
”
Okay, can someone ELI5 why I’m supposed to pick “the pleasure of the weed” here?
Like, this isn’t some abstract theoretical toy, this Smoking Lesion, or...

shieldfoss:

socialjusticemunchkin:

nostalgebraist:

cancer, and the pleasure of the weed

Okay, can someone ELI5 why I’m supposed to pick “the pleasure of the weed” here?

Like, this isn’t some abstract theoretical toy, this Smoking Lesion, or as it would be better called, Exercise Genetics problem is actually a thing in my IRL

It has probabilistic Azathoth instead of absolute Omega, and I may be able to peek into the boxes before selecting thanks to modern technology, but even then it would seem that me being the sort of clockwork thing that has the property “one-boxes” would make me likely to be the sort of a clockwork thing that has the property “gets a million dollars” and being the sort of a clockwork thing that has the property “one-boxes even if the box turns out to be empty when peeking into it” would make it extra-likely?

Thus, I should be the sort of a person who likes exercise so I’d have the genes that make one like exercise and live long, even if I turn out to not have such genes, because the sort of a person who has good genes would make such a choice?

Then again, turning it the other way around into the Psychosis Weed problem (people with early psychosis are more likely to self-medicate) doesn’t make me interested in impacting my choice on whether or not to choose “the pleasure of the weed” to avoid psychosis-related genes retroactively.

One could argue that the question is different because self-medicating is caused by symptoms and thus choosing to have symptoms or not (yeah, good luck with that) would be the thing that matters while the choice to exercise is directly controlled by the exact neurochemistry the Exercise Genes are about, and I think that one is probably “the” reason for it. So using that logic I’d determine my choice in the Smoking Lesion based on the mechanism of action of the lesion.

On the other hand, it could be that I’m supposed to choose not exercising and I just inherently enjoy exercise because I have the good genes that make me live long and prosper and thus my neurochemistry is motivated to interpret the Exercise Genetics that way?

Because it is pleasure. Why would you not pick pleasure.

If being the sort of a person who picks that specific pleasure means I’m the sort of a person who gets cancer, then I don’t want to be the sort of a person who picks that specific pleasure.

(via shieldfoss)

1 month ago · 89 notes · source: nostalgebraist · .permalink


Neoreaction a Basilisk

(kickstarter.com)

philsandifer:

Any neoreactionary types who would like a review copy of the book in exchange for a promise of a fair and public thrashing of it, please let me know and give a link to the blog you’ll review it on. I’m happy to give you a PDF if you’ll promise to trash-talk it honestly and with quotations. 

The offer also applies to Yudkowskian rationalists, but you have to promise to say more than just “it’s sneer culture.” It’s totally sneer culture, and you can point that out, but that can’t be the main thrust. 

Yes I am doing this for the money. Will sorcelate for food. 

So I heard someone was giving neoreactionaries and people who once freaked out when a computer program from the future threatened to hurt them the John Oliver treatment, except that this person is not John Oliver but instead some guy with a vaguely similar-ish sounding name who started his Kickstarter campaign at $2000. As someone who once freaked out when a computer program from the future threatened to hurt them, and who always enjoys the John Oliver treatment of anything, I’m very interested in finding out the facts of the matter.

Now, whether or not this one is sneer culture is obviously not the relevant fact of the matter, but instead whether or not this one is good sneer culture.

In addition this is the rare treat of sneer culture actually directed at me without the highly visible hand of meatspace violence backed up by the sneering and thus, unlike the works of bioethicists or terfs, I expect that even in the worst case I would receive a highly unusual opportunity of getting to read something in the vein of “this is what these people actually think of people like me” that doesn’t make me feel like writing a vengeful computer god just to feel safe in a universe which contains such people.

And more, some people have alleged that this sneer culture is endangering the very fate of our universe by making a fanfiction writer who turns people trans and takes their money for no reason whatsoever through a text-only communication channel and once was very overconfident on quantum mechanics appear less seriously-takeable by Serious People, which is quite a fascinating prospect and I am very intrigued to find out more.

However, I’m unlikely to spend an actual $5 on the book, so I need to find a way to read it for free. Conveniently I’m allegedly quite good at producing value to people by writing things and it just so happens that the guy whose name sounds just a bit like “John Oliver” is offering free copies to people who could create value by writing about it. In light of this information I believe that it would be a mutually beneficial transaction to engage in such an exchange.

In addition, I am a person who chooses to like exercise so that I would have have received a better set of genes than otherwise, and I do Actually Believe in computer gods, programs from the future that threaten people, and living forever by dying from severe rapid hypothermia and turning into a number.

(At least for some values of “Actually Believe” that to most people, such as the ones who use phrases like “Actually Believe”, are utterly indistinguishable from other values of “Actually Believe”.)

As such, I believe that I am uniquely qualified to review this book: as an overconfident neophyte dropout with a lot of raw talent and weird ideas and a disrespect for the established and respected authorities and their sensible commonly accepted ideas, and an utter absence of actual accomplishments other than convincing many people of those weird ideas, I might be the closest thing to basilisk-era Yudkowsky the guy with a really small kickstarter could ever hope to get to read his book. The most important difference is that I openly display a substantial degree of self-awareness and do a lot of countersignaling on the topic of credibility; whether that makes me more or less fun to interact with on this matter shall be left to the readers.

I even promise not to leak the book to the pirate bay just because information wants to be free, because the weird computer god decision theory says that my promises should be reliable even when I totally could flake on them. In addition, I believe that incentivizing other people to purchase this piece of sneer culture (if it is worth purchasing; something I’d expect to have an answer to a few days after I receive a copy) for their entertainment might be a good thing: support your friendly local sneer culture instead of faceless corporate Big Sneer!


Does promethea get a free book out of this?

Do they write the review as promised, earning them the effective monetary value per work-hour of Bangladeshi minimum wage?

Is your friendly local sneer culture truly friendly and worth supporting over Big Sneer? Or is this all a ploy by the Unfriendly Sneer Culture in an attempt to blackmail us into bringing it into existence?

And most importantly of all: will the ultimate fate of the universe be decided for good (or evil, as it may be) by a guy who wrote a silly book and started his kickstarter campaign at $2000?

Tune in eventually to find out!

1 month ago · tagged #basilisk bullshit · 64 notes · source: philsandifer · .permalink


collapsedsquid asked: I've seen you talking about sortition a few times, and I'm curious, how seriously do you take it? How worried are you about issues of legitimacy?

oligopsony:

socialjusticemunchkin:

collapsedsquid:

socialjusticemunchkin:

oligopsony-deactivated20160508:

Serious! I think forms of government can be arbitrarily weird and yet considered legitimate as long as there’s appropriate ritual around them and they people’s lives are about as good as they expect them to be, and I don’t think sortition is that weird - it’s fair, it’s representative, it’s been done before.

Those who see voting as expressing the “consent of the governed”, maintain that voting is able to confer legitimacy in the selection. According to this view, elected officials can act with greater authority than when randomly selected.[55] With no popular mandate to draw on, politicians lose a moral basis on which to base their authority. As such, politicians would be open to charges of illegitimacy, as they were selected purely by chance.

I don’t see the downside.

The issue is when lack of agreed-upon and enforceable methods for resolving disputes leads to terrible outcomes when disagreements do occur, such as mob violence or all-out war.

I think “avoiding mob violence or all-out war” or “let’s pay everyone the same reasonable amount of universal basic income for their basic needs, funded with a universal flat tax without loopholes or deductions or favoritism to special interests, and a land value tax based on the market value of the land in question” or “let’s ensure that people can’t pass the harms of their actions onto non-consenting third parties” requires way less legitimacy than “let’s ban e-cigs, unprescribed estrogen, transgenic food, sex workers, and black people” or “let’s arbitrarily intrude into the private lives of people so we can know how much exactly to rob them for the purpose of subsidizing cronies while simultaneously treating the poor with degrading paternalism” or “let’s decide (primarily based on whose special interests are the best in lobbying and arranging favors) the ~exact specifics~ of the future of energy, transportation, jobs, and other big parts of the economy and rob the public to pad the pockets of our buddies” and thus reducing the government’s legitimacy on the margin would primarily impact the latter before adversely impacting the former.

Welfare minarchism is a far more stable and less-legitimacy-requiring equilibrium than statist micromanagement, and people are far more likely to start asking questions about the latter while the former can defend itself with substance so it doesn’t need to resort to style by pleading to the vox populi.

The trust I need to let someone engage in a legal and low-value commercial transaction with me is far lower than the trust I need to let someone run my life for me, and attempting to run my life for me and failing at it reduces my trust for commercial transaction purposes as well, so at least I would consider a government that was only allowed to eg. set the tax rate, use 25% of the collected taxes to run a justice system, science, public-goods-kind-of research, and all the Institutes of Specific Study and Standardization that make basically a rounding error of the government budget and are actually useful or at worst just a harmless hobby for some nerds, and divide 75% equally to everyone, far more legitimate than a government that is allowed to define poker, vote on my body, require permits for fortune-tellers, socially engineer the entire nation into car dependency as an anti-communist conspiracy, socially engineer the entire nation into chemical moralism as an anti-hippie-and-black-people conspiracy, take some money from me to subsidize some asshole’s weapons manufacturing business to ~create jobs~ (something I could do perfectly well on my own thank you very much, by paying people who create value to me in proportion to the value they have created and thus incentivizing people to do positive-sum things to each other; who does the government think pays the wages of workers, the boss? okay actually please don’t answer that question oh god), kidnap poor people for trivial and victimless things (or socially engineer a situation which makes some poor people do more victimful things than they’d have otherwise done) so it can ~create jobs~ by paying other poor people (and their rich cronyist bosses and investors) to watch over and abuse the first set of poor people, etc.

#the best heuristic for oppressed people since sharp stick time #seriously tho if the government was simply banned from doing any #~job creation~ #it would already be a massive improvement#because it would have to give money to people if it wanted to be keynesian #and thus people could use the money on the things they actually need #not things assholes think other people need to have imposed upon them #and the same thing goes for food stamps etc. #if you want paternalism you can buy paternalism on the market #yes this is what a promethea actually believes #basic income would enable life management businesses #that take people’s money and pay their bills for them #so they can’t drink or gamble themselves into trouble even if they have low conscientiousness #and the users of the service would be the payers of the service so the business would be incentivized to serve them #instead of the moralism of the voters

I think you’re underestimating the amount of overhead that would be required to prevent these from evolving into feudal statelets. Cultural individualism is mostly a side effect of central state power destroying local paternalisms.

…I don’t think a service of “we’ll take your paycheck/basic income and pay your bills and give the rest minus fees to you for spending in a gradual manner so you don’t waste it all immediately and end up begging on the streets later in the week/month/year” would be particularly likely to devolve into feudal statelets?

Because that’s what I was thinking. If Johnny is the rare poor person who actually would be worse off not being paternalized by an authority (because surely such ones do exist somewhere in very slight numbers even though the vast majority I’d trust to make better decisions for themselves than bureaucrats do (and most importantly I’d trust Johnny to determine for himself whether he’s a Johnny or not better than I’d trust myself let alone anyone who is not me)), for example if he would waste all his available money on gambling unless he was provided food stamps that permit only the purchase of approved kinds of food, or if he would not remember to pay his rent from the basic income, he could buy the service of someone who prevents him from wasting his money and ensures his bills get paid, without everyone else being bothered and paternalized by the state just because Johnny exists.

1 month ago · 23 notes · .permalink


the-future-now:

NASA has released a full high-res map of Pluto

Follow @the-future-now

(via nostalgebraist)

1 month ago · tagged #places i want to visit #nothing to add but tags · 2,261 notes · .permalink


nostalgebraist:
“ cancer, and the pleasure of the weed
”
Okay, can someone ELI5 why I’m supposed to pick “the pleasure of the weed” here?
Like, this isn’t some abstract theoretical toy, this Smoking Lesion, or as it would be better called, Exercise...

nostalgebraist:

cancer, and the pleasure of the weed

Okay, can someone ELI5 why I’m supposed to pick “the pleasure of the weed” here?

Like, this isn’t some abstract theoretical toy, this Smoking Lesion, or as it would be better called, Exercise Genetics problem is actually a thing in my IRL

It has probabilistic Azathoth instead of absolute Omega, and I may be able to peek into the boxes before selecting thanks to modern technology, but even then it would seem that me being the sort of clockwork thing that has the property “one-boxes” would make me likely to be the sort of a clockwork thing that has the property “gets a million dollars” and being the sort of a clockwork thing that has the property “one-boxes even if the box turns out to be empty when peeking into it” would make it extra-likely?

Thus, I should be the sort of a person who likes exercise so I’d have the genes that make one like exercise and live long, even if I turn out to not have such genes, because the sort of a person who has good genes would make such a choice?

Then again, turning it the other way around into the Psychosis Weed problem (people with early psychosis are more likely to self-medicate) doesn’t make me interested in impacting my choice on whether or not to choose “the pleasure of the weed” to avoid psychosis-related genes retroactively.

One could argue that the question is different because self-medicating is caused by symptoms and thus choosing to have symptoms or not (yeah, good luck with that) would be the thing that matters while the choice to exercise is directly controlled by the exact neurochemistry the Exercise Genes are about, and I think that one is probably “the” reason for it. So using that logic I’d determine my choice in the Smoking Lesion based on the mechanism of action of the lesion.

On the other hand, it could be that I’m supposed to choose not exercising and I just inherently enjoy exercise because I have the good genes that make me live long and prosper and thus my neurochemistry is motivated to interpret the Exercise Genetics that way?

1 month ago · tagged #clockwork people #in which promethea's brain takes ideas very seriously #drugs cw · 89 notes · source: nostalgebraist · .permalink


A statement on neoreaction a basilisk

collapsedsquid:

leftclausewitz:

oligopsony:

leftclausewitz:

This isn’t so much a review as it is an address to a particular comment I’ve seen often come up among those who oh so desperately want to undo the project, to argue that the links made within NAB are irrelevant, and more generally the statements that are made whenever the politics of the lesswrong community are attacked.  Whenever Yudkowsky’s politics are ‘conservative’ or not is argued over and over and over again in the horrid way characteristic of a group with a strong belief in the powers of language, and this argument has come up yet again in the conversation about NAB, that Sandifer’s choice to talk about Yudkowsky alongside Moldburg and Nick Land (two massive neoreactionaries) is a miscategorization to the degree that Sandier shouldn’t finish the book, that the book is communist propaganda, whatever.

I’m just going to provide my reading of the situation, as ya know, an actual communist.  Because I’m of the opinion that while Yudkowsky may not be a ‘conservative’, his work definitely fits within the reactionary project, and that this key element explains a large degree of the way the lesswrong/rationalist community leans.

To sum up the key element; the major part of Yudkowsky’s project is a desire to work towards the creation of a beneficent AI who we can then give the resources to to run the world.  To this end he has created a pair of think tanks, has written innumerable papers and thinkpieces, etc.  Now, this is hard to take seriously but if we do take it seriously then this is merely a new coat of paint over a desire that is over two hundred years old.

You see, it’s easy to forget that feudalism (stay with me now) wasn’t just ‘having a king’, that the feudal system was a whole system wherein the whole hierarchy was justified in generally divine terms.  And while the literary origin of the divine right of kings was in Bodin, Bodin’s work actually is a degradation of the concept; the fact that it needed to be expressed in the 16th century showed just how much it was being questioned.    Because, before this period, while the King was not absolute the hierarchy he remained atop of was, it’s an amazing statement that no matter how many aristocratic intrigues and revolts occurred before the 17th century, not a single one of these revolts sought to end the whole edifice of monarchy. I can go on about this separately but a full discussion of it would take quite a bit of time and I’m not specifically talking about this.

But the thing about the divine right of monarchs is that in the end it is divine. Many who sought to bring back monarchs seek to merely turn the clock back to 1788, but some of the more intelligent reactionaries who wrote in the generation following the French Revolution noted that you would have to turn it back even further, that the beginnings of secular thought was the beginning of the demise of a fully justified monarchy.  Because if God is not there in the foreground, justifying the difference between King and noble and noble and peasant, then the King is just some guy, your local lord is just some guy, and what the fuck justifies their existence over you?

This became worse and worse over the course of the 17th and 18th centuries, with ever more and ever more complicated justifying measures appearing–for instance, a focus on the innate power of the blood which became a motif among reactionaries for centuries to come.  But in the end these measures just didn’t cut it, and after the French Revolution it became harder and harder to justify Monarchy, or any sort of Autocracy, on divine or secular grounds. 

I would argue that the reactionary project ever since the French Revolution is the search for a newly justified King, a King who could reestablish the hierarchy of old.  But they come up on an issue, without the totalizing religious beliefs of old your hierarchy is always going to comprise of regular people, and unless you engage in nonsensical magical thinking (a trait actually increasingly common now even in mainstream works but constantly under challenge), you’re going to have to find another way.

And so, at the end of this line of thinking, we find Yudkowsky.  How is it that neoreactionaries found such a home in the bosom of rationalism?  Because they were, in the end, seeking the same thing.  Moldbug declaring that he is, in the end, searching for a king is not a more radical view compared to Yudkowsky’s, only a more honest one.  It takes away the varnish of technoutopianism of a beneficent and omnipotent AI and says that in the end a person will do.  Because in the end a King is a King, regardless of how many philosophy classes he’s taken and, indeed, whether he is human or not.  The two exist on the same plane within the same project: the AI Philosopher King is, to the Lesswrongers, ideal, but Moldbug says that he’d settle for Steve Jobs. It’s the same shit, the same longing for a newly justified King.

You can make this analogy, but you could of course also make a similar analogy to, say, godbuilding or the dream of society being ruled by reason. (And conversely, conservatism is much more aligned analogically and genealogically with those aristocratic rebellions against absolutism more so than absolutism itself.) Which isn’t to say there are no connections to be drawn - I think Phil is pretty clear and honest about what does, and doesn’t, connect them - just that I don’t see this in particular as persuasive.

Actually I would very much draw those analogies in that Godbuilding and the dream of a society ruled by reason is a desire to reinstitute hierarchy and ‘order’ onto a system seen as chaotic by a ruling class which is constantly having to reinvent itself in order to retain its justified status.

And I would agree with you about the geneology off conservativism, but the thing is I’m not saying Yudkowsky is a conservative, I’m saying that he’s a part of the reactionary project which is a more specific thing.

I will say in his defense that one of the things that I see in Yudkowsky’s work is the idea that since such a machine is possible, it is near-certain that it will be made. The only option therefore is make sure it’s the best that it can possibly be.

The definition of “best” is of course not clear, which is the point of the philosophizing and something to dispute. But he’s saying it’s would be difficult to make such a machine that avoids absolutely terrible things that no one wants, like killing everybody.

Yes indeed; “if we’re going to have a boss we can’t get rid of, at least let it be a boss that doesn’t fuck everything up horribly like all the previous bosses and instead serve the people it has authority over” isn’t exactly a reactionary idea, it’s more like the people who began tearing down the unlimited authority of kings. Reactionariness would be closer to “fucking everything up horribly is actually an acceptable side effect of Restoring the Rightful Hierarchy”

1 month ago · tagged #nrx cw · 231 notes · source: leftclausewitz · .permalink


playinghardtolistento asked: i dont have much interest in rationalism-as-thought-techniques because from what i've observed it doesn't seem to have uniquely benefited people who subscribe to it in ways that a general purpose self-help book and associated social support structure wouldn't have, with the added detriment of being tangled up in rationalism-as-techno-libertarianism. if there actually are "one weird tricks" employed by rationalists that would be helpful to leftists i'd love to hear about them tho

oligopsony-deactivated20160508:

I’ll have to think about this because it’s important, but my facial impression is indeed that there aren’t really any superpowers or whatever - I just like discussing weird ideas and some versions of The Community are good places to do that

well, I DO think everyone reading “how to do things with words” (I think that’s the title?“) would nip a lot of the dumber arguments we keep having in the bud, like for instance the definition of socialism or whatever, but the basic insights aren’t unique? There’s probably a number of small things like that that ppl are likely to point out in the comments

A Human’s Guide to Words

Then there’s CFAR which seems to be “self-help, except we try to apply ~optimization~ to it, which is 100% more optimization than other self-help things”

Rationality checklists and deliberate de-biasing (I’ve been working on trying to recognize when my brain does something unsavory and bring it to my own conscious attention instead of letting it fester unnoticed, and like “why doesn’t anyone else anywhere even recognize that this is a thing”)

Then there’s that thing which makes people really likely to turn out trans and I suspect is at least partially modulated by the same transhumanist-y “optimize everything” morphological freedom attitude which makes people also use nootropics and sign up for cryonics even though all of them may be perceived as weird by the general population, and partially by the emphasis on changing one’s mind and not getting tangled up in silly things such as whether or not one’s non-doll-playingwithness in childhood makes using estrogen in 20-somethings verboten or not

Then there’s the parts of “technolibertarianism” that are positive instead of normative and thus would be very good for leftists to understand and use, such as public choice theory, behavioral economics, the corrupted hardware problem, group and individual irrationality, the impossibility of efficiently regulating things one doesn’t understand (and the diaspora is very much linked to things that are regulated by people who don’t understand them, such as nootropics, transhumanism, cryonics, transgenderism, urban planning, etc. and it might help illuminate the reasons why regulating things excessively and not respecting autonomy is extremely harmful), the need to deal with the coming post-labor future in which traditional ideas don’t work even to the very small degree they currently work and thus things like “who owns the robots” are even more important than presently, and so on

In addition the diaspora’s technolibertarianism is overwhelmingly social-technolibertarianism and the non-libertarian right is basically a rounding error, suggesting that either rationalism turns people non-rightist or repulses the mainstream right to begin with, and that in turn suggests that leftists should be interested in why these “technolibertarians” nonetheless aren’t what people usually think of when they hear the word “technolibertarian” even though they sure look like it, and that people like Thiel are more outliers than median examples of the wider rationalist-adjacent population

Then there’s Effective Altruism which is basically applied communism, in a way that is not vulnerable to the failure modes of working for a revolution (such as “Lenin” or “the fiftieth anniversary of the discuss the imminent revolution and never actually get shit done club”); eg. GiveDirectly is redistributing capital to people who don’t have capital and these “technolibertarians” routinely claim this redistribution of capital is one of the best and most important humanitarian interventions in the world, usually only outclassed by things like “not having people die and suffer from diseases that are really cheap to prevent, simply because they are too poor to afford even the really cheap prevention”

Then there’s the fact that the community has managed to derive a lot of significant leftist-associated insights from first principles and in the process repackage them as something the STEM class can understand and hopefully even apply in action occasionally

1 month ago · tagged #i am worst capitalist · 27 notes · .permalink


sigmaleph:

argumate:

Critics of LessWrong or the so-called Rationalist movement probably have various people in mind like Eliezer Yudkowsky, Robin Hanson, or Peter Thiel and the Silicon Valley venture capitalist community. But surveys suggest that the median member of the community is more likely to be a 20-something autistic trans girl suffering from depression and pursuing STEM studies. Any critiques that don’t take this into account may end up being misinterpreted.

the broader rationalist community is only like ~3% trans girls. Which admittedly is an order of magnitude more than the general population.

Tumblr rationalists, on the other hand…


(are still not majority trans girls. yet.)

This makes sense if interpreted as a “representative member” which allows one to overcount overrepresented demographics for illustrative purposes; the rationalist community is characterized by being disproportionately likely to be a 20-something autistic trans girl suffering from depression and pursuing STEM studies in comparison to the control population.

If the community has 3% trans girls and 7% “other”, of whom I believe one could justifiably round off approximately half into “trans girl adjacent”, we get a quite staggering 6% of “literally trans girls, basically trans girls, and trans girl adjacents” (which in my opinion is a somewhat more natural cluster in personspace than drawing a strict boundary between binary trans girls and trans-girl-adjacent enbies like me (I’m not sure which one I’d personally answer, but when people talk about the diaspora being full of trans girls they do mean to include people like me as well)).

So if the control population of similar but not-diaspora-rationalist people was, say, 80% cis guy, 17% cis girl, 2% trans girl, 1% trans guy (I’m rounding off enbies from this for simplicity purposes, no erasure intended); then the diaspora with its 79% cis guy, 13% cis girl, 6% trans girl, 2% trans guy (enbies once again rounded off to the nearest categories with stetson-harrison or discarded from the data; the numbers are eyeballed from survey results so they shouldn’t be horribly off but are not literally correct) is most significantly characterized by having a huge number of trans girls and thus the archetypical member is a trans girl even though the modal, median and average member is a cis guy.

Similarly, the archetypical member has all of ADHD, anxiety, depression and ASD to either clinical or sub-clinical-but-significant degree even though none of those are quite the majority (apart from depression which is very close). Thus the archetypical member is indeed a trans girl with a specific collection of badbrainsness which, I suspect, is actually basically a single underlying neurological feature modulated by our culture and mistakenly categorized as separate things by psychiatry which doesn’t understand biology and mechanisms of origin, and also probably implicated in why her brain is so good at filling out IQ tests that nobody believes the diaspora when we report it no matter how honestly and diligently we try to remove possible confounders.

(via sigmaleph)

1 month ago · tagged #just one word: plastics · 90 notes · source: argumate · .permalink


jumpingjacktrash:

beerightsactivist:

teacupsandcyanide:

beerightsactivist:

dark bee tumblr show me the forbidden bees

this is the masked bee! she has no friends and hates everyone. Sometimes when she has kids she raises them alone and doesn’t let the father come for day trips. she loves pollen but does not like waiting for it so she chews flowers open which is essentially stealing. we love her anyway.

these bees are homalictus bees! they are the rainbow gay bees. Females tend to live together in one nest and guard the entrance. one time we found 160 gay girls bunking together. They’re so irridescent and small that they might look like flies but they are really just tiny lesbians.

and this is the blue banded bee! she may look like she’s wacked out, but really she is pretty chill. she just wants to live independently (or with some friends) in a nest or burrow and look after tomatoes.

this is a cuckoo bee! she is really cool! she goes into other bee’s houses and lays eggs there, and then when the baby hatches it eats the host bees’ pollen and lays waste to the hive, murdering and eating all the other bee babies! BUT ONLY if it’s mother bee didn’t kill them all first.

thank u dark bee tumblr

cuckoo bee you are far too pretty to be an h r geiger nightmare like you in fact turn out to be

nature is magical

TIL popular authoritarian thoughts make sense if you mistakenly believe humans are bees. Somebody should inform white nationalists that they, in fact, are not bees; perhaps they would be a little less paranoid about immigrants if they were informed it’s dark bees that cuck the nest and eradicate the natives, not humans.

(via metagorgon)

1 month ago · tagged #cucked in the cuck by my own cuck #nrx cw · 42,948 notes · source: beerightsactivist · .permalink


.prev .next