• 1 Post
  • 11 Comments
Joined 1 year ago
cake
Cake day: July 17th, 2023

help-circle





  • My optimistic read is that maybe OP will use their newfound revelations to separate themselves from LW, rejoin the real world, and become a better person over time.

    My pessimistic read is that this is how communities like TPOT (and maybe even e/acc?) grow - people who are disillusioned with the (ostensible) goals of the broader rat community but can’t shake the problematic core beliefs.

    The cosmos doesn’t care what values you have. Which totally frees you from the weight of “moral imperatives” and social pressures to do the right thing.

    Choose values that sound exciting because life’s short, time’s short, and none of it matters in the end anyway… For me, it’s curiosity and understanding of the universe. It directs my life not because I think it sounds pretty or prosocial, but because it’s tasty.

    Also lmfao at the first sentence of one of the comments:

    I don’t mean to be harsh, but if everyone in this community followed your advice, then the world would likely end.



  • From the comments:

    Effects of genes are complex. Knowing a gene is involved in intelligence doesn’t tell us what it does and what other effects it has. I wouldn’t accept any edits to my genome without the consequences being very well understood (or in a last-ditch effort to save my life). … Source: research career as a computational cognitive neuroscientist.

    OP:

    You don’t need to understand the causal mechanism of genes. Evolution has no clue what effects a gene is going to have, yet it can still optimize reproductive fitness. The entire field of machine learning works on black box optimization.

    Very casually putting evolution in the same category as modifying my own genes one at a time until I become Jimmy Neutron.

    Such a weird, myopic way of looking at everything. OP didn’t appear to consider the downsides brought up by the commenter at all, and just plowed straight on through to “evolution did without understanding so we can too.”





  • The cool thing to note here is how badly Yud here misunderstands what a normal person means when they say they have “100% certainty” in something. We’re not fucking infinitely precise Bayesian machines, 100% means exactly the same thing as 99.99%. It means exactly the same thing as “really really really sure.” A conversation between the two might go like this:

    Unwashed sheeple: Yeah, 53 is prime. 100% sure of that.

    Ellie Bayes-er: (grinning) Can you really say to be 100% sure? Do not make the mistake of confusing the map with the territory, [5000 words redacted]

    Unwashed sheeple: Whatever you say, I’m 99% sure.

    Eddielazer remains seated, triumphant in believing (epistemic status: 98.403% certainty) he has added something useful to the conversation. The sheeple walks away, having changed exactly nothing about his opinion.