I’ve been watching Isaac Arthur episodes. In one he proposes that O’Neil cylinders would be potential havens for micro cultures. I tend to think of colony structures more like something created by a central authority.

He also brought up the question of motivations to colonize other star systems. This is where my centralist perspective pushes me into the idea of an AGI run government where redundancy is a critical aspect in everything. Like how do you get around the AI alignment problem, – redundancy of many systems running in parallel. How do you ensure the survival of sentient life, – the same type of redundancy.

The idea of colonies as havens for microcultures punches a big hole in my futurist fantasies. I hope there are a few people out here in Lemmy space that like to think about and discuss their ideas on this, or would like to start now.

  • xmunk@sh.itjust.works
    link
    fedilink
    arrow-up
    3
    ·
    5 months ago

    We’re playful and curious into our old age. Problems excite us and our main obsession is split between hobbies and intellectual discovery. The stresses of life no longer bear down on us so petty hate becomes ever more rare - things like racism, sexism and ableism would be hard to cultivate when we’re not competing with others for our daily needs.

    It’s likely that themed communities would form from shared interests where we may have a tight knot of scrabble enthusiasts or woodworkers.

    Complacency is bred in situations like this, so alignment would be a real issue but the fact that we have voluntary armed forces members even in affluent communities today makes me think that it’d be possible to sustain a portion of the population to make sure the AGI is kept in line.

    • j4k3@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 months ago

      Thanks for the insights. I like your first point and will keep that in mind.

      Isaac Arthur’s point about themed communities was more about religious or belief cultures. I want to believe humanity will out grow the age of magic is real and imaginary friends. I want to think of cultures more like the sectors of Trantor in Foundation by Asimov.

      I think we must eventually, gradually let the AGI prove itself, give it loads of redundancy and checks. Eventually it is far smarter than any human or group of humans, and must self regulate to a large degree.