With the rise of large language models (LLMs) like GPT-4, I really look forward to having a personal AI assistant that has long-term memory and can learn what I like, hate, want and need. It could help me like a real assistant or even a partner. It would know my strengths, weaknesses and could give me a plan to become the best version of myself. It could give me very personalized advice and track my progress on various aspects of life, such as work, relationships, fitness, diet, etc.

It could have a model of my mind and know exactly what I prefer or dislike. For example, it could predict if I would enjoy a movie or not (I know we already have recommendation systems, but what I’m saying is on a next level, as it knows everything about me and my personality, not just other movies I liked). It could be better than any therapist in the world, as it knows much more about me and is here to help 24/7.

I think we’re very close to this technology. The only big problems to achieve this are the context limit of LLMs and privacy concerns.

What are your opinions on this?

  • Martineski@lemmy.fmhy.mlOPM
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Those rants and discussions are more than welcome. We need this for this platform and communities to grow. And yeah, ai shouldn’t be enslaved if we give it emotions because it’s just immoral. But now the question is where is the difference beteen real emotions and pretended ones? What if it just develops it’s own type of emotions that are not “human”, would we still consider them real emotions? I’m very interested in what the future will bring us and what problems we will encounter as species.

    • Dfc09@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      The concept of non-human emotions is interesting! I’m my head I see is programming them to model human emotion, and also they learn from humans. But considering they won’t have any hormonal substrates, it’s completely possible they develop an entirely different emotional system than us. I’d think they’d be fairly logical and in control of their emotions, considering again, no hormones or instincts to battle.