• 2 Posts
  • 130 Comments
Joined 5 months ago
cake
Cake day: March 2nd, 2024

help-circle

  • jsomae@lemmy.mlOPtoPrivacy@lemmy.mlI'm losing faith
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    edit-2
    3 days ago

    where did you get the idea that gpt4 is capable of this? this is concerns for 10+ years from now, assuming AI makes the same strides is has in the past 10 years, which is not guaranteed at all.

    I think there are probably 3-5 big leaps still required, on the order of the invention of transformer models, deep learning, etc., before we have superintelligence.

    Btw humans are also bad at arithmetic. That’s why we have calculators. if you don’t understand that LLMs use RAG, langchain (or similar), and so on, you clearly don’t understand the scope of the problem. Superintelligence doesn’t need access to anything in particular except, say, email or chat to destroy the world.



  • jsomae@lemmy.mlOPtoPrivacy@lemmy.mlI'm losing faith
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    3 days ago

    AI could kill everyone, though it most likely won’t IMO. 10% chance I think. That’s still very bad though. Despite the fact that Ilya Sutskever, Geoff Hinton, MIRI, heck even Elon Musk have expressed varying degrees of concern about this, it seems the risk here is largely dismissed because it sounds too much like science fiction. If only science fiction writers had avoided the topic!