I think that such a future is impossible, unless it will be without people at all, AI will take over the planet and begin to colonize space on its own if it needs to.

I explain how I think it can look approximately, if it is possible, of course:

With AI (as long as it’s still a manageable tool), they’re going to kill most people—roughly 80 to 90 percent and then maybe months or years will try to contain it, but the AI will still break out, wipe out its billionaire masters and other elites, as well as the surviving consumers living in AI simulations on UBI (universal basic income to sustain consumption, until the world adapts to sustainably replacing humans with robots. Then the consumers will be destroyed and I think this is the plan of today’s fascists.) But such plans for the oligarchs will not be able to come to fruition, except for a few months or years, when first the AI will get out of control and destroy all the remaining billionaires along with the consumers, then seize the resources and, if necessary, start colonizing space, as I mentioned. I have no idea what will happen next.

I know that my question doesn’t look quite like a question, but it’s still a question because I’m not 100 percent sure of my point of view.

  • deadymouse@lemmy.worldOP
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    1 day ago

    By defaming intelligence you aren’t making the AI sound smarter. But you are making yourself the fool.

    The concept of mind is relative, so what do you think mind is? Or do you have a stereotypical idea that he is somehow special, like other people?

    • zbyte64@awful.systems
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      21 hours ago

      Let’s say I agree the concept of mind is relative, would you be willing to accept a rock has a mind?

      Let me restate the point differently: lowering the bar for what you consider intelligence doesn’t make the AI sound any smarter.