Summary

Geoffrey Hinton, the “Godfather of AI,” warns of a 10-20% chance that AI could cause human extinction within 30 years, citing the rapid pace of development and the likelihood of creating systems more intelligent than humans.

Hinton emphasized that such AI could evade human control, likening humans to toddlers compared to advanced AI.

He called for urgent government regulation, arguing that corporate profit motives alone cannot ensure safety.

This stance contrasts with fellow AI expert Yann LeCun, who believes AI could save humanity rather than threaten it.

  • MrNesser@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    4 days ago

    There’s a few ways AI could go

    1. It’s completely indifferent to us, ignores us completely and does its own thing
    2. Someone takes a shot and we end up in a war ala the matrix
    3. AI sees us as needing it and takes over like it does in the polity novels

    Realistically 1 is best case as 3 would cause civil war as governments lose power.

    The only problem with 1 is it inevitably leads to 2 as we are ignorant idiots