You are viewing a single comment's thread.

view the rest of the comments →

0
0

[–] Wall41and14Wall ago 

I am curious as to how you think about this:

Most things in life are 'goal driven.' In nature the goals seem to simplify to 'eat and fuck.' Humans have those goals as well, but also strive for pleasure and comfort among other personal quests.

What do you imagine the goal for a supremely intelligent AI would be - supposing it had the ability to take over the world? Will AI have an inherent curiosity of some sort? To what end?

Thanks

0
0

[–] Maroonsaint ago  (edited ago)

I imagine it would want to expand or it would do nothing. Maybe it would build wormholes idk. Thats like askin your dog what it thinks your goals are it wouldn’t get it. What if it builds a simulation where humans evolve to create ai. And that’s where we are now. Twilight zone music

0
0

[–] Wall41and14Wall ago 

That is kind of what I am thinking. It would take a massive white pill - determine that there is no point to existence and off itself.

I imagine being the smartest in the room with no ability to take action and having zero support in any way.