Smart-Tomato-4984

Smart-Tomato-4984 t1_jdtf78m wrote

To me this sounds suicidally crazy honestly , but I guess only time will tell. In the 70's everyone thought humanity would nuke itself to death. Maybe this too will prove less dangerous then seems.

But I think the risk posed by AGI will always remain. Ten thousand years from now, someone could screw up in a way no one ever had before and whoops, there goes civilization!

1

Smart-Tomato-4984 t1_jdqsjmp wrote

And it would be much better if we did not reproduce, but we should expect 105 billion more people to be born before we realize that filling the galaxy with human descendants would result in a tragedy of the galactic commons and an ecology of stronger civilization eating weaker ones, due to evolution by natural and mimetic selection.

1

Smart-Tomato-4984 t1_jdqrzrf wrote

>A superintelligent AI could for sure bring back people from the past.

I don't think there is enough matter in the reachable universe to make a computer that big. It's not millions of possible minds. It's a near infinity of possible minds. Also, you murdered all the other minds you tested out and than didn't go with.

1

Smart-Tomato-4984 t1_jdqr5cl wrote

My thoughts exactly.

>"Equipping LLMs with agency and intrinsic motivation is a fascinating and important direction for future work." - Sparks of Artificial General Intelligence: Early experiments with GPT-4

Not good. It turns out we can seemingly have pretty good oracle AGI, and they are screwing it up trying to make it dangerous. Why? Why would we want it to have it's own agency?

3

Smart-Tomato-4984 t1_jd2v5xe wrote

Biological immortality is irrelevant. It won't exist any time soon and we aren't debating if the rich might kill off the poor 150 years from now, but in the near-term future.

Also, you can't fight back if you are dead. This is about advanced AI and robotics. Presumably the responsible party would kill everyone on the same day.

1

Smart-Tomato-4984 t1_jd2uhoe wrote

Reply to comment by Education-Sea in Replacing the CEO by AI by e-scape

Someone(s) human must be making the decisions, because sometimes the chat-bot is going to say dumb shit that need creative interpreting and it's not going to take the initiative, if it is a LLM type AI. Someone has to prompt it with questions. LLM have not long term episodic memory either.

If they don't pay anyone ridiculous amounts of money, that's awesome.

2

Smart-Tomato-4984 t1_jd2s5r1 wrote

If Elon Musk couldn't sell his shares off, then he would not be in any sense wealthy. They have value only because he can sell them off.

Anyway, it goes without saying that to kill of poor people would make the rich less rich by definition, since there would be no poor people around for them to be rich in comparison to.

3

Smart-Tomato-4984 t1_jd2rnf2 wrote

Killing people is technologically possible now, but human biological immortality is not. The latter is simply a harder problem than figuring out how to kill even large numbers of people. So probably medical advancements are not relevant to this debate about whether the rich might kill off the poor.

Also, biological immortality wouldn't make poor people un-kill-able. So again, it doesn't seem to be relevant.

1

Smart-Tomato-4984 t1_jd2q010 wrote

Rich people don't leave their money in banks, or very little of it as a percentage, and SVB's failure was not the result of poor people.

Imagine if earth got twice as much habitable land and resources suddenly, you wouldn't expect this to make rich people lose all their wealth. The discovery of the new world didn't make Europe's Kings get poor.

Neither would reducing the population necessarily do that. Anyway, it doesn't so much matter what would happen as what they expect to happen,

1

Smart-Tomato-4984 t1_jd2lvrb wrote

The nifty thing about AI, automation, and advances in small drone technology is that you might not need the knowing cooperation of any other human to have automated factories making billions off tiny poison carrying drones.

One rich nutter could eliminate everyone or most people or all but a bunch of women in the theory that they will repopulate the world with him.

3