Professional-Noise80

Professional-Noise80 t1_jbahmbz wrote

This show is helpful to me when I analyze a work of art because when people say that something is deep and interesting, and I don't think it is, I can just think of the Eva fandom and rest assured that a lot of people agreeing on something doesn't make them right.

I call these kinds of shows "all-aesthetics". All they're doing is trying to look cool and all people are doing when they say it's deep and interesting is also trying to look cool.

Works like Berserk, Eva, Cowboy Bebop, Bladerunner, just aren't as deep as people adamantly make them out to be. They're visually interesting or even great but that's it. I guess the issue is a relative lack of culture, leading to mistaking things that sound deep for things that actually are.

1

Professional-Noise80 t1_j7gk3kk wrote

Increased consumption of anything that makes you obese will be associated with disease risk. I mean obese people tend to eat more of everything as well. So do fit people, who are actually healthier. What are we supposed to take away from this ? That we should stop eating carbs ? Sounds like a bad idea to me.

9

Professional-Noise80 t1_j6o0d5z wrote

Was this written by Jordan Peterson or something ? How is happiness not meaningful ? Don't we pursue meaning because it makes us happier ? Why the fetishisation of suffering ?

Even hedonism is not meaningless. It recognizes that pleasure is sometimes attained through some suffering via making efforts to achieve goals for example. Hedonism isn't even meaningless or a bad philosophy.

I suspect that the Peterson-like people aren't actually pursuing meaning itself, they're pursuing a grandiose idea of themselves (or more plainly, status), that's why they express so much contempt. Same thing could be said about Nietzsche.

But Nietzsche and Peterson, I suspect, were and are deep down miserable, unhappy, lonely people

12

Professional-Noise80 t1_j3o2frz wrote

Intelligence is a tool to produce and apply knowledge but it's not everything. For humanity to make progress in science we need to actually test hypotheses in the real world and it takes time and ressources. Without real world data intelligence is basically useless.

And obviously scientific progress is a very long process that wouldn't be that much faster with the aid of better AI imo.

1

Professional-Noise80 t1_ivpk1av wrote

I don't see why AGI is so hard.

The way I understand it, something can minimally constitute an AGI if it's able to use knowledge gathered from learning to complete one task in order to train itself to complete another task faster than it would if it didn't have that prior training.

AGI has been attempted this year and although the results were inconclusive, I think we might see something approaching it in the near future.

I think the line isn't clear cut for AGI, I think there's going to be gradual improvement, like in language models.

But I really don't know. What do you think ?

1

Professional-Noise80 t1_istlwt1 wrote

This is definitely human. AI don't feel emotions, they don't know how "safe" feels.

If you ask any human to imagine a scenery, I guess they would conjure an image pretty similar to this. But they would actually "conjure an image" in their mind, they would have feelings about it.

That's what AIs don't have and probably never will. Subjectivity.

AIs only imitate that. But hey, that's good enough for me.

1