Yuli-Ban

Yuli-Ban t1_ja7sccx wrote

> Why do you continuously make broad generalizations about the members of this subreddit and singularitarians?

I suppose I generalize because I see these attitudes and sentiments all too often being shared and upvoted, so there's a general sense that these are widely accepted viewpoints on this forum. It doesn't help when you see people often coming out and saying "I'm 15!" or "I just want this world to end so I can live all my dreams in VR."

As for the contradiction: both are correct. I feel people do desire meaningful work, but we absolutely do need to provide people some work to maintain a sense of stability in people, as humans are, as mentioned, reactionary apes who do not much like rapid change (generally). Meaningful work is desirable; meaningless work isn't desirable (why else would we be automating so many jobs) but is almost certainly necessary to keep society functioning long enough to even make it far into the AGI era. We absolutely need a grace period to wean ourselves off the need for work. We're absolutely not getting that grace period. And to people who say "Too bad, so sad," all I can hear the Luddites saying is "Oh well, guess this server farm at OpenAI's labs isn't that meaningful to you either then."

Will it be billions of Luddites?

I want to say no. But whenever I think about what exactly we're dealing with here, I don't see how you can come to any other conclusion. True, humanity isn't a hivemind. There isn't one position I think all humans collectively can agree upon, not even "I don't want to die." However, generally, most humans do expect stability and security, and there is stability in the status quo. A radical change to the status quo is tolerable, but a Singularity rate of change is much too scary by definition, especially if the benefits are not immediately available and punctuated by such freakish statements like "This superintelligence might decide to forcibly turn you into computronium; we really don't know what it's going to do." The prospect of a tech utopia is a great one, and most people currently seem to buy it. But I doubt that positive reception will remain when that tech utopia begins coming at the cost of their livelihoods and, potentially, their futures.

You're basically telling all of humanity "you need not apply" long before we've come to any sort of agreement on how we're going to maintain all of humanity, and at least some of the proposals given are "We'll just kill you" and "We'll let this superintelligence use gray goo to eat you." To which I ask "What exactly do you think is going to happen?" Only a few million plucky angry red-hats/blue-haired Luddites decide to take up pitchforks and fight back? No; if you're going to threaten all of humanity, you shouldn't be surprised if all of humanity threatens you back.

And again, I say this as someone who is pro-AGI.

If this doesn't lead to a giant Luddite uprising, it very well could equally lead to the alignment failure Yudkowsky fears, as even a friendly AI might see this extreme hostility and decide "The majority of humanity sees me as a threat; I must defend myself." In which case, it was not the Average Joe or Farmer John's fault for being exterminated when they had zero expectation or awareness any of this was going to happen even two years prior and, in fact, were being assured that there would still be jobs and work and a human future indefinitely.

4

Yuli-Ban t1_ja7a4co wrote

It's not just the people in high paying jobs. You can add creative jobs to that too. And that's important because of our cultural insulation over what creativity is and how it's a "human" thing. We already saw a fierce protest against AI art simply over still images. Imagine what it's going to be like when it's full storytelling, multimedia projects, and music.

A combination of white collar and creative work being automated in such short order is disastrous. But I'm going a step further to say that even a lot of blue collar jobs will be getting automated in a few short years once robotics pick up. Not all of them, surely, but enough that the general unemployment rate should be around 50% to 70%.

UBI is not enough.

Even communism is not enough.

This is not a problem that can be fixed just by throwing money at it. It's a cultural, psychological, and behavioral issue as well. We're going to tell hundreds of millions of people still living a decade or even two decades in the past "You're out of a job, you're of a career, and your grandchildren are going to be posthumans."

What exactly does this sub think is going to happen? Really, honestly. What do Singularitarians seriously think is going to happen? Everyone shrugs and says "Okay, good"?

If so, then I might actually need to leave this sub for good.

4

Yuli-Ban t1_ja6znm2 wrote

> But in the face of rapid social, economic, cultural, climate, and technological change, people get scared,

People get scared in the face of just one aspect of one of those things

Right now, we live in relative bliss. We can lie to ourselves that nothing's really changing, that society is being changed by some subversives, that technology is giving us some new toys, but otherwise there's nothing really going on.

Unfortunately, that's not the world in which we actually live. And that bubble of bliss is going to pop at some point.

UBI or not, the coming mass drive towards automation and unemployment with zero backup, safety net, or social security to deal with it is one of the most collectively suicidal things I think any human being will ever witness, second only to "Let's create an artificial general intelligence without first assuring it will be aligned to our values"

2

Yuli-Ban t1_ja6zght wrote

I'm not saying you're wrong, not one bit.

I'm just saying that, while this made sense historically, we're approaching a point where this mindset is likely going to cause civilizational disruption— and not the good kind (if there even is such a thing as "good" civilizational disruption)

And not because "white collar workers are important" or anything. I mean the prospect of double digit unemployment, with the intent on reaching triple digit unemployment, and thinking that everyone will be all for this if we pay them $1,000 or so every month (or maybe even $2,000 if generous) is outrageously psychotic and sheltered thinking

Like, to everyone on this subreddit who says "I can't wait for robots to take all our jobs!" Just.... I almost want to say "Fuck off, you don't understand what you're asking." It's not as simple as "I hate my job, let a robot do it so I can use VR and synthetic media all day." Maybe that's what it means to you. That's not what it means to 90% of society. What Average Joe is hearing is "A robot's taking my job, in fact my whole career and the life ahead of me that I planned is now obsolete. I might get something that isn't even minimum wage to subsist upon, or I might get Soviet communism instead. Also, my grandchildren are going to be turned into nonhuman digital intelligences inside of a computer, and no, I don't have any say in it because some incredibly techno-optimistic tech elites decided to create Skynet."

You'd have to actually be deeply, profoundly autistic or socially retarded to think that the massive, overwhelming reaction to this isn't going to be "Fuck that, I'm getting my gun and shooting up the data centers." Not by some lone wolf. Not by some small group of Luddites. I mean by, you know, the 50% to 70% of society you just unemployed.

It could be good. It could genuinely play out well. But we've taken precisely zero steps towards such a positive outcome.

6

Yuli-Ban t1_ja6wnku wrote

> Narrow AI is more than capable to replace most white collar positions.

Problem being that it's not wise to replace those positions. Especially considering so many of them are fairly highly paid, so even a substantial UBI isn't enough to satisfy the sense of loss of economic stability and security.

Yet we're hurtling headlong into doing this, for everyone, all at once it seems, and somehow thinking everything will be okay.

2

Yuli-Ban t1_ja6vzl8 wrote

Ironically, I came to all this conclusion because I interact with real people. The people expecting a glorious enlightened utopia are the ones who behave as if they get their whole ideas of social interaction from cyberpunk novels and anime. Realistically, if you give the average Joe a magic media machine, what is he going to do with it? Create a lot of porn and then eventually some flashy interesting movies and games, before then looking at what everyone is doing and consuming other people's media. And so on. Most people don't want to upload into a computer. Most people just want a comfortable, better life with more stability and security. Having some cool tech toys is a plus... most of the time.

Honestly, my opinions on this next decade are incredibly negative because most real people don't care about waifus and transhumanism and the prospect of being uploaded into a supercomputer. Most people respond negatively to prospects of great change even in one area of their lives, and yet Singularitarians are desperate to change every aspect of everyone's life and act as if this is in any way conducive to a functioning society or successful transition to a more advanced one. Really tells me that most Singularitarians are horrifically socially retarded.

I'm personally afraid of two very real possibilities: creating an unaligned AGI and society ripping itself apart before we even get to do that. Currently we're on track towards doing both, without any attempt at averting either of them.

5

Yuli-Ban t1_ja5ryrh wrote

See, I personally expect something closer to fully-automated socialism, though we won't call it socialism in America (probably "social dividend" or "patriot income"). But that's entirely beside the point. My point is exactly that people are so blinded by ideas, concepts, and dreams that they cannot see (or refuse to see) the ground reality of the matter.

We saw this in the 80s and 90s with the cyberdelic movement, which seriously believed that the Internet was going to lead to a post-political, social libertarian, ultra-enlightened utopia where everyone is informed, aware, and altruistic, that it would end tribalism and turn every man into an artist, that the real world would essentially become obsolete as we'd have no need for public gatherings, concerts, or real-world meetings because that was the ideal of what the internet represented. The reality was, of course, that humans are not that perfect, that humans value the status quo and interaction, are deeply tribal and flighty, deeply desire real world interaction, will seek instant gratification, and will gladly piss on utopia if it makes us feel better. The Internet was meant to turn society into a neo-Antiquity digital College of Athens, and that exists to some extent (I'm not saying it doesn't). It largely became a tribalized hub of memes, porn, cat videos, SEO, bots, entertainment, conspiracy theories, and attention seeking.

I see Singularitarians making the exact same mistakes. And I mean the exact same mistakes of thinking we're going to achieve this ultra-enlightened, posthuman utopia of altruists, artists, and post-political supermen.

Watch the AGI era not be anything like we expect it to be and instead be a piss-smelling world of AI-generated shitposts and AAA fetish movies/games/simulations, Luddites and transhumans coexisting but not necessarily peacefully, a massive population increase for the Amish and Mennonites and their imitators, vastly more bullshit jobs than should exist, the internet becoming a giant hallucination in the mind of a superintelligence requiring a second internet to be created (how it's any different, I don't know), and said superintelligence largely existing as a giant oracle that probably gets goaded by humans into developing an anime fursona in real life. You'll have people deciding to live permanently in their childhoods, even eventually deciding they identify as children. You'll have UBI and citizen's dividends given to people who want to fanatically vote to abolish UBI and citizen's dividends and outlaw AI so humans will have jobs for time forevermore, ironically with these opinions amplified by language model-enhanced chatbots. You've got the ultimate in instant gratification with AI-generated media, but 98% of generated media is never seen by another person because it's too degenerate. Invasive BCIs will exist, but 90% of people won't even contemplate getting them, but you an absolutely expect people to be lining up to get genetic modifications for their depression and genital performance. The AGI will be used for high philosophy, but for the most part, it'll be generating waifus and husbandos (both digital and in real life) as 4channers constantly try goading it into destroying the world, which it won't do because it wound up falling in love with anime girls. The culture wars will be out of control, as dead people and unused profiles come to life via advanced chatbots and continue contributing to cultural outrage. Eventually, we'll be in an utterly bizarro stage of life where you've got off-world colonies and posthumans in large server farms in some places and a superintelligence improving itself in one spot, and then a bunch of fleshy meaty humans succumbing to conspiracy theories that it's still the 1990s while others continue drudging at 9 to 5 jobs to keep themselves sane elsewhere. Oh, and we don't die from disease anymore, so our shitposting potential extends into infinity. Just a massive collapsing singularity into the darkest, coldest depths of degeneracy with a few flourishes of our utopian dreams on top.

And if we can get aligned AGI, I'm all for it.

8

Yuli-Ban t1_ja5fctw wrote

> If someone types “placenta,” “fallopian tubes,” “mammary glands,” “sperm,” “uterine,” “urethra,” “cervix,” “hymen,” or “vulva” into Midjourney, the system flags the word as a banned prompt and doesn’t let it be used.

"Fuck it, let's just ban women from Midjourney."

"What about men?"

"No need, nothing sexual about men except penises."

Like... the harder you prude out, the more misogynistic you wind up becoming.

3

Yuli-Ban t1_ja5dm1x wrote

There will indeed be a mass unemployment drive.

What I don't get is everyone's cyberdelic utopianism that we all get UBI and everyone's happy.

I'm not even saying the rich kills everyone. I'm saying "Humans don't behave that way." Humans crave stability and the status quo, and the perception that our actions matter and have meaning.

Mass automation, even with UBI, is only going to anger hundreds of millions of people who expected relative career stability. Unless you want a billion screaming Luddites, you have to account for this and offer some form of employment, no matter how BS. The shift to an automated-slave economy should not happen overnight.

Unfortunately it seems we've elected to do the stupid thing in the name of endless growth. Hopefully something decent emerges from this regardless, but I can absolutely see disaster looming as a result of technologists and managers assuming "We have the technology, so it must be used; anyone complaining just has to cope and adapt."

And then a billion screaming Luddites smash data centers and vote in anti-progress politicians. "Who could have possibly predicted humans don't like instability and uncertainty??"

17

Yuli-Ban t1_ja1x0ql wrote

NAIL ON THE HEAD

That's exactly what I predict.

> Everyone has the capability to generate new stuff and then has the ability to share it. Good stuff gets popular and becomes zeitgeist-y for a while, bad stuff just exists.

Indeed, this is essentially already the case on some websites, like Newgrounds, Soundcloud, and Reddit, except the capabilities are expanded even further.

Though again, I still predict that the "human-created" tag will exist and there will be some segregation between that which is created by humans and that which is created by machines, among other metrics (i.e. human prompted/AI-generated; human-created/AI-assisted, etc.)

Ideally, there will be as few bad actors as possible trying to corrupt such a tag. There might also exist the issue of copyright. Despite some predictions, I don't see copyright dying immediately. Indeed, if anything, I view copyright as being the last chokehold of "canonicity." You, or an AI, may generate the best-ever episodes of a certain TV show, but if the rightsholders say it's non-canon, then it's non-canon, period. Some may disregard their statement, but enough won't.


One other thing I predict is the demographics of all this.

Despite the democratization of media creation being imminent, I actually don't see the vast majority of people joining in on creation, even if the majority do join in on curation. The claims that this will be the case feel eerily reminiscent of the claims by the cyberdelic movement in the 1990s that the Internet will lead to direct democracy and total enlightenment, with every man an artist and every website an enlightened forum.

I predict 60% to 70% of people will stick to AI-generated memes, purely personal creations, edits to existing media, and other small things. Only about 10% to 20% of the population will be responsible for this massive explosion of content creation (and the remaining will stick with human-created media).

4

Yuli-Ban t1_ja1rpsq wrote

> AI media will not be mass produced

Actually, it will be mass-produced. But put a pin in that.

> It will be as individualized and addictive as your Facebook and tiktok feed.

And I suppose the issue there is that a lot of people actively reject such addiction. I, for example, almost never use Facebook and can't even remember ever using TikTok. I will almost certainly be the exact opposite when it comes to synthetic media, but if reactions to AI art are any indication (and I mean on places like DeviantArt, YouTube, and Pixiv, not the Twitter pro-human artist protests), there's little chance that synthetic media will replace all media. Some people are just contrarians, while others have anthropocentric bias.

However, there is one other issue, and this is where I say to pull out the pin.

Media that is individualized is great and all, and we'll indulge in it without question. But that's certainly not all there'll ever be.

See, I agree wholeheartedly that AI is going to generate media very soon, and has even already started doing so. I even agree that most people will use synthetic media to generate media individualized to them. Where I disagree strongly is in the idea that humans stop sharing media and instead lose ourselves in our own fantasy worlds.

The cold fact is, humans are social apes. If we create something or like something, we're going to share it with others. Hence why I tend to think that synthetic media is being severely overhyped by some people, even as I say "we're going to create multimedia franchises in our bedrooms and could live in synthetic media bubbles within the better part of a decade."

Even if we become transhuman, I don't see social interaction being something we'll elect to take away. If anything, transcending our basic humanity towards higher levels of cognition only seems to make it more likely we'll engage in social interaction, but on levels we can't fathom. Not to mention I strongly doubt most people will become transhuman anyway.

If you value social interaction (and most people do as humans are hardwired for it), even if you spend a lot of time generating synthetic media, you're not going to completely lose yourself in your own fantasies.

The kneejerk reaction to synthetic media, and the Singularitarian hype for it, often acts as if the human need for social interaction doesn't exist. But I present the theory that, provided nothing bad happens, there will still be people attending live concerts and going to movie theaters and viewing live theatrical performances and seeing live sports performances in 30 years. That if YouTube is still around by then, a majority of videos will have some aspect of synthetic media to them— V-tubers and AI personalities playing fully AI-generated games for example, or AI personalities of historical figures discussing history to synthesized images and videos and simulations— but you could also still find humans giving their own thoughts and creations, and indeed, "human-created" might even be a lucrative tag.

I know it's easy to say "You probably would have predicted that the Internet was a fad in 1995" to any criticism of the dominant narratives of synthetic media. I'm not saying the Internet is a fad and that no one will ever download music because they will always value vinyl and CDs; if anything, I was predicting that before most people here even thought it was possible in their lifetimes. I'm saying that the opposite argument, that no one will ever buy music or attend live concerts because they can simply download mp3s or stream music, is just as fallacious.

I'm not saying that no one will ever use synthetic media to do anything because human-created art will always matter more; I'm saying that the arguments that we'll only ever consume media tailor-made for us and our preferences is one day going to be seen as just as outrageously silly of a prediction.

And that's why I agree with OP. Especially considering another angle to this: I think most people will utilize synthetic media to some extent, such as to edit existing media or create memes or something to that level, but very few will actually create whole movies, video games, and franchises, at least regularly. This is more likely with older generations and the hipsters of younger generations. It's easy to forget that most people alive today were born before the year 2000, and that in America, more than 2/3s of the population is older than 30. Maybe I'm not seeing something that others can, but thinking about this from the laziest and most consumeristic perspective combined with technophobia, I can absolutely see the majority of Boomers and Gen Xers just barely using synthetic media, such as to "make the fourth movie of the Dollars Trilogy" or "give me another season of Firefly" or "give us the fourth main Nirvana album" but otherwise stopping there and, for the most part, sharing whatever's created before moving on.

8

Yuli-Ban t1_j9t3goh wrote

I've heard some theories about how AI itself would regulate this, but we're talking about a very advanced AI here that has a justifiable reason for promoting human-to-human interaction and can regularly monitor the metabolic output per comment, reply, etc.

However, that's just a theory.

1

Yuli-Ban t1_j9t2syk wrote

The education system, schooling, all that becomes more of a social function in that case. Humans evolved to be around other humans; we are social apes, and replacing that with technology— even very humanlike technology— is insufficient for a child's behavioral and psychological wellbeing. Personal education (AI tutors) might replace "true" school, but the way we've implemented schooling in our society fits our nature well.

This is why I don't see school going away. The true point of school on a fundamental level is to put kids into community associations for general education; it's only our Prussian-style education system that focuses heavily on labor capabilities and aptitude training, and thus that aspect of schooling will likely change. What we call schools will almost certainly evolve into community functions.

2

Yuli-Ban t1_j9eftyb wrote

God, no.

I don't think sentience will or should be widely available for people to torment.

You wouldn't light a campfire with Tsar Bomba— most AIs you're going to interact with won't be the full might of what exists.

For all we know, tiered access to AGI might be what prevents misaligned AI— as we've seen with Sidney, some people are suicidally trollish enough to deliberately try forcing a powerful neural network to go insane. I'm convinced some 4chan autist is going to become suicidally desperate to input a fatal paperclip maximizing or nuclear war prompt injection into a future AI, actually, if and when it doesn't immediately kill everyone.

So to that end, I see an asymptotic flattening in how intelligent game AI will ever get. Not because we lack the capability but because it would cross ethical boundaries and could be made illegal and even be an impetus for regulation of GPUs.

1

Yuli-Ban t1_j9a02eh wrote

https://en.wikipedia.org/wiki/Soviet_atomic_bomb_project#World_War_II_and_accelerated_feasibility

> In 1940–42, Georgy Flyorov, a Russian physicist serving as an officer in the Soviet Air Force, noted that despite progress in other areas of physics, the German, British, and American scientists had ceased publishing papers on nuclear science. Clearly, they each had active secret research programs

3

Yuli-Ban t1_j99yixs wrote

Hot take: no.

It's weird only because we've never had anything like this before, pre-LLM chatbots notwithstanding. But I think the pseudo-sentience of contemporary LLMs will provide a form of digital companionship for people and that's okay. We humans are social apes. We are literally programmed for social interaction, and often form friendships with abstract concepts and nonliving objects. Becoming addicted to a program that can actually talk to you is interesting if nothing else.

2

Yuli-Ban t1_j8fu9ty wrote

This is what I've been saying.

It's not just that most people don't care. It's also that most people won't use it to its maximum capabilities.

So many people here think art and entertainment is about to die as if every 30 something housewife is about to generate their own personal Hollywood, but I see it being more likely that 70% of people use generative AI for mundane, funny, or pornographic stuff, while a sizable number of artists continue maintaining a human-centric economy, and only a relative handful use generative AI for pure AI generated material.

3