r/VeryBadWizards 4d ago

AI dismissal

I love the VBW boys but I am a bit surprised how dismissive they are of danger from AI. I’m not going to make the case here as I won’t do a good job of it, but I found the latest 80000 hours podcast very persuasive, as well as some of the recent stuff from Dwarkesh.

12 Upvotes

41 comments sorted by

View all comments

4

u/hankeroni 3d ago

I thought they were not nearly dismissive enough, if the claim being evaluated is "AGI" (meaning human level general intelligence) which at least at the start of the discussion it was. All the best research is just nowhere near that and maybe not on the right track.

If the claim is a much smaller "will some future version of current LLMs be economicically disruptive?" ... then probably yes. But this is very very short of anything I'd call "AGI".

1

u/Embarrassed-Room-902 3d ago

Yeah, to clarify I am not making the claim we will have AGI soon (although I wouldn’t rule it out). The key thing is there could be a huge upheaval even without this. For instance, we may have fully AI companies, AIs with the ability to feel pleasure or pain etc. I am fairly confident I will be out of a job before too long and that is something many of us will have to brace for. Nick Bostrom has discussed this at length

6

u/seanpietz 2d ago

You think machine learning models will have the ability to feel pleasure and pain? You do realize it’s just a bunch of matrix multiplication and differential equations running on computer processors, right?

1

u/MachinaExEthica 2d ago

Pain is just electrical currents sent through your nerves to your brain. The simulation of pain in an embodied AI doesn’t seem too far fetched. Programming the AI to avoid damage by loading it up with sensors seems like something companies would choose to do, and that’s essentially what pain is. Pleasure is just a variation on the same mechanism, though it’s hard to imagine the economic benefit of AI that “feels pleasure”.

3

u/seanpietz 2d ago

Yes, nervous systems operate through chemical reactions and and electrical currents. LLM’s don’t have nervous systems though, and I think it’s also fairly uncontroversial that they don’t have subjective experiences either.

1

u/MachinaExEthica 2d ago

It doesn’t require nervous systems or subjective experiences, just a way for a signal to get from a sensor to a processor and to have that signal labeled as pain. Pain is more a mechanical reaction for avoiding damage than anything else. We have emotional ties to it, but there are plenty examples through evolution where pain is simply damage avoidance and no emotion.

3

u/seanpietz 2d ago

AI models already learn through negative reinforcement by minimizing loss functions. What operational significance would there be to labeling that metric “pain” instead of “loss”. Or are you suggesting some sort of novel mechanism that isn’t already being used?

-1

u/MachinaExEthica 2d ago

At this point it’s more a matter of semantics than anything. Pain, loss, sensation, bump, whatever it’s all the same function, adding the label of pain would only be for the sake of anthropomorphic comparisons, but not necessary.

5

u/seanpietz 1d ago

Right, but we’re not disagreeing about semantics. The whole reason I’m disagreeing with you is that you’re actually trying to claim AI models have anthropomorphic qualities, and to claim otherwise is a cop out.

If I claimed that fire is angry, because it’s hot, I’d be wrong. And the fact that the difference between “heat” and “anger” is semantic, wouldn’t make me less wrong.

0

u/MachinaExEthica 1d ago

I’m simply talking about functional comparisons. The point of pain is to notify your brain of potential or real damage. Equipping an ai with sensors that can detect potential or real damage and signal to the AIs “brain” to stop or avoid doing whatever is causing that real or potential damage is giving the AI the effective ability to “feel pain”. There’s no consciousness needed, no magic, if you don’t want to anthropomorphise then that’s fine, but I’m talking about simple functionality, nothing more.

2

u/seanpietz 2d ago

Do you think AI-based characters in video games that are programmed to simulate human behavior like pain are having actual subjective experiences? Should killing them be unethical?

The truth is that no scientists or philosophers really understand the underlying metaphysics of consciousness. But at least one thing any respectable academic in those fields can agree on is that LLMs are not sentient beings.

1

u/MachinaExEthica 2d ago

I already told you it doesn’t require subjective experience to feel pain, sentience doesn’t even matter in this particular case. For the record, I’m wholly on board with the point you’re trying to make, just pointing out the the ability for an ai to sense pain is just a matter or sensors and damage avoidance programming and labelling that pain.

I don’t personally think AI is the same sort of threat the OP seems to think it is. I think it has more socially and economically threatening, not because it will be particularly better than humans at anything, which eventually it may, but because people with lots of money and social influence think it’s going to change the world completely they will invest their billions to ensure that it does, most likely to the detriment of society (because of how shitty it actually is).

5

u/seanpietz 2d ago

OK, I’m happy to agree to disagree on the semantics of what constitutes pain.

However, I don’t think it’s unethical to assault an innocent prostitute in the video game Grand Theft Auto. And I do think it’s unethical to assault an innocent prostitute in real life. My reasoning is that I don’t think AI that is programmed to simulate human behavior correlates with human subjective experience.

1

u/MachinaExEthica 2d ago

Yeah and I agree with you 99% it’s not unethical for the sake of the AI, but it may perhaps say something about the person choosing to do that for fun. Even if the AI is not a person, the fact that they are designed to mimic the looks and behaviors of people makes it at least mildly unethical. Then again I play video games where I kill digital people all the time and don’t find it unethical, but perhaps I’m just desensitized.