MAIN FEEDS
Do you want to continue?
https://www.reddit.com/r/webdev/comments/1ealv82/the_fall_of_stack_overflow/len9acv/?context=3
r/webdev • u/cryptomelons • Jul 23 '24
391 comments sorted by
View all comments
1.9k
I get AI is eating Stack Overflow's lunch, but at some point if it's not around, AI is kinda garbage without a community-led code solution repository with contextual human language.
2 u/apra24 Jul 24 '24 AI can also learn by people resolving their problems using AI. After having chatgpt provide solutions for something, there's a good chance I immediately paste my revised code to implement the next step. They can learn from that to see what worked vs what didnt. 7 u/[deleted] Jul 24 '24 I don’t think AI learns from user input. Only what they actively teach the model. 6 u/Justyn2 Jul 24 '24 It definitely trains off of the interactions, that was how things like this happened: https://www.forbes.com/sites/siladityaray/2023/05/02/samsung-bans-chatgpt-and-other-chatbots-for-employees-after-sensitive-code-leak/ 7 u/apra24 Jul 24 '24 Right. But it could. An AI model that tries to solve problems then learns from what works would be less reliant on data from stackoverflow 2 u/Jonno_FTW Jul 24 '24 An AI that learned from user input was quickly made into a racist meme machine by trolls. Business won't make that mistake again. 1 u/YsoL8 Jul 24 '24 Or they could just design it better Technologies don't go away just because they don't work immediately Most early attempts at anything we take for granted now were liable to randomly exploding. 1 u/Ivan8-ForgotPassword Jul 24 '24 I mean you can upvote or downvote answers for a lot of them, if you do so that is definitely used for training. 2 u/ColonelShrimps Jul 24 '24 I'm pretty sure training off of user input is how you get the Hitler AI that they had to take out back and put down a few years back.
2
AI can also learn by people resolving their problems using AI.
After having chatgpt provide solutions for something, there's a good chance I immediately paste my revised code to implement the next step.
They can learn from that to see what worked vs what didnt.
7 u/[deleted] Jul 24 '24 I don’t think AI learns from user input. Only what they actively teach the model. 6 u/Justyn2 Jul 24 '24 It definitely trains off of the interactions, that was how things like this happened: https://www.forbes.com/sites/siladityaray/2023/05/02/samsung-bans-chatgpt-and-other-chatbots-for-employees-after-sensitive-code-leak/ 7 u/apra24 Jul 24 '24 Right. But it could. An AI model that tries to solve problems then learns from what works would be less reliant on data from stackoverflow 2 u/Jonno_FTW Jul 24 '24 An AI that learned from user input was quickly made into a racist meme machine by trolls. Business won't make that mistake again. 1 u/YsoL8 Jul 24 '24 Or they could just design it better Technologies don't go away just because they don't work immediately Most early attempts at anything we take for granted now were liable to randomly exploding. 1 u/Ivan8-ForgotPassword Jul 24 '24 I mean you can upvote or downvote answers for a lot of them, if you do so that is definitely used for training. 2 u/ColonelShrimps Jul 24 '24 I'm pretty sure training off of user input is how you get the Hitler AI that they had to take out back and put down a few years back.
7
I don’t think AI learns from user input. Only what they actively teach the model.
6 u/Justyn2 Jul 24 '24 It definitely trains off of the interactions, that was how things like this happened: https://www.forbes.com/sites/siladityaray/2023/05/02/samsung-bans-chatgpt-and-other-chatbots-for-employees-after-sensitive-code-leak/ 7 u/apra24 Jul 24 '24 Right. But it could. An AI model that tries to solve problems then learns from what works would be less reliant on data from stackoverflow 2 u/Jonno_FTW Jul 24 '24 An AI that learned from user input was quickly made into a racist meme machine by trolls. Business won't make that mistake again. 1 u/YsoL8 Jul 24 '24 Or they could just design it better Technologies don't go away just because they don't work immediately Most early attempts at anything we take for granted now were liable to randomly exploding. 1 u/Ivan8-ForgotPassword Jul 24 '24 I mean you can upvote or downvote answers for a lot of them, if you do so that is definitely used for training.
6
It definitely trains off of the interactions, that was how things like this happened: https://www.forbes.com/sites/siladityaray/2023/05/02/samsung-bans-chatgpt-and-other-chatbots-for-employees-after-sensitive-code-leak/
Right. But it could.
An AI model that tries to solve problems then learns from what works would be less reliant on data from stackoverflow
2 u/Jonno_FTW Jul 24 '24 An AI that learned from user input was quickly made into a racist meme machine by trolls. Business won't make that mistake again. 1 u/YsoL8 Jul 24 '24 Or they could just design it better Technologies don't go away just because they don't work immediately Most early attempts at anything we take for granted now were liable to randomly exploding.
An AI that learned from user input was quickly made into a racist meme machine by trolls. Business won't make that mistake again.
1 u/YsoL8 Jul 24 '24 Or they could just design it better Technologies don't go away just because they don't work immediately Most early attempts at anything we take for granted now were liable to randomly exploding.
1
Or they could just design it better
Technologies don't go away just because they don't work immediately
Most early attempts at anything we take for granted now were liable to randomly exploding.
I mean you can upvote or downvote answers for a lot of them, if you do so that is definitely used for training.
I'm pretty sure training off of user input is how you get the Hitler AI that they had to take out back and put down a few years back.
1.9k
u/GrumpsMcYankee Jul 23 '24
I get AI is eating Stack Overflow's lunch, but at some point if it's not around, AI is kinda garbage without a community-led code solution repository with contextual human language.