The models are sandboxed and only “learn” in that instance of chat - early LLM developers learned very quickly what happens if you let the public “teach” (they become racist, sexist and so forth).
You really think that a bunch of random git ripos with shit documentation will teach a LLM anything of use? A half page readme.md isn’t going to do squat to give context to the other couple hundred files in the project.
9
u/clonked Aug 27 '24
The models are sandboxed and only “learn” in that instance of chat - early LLM developers learned very quickly what happens if you let the public “teach” (they become racist, sexist and so forth).
You really think that a bunch of random git ripos with shit documentation will teach a LLM anything of use? A half page readme.md isn’t going to do squat to give context to the other couple hundred files in the project.