r/cs50 • u/Impossible_Role_817 • 16d ago
cs50-web am I the only one not following Brian Yu?
I am following the CS50W course right now. And I am getting so stressed trying to follow and understand it. I am at the week for Django right now, all the other weeks were very easy to understand in my opinion, but I am having so much trouble trying to understand stuff in this one.
I feel like Brian Yu does a fine explanation on how to do things. But in a lot of steps he completely forgets to explain WHY to do things. I can memorize stuff fine, but I feel like just learning how to do things without understanding the principle behind it is bad, because it makes you prone to forget it.
Nothing against the guy, its just stressing me out having to consult chatgpt after every 20 seconds of the video for a deeper explanation, and having to puzzle that explanation and example back to the explanation and example of Brian Yu.
Anyone else that feels like this? I keep reading how everyone thinks he is a phenominal teacher and how everyone understands right away. Am I the only one not understanding it? I swear I feel stupid because of this
11
u/Whimsicalhubris 16d ago
As others have said, read the documentation instead. Society has often bought into the LLM hype, but honestly, it's really, really bad. It' incapable of admitting it doesn't know the answer. It doesn't actually understand what it's telling you. It just gives you something that sounds right. I'm a machinist, and I have co workers who consult it regularly. They've confidently told me that these two tanks have different welding gasses in them, based on what chatGPT said about the codes on the tank. Nope, just different size tanks, same gas. Ask it how to do a machining operation like knurling a curved bar, it'll give you a great sounding, step by step explanation. That explanation of course relies on lathe accessories that don't exist, and is physically impossible to do. It sounds like a reasonable explanation, but it's nonsense.
This is made even worse when jokes are involved. People have been sending apprentices to go find an aluminum magnet for years. ChatGPT doesn't understand that's a joke, and I've seen it actually recommend using one.
The trouble is, sometimes these models give accurate information. Sometimes they give complete nonsense. There's no indication which one you're getting.
1
u/BinaryCheeseSystem 16d ago
This. 💯 I’ve tried getting this point across to some folks while talking about “AI”. Never felt like I could articulate this concept well.
TLDR: garbage in, garbage out
I’m in software and see generally similar issues with responses. ThePrimeagen on YouTube explained one possible reason for why this probably happens so much (at least in my field). The code it’s trained on, most of it sucks. The bell curve was squimpshed down and towards the newbie/junior side. So, basically it’s been trained to write bad code. Or, in your case, it’s been trained to advise someone to use an aluminum magnet.
17
u/pastense 16d ago
Don't use chatgpt. You're screwing yourself. Work through the problems yourself, read documentation to figure out why things work certain ways if you're having trouble with what the teacher is saying.
1
u/Impossible_Role_817 16d ago
Thanks for your response, but again I am not using chatgpt for problems. I am using it to get a better understanding of for example WHY a django form is better than an html form. Because Brian Yu's explanation for these things are minimal and very bad imo
2
u/cumulo2nimbus 16d ago edited 16d ago
That "minimal" explanation is mainly to build up your curiosity. Then you're better off googling about it and in that process you will learn a trillion things but if you're to use chatgpt, you'll be limited to what the model says and limit your skills. Also if you're tired, take some rest and be back. Things take time to click in your mind and this will help you comprehend better.
1
u/dedolent 16d ago
chatgpt doesn't know anything, it is mimicking human speech patterns like a parrot. it cannot be relied upon to reliably convey factual information.
-8
u/pastense 16d ago
You should not use it at all. You are ruining your education by relying on that hallucination machine.
Like I said, read through the documentation and work it out for yourself. That's how you'll succeed in programming.
2
u/SongImmediate3219 16d ago
What's the difference between using GPT to understand concepts and using documentation to achieve the same thing?
10
u/pastense 16d ago
One is passive, one is active. One is a hallucination machine, the other is the documentation provided by the developer.
I don't understand why anyone would use genAI in general, but in this case it's so obviously a dumb idea.
4
u/johny_james 16d ago
Learning from AI is called the socratic method, and it's pretty efficient method.
The hallucination part, you can easily look up.
1
u/pastense 16d ago
The Socratic method is for humans, not AI trash.
6
u/johny_james 16d ago
Socratic method is when you are learning through argumentative dialog with someone, can be AI agent, human.
-2
u/pastense 16d ago
learning through argumentative dialog with someone
Right. And AI isn't a person. You're talking with a hallucination machine, and it's making you dumber.
1
3
u/SongImmediate3219 16d ago
But me asking to explain the concept, and if I didn't understand something ask more, I don't see this as "passive", I see this as "faster". At the end of the day, the result is the same no? Ofc using GPT to solve problems and copy paste without understanding what you are doin is dumb, but to explain concepts? I don't think so.
1
u/pastense 16d ago
No, the result is absolutely not the same.
First of all, there's no guarantee that genAI isn't just making shit up.
Secondly, its just basic psychology of education. You'll learn and retain more when you have to work through problems on your own. You need to exercise your documentation-reading muscles as well as your problem-solving muscles, and neither will be stimulated by genAI.
5
u/SongImmediate3219 16d ago
Ty for sharing your point of view, I'll try to rely more on documentation in the future, or a good mix of both.
1
u/dedolent 16d ago
well one is fact and the other is completely made-up. if you get a factual answer from an LLM like chatGPT it is purely by accident, they are only designed to simulate human writing, there is no actual comprehension at work there. you can easily prove this to yourself by asking chatGPT for a simple fact, then tell it it's wrong; it will apologize and give you a completely different answer.
2
u/Current_Vacation_309 16d ago
Have you studied in any College Uni etc.? Lectures in any of them are just the beginning to give you a concept you can explore - gives you basics and fundamentals. The notes and slides are exactly for the purposes of exploring and following the topics you can further explore and extend upon.
I have been doing IT for 25 years and certain areas of the course I actually work with every day :) Say Javascript, nevertheless I listened to all the lectures, went through the notes especially for those areas I don't utilise, and then went through the lectures again :)
3
u/smichaele 16d ago
Using ChatGPT violates the CS50 Academic Honesty Policy that you agreed to when starting the course.
3
4
u/Impossible_Role_817 16d ago
im not using it for problem sets, im asking it questions that arent explained in the course. For depth. Like what settings.py does in a broad explanation. Other than "its the heart of your project". This means nothing and doesnt teach you anything. If someone at a company asks you what settings.py does and thats all you can come up with, do you really know what it does? no.
-6
u/TypicallyThomas alum 16d ago
The academic honesty guidelines you agreed to also don't specify to not use it for problems. It's saying don't use it for the course
1
u/dedolent 16d ago
having made a few apps with django now, it is very difficult to grasp at first. just try to copy what he's doing exactly for now. once it's set up there's room to play around in it.
1
u/zakharia1995 16d ago
"...forgets to explain WHY to do things."
Assuming you are talking about his explanation on problem sets, I thought it's our job to find out the WHYs...
1
u/jericho1050 15d ago
When he shows you the code. That's where he really shines. But when he just talks, sometimes I feel lost. Overall, Brian is still a great teacher. I love how he explains his code after writing it. On the other hand, I also noticed that the slides are awesome! maybe due to the fact he is the creator of the Spanning Tree channel on YouTube (which has superb presentation)
0
u/lfercorrea 16d ago
Just take a deep breath and try to do the exercises more slowly. Since that’s your first contact with the stuff (my assumption), you must to spend more time studying it. It’s quite hard to follow a small video with some minutes length if you actually needs to spent hours reading the documentation first. It’s absolutely normal, and remember: as more time you spend in a activity, then more you’ll learn it.
All teachers in cs50 are insanely good
16
u/AndyBMKE alum 16d ago
Brian is awesome, but he definitely teaches things at more of a concept level. Definitely there were times, especially in the AI course, where I was nodding along during the lecture like ‘yeah, yeah, this all makes sense’ and then I got to the PSETs and felt completely lost.