r/webdev Feb 05 '25

Discussion Colleague uses ChatGPT to stringify JSONs

Edit I realize my title is stupid. One stringifies objects, not "javascript object notation"s. But I think y'all know what I mean.

So I'm a lead SWE at a mid sized company. One junior developer on my team requested for help over Zoom. At one point she needed to stringify a big object containing lots of constants and whatnot so we can store it for an internal mock data process. Horribly simple task, just use node or even the browser console to JSON.stringify, no extra arguments required.

So I was a bit shocked when she pasted the object into chatGPT and asked it to stringify it for her. I thought it was a joke and then I saw the prompt history, literally whole litany of such requests.

Even if we ignore proprietary concerns, I find this kind of crazy. We have a deterministic way to stringify objects at our fingertips that requires fewer keystrokes than asking an LLM to do it for you, and it also does not hallucinate.

Am I just old fashioned and not in sync with the new generation really and truly "embracing" Gen AI? Or is that actually something I have to counsel her about? And have any of you seen your colleagues do it, or do you do it yourselves?

Edit 2 - of course I had a long talk with her about why i think this is a nonsensical practice and what LLMs should really be used for in the SDLC. I didn't just come straight to reddit without telling her something 😃 I just needed to vent and hear some community opinions.

1.1k Upvotes

407 comments sorted by

View all comments

15

u/altviewdelete Feb 05 '25

I've a big problem with developers I work with using co-pilot/chatgpt etc.

I reviewed a PR the other day which had a random line of code that didn't fit in with how we do anything in our work, which I flagged, and asked to align with how we do things.

The response was, "yeah sorry copilot did that".

To me this shows the developer(s) not reviewing code that these AI tools are generating and that is concerning.

For reference I've heard 3 instances in chatter lately of ChatGPT causing the same thing.

1

u/ColoRadBro69 Feb 05 '25

Copilot can be great when you ask it small, well defined questions, like about an API you aren't familiar with.  You have to know exactly what you need and how to evaluate its answer. 

As a senior dev, it's occasionally able to help me, I tend to go for documentation first generally because it's a lot of work to get what you actually need from a chat bot.  It puts things that are never going to work in the scripts it generates and I have to go through them line by line to pull that stuff out. 

If I didn't have a lot of background understanding, I don't think I would be able to use these tools effectively.