Even then nothing is learned. Every HN thread there is on AI coding: "I am using $model for writing software and its great." "I am using $model for writing software and it sucks and will never do it." 800 comments of that tit for tat in present tense. Still nothing learned.
Doesn't help that no one talks about exactly what they are doing and exactly how they are doing it, because capitalism vs open technology discussions meant to uplift the species.
> Doesn't help that no one talks about exactly what they are doing and exactly how they are doing it
Let me try a translation:
> I am using $model for writing software and its great.
I have generated an extremely simple javascript application that could have been mostly done by copy/paste from StackOverflow or even Geeks4Geeks and it runs.
This is true. I have a PWA that I generated with a LLM on my phone right now. It works. Pretty sure even w3schools would be ashamed to post that code.
> I am using $model for writing software and it sucks and will never do it.
This is also true. At work I have a 15 year old codebase where everything is custom.
You can't get a LLM to use it all as a context because you simply don't have ram for it so you can't even test the quality of advice given on it.
You can't train a LLM on it because you don't have the budget for it.
You maybe could get a LLM to generate code by prompting it "this is the custom object allocation function, these are the basic GUI classes, now generate me the boilerplate for this new dialog I'm supposed to do". Unfortunately it takes as long or longer than doing it yourself, and you can't trust the output to boot.
Doesn't help that no one talks about exactly what they are doing and exactly how they are doing it, because capitalism vs open technology discussions meant to uplift the species.