Humans are unreliable in predictable ways. This makes review relatively painless since you know what to look for, and you can skim through the boilerplate and be pretty confident that it's right and isn't redundant/insecure, etc.
LLMs can use linters and type checkers, but getting past them often times leads it down a path of mayhem and destruction, doing pretty dumb things to get them to pass.
LLMs can use linters and type checkers, but getting past them often times leads it down a path of mayhem and destruction, doing pretty dumb things to get them to pass.