Yeah I mean all that is basically true – for code. The tools work, if you know how to work them.
Of course, is this going to put programmers out of work? yes. Is it profitable for the same companies that are causing unemployment in other fields? Yes. So like, it’s not as though there isn’t blood on your hands.
TBH “AI” is going to create more jobs for devs. I tried using “AI” once for coding. It took me more time to debug the code than to google an alternative solution. It might be good for boilerplates, summarizing stack exchange, etc. But in reality you can’t code anything original or worthwhile with statistics.
If you use LLMs, you should use it primarily in ways where it’s easier to confirm the output is valid than it is to create the output yourself. For instance, “what API call in this poorly-documented system performs <some task>?”
There is no consistent definition of AI so you might as well drop the quotation marks, lest you be prescriptivist.
Yeah I mean all that is basically true – for code. The tools work, if you know how to work them.
Of course, is this going to put programmers out of work? yes. Is it profitable for the same companies that are causing unemployment in other fields? Yes. So like, it’s not as though there isn’t blood on your hands.
TBH “AI” is going to create more jobs for devs. I tried using “AI” once for coding. It took me more time to debug the code than to google an alternative solution. It might be good for boilerplates, summarizing stack exchange, etc. But in reality you can’t code anything original or worthwhile with statistics.
If you use LLMs, you should use it primarily in ways where it’s easier to confirm the output is valid than it is to create the output yourself. For instance, “what API call in this poorly-documented system performs <some task>?”
There is no consistent definition of AI so you might as well drop the quotation marks, lest you be prescriptivist.