
"But LLMs took it a notch even further, coders have started morphing into LLM prompters today, that is primarily how software is getting produced. They still must baby sit these LLMs presently, reviewing and testing the code thoroughly before pushing it to the repo for CI/CD. A few more years and even that may not be needed as the more enhanced LLM capabilities like "reasoning", "context determination", "illumination", etc. (maybe even "engineering"!) would have become part of gpt-9"
"The problem is that even though the end result would be a very robust running program that reeks of creativity, there won't be any human creativity in that. The phrase dismal science was first used in reference to economics by medieval scholars like Thomas Carlyle. We can only guess their motivations for using that term but maybe people of that time thought that economics was somehow taking away the life force from society of humans, much similar to the way many feel about AI/LLM today?"
Programmer roles were becoming commoditized before AI, with recruiters labeling people as 'Python coder', 'PHP scripter', 'dotnet developer' and similar. Large language models accelerated that trend by turning many coders into LLM prompters who orchestrate code generation. Current practice requires programmers to review, test and baby-sit generated code before CI/CD. Future LLMs may gain reasoning, context determination and other advanced capabilities that reduce human oversight. Resulting software may be robust and creative in output but lack human creativity. Economic analogies liken this loss of human agency to earlier critiques of commodifying social life. Survival requires adapting and learning LLM skills.
 Read at SitePoint Forums | Web Development & Design Community
Unable to calculate read time
 Collection 
[
|
 ... 
]