By the way, you can listen to me read this post aloud on my Patreon, along with many other audio recordings. Four ginever glasses sit on a mirrored table at Distilleerderij’t Nieuwe Diep in F…
this is AI but it felt a lot more guy with broken gear
The artlicle certainly feels blasé ^^, I think the most objectional part is:
Large language models shift even more of that time into investigation, because the moment the team gets a chance to build, they turn around and ask ChatGPT (or Copilot, or Devin, or Gemini) to do it. When we learn that we need to integrate with google cloud storage, or spaCy, or SQS Queue, or Firebase? Same thing: turn around and ask the LLM to draft the integration.
Now clearly (to me) the author isn’t happy about this, but I think they are giving hope on the direction of the profession too soon. There are still plenty of people happy enough to implement things themselves.
The artlicle certainly feels blasé ^^, I think the most objectional part is:
Now clearly (to me) the author isn’t happy about this, but I think they are giving hope on the direction of the profession too soon. There are still plenty of people happy enough to implement things themselves.