pleroma.debian.social

pleroma.debian.social

One consequence I thought would follow from vibecoding was, systems are not going to get simpler: when two bureaucratic processes interact, and those processes are aided by automation, there's less incentive to cut through the bureaucracy and simplify the underlying processes

what I hadn't realised was, things will get worse, not better: I'm now seeing systems that are complex beyond human reasoning, because humans *aren't reasoning about them*. >1kloc bash scripts embedded within >2kloc YAML files, that are not independently developed or tested, stuff like that
replies
1
announces
3
likes
4

@jmtd And therefore easier for malicious code to appear unnoticed ...

@jmtd This is possibly what distresses me the most with the rampant spread of Gen AI in general -- not just vibe coding; it's propensity to seemingly ease to the pain of complying with what is already perceived as a purely performative task, compulsory yet ultimately meaningless: filling in your quarterly performance review, writing a cover letter, posting on linkedin, etc...
But if it is indeed pointless, why even bother in the first place? And if it had not been completely devoid of purpose before, you can be certain that is now that any hints of agency have been removed from the humans involved, so why continue?

I don't know how popular this cartoon was outside of France, but I find myself thinking about it a lot these days:
https://en.wikipedia.org/wiki/Les_Shadoks

@fred perhaps there is hope: for those of us inclined and sufficiently skilled, there remains a need for taking a machete to processes, and understanding how to do things simply. We’ll be drowning in work!

@jmtd yeah, I'm sure the pendulum will swing the other way sooner or later.
But god am I annoyed that we have to live through such incredibly stupid times!!!

@jmtd @fred and, at the same time, I see fellow developers that I respect wonder whether there's still a future in programming…
I can't fathom how people do not realize that LLMs only somewhat work because they are trained on existing stuff. If "original" stuff ceases to be created, the whole system collapses (and it's been proven that there is significant degradation in quality when training LLMs on LLM output)…

@fred @jmtd it might take a high-profile, death-provoking incident, though… Imagine vibe-coded nuclear power plant control code or airplane auto-pilot system (though, I guess Boeing "innovated" there already)…