> We are rapidly moving from deterministic engineering to probabilistic engineering, and our tools, our training, and our organizational instincts are still built for the old paradigm.
There's some bad news: it's never been a non-probablistic engineering.
A key word is Failt-tolerance.
The engineering has never been fully deterministic. Same as running the systems.
So nothing's changed in this area.
Ancalagon 14 hours ago [-]
Interesting note: Those seniors with the old-guard set of skills will also become more valuable to the sectors of software still defined by determinism, as the juniors trained from the probabilistic era will find they are unable to transition effectively to the deterministic jobs.
Seniors today will effectively command the highest value for the foreseeable future.
joshkel 19 hours ago [-]
Reading this article has given me a profound realization: As a software user, I don't want the products I use to ship new features at "three, five, or ten times what they shipped a year ago," at the expense of "probabilistic" stability and reliability.
I want software that works.
Yet the economics of the software industry, and perhaps the economy as a whole, seem to inexorably push toward the former.
7777777phil 15 hours ago [-]
>Generation became cheap, validation didn't
this is basically the whole post imo. Also maps to why productivity hasn't really moved despite 93% adoption.. the oversight bandwidth eats the generation gains.
There's some bad news: it's never been a non-probablistic engineering.
A key word is Failt-tolerance.
The engineering has never been fully deterministic. Same as running the systems.
So nothing's changed in this area.
Seniors today will effectively command the highest value for the foreseeable future.
I want software that works.
Yet the economics of the software industry, and perhaps the economy as a whole, seem to inexorably push toward the former.
Jevons for code is right but the bottleneck is review, not typing: https://philippdubach.com/posts/93-of-developers-use-ai-codi...