Dawid Kedzierski's Newsletter

Subscribe
Archives
February 11, 2024

The future of work with AI

It's yet another day where I don't need to use short multiplication formulas. Does this mean that our school education was a complete waste of time? Perhaps, perhaps not.

What all those math lessons, including algebra, geometry, and maybe even statistics, were meant to teach us was critical thinking.

With the rise of modern AI, it's easy to fall into the trap of thinking that learning a new programming language or even studying Computer Science is no longer worth it. After all, AI is bound to do most, if not all, of the work eventually, right?

Mainstream media is flooded with articles describing how our productivity will increase. New research is published daily on this topic. Apparently, in a few years, all desk jobs will be somehow affected (improved?) by AI.

I don't know what the future will look like, and neither do the so-called experts. However, it seems to me that in this whole discussion, we have forgotten what we already know: our basic human nature. We are lazy. If there's a shortcut available, we choose it. If we don't need to think about something, we won't. If someone else can do the job and we can afford it, we hire them. This is how evolution has shaped us, regardless of what productivity gurus try to tell us.

Because of this, in Software Engineering, we observe several phenomena.

Cargo cult programming - emerges when developers engage in coding without fully comprehending their craft. It's a scenario where a developer, through trial and error, borrows code from elsewhere, tweaking and testing until it appears to function - or nearly so. Halting adjustments for fear of breaking what barely works, they often leave behind redundant lines of code, akin to packing a suitcase with items you'll never use.

Shotgun debugging - involves making random, unsystematic changes to software, hoping to accidentally fix a bug.

Copy-and-paste programming - involves repeatedly using existing code snippets by copying and pasting them instead of creating reusable functions or components. This practice leads to code duplication, increased error rates, and maintenance challenges.

Golden Hammer - occurs when a familiar technology or solution is used for every problem, regardless of its suitability, under the belief that it is universally effective. This approach can lead to inefficient or inadequate solutions.

And many, many more.

My prediction for the future of work with AI is that, despite the initial increase in productivity, it will not bring about significant long-term changes. As AI models continue to advance, we will place more trust in their outputs. This will undermine the critical thinking required to evaluate those outputs. Cargo cult programming and other anti-patterns will become more prevalent, counteracting the initial productivity gains. As a result, these opposing forces will eventually balance each other out.

In conclusion, while AI may initially lead to a boost in productivity, it is important to remember the value of critical thinking and human judgment. Relying solely on AI without questioning its outputs can lead to the emergence of anti-patterns like cargo cult programming and hinder long-term progress. It is crucial for individuals to continue developing their skills and maintaining a balance between leveraging AI technologies and preserving their own critical thinking abilities.

Don't miss what's next. Subscribe to Dawid Kedzierski's Newsletter:
Powered by Buttondown, the easiest way to start and grow your newsletter.