AI is a big fat lie!

I think it really depends on how you utilize AI.

If someone treats it like magic and blindly vibe codes everything into production, then yes, they’re going to hit that 1% wall hard. Especially in ecosystems like WordPress or Envato where a tiny architectural mistake, security issue, or guideline violation can mean instant rejection. In those environments, deep understanding absolutely matters.

But if AI is used as an assistant rather than a replacement for thinking, it’s a different story. Drafting boilerplate, refactoring repetitive patterns, generating test scaffolding, exploring alternative implementations, speeding up documentation, that’s where it shines. The senior dev still needs to review, structure, and make final decisions. The tool doesn’t remove skill, it amplifies it when skill is already there.

About AGI and curing cancer, I agree the hype is out of control. Bold claims get headlines and funding. That doesn’t mean the current models are useless, it just means they’re not what marketing says they are. They’re probabilistic systems, not reasoning minds. Expecting them to replace engineering judgment is unrealistic.

Also, regarding the “unsalvageable codebase” point, that’s less about AI and more about process. If AI-generated code isn’t reviewed, versioned, and incrementally integrated like any other contribution, it will absolutely become chaos. But that’s true for human-written rushed code too. Version control, code reviews, and architecture discipline still apply.

I don’t think the industry is falling apart. It’s shifting. Junior roles may change, but fundamentals are still fundamentals. The devs who understand systems, constraints, performance, and tradeoffs will remain valuable. AI doesn’t remove the need for that. It just changes how fast we can move.

So yeah, these tools are useful. They’re not magic. They’re not AGI. They’re not the end of developers either. It really comes down to how responsibly and strategically we use them.

1 Like

my man never had a luxury of having to do an entire codebase refactor of a human-written project every 1-2 years, due to it becoming as unmanageable regardless whether AI was involved or not :sob:

(@makc3d made me reply in this thread since he’s super impressed with my AI-waifu army coding 2 games per week - I have absolutely no opinion on AI, but on saying ppl write better code than AI I do a bit :sob:)

1 Like

@ Umbawa_Sarosong

I can see that you’ve been using agentic coding, and you clearly have a deep understanding of it. I agree with your thoughts.

Again, I find AI really useful. My issue is with the way it’s advertised. These CEOs are literally selling us lies — and it works. That’s my problem. And of course, they know this better than we do.

They’ve been saying since 2024 that in six more months software engineering won’t be needed anymore, and that anybody — literally anybody — will be able to create any app or idea with just a few words, like “Build me YouTube.” This will never happen, and we all know it, Even if you give a detailed description, which is ridiculous if you think about it.

I am sick and tired of these deregulated CEOs who will do literally anything to get more money. I know it’s business, but it’s unfair and disgusting. That’s why I try to speak up here and there, trying to do my humble part.

I have been fighting Lyme disease since 2014, and because of that I’ve developed a deeper understanding of how the human body works. I had to learn — this condition is not recognized by many doctors in my country. There are only two doctors who treat it and understand its implications. But if I go to any hospital in the UK, they will all say I am healthy, even though this illness has almost killed me more times than I can remember.

We are incapable of killing a virus or bacteria that has invaded the immune system with precision, and yet they claim they will cure cancer. Really?

The way they advertise that AI will cure cancer, considering that it does not truly understand the actual issues, is both funny and sad at the same time. Our body is a divine marvel. When you have cancer, the issue is not just the tumor, it’s a whole range of underlying problems that led you there. So the approach itself, “cure cancer,” is flawed.

Anyway, this will never happen with LLMs. If they do achieve AGI, which I doubt will happen in our lifetime, maybe, we all know what tends to happen with new technology: it is often used to do harm before anything else.

1 Like

you could say, in a way AI averages all the code it was trained on. and then its output is average. as in, half of the people write better code than that. and another half, well, you know.

1 Like

I get where you’re coming from about the marketing. A lot of the messaging from big tech definitely exaggerates what AI can actually do today. Saying “anyone can build YouTube with a prompt” is obviously hype. Real software is messy, full of edge cases, architecture decisions, debugging, performance work, and long maintenance cycles. No model is replacing that level of engineering anytime soon.

But I also don’t think AI itself is a lie or a bubble. Underneath the hype there is real capability. If you look at things like code assistance, documentation help, refactoring ideas, shader experimentation, or exploring APIs faster, it’s genuinely useful. It’s not magic and it’s not autonomous engineering, but it’s a powerful tool if you already know what you’re doing. I treat it more like a very fast research assistant or pair programmer that sometimes gives good ideas and sometimes gives nonsense.

The “average code” point is actually interesting and partly true. Models tend to generate safe, common patterns. That means experienced developers can absolutely write better code in many cases. But the productivity gain comes from speed. You can sketch systems faster, test approaches quickly, and iterate more. Then the human still has to refine, optimize, and make the real design decisions.

So I think the reality is somewhere in the middle. The CEOs oversell it, but the technology itself is not fake. It’s already changing how people prototype, learn libraries, and build tools. It just doesn’t eliminate the need for engineers, especially in complex areas like graphics, engines, and real time systems like the stuff we build with three.js.

In the end it’s still the developer driving the process. AI just shortens some of the distance between idea and implementation.

1 Like