The True Danger of AI and Why 'On Track' Means Nothing
Published on 30.04.2026
TLDR
HackerNoon's April 29 edition centers on two problems that look different on the surface but share something underneath: systems that hide what's really happening until it's too late to act. One is civilization-scale, one is a project status meeting. Both deserve more honest attention than they usually get.
The Real AI Risk Is Not What Science Fiction Prepared Us For
I keep thinking about how much mental energy the tech industry has poured into Skynet scenarios, misaligned superintelligences, and robot uprisings. Meanwhile, the actual risk might be far quieter and harder to reverse.
Han Be, a professor of engineering physics who also writes sci-fi, puts it this way: AI's true danger isn't a dramatic takeover, it's a slow collapse of human purpose. Work isn't just income. It's dignity. It's structure. It's a reason to get out of bed. If automation removes the need for most humans in production, societies face a question they've never had to answer at this scale: what do people do? What gives life meaning when the economy no longer needs most of us to participate?
There's a companion piece by Anton Voichenko on the tax implications of replacing workers with AI, which is actually the more concrete thread of the same conversation. If you automate away jobs, the workers don't pay income tax, don't spend money the same way, and the whole fiscal foundation that funds social services starts to crack. Experts have been dancing around this, but the numbers aren't hard to see. The economic plumbing of this transition matters as much as the philosophical questions about purpose and survival.
Neither of these pieces is doom-mongering. They're honest about a transition that's already underway. I'd rather read this kind of grounded concern than another press release about AI changing everything for the better.
"On Track" Is a Status That Tells You Nothing
Constantine, a project manager at Wargaming, makes a point that every developer and every PM has lived through: "on track" is the least useful status report you can give.
Think about it. "On track" sounds reassuring, but it answers none of the questions that matter. On track compared to what baseline? On track by whose estimate? What's the risk if one dependency slips? The problem is that "on track" is comfortable for everyone in the room. No one gets uncomfortable questions. No one has to admit uncertainty. The PM gets to move on to the next item.
Game production is a particularly brutal context for this because games are combinatorially complex. Art, engineering, narrative, audio, QA, platform certification, they're all coupled in non-obvious ways. A status of "on track" for the character animation team tells you nothing about whether the combat system is actually ready to receive those animations. The failures don't announce themselves early. They accumulate quietly behind a wall of optimistic status reports and then materialize as a crisis two weeks before a milestone.
The fix isn't complicated but it does require a bit of courage: make status reports answer specific questions. What got done? What didn't? What's the actual risk between now and the next checkpoint? "On track" answers none of those. Replace it with something honest, even if the honest answer is "we're probably fine but here's the thing that worries me."
Why "On Track" Is the Least Informative Status in Game Production
Key Takeaways
- AI's most serious threat isn't dramatic or sudden. It's the gradual removal of the economic and social role that gives most people structure and dignity.
- The tax revenue question around AI-driven job displacement is underexplored and genuinely consequential. Governments can't fund welfare systems on a shrinking income tax base.
- "On track" as a project status is a false comfort. It signals that no one in the room wants to have the harder conversation.
- Specific, risk-aware status updates are harder to give but far more useful to everyone who has to make decisions based on them.
- Both articles this week are really about the same thing: the gap between what systems report and what's actually true underneath.