Europe's Open Source Legacy, React Stack 2026, and Why AI Won't Replace Developers Yet

Published on 27.01.2026

Why There's No European Google?

TLDR: Europe hasn't produced tech giants like Google or Facebook, but it has created fundamental infrastructure that powers the modern internet — the World Wide Web, Linux, Git, and more — all released as open source rather than privatized for profit.

Europe's relationship with technology is one of the most fascinating and misunderstood stories in our industry. The standard narrative goes something like this: America builds the big companies, and Europe just regulates them. But that framing completely misses what Europe actually contributed. And what it contributed is, frankly, more important than any single company.

Think about it. The World Wide Web — HTTP and HTML — came from Tim Berners-Lee at CERN. Linux, the operating system that runs basically everything from Android phones to cloud servers to supercomputers, was created by Linus Torvalds in Finland. Git, the version control system that every developer on the planet uses daily, also came from Torvalds. VLC, Mastodon, LibreOffice — all European. And here's the crucial difference: every single one of these was released as open source, as a common good for humanity, rather than locked behind a corporate paywall.

This raises an uncomfortable question about what we actually value in tech. We celebrate the billionaires who privatized innovations, but the Europeans who gave away arguably more foundational technology get treated as footnotes. The Web itself is open source. Linux is open source. Git is open source. Without these three things, Google, Facebook, and Amazon literally could not exist in their current form.

Now, there are legitimate structural reasons why Europe hasn't produced the same kind of mega-corporations. Fragmented markets across different languages and regulatory environments, less aggressive venture capital culture, and different attitudes toward work-life balance all play a role. But framing this as a failure misses the point entirely. Europe chose a different model — one focused on shared infrastructure rather than private monopolies — and that model arguably created more lasting value.

For architects and team leads, this is worth reflecting on when making build-vs-buy decisions. The open source model that Europe championed has become the backbone of modern software development. Every time you spin up a Linux container, push to a Git repo, or serve a web page, you're standing on European open source shoulders.

Key takeaways:

  • Europe created the foundational infrastructure of the modern internet: the Web, Linux, Git, and more
  • These were all released as open source commons rather than privatized for profit
  • The absence of European tech giants doesn't mean absence of European tech innovation — it reflects a different value system
  • Structural factors like fragmented markets and different VC cultures also contribute to the difference

Tradeoffs:

  • Open source maximizes global impact and adoption but sacrifices the concentrated wealth that funds further R&D
  • Privatization enables rapid corporate growth but limits universal access to foundational tools

Link: Why there's no European Google?


My React Ecosystem Stack in 2026

TLDR: A developer's preferred React ecosystem for 2026 looks remarkably similar to 2025 — Zustand, Tanstack Query, Tailwind, Shadcn/ui, Vitest, and Vite remain the go-to choices, signaling that the ecosystem has reached a period of stability.

Here's something that should make every React developer feel a little better about their technology choices: the React ecosystem in 2026 is largely unchanged from 2025. That's not a sign of stagnation — it's a sign of maturity. The wild churn of frameworks and libraries that defined the React world for years seems to be slowing down, and the winners are becoming clear.

The stack breakdown is telling. Zustand for client state management — clean, minimal, no boilerplate. Tanstack Query (formerly React Query) for server state — because mixing server cache with client state was always a terrible idea, and Tanstack Query solved it elegantly. Tailwind CSS paired with Shadcn/ui for styling — utility-first CSS with copy-paste components that you actually own. Vitest for testing, which has effectively replaced Jest in the React ecosystem thanks to its speed and Vite-native integration. And Vite itself for bundling, which continues to be the default choice.

What's notable is what's NOT on this list. No Redux. No CSS-in-JS solutions like styled-components or Emotion. No Create React App (obviously). No Webpack. The ecosystem has collectively moved past these tools, and the replacements are genuinely better in every measurable way — simpler APIs, better performance, smaller bundles, faster dev experience.

For teams evaluating their stack, this stability is actually great news. If you adopted these tools in 2024 or 2025, you made solid choices that are holding up. If you're still on older tools, you have clear migration targets that the community has validated. The React ecosystem has finally reached a place where you can make technology choices with reasonable confidence that they'll still be the right choices a year or two from now.

The one area that continues to evolve is the meta-framework layer — Next.js, Remix, and others are still actively competing and changing. But at the library level, the stack has stabilized in a way that's genuinely productive for teams trying to ship products rather than chase trends.

Key takeaways:

  • The React ecosystem has stabilized — 2026 choices are nearly identical to 2025
  • Zustand, Tanstack Query, Tailwind/Shadcn, Vitest, and Vite form the consensus modern stack
  • Redux, CSS-in-JS, and Webpack have been largely replaced by simpler alternatives
  • Ecosystem stability means teams can invest confidently in current tooling

Link: My React ecosystem stack in 2026


Software as Fast Fashion

TLDR: The software industry is adopting a "fast fashion" approach through AI-generated code, prioritizing quick, disposable solutions over quality and maintainability — and this mirrors fast fashion's environmental and quality problems in concerning ways.

This is one of those analogies that lands so well it's almost uncomfortable. The software industry's embrace of AI-generated "vibe coding" is starting to look a lot like fast fashion — cheap, quick, disposable, and with hidden costs that someone else ends up paying for.

The pattern is unmistakable. LLM-powered code generation produces solutions that work in the moment but are inconsistent, inefficient, and fundamentally designed to be thrown away rather than maintained. Developers prompt an AI, get something that passes the immediate test, ship it, and move on. When it breaks or needs changes, it's often easier to regenerate than to understand and fix what's there. Sound familiar? It's the same logic that makes it cheaper to buy a new fast-fashion shirt than to repair the one you have.

The problem with this model — in both fashion and software — is that the costs don't disappear. They just get displaced. In fashion, it's environmental waste and exploitative labor. In software, it's technical debt, security vulnerabilities nobody reviewed, and systems that become incomprehensible to the teams that depend on them. The "disposable code" mindset works fine for prototypes and experiments. It falls apart catastrophically when that code ends up in production, which it always does, because temporary solutions have a way of becoming permanent.

For architects and engineering leaders, this trend demands a deliberate response. You need clear policies about where AI-generated code is acceptable and where it isn't. You need review processes that account for the specific failure modes of generated code — inconsistent patterns, cargo-culted solutions, subtle misunderstandings of your domain. And you need to resist the organizational pressure to treat faster output as automatically better output.

The most productive teams will be the ones that use AI code generation strategically — for boilerplate, for exploration, for first drafts — while maintaining human ownership of architecture, design decisions, and code quality standards. The worst outcome is sleepwalking into a codebase that nobody understands because nobody actually wrote it.

Key takeaways:

  • AI-generated "vibe coding" produces disposable code with hidden maintenance costs
  • The fast fashion analogy is apt: quick and cheap upfront, expensive in aggregate
  • Technical debt from generated code is harder to manage because nobody deeply understood it to begin with
  • Teams need explicit policies about where AI generation is appropriate vs. where human craftsmanship is required

Tradeoffs:

  • AI code generation maximizes initial velocity but sacrifices long-term maintainability and team understanding
  • Treating code as disposable enables faster iteration but accumulates invisible technical debt

Link: Software as Fast Fashion


AI Will Replace Developers in 6 Months. Again...

TLDR: Tech CEOs at Davos are again claiming AI will replace developers within 6-12 months — the same prediction they made a year ago. Current transformer architectures have fundamental limitations that make this unlikely.

If you're experiencing déjà vu reading this headline, that's because you should be. The same prediction — "AI will replace developers in 6 months" — was made at Davos last year. And the year before. The developers are still here. The predictions keep coming back like a bad recursive function with no base case.

The latest round features the usual suspects: tech CEOs and executives at the World Economic Forum making bold claims about the imminent obsolescence of software developers. It's worth asking why these predictions keep being wrong, because understanding the failure mode is more instructive than just laughing them off.

Current transformer-based AI architectures have a fundamental limitation that gets glossed over in these conversations: they rearrange existing knowledge probabilistically. They're extraordinarily good at pattern matching, interpolation, and generating plausible-looking outputs based on training data. What they cannot do is generate genuinely novel ideas, reason about truly new problem domains, or handle the kind of ambiguous, context-dependent decision-making that makes up the hard part of software engineering. Writing code is maybe 20% of what developers do. The other 80% — understanding requirements, making architectural decisions, debugging novel issues, navigating organizational complexity — is exactly what current AI architectures are worst at.

That said, dismissing AI entirely is also wrong. AI is genuinely transforming how developers work, even if it's not replacing them. It's handling boilerplate, accelerating exploration, catching obvious bugs, and making documentation more accessible. The honest answer — which doesn't make for good Davos soundbites — is that AI is making developers more productive, not obsolete. The ratio of what one developer can accomplish is changing, but the need for human judgment, creativity, and ownership is not.

For engineering leaders and architects, the practical takeaway is to invest in AI-augmented workflows without reorganizing around the assumption that developers become unnecessary. Upskill your teams on effective AI tool usage. Build processes that leverage AI for what it's good at. And be deeply skeptical of anyone selling you on a future that conveniently requires buying their product.

Key takeaways:

  • "AI replaces developers in 6 months" is now an annual prediction that keeps not happening
  • Transformer architectures excel at pattern matching but cannot generate truly novel solutions
  • AI is genuinely making developers more productive, even if it's not replacing them
  • Writing code is a small fraction of software engineering — the hard parts remain deeply human

Link: AI will replace developers in 6 months. Again...


Migrate to ESLint v10.x

TLDR: ESLint v10.0.0 brings breaking changes including mandatory flat config (eslint.config.js), Node.js v20.19+ requirement, and new configuration lookup behavior starting from each linted file's directory.

ESLint v10 has landed, and it's the kind of major version update that actually matters — not just a version bump, but a genuine architectural shift in how the tool works. If you've been putting off the migration from eslintrc to flat config, your procrastination grace period is officially over.

The headline changes are significant. First, the old eslintrc configuration format — the one with extends, plugins arrays, and overrides — is completely gone. Not deprecated, not legacy, gone. You must use the flat config format with eslint.config.js. Second, the minimum Node.js version is now v20.19+, v22.13+, or v24+. If you're running anything older, you'll need to upgrade your runtime before you can upgrade your linter. Third, configuration file lookup now starts from each linted file's directory and walks up to the root, rather than starting from the current working directory. This is actually a much more intuitive behavior, but it can break existing setups that depended on the old lookup order.

The flat config format is genuinely better than what it replaces. It's just JavaScript — no special JSON schema, no magical string-based plugin resolution, no confusing inheritance chains. You import plugins directly, compose configurations with spread operators, and the whole thing is transparent and debuggable. But migration can be painful, especially for large codebases with complex eslintrc configurations involving multiple extends, shared configs, and custom rules.

For teams planning the migration, the practical advice is: don't try to do it all at once. Start with the ESLint migration tool, which handles the straightforward cases. Then manually address the edge cases — custom rules, workspace-specific overrides, and CI pipeline changes. Test thoroughly, because linting behavior changes can quietly alter which code passes review. And update your Node.js version first if needed, so you're not fighting two battles simultaneously.

Key takeaways:

  • eslintrc format is completely removed — flat config (eslint.config.js) is now mandatory
  • Minimum Node.js requirement is v20.19+, v22.13+, or v24+
  • Config lookup now starts from the linted file's directory, not the working directory
  • The flat config format is more intuitive but migration from complex eslintrc setups takes effort

Tradeoffs:

  • Flat config is simpler and more transparent but requires a one-time migration effort that can be significant for large codebases
  • Stricter Node.js requirements ensure modern runtime features but force infrastructure upgrades

Link: Migrate to v10.x