Insights

Enterprise

Artificial Intelligence

Data

Americas

Snowflake on Why Data Engineering Fundamentals Define AI Success or Failure

Snowflake emphasizes that AI advancement depends on foundational data engineering practices as 72% of tech leaders recognize data engineers as integral to business success.

Snowflake emphasizes that AI advancement depends on foundational data engineering practices as 72% of tech leaders recognize data engineers as integral to business success.

Snowflake emphasizes that AI advancement depends on foundational data engineering practices as 72% of tech leaders recognize data engineers as integral to business success.

NewDecoded

Published Nov 26, 2025

Nov 26, 2025

4 min read

Data engineering, once dismissed as unglamorous infrastructure work, has become headline material as organizations rush to deploy AI systems. The sudden attention reflects a harsh reality: AI initiatives fail when data pipelines, governance, and lineage aren't solid. According to research cited by Snowflake, 72% of 400 surveyed technology leaders now view data engineers as integral to their business, marking a significant shift in how organizations value the discipline.

Veronika Durgin, writing for Snowflake's blog, applies a risk analysis framework to explain why AI projects struggle. The "knowns and unknowns" model divides organizational knowledge into four critical quadrants: what teams understand and rely on, what they recognize they don't know, what answers exist but remain hidden in silos, and what surprises await that nobody anticipated. Each quadrant presents distinct challenges that require different responses, from reinforcing fundamentals to fostering cross-team communication.

The most dangerous category isn't what organizations don't know, but what they think they know. Models hallucinate because nobody validated training data. Pipelines break silently, feeding stale information to production systems. Quick prototypes become business-critical dependencies without proper safeguards. These failures trace directly back to skipped fundamentals: the lineage jobs, schema tests, and safety switches that prevent small problems from cascading into system-wide outages.

Organizations face expanding questions they recognize but can't answer: measuring explainability when models make unauditable decisions, tracing lineage when models retrain themselves, governing synthetic data, and handling drift that happens in milliseconds. Meanwhile, critical answers often hide in plain sight across team boundaries. Vendors optimize systems until performance tanks while controlling diagnostic information. Upstream teams change schemas without notifying downstream dependencies. Teams solve identical problems independently because knowledge never flows between them.

Durgin argues the solution isn't perfection but resilience. Organizations must design systems assuming failure will occur, building rollbacks and retries into architecture. They need to contain blast radius so single failures don't compromise entire platforms. Most importantly, they must create cultures where engineers feel safe flagging problems they observe and where post-mortems become learning opportunities rather than blame sessions. AI has compressed years of technology maturity into months, but the fundamentals that make systems trustworthy haven't changed.


Decoded Take

Decoded Take

Decoded Take

This renewed emphasis on data engineering fundamentals arrives as Snowflake positions itself against competitors offering AI-native platforms. The timing aligns with Snowflake Summit 2025's core message that "there is no AI strategy without a data strategy."

While vendors race to release AI features, Snowflake is making a strategic bet that organizations will ultimately choose platforms emphasizing governance, observability, and reliability over raw AI capabilities. The knowns-unknowns framework serves as both practical guidance and market positioning: it acknowledges AI's complexity while asserting that disciplined data engineering, not just model sophistication, determines which AI initiatives succeed.

For enterprises burned by rushed AI deployments, this message resonates. The question is whether organizations will invest in unsexy infrastructure work when competitors promise faster paths to AI value.

Share this article

Related Articles

Related Articles

Related Articles