5 min read
|
Saved February 14, 2026
|
Copied!
Do you care about this?
This article explores a shift in data modeling from rigid orthodoxies to a more pragmatic approach. It emphasizes starting with simple structures, adding complexity only when necessary, and leveraging semantic clarity for flexibility across different modeling techniques.
If you do, here's more
The article introduces a new newsletter format focused on sharing insights from data architecture and modeling. The author emphasizes a shift away from outdated practices from the 2000s and 2010s towards more pragmatic approaches. The core idea is to keep things simple initially and only add complexity when necessary. This is exemplified through examples from various experts in the field.
One key insight comes from Zach Wilson and Sahar Massachi, who argue against the traditional Slowly-Changing Dimension Type 2 patterns. Instead, they suggest using date-stamped data snapshots to save time and engineering effort. Their proposed architecture involves starting with virtual views and only materializing tables when performance data indicates a need. This shifts the focus from storage costs to optimizing human time.
Robert Andersonβs work highlights the importance of a semantic foundation that allows for different data modeling approaches without the need for separate implementations. His method enables flexibility by creating various projections from a single model. Juha Korpela adds a layer by suggesting that data modeling should involve both design and discovery paths, building knowledge over time rather than trying to create a comprehensive model upfront. The overarching theme is a move towards evidence-based complexity, where changes and adaptations occur in response to real-world data rather than anticipated needs.
Questions about this article
No questions yet.