Database can be measured by six properties – four determined by the design; Usability, Integrity, Performance/Scalability, and Extensibility; and two being more a function of implementation than design; Availability and Security.
Of these six factors, I’d argue that in the long run, extensibility is the one that is more expensive to repair. All six are necessary, but a database that has become brittle (touch one thing, everything else breaks) is either impossible or extremely expensive to correct. Database extensibility is bought with three techniques; a strongly enforced data abstraction layer, data driven design, and normalization. Normalization is typically credited with the integrity of the database, and this is true. But normalization also influences the extensibility of the database. Here’s why,
A poorly designed, anti-normalized database is typically made to work using multiple layers of convoluted code. It takes a lot of extra code to keep anti-normalized data consistent. All this code is more expensive to maintain, and makes it very difficult to implement changes and new features.
It’s not even a factor of pay-me-now or pay-me-later. An anti-normalized database doesn’t just cost a bit more to fix later; it costs so much to fix later that companies, or careers, fail due to the poor design. The bottom line is still that a man-day spent on database design saves, at least, a man-month of extra work later.