In order to be successful at anything, it is important to utilize the right tools, at the right times and in the right ways. This is true for almost any business from IT providers to football clubs. Big Data, and all that goes with it, is a hot topic in the world of IT at the moment. Using and knowing how to implement the best data modeling methodology is key and has become critical to successful Big Data projects. Persisting with outdated data modeling methodologies is like continuing to play Wayne Rooney in the number 10 for Everton, once brilliant but now just doesn’t have the legs!
The Data Vault is a new-ish model used in Big Data analysis. It is a hybrid data modeling methodology providing historical data representation from multiple sources designed to be resilient to environmental changes. Originally conceived in 1990 and released in 2000 as a public domain modeling methodology, Dan Linstedt, its creator, describes a resulting Data Vault database as:
“A detail oriented, historical tracking and uniquely linked set of normalized tables that support one or more functional areas of business. It is a hybrid approach encompassing the best of breed between 3NF and Star Schemas. The design is flexible, scalable, consistent and adaptable to the needs of the enterprise.”
by Benjamin Blackburn