In this article we present the minimum requirements necessary to qualify for enrollment to the VTC Program. These requirements will be vetted once the enrollment submission form is filled in.
Required Features / Focus
The tool must be Data Vault 2.0A System of Business Intelligence containing the necessary c More focused, and produce Data VaultA System of Business Intelligence containing the necessary c More 2.0 artifacts. Any tool submitted that does not work with, generate, or deal with Data Vault 2.0 will be denied entry. We currently have two certifications we offer: Modeling Tool Certification, and Code Generation Certification.
As stated in the legal contract – we certify tools that are focused on enhancing or enriching Data Vault specific productivity. We want to ensure that the tools produce artifacts that comply with the Data Vault 2.0 standards, and in some cases the recommended best practices. Any submission of a tool that does not contain Data Vault 2.0 specific features will be rejected and not allowed to proceed with the vendor certification process.
Tools that do Not Qualify for our VTC Program
What is below is not an exhaustive list. DataVaultAlliance Holdings LLC reserves the right to assign qualification status to any tool being enrolled in the program. Any vendor tool that is disqualified will receive DVA reasoning as to why the decision has been made. The list below shares some of the top reasons why a tool may not be accepted in to the VTC program.
- Tools that generate source system Data Vault Models do not qualify
- Tools that leverage primary and foreign keys to auto-forward engineer Data Vault models which rely on source system primary keys do not qualify
- Tools that cannot demonstrate an ability to integrate multiple models across multiple source applications to a single enterprise view of a Raw Data Vault do not qualify
- Tools that do not allow designers and architects to change naming conventions in the GUI do not qualify
- Tools that do not leverage a template engine for code-generation do not qualify
- Tools that do not allow marking of Business Keys (composite and non-composite fields) do not qualify
- Tools that do not support standard data vault objects (as documented in the DV Modeling standards) do not qualify
- Tools that do not generate standard columns / required columns to Data Vault model objects do not qualify
Modeling Tool Certification Required Features
The tool must produce physical data model artifacts in the form of one or more text files. Tools that do not generate physical data model artifacts will not qualify. The following items must be generated in order to meet minimum requirements:
- Physical Data Model (DDL) table structures for:
- Staging areas
- Raw Data Vault Objects
- Business Data Vault Objects
- Information Mart Objects
- Physical Constraint Commands (including Referential Integrity)
- Primary Key Declarations
- Foreign Key Declarations
- Optional: Alternate Key Declarations
- Optional: Index Declarations
- Augmentative Metadata for Data Modeling Objects
- Source – to target (one to one, field to field) mappings
- Business Key declarations
- Naming Convention Settings (prefixes and suffixes for: tables, columns, constraints)
- Schema Definitions (assigning schema names a constant “type” – stage, raw vault, etc…)
Code Generation Certification Required Features
Code Generation are the sets of processes known as Data Manipulation Language (DML) instructions that move data from one structure to another. Data Modeling certification only handles the structure. To be Data Vault 2.0 standards compliant, it is necessary to validate the data movement processes as well. In order to validate the data movement processes, we test the data (i.e.: the source data balanced to the target data, and so on). This type of validation allows us to run the DML code and test the results. The DML code must be 98% ANSI-SQL compliant.
We do NOT accept the following:
- External Scripts (non-SQL based)
- Operating System Scripts
- Reliance on non-SQL execution engines (ie: ETL, ELT engines outside the DB)
- Java Applications
- Javascript applications
- Basically: NON-SQL based external to DB
- NOTE: we do not allow: stored procedures, or database Functions, or UDF’s (user defined functions)
The following artifacts must be generated in order to meet the certification criteria:
- ANSI-SQL DML commands that are Database Native, in Text files
- Currently supported database is PostGreSQL (we are working on adding Snowflake soon)
- Insert Into…select from statements
- Update statements
- WITH CTE… statements
- Select Statements
- Optional: Views
Common Q&A
- Can I submit a tool that’s focused on data modeling only?
- Yes – but today, it must include phyiscal models and physical artifacts. Logical modeling and Conceptual modeling tools must wait until we have the appropriate certification available for those features to be validated.
- Can I submit an automated testing tool for Data Vault 2.0?
- Not today. We have plans for a future certification that includes verifying / certifying automated testing tools and the artifacts they generate.
- Can I submit a tool that has nothing to do with Data Vault specifically?
- No. Our testing and certification processes only apply to Data Vault 2.0 specific features, standards, and best practices.
We recommend (but do not require) the tool to utilize a template-driven code-generator. This way, when a test fails, there is a faster iteration time to fix the code that is generated (by adjusting the template rather than the code-base).
The Metadata requirements include augmentative metadata that most DDL (data definition language scripts) and DML (data manipulation language scripts) do not contain. Extra metadata is required in order to validate different technical aspects of the tool. For instance: to validate that business keys are properly named, extra metadata around the defined naming conventions is required; otherwise there is nothing to validate against.
NOTE: once enrolled in the program the vendor is granted access to on-line self paced training classes, and example downloads that walk through each artifact – what it means, why we need it, and how it’s used (as a Baseline). Once the vendor can prove the tool generates the Baseline (and passes certification on the baseline) a final model with several iterations is provided to complete the certification process. More about this process can be found in the Process documentation here.
Recommended Staffing Aspects
While we do not (today) require specific staffing to qualify for the program, we do recommend certain staffing knowledge before agreeing to engage with the program. We have outlined a few of the major suggested staffing knowledge recommendations below:
- Engineer: minimum of 2 years of experience in building Data Vaults, highly encouraged that the engineer have CDVP2 certification
- Manager overseeing the engineer: basic knowledge of what Data Vault 2.0 is and what it’s value is to the vendors’ customer base
- Sales Engineer: CDVP2 Certification – in order to properly build the right solution once the tool has been certified and aligned with Data Vault 2.0
- Sales Person: Introductory knowledge of Data Vault 2.0, what it’s value is to the vendors’ customer base – along with the understanding of where to get proper training (ie: the CDVP2 course)