World Wide Data Vault Consortium (WWDVC)

Unlocking Global Insights With Invaluable Data Vault Knowledge

Gain a competitive edge with access to past conference videos through our Professional Membership. Explore expert insights and innovative solutions in data management, staying ahead in this dynamic field. Whether you’re a seasoned pro or new to the game, these videos provide valuable knowledge to boost your career and enhance your organization’s data strategy. Don’t miss out – unlock the power of data today!

About Your Access

A yearly subscription to our conference videos featuring Data Vault thought leaders provides exceptional value, offering access to cutting-edge insights and strategies.  Stay at the forefront of data management and continuously enhance your knowledge and skills. Invest in your professional growth with a Professional Membership today!

Conference Details

Welcome to WWDVC 2018 – World Wide Data Vault Consortium

We hold this conference yearly, in 2018 we had over 110 attendees,over 28 presentations including 4 deep dive hands-on build sessions.  Our presenters talked about business successes, breaking down barriers with management, how to build real-time environments and more!! If you’ve never gotten to attend, then now is your chance to find out what it is all about!

Not a Member Yet?

Unlock the power of knowledge and transform your data management expertise with our conference membership. Gain exclusive access to invaluable insights, strategies, and innovations shared by top industry leaders.

Don’t miss out – seize this opportunity to supercharge your professional growth and gain the competitive edge with your conference membership!

Check out these incredible numbers

The amount of knowledge packed into these sessions is phenomenal. Don't miss this opportunity to learn fantastic new ideas.

0 +
Attendees
0
Sponsors
0 +
Speakers
0 +
Content Hours

Conference Sessions


Speaker: Bill Inmon

The father of data warehousing presents and discusses how Data Vault can be extended with Textual Data, along with how Data Vault is the next big thing for Big Data management.

Download Slides: 08_KeyNote_BillInmon

Speakers: Kent Graziano

With the increasing prevalence of semi-structured data from IoT devices, web logs, and other sources, data architects and modelers have to learn how to interpret and project data from things like JSON.

While the concept of loading data without upfront modeling is appealing to many, ultimately, in order to make sense of the data and use it to drive business value, we have to turn that schema-on-read data into a real schema! That means data modeling!

In this session I will walk through both simple and complex JSON documents, decompose them, then turn them into a representative data model using Oracle SQL Developer Data Modeler. I will show you how they might look using both traditional 3NF and data vault styles of modeling.

Download Slides: 09_MakingSenseOfSchemaOnRead

Speaker: Michael Magalsky

Building Earth's largest known commercial Data Vault implementation on Teradata at Micron Technologies was chock-full of rich learning. Sharing lessons learned through each of our organizations' unique information journeys results in net positives for our industry.

In this interactive session, learn from our Data Vault implementation challenges and successes at Micron. Then share your thoughts, ideas, successes, and failures. The most elegant solution is usually simple, and seldom is the most obvious.

Download Slides:10_WorldsLargestDV

Speakers : Bruce McCartney

The purpose of this presentation is to examine the DV 2.0 architecture in light of big data architectures , including the Lamda Architecture for data processing and other streaming technologies used in Big Data.

There will be an examination of the definition of what streaming is, and how it related to data warehousing. A discussion of the motivations for using streaming to populate a data warehouse in general and a data vault specifically will be introduced.

Download Slides: 11_DV2ArchitecturalAlignment_BigData

Speakers : Richard Jordan

For thirty years the data warehouse has remained a mysterious “behind the IT curtain” technology to the typical business unit manager. The business value of a traditional DW was hampered by two key factors: cost & complexity.

Even the most technology savvy business leaders shy away from building a DW because the payoff for developing one is over the horizon, and their primary focus in on the current quarter.

The advances offered by DV 2.0 present the opportunity for the payoff for implementation to begin to be realized in that same current quarter. To get there a business has to make two key choices: what data do they want to include and build vs. buy.

Download Slides: 12_BusinessOfDV

Speaker: Sanjay Pande

Sanjay Pande will be discussing Data Vault on Hadoop from a practical implementation perspective. What it means, what to do, and so on.

Speaker: Sam Williams

In this video we meet users, and hear from them about their efforts (both successes and challenges) with Data Vault.

Speaker: Scott Ambler

In this video Scott shares his vision for what it takes to truly be agile in today's BI and Big Data Environment.

Download Slides: 14_DisciplinedAgile

Speakers : Nols Ebersohn

Information is an asset, however every organisation faces a bleak and foreboding prospect of facing in business intelligence the equivalent of a GFC.

Why is it that most data warehouses on average only last for 3-5 years, at which time its either duplicated or replaced? How and where will the next major investment failure occur for Business Intelligence? How is this preventable and what are the key success factors in dealing with the organisational technical debt?

Somebody has to pay this debt - like the sub-prime crisis, someone will have to find a way to have the information assets based on real assets instead of fictitious assets. We will explore the ways to quantify, manage and remediate these challenges.


Speaker: Beat Bannwart

The speaker walks us through how Data Vault enables Live / Real-time Credit Card Authorizations in a banking environment.

Download Slides: 16_NearRealTime


Speakers: Christiane Chagnon & France Pare

In order to acquire and make data available in a faster way, the enterprise information system needed to review the development life cycle in every angle: people, process and tools. The development team adopted the agile methodology in order to increase the time to market delivery and to work closer with the product owners.

The data vault methodology helped to build out and deploy quicker by automating the generation of the code from the mapping document. With this improvement, the test process became the bottleneck and we needed to reduce the duration of the test execution in order to deliver in weeks instead of months.

Download Slides: 17_TestingAutomation

Speaker: Neil Strange

A causal loop analysis of organisational barriers to implementing a data vault (paralleling attempts to improve data management as a whole) – identifying critical points of leverage that can be used to reduce these barriers.

The point here is that often there are hidden feedback loops within the social structure of an organisation that resist new ideas. If you try to tackle the barriers head on the feedback loops strengthen and build resistance, the stronger you push the stronger they push back.

You have to find ways to weaken the feedback loop, often using simple and surprising left-field tactics, so that you increase the chances of success.

Download Slides: 18_ReducingBarriersToDV


Speaker: Mike Economou

It would require an act of congress! Adopting agile development principles and taking an enterprise-wide view of data is a difficult transition for any large organization: processes need to adapt, teams need to be restructured, metrics need to be redefined, and traditional expectations need to change.

Implementing change in the federal sector can be made even more complicated, as the relationship between contracted software development teams and the government is both highly regulated and designed with more traditional approaches in mind.

As a federal prime contractor tasked with building a data warehouse for an agency in the Intelligence Community, Quadrint has worked to meet the unique needs and regulations of federal customers while still remaining agile, and has learned plenty of lessons along the way.

Download Slides: 19_DVIntheFederalSector

Speaker: Russell Searle

Hospital clinicians and administrators are starved of data. What data they do get it is typically stale and usually unrecognisable, leading to low trust. Hospitals generate vast amounts of data yet so little is leveraged outside of the systems that created it.

Clinicians are highly dedicated scientists, always seeking to improve patient outcomes through research, but research needs to be fed with data. Administrators need to anticipate the future, and manage the present, not explain what happened last month, or more commonly, the month before.

Download Slides: 21_DV_In_Healthcare

Speakers : Ed Comer, Geoff McElhanon

The Michael & Susan Dell Foundation (MSDF) is dedicated to improving K-12 education outcomes by supporting teachers, students, and parents in making the best decisions for success using data that is timely, correct, and understandable.

Over the last eight years, MSDF has sponsored the development of an education data standard and a set of supporting open source technologies, branded as Ed-Fi® and governed by a community-driven Ed-Fi Alliance. The Ed-Fi data standard is built around a conceptual model, known as the Unifying Data Model (UDM), supporting a family of consistent representations: bulk XML data interchanges; a transactional REST API with JSON payloads; and a normalized SQL-based Operational Data Store.

The supporting technologies are generated from the data model, thus allowing the standard to easily evolve and be extended without incurring large technology maintenance costs. The Ed-Fi Alliance invested in a compatible Ed-Fi dimensional data warehouse to support the needs of the community for longitudinal data for reporting and analytics.

However, because the schema and its large amount of ETL were hand-crafted, the high continuing investments limited its use in the community. Enter the Data Vault. Mapping the Ed-Fi UDM model of Entities, Associations, and Attributes to the Data Vault’s Hubs, Links, and Satellites was straightforward.

This presentation will discuss the approach that generates the Data Vault schema from an Ed-Fi semantic model, generates the stored procedures to load the Data Vault from the Operational Data Store, and also generates a set of Data Vault views that hide the complexities of the Data Vault schema. By completely automating the API-to-ODS-to-DV data pipeline, education organizations can focus their resources on building data marts and visualizations to drive student performance.

Speaker: Nols Ebersohn

Part 2:

Information is an asset, however every organisation faces a bleak and foreboding prospect of facing in business intelligence the equivalent of a GFC. Why is it that most data warehouses on average only last for 3-5 years, at which time its either duplicated or replaced? How and where will the next major investment failure occur for Business Intelligence?

How is this preventable and what are the key success factors in dealing with the organisational technical debt? Somebody has to pay this debt - like the sub-prime crisis, someone will have to find a way to have the information assets based on real assets instead of fictitious assets.


Speakers : Moritz von Ketelhodt, Michael Olschimke

Beyond the lasting damage caused to individuals’ personal finances, the erosion of trust within certain corporations, and a feeling of caution within the sector, the financial crises that struck a number of countries left something else in its wake, higher levels of mandatory regulation. Due to this increase within mandatory oversight and the creation of new protocols governing the sector, a new tool for transparency and compliance became necessary.

For that reason, Berenberg IT, along with Scalefree consultants stepped forward to offer their business users a solution; by building a Data Vault 2.0 System of Business Intelligence, a centralized enterprise data warehouse that would support self-service clients, giving them the ability to build and adapt their own business intelligence platforms while still offering dedicated service clients the ability to facilitate audits without risk of falling out of compliance, Berenberg would allow businesses leverage applicable business intelligence and benefit from the system.

Download Slides: 15_ManagedSSBI

Get Started Today

Connect with other professionals, unlock a world of unparalleled opportunities. Elevate your expertise and access tons of self-paced e-learning content. Join your peers in our dynamic community, enhance your skill sets today!
Read More
Shopping Cart