Collibra Catalog

Instructor:

Peter Princen, Product Manager

Objectives:

  • Explain approved data sets
  • Summarize ingestion of data
  • Demonstrate shopping for data

Learn everything about how to use the Collibra Catalog, which helps you find, understand, and trust the data you need. Collibra Catalog is a trusted, single source of intelligence for data experts and other data citizens who need quick access to enterprise data. By cataloging approved and trusted data sets and making them easily discoverable through semantic search, Collibra Catalog provides a new way for data users to find and access the data they need, evaluate its lineage, and even enrich its value.

Getting Started: Communities, Domains and Stewardship

Instructor:

Ben Brendle, Presales Engineer

Objectives:

  • Outline organizational structure
  • Compare communities and domains
  • Classify the roles and responsibilities

Description:

This course is designed for members of the Data Governance Council, as they begin to implement their program. Members of the council are representatives from business and technical data stakeholder functional groups. We will review organizational concepts, data governance structure, create communities and subcommunities, and assign roles and responsibilities.

Getting Started: Assets and Asset Types

Instructor:

Ben Brendle, Presales Engineer

Course Objectives:

  • Explain the metamodel
  • Classify assets with asset types
  • Relate characteristics with attributes and relations

Description:

This course is designed for Data Stewards who are responsible for utilizing an organization’s data governance processes to ensure fitness of data elements, both the content and metadata. Data Stewards have a specialist role that incorporates processes, policies, guidelines and responsibilities for administering organization’s entire data in compliance with policy and or regulatory obligations. We’ll review assets and asset types. We will also demonstrate how to create characteristics with attributes and relations.

Getting Started: Business Glossary

Instructor:

Ben Brendle, Presales Engineer

Objectives:

  • Explain Business Glossary domain
  • Summarize good definitions
  • Demonstrate importing business terms

Description:

This course is designed for Data Stewards who are responsible for utilizing an organization’s data governance processes to ensure fitness of data elements, both the content and metadata. Data Stewards have a specialist role that incorporates processes, policies, guidelines and responsibilities for administering organization’s entire data in compliance with policy and or regulatory obligations. We will start to create business terms and evaluate creating good definitions. We’ll also apply a round-trip export-import process to import multiple business terms from an Excel spreadsheet.

Getting Started: Data Dictionary and Reference Data

Instructor:

Nandini Rajagopalan, Solution Engineer

Objectives:

  • Explain registering a data source with Data Dictionary
  • Relate physical data dictionary to schemas, tables and columns
  • Outline logical data models, data entities and data attributes with Reference Data

Description:

This course is designed for Technical Stewards who are involved in daily data-related decisions, execute business decisions and implement business requirements in a technology platform. We’ll review the following topics; use data dictionary to register a data source, view physical data models including schemas, tables and columns, examine data profiling, and identify logical data models using reference data.

Getting Started: Catalog, Collibra Everywhere and Lineage

Instructor:

Paul Windom, Presales Engineer

Objectives:

  • Demonstrate creating data sets in Collibra Catalog
  • Explain features of Collibra Everywhere
  • Relate technical lineage to business assets in traceability diagrams

Description:

This course is designed for Data Stewards who are responsible for utilizing an organization’s data governance processes to ensure fitness of data elements of both the content and the metadata.  We’ll begin by explaining the features of the Collibra Catalog and create the Customer Product Sales dataset. We will describe how to interpret business context, data elements, systems, and business processes related to customer product sales dataset; and an explanation of lineage diagram functions and configurations. We will also demonstrate using Collibra Everywhere, specifically to identify sales territory.

Getting Started: Data Helpdesk, Policy Manager and Workflows

Instructor:

Ben Brendle, Presales Engineer

Objectives:

  • Explain automatic notifications for Data Helpdesk issues
  • Relate policies and standards to business assets
  • Demonstrate workflows to model your data processes

Description:

This course is designed for Privacy Stewards who are responsible for utilizing an organization’s data governance processes to ensure compliance. Privacy stewards have a specialist role that incorporates processes, policies, guidelines and responsibilities for determining with which policy and or regulatory privacy obligations an organization should comply. We’ll begin the course by using the Data Helpdesk to log an issue for missing policy descriptions. Next, we’ll search Policy Manager for relevant policies and standards to use. We will create relations between governance assets and business assets. We’ll describe the use of workflows and show workflow configurations in Settings. These processes will build towards our final steps of completing the Getting Started lineage. Our last steps are to create relations between our data asset, the End date contract + 2 years policy standard, and our Customer Product Sales dataset. We’ll confirm all assets are used appropriately for the Customer Lifetime Value report.

Getting Started in BCBS 239

Instructor:

Simon Hankinson, Global Financial Services Market Lead

Objectives:

  • Examine the operating model
  • Categorize the critical data elements
  • Inspect the BCBS 239 principles

Description:

In this course you will be introduced to the BCBS principles and explore how to establish prioritizes in preparing and creating your report. The relationship and value the BCBS regulatory adherence implies on other regulatory requirements will also be explored. Upon completion of this course you will be able to: Examine your organization and to interpret an approach to data governance, establish priorities to start your BCBS 239 Data Governance initiative, and describe the BCBS 239 principles.

Prescriptive Path Introduction

Instructor:

Lowell Fryman, Practice Principal & Capability Manager

Objectives:

  • Outline the Prescriptive Path steps
  • Explain the objective of each step
  • Summarize the key capabilities of each step

Description:

This course presents an overview of the Prescriptive Path, steps one-ten. Each step contains objectives, identification of the key roles involved in the step, the notable prerequisites, the use case, the Collibra applications that maybe involved, and the key capabilities and activities that can help you achieve implementation of that step. The sequencing of the steps will also be addressed. Most users will not start at step one and progress through ten. Your team may skip a step here or there, but it’s important to address all of the steps at some point in time to improve your adaptability and maturity. The steps categorize the progression from strategy to review of your strategy and capturing all of your assets and activities in between.

Move Data to AWS for Tableau Analytics

Instructor:

Peter Princen, Sr. Product Manager


Course Objectives:

  • Explain onboarding ERP cloud data into data lake built on AWS S3
  • Utilize Amazon Athena to access data in AWS S3 data lake
  • Examine complete lineage of Tableau workbook and source systems

Description:

In this course, we will review a user journey of a business analyst that needs to make a report on sales forecasts in the domain of supply chain. The use case will show that not only Catalog will be used to find the correct data, but it will actually manage the whole process of looking for data, requesting new data to be onboarded into the data lake where our analyst can actually access it, and leading up to creating a report in a BI tool of choice. In our example the BI tool will Tableau, and the data lake will be built on AWS S3. At the end of our journey, the analyst can access the data that is stored on AWS S3.

>