Developer Portal, APIs and CLI

Instructor:

Julien Gilon, Product Manager

Objectives:

  • Plan to adapt to API v2
  • Utilize Collibra REST API
  • Develop using Collibra Command Line Interface (CLI)

Description:

We are excited to introduce the new site, Developer.collibra.com, providing a single source of truth for your Collibra development. Tutorials, API documentation and updates are available for Collibra development capabilities. We’ll also review documentation for automatically generating a REST client with Collibra REST API. In addition, you can now automate the creation and upload of workflows from the development environment. The discovery of API v1 is also automated, allowing for faster conversion to API v2. Finally, the Collibra Command Line Interface (CLI) is a lightweight application, allowing you to bootstrap and manage your projects.

Report Certification and Watermarking

Instructor:

Tudor Borlea, Presales Engineer

Objectives:

  • Apply report certification process
  • Identify critical data elements
  • Select standards to apply to critical data elements

Description:

Report certification and watermarking addresses how to determine which reports can be used as trusted documents. We begin by identifying the Critical Data Elements. The next step is to identify the lineage and the standards that need to be applied to each of their critical data elements. The third level of certification monitors the data quality for each of the critical data elements. If at the end of a process, the certification is approved, the report then carries a seal or stamp of approval. This mark indicates to the business users that the report is safe to use for decision making. A watermarked report indicates curated data, and this helps create a culture of governance from the ground up. Empowered users recognize the power of working through formal channels to publish reports and trust the data.

Data Governance Best Practices

Instructor:

Kash Mehdi, Customer Success Manager

Objectives:

  • Implement the Operating Model
  • Identify data domains
  • Identify critical data elements within domains

Description:

We are going to review the four best Data Governance practices when launching a Data Governance program. The first step is to focus on the operating model. This is a key element of your Data Governance journey, so in this phase you’re going to define roles, responsibilities. The next step in this process is identification of data domains. This will involve managing key information assets through the Collibra Platform, where you define domains, glossaries, dictionaries, data catalog, the business processes around managing customer information. The third point is identifying the critical data elements within your domains. The fourth and final step is defining control measurements. Now that you defined structure, domains, identified what’s really critical, it takes us to this control measurement, which includes some key activities.

Data Quality Configurations and Dashboards

Instructor: 

Joyce Snelders, Manager, Analytics and Cognitive, Deloitte Consulting LLP

Objectives:

  • Interpret data quality components within Collibra Platform
  • Determine an aggregation path in the data quality rule configuration
  • Explain a data quality dashboard for a single asset

Description:

This is the second course in a series based on the use case of a retail distribution company using Collibra to cleanse and monitor the quality of their data. In this course we will do a one-time configuration by asset type in order to integrate and ingest scores into Collibra to view how a data quality score has evolved over time. We will create a community and a rulebook domain, and then configure the data quality rules in Collibra Platform. Once all configurations are complete, we will link the data quality rule to the asset types we have created for it. Lastly, we will discuss how to create a data quality dashboard for a single asset.

 

Interpreting Data Quality in Collibra Platform

Instructors:

  • Joyce Snelders, Manager, Analytics and Cognitive, Deloitte Consulting LLP
  • Darshana Galande, Senior Consultant, Deloitte Consulting LLP

Objectives:

  • Appraise data quality for business purposes
  • Assess relationships between assets and metrics
  • Support accountability through data quality dashboards

Description:

This course is the first in a series based on a retail distribution company’s need to cleanse and monitor the quality of their data. This course will help business users trust their data by determining what data quality is and explaining data quality dimensions such as timeliness, completeness and accuracy. We will highlight the relationships between a business term, a business rule, a data quality rule and a data quality metric. Additionally, we will explain how data quality rules, metrics and dimensions are aggregated and assigned to an asset. To conclude, we will show you examples of a data quality dashboard for a single asset, giving you an idea of how it will look in your organization.

Using Dashboards and Data Lineage to Derive Insights from Data

Instructor: 

Joyce Snelders, Manager, Analytics and Cognitive, Deloitte Consulting LLP

Objectives:

  • Select relationships and add characteristics to data quality rules and dimensions
  • Determine accumulated quality dashboards for multiple assets
  • Assess data impact across assets

Description:

This is the final course based on the use case of the retail distribution company using Collibra to cleanse and monitor the quality of their data. In this course we’re going to create data quality dashboards that are accumulated on a table, or on a certain domain. Following that we will look at the full life cycle of the data quality score through end to end lineage on a traceability diagram. You have created an aggregation path, so now you can see the relationships between your scores, your role itself, and to which columns the rule is applicable. This will enable a business user or a stakeholder to look at the business terms they are responsible for, creating accountability and giving them insight on what the status of the data quality score is. This course will also discuss importing data quality rules and metrics.

Collibra Data Office: Data as a Strategic Asset

Instructor:

Lisa Kaiser, Operations, Product & Engineering

Objectives:

  • Explain how a data office can unlock value from data
  • Compare the data office key collaborations and partnerships
  • Assess the steps needed to create a data office

Description:

Over the last decade, we have seen countless organizations strive to become data-driven. Key to this mission-critical function is a data office. Led by the Chief Data Officer, this department is responsible for being the champion for data as a strategic, competitive asset by increasing the company’s data intelligence. We’ll begin by discussing how a data office can help unlock value from data. Next, we’ll cover the various organizational approaches to a data office and the key partnerships and collaborations they must have within the organization. Finally, we’ll review some examples of different data offices, including the Collibra Data Office.

Collibra Data Office: Data Products

Instructor:

Alexandre t’Kint, Data Scientist

Objectives:

  • Examine what is a data product
  • Analyze how a data product provides value 
  • Distinguish the steps needed to create a data product

Description:

At Collibra, we see a data product as the output of a data science activity which creates actionable insights from big data. They are valuable, information-enriched solutions that can take different forms, such as data assets, virtual data assets, SQL queries, reports, and dashboards. This course will discuss what is a data product, the ways to create and derive value from a data product, including the necessary approval steps.

SysAdmin: Workflow Configurations v5.x

Instructor:

Kristof Depypere, Chief Architect

Objectives:

  • Test for deploying a new workflow
  • Examine configurations for workflows
  • Classify roles to participate in workflows

Description:

Learn everything about how to deploy and configure a new workflow in Data Governance Center. To deploy a new workflow, begin with the Settings page, and navigate to Workflows. Select the upload button to choose the files that you want to deploy. Every workflow is defined in a .bpmn file, so if you want to deploy a new one, select one from your list and get it uploaded and you will see the new workflow appearing in the list underneath. As a last step, enable the workflow so that it becomes available to all the users.

Data Governance Strategy – Data Governance Principles and Policies

Instructor:

Lowell Fryman, Customer Success Practice Principal

Objectives:

  • Identify data governance principles and policies
  • Apply principles to develop roadmap
  • Utilize policies for roles and responsibilities

Description:

A key aspect of creating Data Governance principles is to make them short and sweet so everyone can remember! The principles establish the data culture, drive desired outcomes and form the Data Governance roadmap. Data Governance policies can then be developed for multiple principles and are used to measure Data Governance compliance. The policies outline the What, Why and Who level and support the operating procedures and standards. Implementing both Data Governance principles and policies into practice provides your team with the guidance and knowledge necessary to succeed.

>