Collibra Catalog

Instructor:

Peter Princen, Product Manager

Objectives:

  • Explain approved data sets
  • Summarize ingestion of data
  • Demonstrate shopping for data

Learn everything about how to use the Collibra Catalog, which helps you find, understand, and trust the data you need. Collibra Catalog is a trusted, single source of intelligence for data experts and other data citizens who need quick access to enterprise data. By cataloging approved and trusted data sets and making them easily discoverable through semantic search, Collibra Catalog provides a new way for data users to find and access the data they need, evaluate its lineage, and even enrich its value.

Developer Portal, APIs and CLI

Instructor:

Julien Gilon, Product Manager

Objectives:

  • Plan to adapt to API v2
  • Utilize Collibra REST API
  • Develop using Collibra Command Line Interface (CLI)

Description:

We are excited to introduce the new site, Developer.collibra.com, providing a single source of truth for your Collibra development. Tutorials, API documentation and updates are available for Collibra development capabilities. We’ll also review documentation for automatically generating a REST client with Collibra REST API. In addition, you can now automate the creation and upload of workflows from the development environment. The discovery of API v1 is also automated, allowing for faster conversion to API v2. Finally, the Collibra Command Line Interface (CLI) is a lightweight application, allowing you to bootstrap and manage your projects.

Collibra Catalog 5.x: Rapid Data Discovery

Instructor:

Peter Princen, Product Manager

 Objectives:

  • Organize ingesting from data source
  • Experiment with Smart Catalog
  • Make use of the Data Dictionary

Description:

Learn everything about how to use the Collibra Catalog including registering a data source, such as CSV, Excel and JDBC support. Schedule a data refresh for the newly registered data sources with encrypted credentials. Take advantage of Smart Catalog, which will suggest recommended data sets and assets based on the browsing history of the logged in users and their peers. The Catalog also allows for shopping for data, with recommendations of data sets which may be of interest to you. Add data sets to your shopping basket and request access from the data owners. Access the Data Dictionary to review all the schemas you have registered within Collibra Platform and add them to existing data sets or create new ones. Finally, review the data profiling and data sampling available, to assess parameters of importance and review the corresponding graph.

Collibra Catalog and Tableau Integration

Instructor:

Yulia Prylypko, Inbound Product Manager

Course Objectives:

  • Explain Tableau integration process
  • Show lineage of Tableau assets
  • Certify Tableau reports

Description:

We will register the data source system as Tableau Server. By registering all physical data sources in Catalog, they become more easily discoverable by storing profiles and samples. Next, we will assign all Tableau Projects, Workbooks and Views to a community. Tableau sites will be synchronized to Catalog, allowing you to view the lineage of the assets using Traceability diagrams. As a result, data sets can be combined from different sources and Tableau reports can be certified. The integration allows for your Tableau data to be readily available in the Collibra Platform.

Collibra API: Rest and JAVA

Instructor:

Mathisse De Strooper, Product Manager

Objectives:

  • Examine creating assets via the REST API
  • Analyze Groovy script to develop workflows
  • Test for Relation API

Description:

We will provide an introduction to our Collibra API. We have introduced the API Version 2 in our new Collibra Platform 5.1 release. We will review our Java API, which can be used inside the workflows to automate business processes, and our REST API, which can be used to communicate with our Collibra Platform from external clients available. All operations which you can perform in the Collibra Platform on certain resources such as assets, domains, communities, relations, and attributes, etc, can be found in their corresponding API.

Getting Started: Assets and Asset Types

Instructor:

Ben Brendle, Presales Engineer

Course Objectives:

  • Explain the metamodel
  • Classify assets with asset types
  • Relate characteristics with attributes and relations

Description:

This course is designed for Data Stewards who are responsible for utilizing an organization’s data governance processes to ensure fitness of data elements, both the content and metadata. Data Stewards have a specialist role that incorporates processes, policies, guidelines and responsibilities for administering organization’s entire data in compliance with policy and or regulatory obligations. We’ll review assets and asset types. We will also demonstrate how to create characteristics with attributes and relations.

Getting Started: Business Glossary

Instructor:

Ben Brendle, Presales Engineer

Objectives:

  • Explain Business Glossary domain
  • Summarize good definitions
  • Demonstrate importing business terms

Description:

This course is designed for Data Stewards who are responsible for utilizing an organization’s data governance processes to ensure fitness of data elements, both the content and metadata. Data Stewards have a specialist role that incorporates processes, policies, guidelines and responsibilities for administering organization’s entire data in compliance with policy and or regulatory obligations. We will start to create business terms and evaluate creating good definitions. We’ll also apply a round-trip export-import process to import multiple business terms from an Excel spreadsheet.

Getting Started: Data Dictionary and Reference Data

Instructor:

Nandini Rajagopalan, Solution Engineer

Objectives:

  • Explain registering a data source with Data Dictionary
  • Relate physical data dictionary to schemas, tables and columns
  • Outline logical data models, data entities and data attributes with Reference Data

Description:

This course is designed for Technical Stewards who are involved in daily data-related decisions, execute business decisions and implement business requirements in a technology platform. We’ll review the following topics; use data dictionary to register a data source, view physical data models including schemas, tables and columns, examine data profiling, and identify logical data models using reference data.

>