Online Course

[L3-DE] Productionizing Data Pipelines - Online

- - Online
[L3-DE] Productionizing Data Pipelines - Online

You have created a data pipeline with KNIME Analytics Platform. But how to put it into production so as to make the data available to end users? In this course, we will show you how to use KNIME Software to test and deploy a data transformation workflow, automate its deployment and enable the subsequent data monitoring, and maintenance. 

We will consider a use case of creating a data pipeline to manage the orders data for a restaurant franchise that receives data from various branches, demonstrate how to deploy the data transformation workflow manually or automatically, and how to schedule and trigger the execution of data pipelines in a production environment.

In the first session of this course, you will learn how to prepare a data transformation workflow for deployment. In the second session, you will be introduced to KNIME Business Hub and will learn how to deploy a data pipeline as a scheduled or triggered execution. Next, in the third session, you will learn types of data pipeline - ETL and ELT, and how to use the Continuous Deployment for Data Science (CDDS) extension framework to enable automated deployment on KNIME Business Hub. Finally, in the fourth session, you will learn about the best practices to productionize data pipelines: the principles of data governance - quality, security and cataloging, orchestration and performance optimization. 

This is an instructor-led course consisting of four, 75-minutes online sessions run by our KNIME data scientists. Each session has an exercise for you to complete at home and together, we will go through the solution at the start of the following session. The course concludes with a 15 to 30-minute wrap up session.

Course Content

  • Session 1: Preparing a Data Pipeline for Deployment
    Session 2: Introduction to KNIME Business Hub
    Session 3: ETL and ELT; Data Pipelines Validation and Deployment Automation
    Session 4: Best Practices when Productionizing Data Pipelines
    Session 5: Optional follow-up Q&A (15-30min)
Download agenda


I don’t see the course when I click on “Register now.” How can I register for the course?

You first need to create an account on the KNIME Learning Store. After you log on to the KNIME Learning Store, clicking on the “Register now” button will take you to the course web page.

What level of KNIME experience is needed for this course?

You should already know how to build workflows, access databases and files, use flow variables and components in KNIME Analytics Platform. We recommend taking L1-DW and L2-DW courses before attending this course.

How do I join the course?

You can join the course using the Zoom links found in your LearnUpon course page. You will also receive an email with the Zoom link one day prior to each session. Please note that each Zoom link is specific to a particular session. Make sure you have a stable internet connection!

What if I miss a session? Will I be able to watch a replay?

Sure! The sessions will be recorded and you’ll have access to each one for one month starting from the time the session is over.

Will I be able to ask questions?

Absolutely - fire away!

What do I need to have?

Your own laptop, ideally pre-installed with the latest version of KNIME Analytics Platform and required extensions (details will follow in the reminder email).

Where do I find the latest version of KNIME Analytics Platform?

Download the latest free, open source version of KNIME Analytics Platform here:

Do I need KNIME Business Hub to join this course?

You will be granted temporary access to KNIME Business Hub during this course to work on exercises. The credential to access KNIME Business Hub will be given to you on the first day of the course. You do not need to use your organization’s KNIME Business Hub for this course.

What other resources are available to help me?
You might also like Show all events

What are you looking for?