Source Systems, Data Ingestion, and Pipelines

This course is part of DeepLearning.AI Data Engineering Professional Certificate

Instructor: Joe Reis

What you'll learn

  •   Gather stakeholder needs and translate them into system requirements.
  •   Implement a batch and a streaming ingestion process on AWS to ingest data from various source systems.
  •   Integrate aspects of security, data management, DataOps and orchestration into the data systems you build.
  • Skills you'll gain

  •   Apache Airflow
  •   NoSQL
  •   Amazon CloudWatch
  •   Terraform
  •   Infrastructure as Code (IaC)
  •   CI/CD
  •   Data Pipelines
  •   Data Quality
  •   Extract, Transform, Load
  •   Data Integration
  •   Amazon Web Services
  •   Data Processing
  • There are 4 modules in this course

    In this course, you will explore various types of source systems, learn how they generate and update data, and troubleshoot common issues you might encounter when trying to connect to these systems in the real world. You’ll dive into the details of common ingestion patterns and implement batch and streaming pipelines. You’ll automate and orchestrate your data pipelines using infrastructure as code and pipelines as code tools. You’ll also explore AWS and open source tools for monitoring your data systems and data quality.

    Data Ingestion

    DataOps

    Orchestration, Monitoring, and Automating Your Data Pipelines

    Explore more from Cloud Computing

    ©2025  ementorhub.com. All rights reserved