A Pub/Sub for Tables based data integration platform, to discover, publish, modify and consume data effortlessly.
-
Updated
Dec 2, 2025 - Python
A Pub/Sub for Tables based data integration platform, to discover, publish, modify and consume data effortlessly.
An end-to-end data pipeline which extracts divvy bikeshare data from web loads it into data lake and datawarehouse transforms it using dbt and finally , a dashboard to visualize the data using looker studio, the pipeline is orchestrated using prefect
Modeling tool like DBT to use SQL Alchemy core with a DataFrame interface like
This Repo contains all study, lab and supportive materials for Udemy course on "Google Cloud Professional Data Engineer - A Complete Guide".
💻💛Fundamental Data Engineering Course 2024 Week4 Learn DBT Transform Data with Models, Macro, ELT-Pipeline with Dagster 🌎
🌄📈📉 A Data Engineering Project 🌈 that implements an ELT data pipeline using Dagster, Docker, Dbt, Polars, Snowflake, PostgreSQL. Data from kaggle website 🔥
🛸 This project showcases an Extract, Load, Transform (ELT) pipeline built with Python, Apache Spark, Delta Lake, and Docker. The objective of the project is to scrape UFO sighting data from NUFORC and process it through the Medallion architecture to create a star schema in the Gold layer that is ready for analysis.
🍺 A data engineering project showcasing an ELT pipeline using modern technologies such as Delta-rs, and Apache Airflow.
This is an ELT data pipeline setup to track the activities of an e-commerce website based on orders, reviews, deliveries and shipment date. This project utilized technologies like Airflow, AWS RDS-Postgres, Python etc.
ETL process for BGG cloud data warehouse
ELT pipeline orchestrating DBT transformations with Apache Airflow utilizing Snowflake.
Enterprise ELT Framework using Airbyte, dbt, Prefect, and Power BI for seamless data extraction, transformation, and visualization. This project showcases a scalable pipeline integrating SQL Server, GCP, and tabular models in Power BI for real-time analytics and business intelligence. Ideal for data engineers and analysts seeking efficient ETL/ELT.
This repository demonstrates an end-to-end ELT (Extract, Load, Transform) pipeline that extracts data from a source PostgreSQL database, loads it into a destination PostgreSQL database, and performs data transformations using dbt (Data Build Tool).
Custom ELT pipeline for scraping job listings from 'Welcome to the Jungle' (France), transforming and cleaning the data, and visualizing it for job market analysis.
Data Engineering project which involves ETL using PostgreSQL and Python
End-to-end ELT pipeline for Jumia laptop data using Python, PostgreSQL, Airflow & Docker.
Modern ELT with Snowflake, dbt, and Star Schema for E-Commerce
End-to-end data pipeline that extracts Amazon product data via API, transforms it with dbt in Snowflake, and identifies market opportunities through Power BI analytics
This project was created as part of an assessment for DigitalXC AI. It demonstrates a cloud-based ELT pipeline using AWS MWAA, Airflow, dbt, PostgreSQL, and Superset. The pipeline automates data ingestion from S3, transformation with dbt, and visualization through Superset, following modern data engineering practices on a scalable AWS architecture.
Add a description, image, and links to the elt-pipeline topic page so that developers can more easily learn about it.
To associate your repository with the elt-pipeline topic, visit your repo's landing page and select "manage topics."