Business Intelligence and Data Warehousing Project
-
Updated
Dec 4, 2019 - TSQL
Business Intelligence and Data Warehousing Project
Syracuse University, Masters of Applied Data Science - IST 722 Data Warehouse
Designed and implemented a complete Data Warehouse solution, defining data architecture, designing multi-layer (Bronze, Silver, Gold) E.T.L. pipelines, and building star schema models in SQL scripts to transform and load data from multiple CRM and ERP systems, with final data visualized using Tableau.
Data Lakehouse course final project (5th semester). This project implements ETL (Extract, Transform, Load) pipeline using Pentaho Data Integration (Kettle) to build a data warehouse focused on new student admissions data from three sources.
This is an ETL system designed using Pentaho to transfer multiple data tables between servers in one execution. In each transfer data process, the system can perform other tasks, such as generating a file, and sending generated file to email and/or SFTP.
This ETL system is designed to clean up data in the FTP folder by removing unused files or folders with names containing specific dates that match the configurations of date ranges and date patterns in the parameter table.
Add a description, image, and links to the extract-transform-load topic page so that developers can more easily learn about it.
To associate your repository with the extract-transform-load topic, visit your repo's landing page and select "manage topics."