Skip to content
View JarrodWade's full-sized avatar
  • Phoenix, AZ

Block or report JarrodWade

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Please don't include any personal information such as legal names or email addresses. Maximum 100 characters, markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this userโ€™s behavior. Learn more about reporting abuse.

Report abuse
JarrodWade/README.md

Hi there, I'm Jarrod ๐Ÿ‘‹

  • I'm currently working on the data analytics side, but I've been intrigued by Data Engineering lately. I'm always excited to learn cool new stuff, so this is where I keep my personal projects.

Certifications ๐Ÿ“š:

aws-certified-solutions-architect-associate

Projects:

  • ๐Ÿ‘จโ€โš•๏ธ FDA-Adverse-Events-Pipeline ๐Ÿ’Š

    • ๐Ÿฅ Fun project looking at Adverse Events as reported to the FDA, specifically focusing on statin drugs (and potentially associated effects).
      • Tech used: dbt, Airflow, Snowflake, AWS S3, Astronomer Cosmos, Python, SQL
  • ๐ŸŽฎ Video-Game-PySpark ๐ŸŽฎ

    • Project to practice PySpark and Athena skills, specifically looking at video game data using the RAWG API.
      • Tech used: PySpark, Amazon Athena, AWS S3, Jupyter Notebooks
  • ๐Ÿ“œ Planning-Center-Data-Pipeline ๐Ÿ˜„

    • This one actually gets used every month to support a data analytics project in my own personal community. This project started with automating some repetitive monthly tasks with Python and quickly evolved into a mini-DE project.
      • Tech used: Airflow, Docker, AWS S3, Python, Beautiful Soup (web scraping), and Google Apps API
  • ๐Ÿง‘โ€โœˆ๏ธ OpenSky-Flight-Data-Pipeline (In Progress...) ๐ŸŒ

    • Planning a trip to Japan ๐Ÿ—พ, so I wanted to do a near-real time analysis of flights from US <--> Japan using the OpenSky API. I am thinking of experimenting with Kafka / streaming on this one (more to come.)
  • ๐ŸŒฑ I'm currently learning:

    • Exploring Modern Data Tools / Platforms (dbt, Airflow, Databricks, Snowflake)
    • Everything Cloud. There's always more to learn when it comes to cloud. AWS is what I know best, but I'm always wanting to explore Azure and GCP more as well
    • Evolving Data Architecture best practices

Always looking to connect with folks who enjoy similar stuff - to discuss new opportunities, challenges, or whatever! ๐Ÿ‘

Email: jarrod.wadej@gmail.com

https://www.linkedin.com/in/jarrod-wade/

Pinned Loading

  1. fda-adverse-events-pipeline fda-adverse-events-pipeline Public

    Data Engineering pipeline to extract drug adverse events data, stage in AWS S3, transform with dbt, house in Snowflake for further analysis. All orchestrated with Astronomer Cosmos managed Airflow.

    Python

  2. planning-center-data-pipeline planning-center-data-pipeline Public

    Data Pipeline to extract Planning Center Online API data, perform validations with webscraping, upload to S3 and Google Sheets for further analysis. All orchestrated with Airflow within Docker image.

    Python