Skip to content

My CodersLab final project, car-selling-portal web scraper with Django-RESTful api back end. MotoCrawler is using Scrapy spiders to collect data and feed it to PostgreSQL database. API is consumed by a simple React app with Axios promise management.

License

Notifications You must be signed in to change notification settings

PSarapata/Moto_Crawler

Repository files navigation

Moto_Crawler

My CodersLab final project, car selling websites web scraper built with Scrapy and Celery, using Django RESTful API and utilizing ReactJS Front End.

HomePage LoginPage
MotoCrawlerHomePage MotoCrawlerLoginPage
CloseUp Favourites
MotoCrawlerCarCards MotoCrawlerFavourites

Video Presentation

YoutubeLink <-- Click!

Prerequisites

You will find requirements.txt file in projects' root. Main technologies used:

  • Django (Python's framework)
  • Scrapy (Python's framework)
  • React.js (JavaScript's framework)
  • Axios (for data transfer between Back and Front ends)

Installation

To run locally I recommend to use virtual environment, mine is in the root, same directory with docs/, MotoCrawler/, react_frontend/.

  • First, run pip install -r requirements.txt.
  • Then, go to core/settings and replace all the necessary data there (I cut the fragile data for security reasons).
  • For Front, you will need npm, core.js, React, Material-UI (icons too), axios.
  • Be sure to run migrations before you try to fire off django project.

License

This project has a standard GPL-3 license.

Developer's journal

I am now working on documentation, therefore README will also undergo a large redecoration. If you are interested in the whole development process of MotoCrawler, have a read through Developer's Journal here: MotoCrawler's Dev Journal

Documentation, How-To?

MotoCrawler's documentation is hosted on GitHub Pages, here:

View MotoCrawler's documentation

Contact

Drop me a message on LinkedIn

About

My CodersLab final project, car-selling-portal web scraper with Django-RESTful api back end. MotoCrawler is using Scrapy spiders to collect data and feed it to PostgreSQL database. API is consumed by a simple React app with Axios promise management.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published