Smart crawler that gets all USP disciplines and their prerequisites separated by department.
❗ | Obs: It's needed an operational system with a GUI |
---|
Having Node.js and yarn installed, choose one of the following execution types below.
All the results you will be stored as a list of departments in a single file called data.json
under /data
directory. This execution type is the most expensive.
Run:
$> yarn install
$> yarn start singlefile
A firefox window will open and the crawler will start, this can take a few hours.
For each department, a file called DEP_CODE.json
will be generated under /data
directory storing all the data from the department with code DEP_CODE
.
Run:
$> yarn install
$> yarn start multifile
A firefox window will open and the crawler will start, this can take a few hours.
The Classes structure is explicit in this UML Class Diagram: