Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

modified StateofTexas to pull all datasets - Challenge 1 #9

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

vivaan998
Copy link

  • Used Promise.all() to make multiple requests in parallel (fetching collection and resource)
  • performed some basic operations to optimize the time and storage issues

Copy link
Contributor

@Nefsen402 Nefsen402 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice work, a little bit more complex than I would like but I see the reasoning.

]);

// Parsing the response data from JSON to objects
const [collectionsData, resourcesData] = await Promise.all([collectionsResponse.json(), resourcesResponse.json()]);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would prefer the fetch/json parsing to be is series instead of parallel.

if (!collectionIds) {

// Fetching collection_id data from the API and mapping the results to an array
const idData = await (await fetch('https://api.tnris.org/api/v1/collections_catalog/')).json();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please reuse the baseUrl here.

@@ -1,32 +1,53 @@
import fetch from 'node-fetch';
import moment from 'moment';

const baseUrl = 'https://api.tnris.org/api/v1';
let collectionIds = null;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This cache isn't a great idea, it's over complex and is confusing. The scrapers aren't run constantly, only on a infrequent but still reasonable schedule.

// Looping over each collection_id
for (const id of collectionIds) {
// Building the URLs for the collections and resources endpoints
var collections = `collections/${id}`;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please use let/const over var.

collectionIds = idData.results.map(function(item) { return item["collection_id"]; });
}

const data=[];
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please use proper whitespace around operators to be more consistent with the file, and the rest of the code base.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants