Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Missing credentials for scraper. CompanyId: hapoalim #268

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .eslintrc.js
Original file line number Diff line number Diff line change
Expand Up @@ -66,6 +66,7 @@ module.exports = {
'arrow-body-style': 'off',
'object-curly-newline': 'warn',
'semi': productionError,
'padded-blocks': 'off',
'no-return-assign': 'off',
'no-restricted-imports': ['error', {
name: 'electron-log',
Expand Down
7 changes: 1 addition & 6 deletions src/backend/import/bankScraper.ts
Original file line number Diff line number Diff line change
Expand Up @@ -24,19 +24,14 @@ type EmitProgressEventFunction = (eventCompanyId: string, message: string) => Pr
export async function scrape({
companyId, credentials, startDate, showBrowser = false
}: ScrapeParameters, emitProgressEvent: EmitProgressEventFunction, chromePath: string) {
if (!credentials || (!credentials.username && !credentials.num && !credentials.id) || !credentials.password) {
throw new Error(`Missing credentials for scraper. CompanyId: ${companyId}`);
}
Comment on lines -27 to -29
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I removed this. This is a complicated condition and its maintaining worth more than its value.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

suggesting to place the validation logic(s) at the input forms

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This validation was added because before we had it, the error in this case was cryptic.

You want to remove it because it fails too often in cases where there are credentials that are valid?

If this is the case, I suggest replacing the exception with a console

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

suggesting to place the validation logic(s) at the input forms

That's a good idea. Doesn't have to block this PR.

Its actually a good first issue.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You want to remove it because it fails too often in cases where there are credentials that are valid?

If this is the case, I suggest replacing the exception with a console

My point is it failed because of a bug in the code, and it is not so important to fail it earlier before the scraping.
I think it is very complicated logic for an almost useless use case. It is complicated because you need to validate the loginFields that differ in each scrapper.

suggesting to place the validation logic(s) at the input forms

And in any way, you can't insert an empty field through the UI.


const options = {
companyId, // mandatory; one of 'hapoalim', 'discount', 'otsarHahayal', 'leumiCard', 'isracard', 'amex'
startDate, // the date to fetch transactions from (can't be before the minimum allowed time difference for the scraper)
combineInstallments: false, // if set to true, all installment transactions will be combine into the first one
showBrowser, // shows the browser while scraping, good for debugging (default false)
verbose: false, // include more debug info about in the output
executablePath: chromePath,
prepareBrowser: async (_browser) => { },
preparePage: async (_page) => { }
executablePath: chromePath
};
const scraper = createScraper(options);
scraper.onProgress((eventCompanyId: string, payload: { type: string }) => {
Expand Down