You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Mar 25, 2021. It is now read-only.
I remember reading a book (JavaScript: The Good Parts? Clean Code? The Art of Unit Testing?) by one of the big influential programming folks and encountering a passage arguing in favor of this type of rule. The justification was something like these points:
It's easy to type +i or -i instead of --i
Standardizing widths of increments/decrements makes them easier to read
It's easy to forget the difference between ++i and i++
Perhaps this rule should be named increment-decrement and only ban the pre- operators, to offend fewer people? The first point is the only one of the three I feel would withstand a real debate.
The text was updated successfully, but these errors were encountered:
when used as an ExpressionStatement, no-unused-expression would warn you about that.
This is only an issue when assigning the result.
But even then you get an error from prefer-const because you never reassign the variable (if this is the only place you might have modified it).
See #1142.
I remember reading a book (JavaScript: The Good Parts? Clean Code? The Art of Unit Testing?) by one of the big influential programming folks and encountering a passage arguing in favor of this type of rule. The justification was something like these points:
+i
or-i
instead of--i
++i
andi++
Perhaps this rule should be named
increment-decrement
and only ban the pre- operators, to offend fewer people? The first point is the only one of the three I feel would withstand a real debate.The text was updated successfully, but these errors were encountered: