Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"Warn" ability, instead of fail #1480

Closed
aseemk opened this issue Dec 22, 2014 · 21 comments
Closed

"Warn" ability, instead of fail #1480

aseemk opened this issue Dec 22, 2014 · 21 comments
Labels
type: feature enhancement proposal

Comments

@aseemk
Copy link

aseemk commented Dec 22, 2014

We at @fiftythree ❤️ Mocha. We use it to test Mix and more.

As our test codebase has grown, there are a few test suites we'd like to run like usual, but not fail overall if they fail. We just want to be warned, sort of like pending tests.

I emailed the group about this a year ago, so starting a formal issue for this now:

https://groups.google.com/d/topic/mochajs/zpCTMFD5EQ8/discussion

Strawman proposal: describe.warn('foo') / it.warn('should bar'). Or, if you like cuteness, it.ideally('should bar'). =)

WDYT? Thanks for the consideration!

@rlidwka
Copy link

rlidwka commented Dec 22, 2014

Why not use pending tests (it.skip)?

@boneskull
Copy link
Contributor

@rlidwka In the discussion he mentions why this is not desirable.

@boneskull boneskull added the type: feature enhancement proposal label Dec 22, 2014
@dasilvacontin
Copy link
Contributor

This could be useful for the HTTP tests.

@travisjeffery
Copy link
Contributor

Not really, you're basically saying 'I don't care about those tests' at that point.

@travisjeffery
Copy link
Contributor

A feature I've been thinking about that would help with these kinds of tests, would be having an eventually/retry feature per test case. E.g. if the test fails, it'll automatically retry a few times after a delay.

@dasilvacontin
Copy link
Contributor

Yeah, that would be way better. Having the travis output repeat 3x just because of the http test is a bit annoying.

@kpheng
Copy link

kpheng commented Jan 27, 2015

@travisjeffery

Retry support would be awesome

@dasilvacontin
Copy link
Contributor

Just throwing out some ideas:

try(3).it('should ...', fn);

retry(2).it('should ...', fn);

describe('Suite', function () {

  this.retry(3);

  it('should ...', fn);

  it('should ...', function (done) {
    this.retry(2);
    ...
  })

  it('should ...', function (done) {
    this.timeout(700).retry(5);
    ...
  })

})

@danielstjules
Copy link
Contributor

So what sort of status code should the process exit with if a spec failed, but was only a "warning"? How would you notice it? If it's a non-zero code, it's going to cause the build in your CI system to fail. If it just exits with zero, then you're expected to look at the test results and notice the warning?

If this is in the context of CI, then one solution is to split up your test suite. E.g. test/unit, test/functional, test/allowed-failures. You would run the first two directories normally. But for the third, you'd break it into a separate command: mocha test/allowed-failures || true. The suite will run as normally, but will not fail your build on error.

@boneskull
Copy link
Contributor

For reference, here's the text of the forum post:

Longtime user and fan of Mocha. Really nice work and thanks as always for all the amazing contributions to the Node world.

I'm running into a tricky timing bug that causes a few of my tests (just 1-2%) to fail randomly. This breaks my CI, which is cumbersome since there are many tests and we haven't optimized CI and test performance.

So for now, I'd love to be able to mark those test cases as "okay to fail". I know I can skip them, but I'd like to still run them, so I stay aware of whether (and how often) they're still failing. But, I wouldn't like Mocha to return a non-zero exit code if only those tests fail. Sort of like "warnings" rather than "failures".

Is that possible? If not, is that a feature you'd be open to adding?

@boneskull
Copy link
Contributor

@aseemk Would wrapping the finicky tests each in a try/catch work? Dump something to STDOUT...

@jbnicolai
Copy link

Closing this in favor of #1773, which it seems would also support your usecase.

If you feel strongly about this feature, feel free to comment. However, since you're the only person asking for it you'll probably have to put in the work of a PR with documentation and tests yourself ;-)

@aseemk
Copy link
Author

aseemk commented Jul 6, 2015

Thanks @jbnicolai, and everyone for chiming in. Sorry I've been unresponsive here.

#1773 doesn't really address our need, but I won't argue — if no one else seems to find value in this idea, alas. =) Thanks for the consideration anyway!

If you're interested in why / more details, tl;dr:

  • As we move to more microservices, some service dependencies can be temporarily down; retrying right away won't help in those cases.
  • We do know about Travis's allow_failures, as well as .skip (we actually use --grep '#skipci' --invert instead, for flexibility). Those are good hammers, but not everything is a nail.
  • In particular, we're interested in this feature for individual tests (rather than entire suites). And running just those tests separately isn't straightforward; nor is repeating the entire suites efficient.
  • Yes, we do view test output even if it passes — when we run tests locally.

I guess this extended beyond a tl;dr. =)

Our failure rate these days is low enough that we're okay living with this right now, so I don't anticipate myself getting to a documentation+tests PR, but thanks for letting me know that's an option!

@jbnicolai
Copy link

@aseemk thanks for the thorough explanation, could be useful if we ever revisit this in the future. And if you ever change your mind about implementing this yourself, feel free.

@aseemk
Copy link
Author

aseemk commented Jul 6, 2015

👍

Btw, I appreciate your warm and courteous attitude, @jbnicolai. =)

@danielstjules
Copy link
Contributor

In particular, we're interested in this feature for individual tests (rather than entire suites). And running just those tests separately isn't straightforward; nor is repeating the entire suites efficient.

@aseemk Gotcha. That's where both retries as well as my suggestion fall apart, sorry :( However, I think we've got another PR that might help: #1618 With dynamic skip, the resulting specs would look like:

it('spec that might fail', function(done) {
  someAsyncTaskThatMightFail(function(err, res) {
    if (err) return this.skip();

    // otherwise make assertions
  }.bind(this));
});

It doesn't have the benefit of a separate section in the reporter output (e.g. passing, failing, pending, warnings), but it seems to achieve the desired behaviour?

@aseemk
Copy link
Author

aseemk commented Jul 7, 2015

That does sound like it'll help, @danielstjules. Thanks!

It can even get abstracted out into a generic helper which can intercept done... which could then get patched onto it or describe. =)

Agreed that having a separate "warnings" section would be ideal, but even just seeing it in "pending" will be helpful. Thanks again! Looking forward to this.

@gmcdev
Copy link

gmcdev commented Jun 26, 2017

We are using local testing as a way of communicating requirements to the individuals developing our production web-components. It would be very helpful to have a "warnings" highlight in the output for artifacts that are not desirable, but still pass the boilerplate spec.

@kking937
Copy link

In my opinion and on my own project, it would be nice to give a warning for tests that I know I need to implement, but I don't have time to do so right now. So that I can run 'npm test', not have it fail, but see the specific tests that I need to write still.

@boneskull
Copy link
Contributor

@kking937 you can do this by not passing a callback to it():

it('should be written later');

@kellyselden
Copy link

Here is a simplified workaround

it.allowFail = (title, callback) => {
  it(title, function() {
    return Promise.resolve().then(() => {
      return callback.apply(this, arguments);
    }).catch(() => {
      this.skip();
    });
  });
};

it.allowFail('skip on error', function() {
  assert.ok(false);
});

or use it here if you want.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type: feature enhancement proposal
Projects
None yet
Development

No branches or pull requests