Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

exception in DataProvider doesn't fail test run #217

Closed
cjyar opened this issue May 10, 2012 · 24 comments · Fixed by #1579, #2258 or #2748
Closed

exception in DataProvider doesn't fail test run #217

cjyar opened this issue May 10, 2012 · 24 comments · Fixed by #1579, #2258 or #2748

Comments

@cjyar
Copy link

cjyar commented May 10, 2012

If my @dataProvider method throws an exception, the TestNG report correctly shows that exception. However, the exit status from TestNG execution is zero, indicating that all tests passed. If a DataProvider fails, the build should fail as well.

@cbeust
Copy link
Collaborator

cbeust commented Jun 16, 2012

There is something a bit odd going on. I added the following trace:

System.out.println("Exiting with status " + testng.getStatus());
System.exit(testng.getStatus());

I run a test with a failing data provider and I get:

# 

SingleSuite

# Total tests run: 1, Failures: 1, Skips: 0

Exiting with status 2

However, the shell reports:

$ echo $#
0

@sabaatworld
Copy link

i am facing the same issue. what is the solution for this? i am using testNG.hasFailure to determine if there are any failures.. Even thought the report and console show a failure, hasFailure is false.

@orange-buffalo
Copy link

I am facing this issue with version 6.9.10.
Consider the following test:

import org.testng.annotations.DataProvider;
import org.testng.annotations.Test;

public class DataProviderTest {
    public static final String DP_NAME = "dataProvider";

    @DataProvider(name = DP_NAME)
    public static Object[][] getData() {
        throw new IllegalStateException("guess me!");
    }

    @Test(dataProvider = DP_NAME)
    public void test() {
    }
}

The result of its execution is skipped test instead of failure.
The reason is in Invoker.java:1084

ParameterBag bag = createParameters(...);
if (bag.hasErrors()) {
      ...
      tr.setStatus(ITestResult.SKIP);    // should be FAILURE

Guys, would you accept a pull request with the change to set test status to Failure for this case?

krmahadevan added a commit to krmahadevan/testng that referenced this issue Oct 18, 2017
krmahadevan added a commit to krmahadevan/testng that referenced this issue Oct 19, 2017
Closes testng-team#217

Following is the gist of changes done:
* Deprecated the nested ExitCodeListener and added
a new one.
* Added null checks to ensure that an ITestResult
object created based on a Description object is not
null. Description Object is typically seen to be null
when JUnit encounters problems in parsing a method 
as a test method.
* The new ExitCodeListener now considers pass/fail/
skip methods through-out the entire suite of suites
to determine the effective return code (or) status.
So altered some tests to now assert on the new status
* Removed an invalid JUnit @test (annotation on a 
static method is not valid in JUnit world)
* Since new ExitCodeListener is a reporter backed
listener, altered a test that asserts on the reporter
count. (UniqueReporterInjectionTest.java)
@orange-buffalo
Copy link

This is still reproducible with 7.1.0 (same test case as in my previous comment).

@krmahadevan
Copy link
Member

@orange-buffalo - In your previous comment you had mentioned that you would like to submit a PR. Please feel free to raise a PR to fix the issue and we will be more than glad to get it merged.

@krmahadevan krmahadevan reopened this Jan 29, 2020
@krmahadevan krmahadevan added this to the 7.2 milestone Jan 29, 2020
orange-buffalo added a commit to orange-buffalo/testng that referenced this issue Jan 30, 2020
… handling and always marking the test as failed in case of any exception
@juherr
Copy link
Member

juherr commented Jan 30, 2020

I don't know what is the expected test status here.
If I remember well @cbeust 's approach, a failed test is only when the test fails. When a configuration method fails then the test cannot be failed because it didn't run.
@cbeust Could you confirm?

If "skipped" is not the expected status, maybe an option could be used in order to configure the behavior. We can imagine too that the run will stop because a configuration method is not supposed to fail.
@krmahadevan @cbeust Thought?

@krmahadevan
Copy link
Member

@juherr - I concur with you as well.

A data provider is kind of like a prerequisite needed for the test method. So if the prerequisite failed, then the test should be skipped is what I feel as well.

@orange-buffalo
Copy link

The thing is that in real world data provider methods do fail, even if they are not supposed to. If data provider is more complicated then just returning constants (like in integration testing where more complex data might be required), there are chances of mistakes that cause exceptions in data providers. Current behaviour of skipping the tests totally hides these problems (Maven / Gradle / etc build will be successful). Perhaps you can share some use cases with us, but I can't imaging when a user would like the test to be silently skipped if their data provider failed with NPE.

Maybe TestNG team could consider an explicit declaration of a pre-condition that causes the test to be skipped? For instance, if test code detected that precondition is not met, it could throw a "skip test exception", which would be a clear and distinct indicator (unlike "any exception") for the engine to skip the test.

orange-buffalo added a commit to orange-buffalo/testng that referenced this issue Feb 29, 2020
… handling and always marking the test as failed in case of any exception
@krmahadevan
Copy link
Member

krmahadevan commented Mar 1, 2020

@orange-buffalo @juherr - I think we can fix this without any changes within TestNG.

The following TestNG Listener can be added (maybe as a mandatory listener)

import org.testng.IExecutionListener;
import org.testng.TestNG;

public class NoTestsFoundWarning implements IExecutionListener {

  @Override
  public void onExecutionFinish() {
    int status = TestNG.getDefault().getStatus();
    if (status == 2) {
      throw new IllegalStateException("No tests were executed.");
    }
  }
}

The added project demonstrates the solution in action.

issue_217.zip

krmahadevan added a commit to krmahadevan/testng that referenced this issue Mar 3, 2020
Closes testng-team#217

Currently TestNG doesn’t throw an exception which
build tools can leverage to fail a build, when all
the “@test” methods get skipped and nothing was run.
TestNG returns the proper status, but sometimes 
build tools may not be reading that status and take
appropriate action.

Introduced a new configuration through this PR using
which a user can instruct TestNG itself to trigger
an exception when no tests were executed.

If one is using surefire plugin, then your surefire
plugin configuration would look like below:


<configuration>
  <properties>
    <property>
      <name>failwheneverythingskipped</name>
      <value>true</value>
    </property>
  </properties>
</configuration>

If you are using command line, then this feature
can be turned on by passing 
-failwheneverythingskipped=true
krmahadevan added a commit to krmahadevan/testng that referenced this issue Mar 3, 2020
Closes testng-team#217

Currently TestNG doesn’t throw an exception which
build tools can leverage to fail a build, when all
the “@test” methods get skipped and nothing was run.
TestNG returns the proper status, but sometimes 
build tools may not be reading that status and take
appropriate action.

Introduced a new configuration through this PR using
which a user can instruct TestNG itself to trigger
an exception when no tests were executed.

If one is using surefire plugin, then your surefire
plugin configuration would look like below:


<configuration>
  <properties>
    <property>
      <name>failwheneverythingskipped</name>
      <value>true</value>
    </property>
  </properties>
</configuration>

If you are using command line, then this feature
can be turned on by passing 
-failwheneverythingskipped=true
krmahadevan added a commit to krmahadevan/testng that referenced this issue Mar 3, 2020
Closes testng-team#217

Currently TestNG doesn’t throw an exception which
build tools can leverage to fail a build, when all
the “@test” methods get skipped and nothing was run.
TestNG returns the proper status, but sometimes 
build tools may not be reading that status and take
appropriate action.

Introduced a new configuration through this PR using
which a user can instruct TestNG itself to trigger
an exception when no tests were executed.

If one is using surefire plugin, then your surefire
plugin configuration would look like below:


<configuration>
  <properties>
    <property>
      <name>failwheneverythingskipped</name>
      <value>true</value>
    </property>
  </properties>
</configuration>

If you are using command line, then this feature
can be turned on by passing 
-failwheneverythingskipped=true
krmahadevan added a commit to krmahadevan/testng that referenced this issue Mar 3, 2020
Closes testng-team#217

Currently TestNG doesn’t throw an exception which
build tools can leverage to fail a build, when all
the “@test” methods get skipped and nothing was run.
TestNG returns the proper status, but sometimes 
build tools may not be reading that status and take
appropriate action.

Introduced a new configuration through this PR using
which a user can instruct TestNG itself to trigger
an exception when no tests were executed.

If one is using surefire plugin, then your surefire
plugin configuration would look like below:


<configuration>
  <properties>
    <property>
      <name>failwheneverythingskipped</name>
      <value>true</value>
    </property>
  </properties>
</configuration>

If you are using command line, then this feature
can be turned on by passing 
-failwheneverythingskipped=true
krmahadevan added a commit that referenced this issue Mar 3, 2020
Closes #217

Currently TestNG doesn’t throw an exception which
build tools can leverage to fail a build, when all
the “@test” methods get skipped and nothing was run.
TestNG returns the proper status, but sometimes 
build tools may not be reading that status and take
appropriate action.

Introduced a new configuration through this PR using
which a user can instruct TestNG itself to trigger
an exception when no tests were executed.

If one is using surefire plugin, then your surefire
plugin configuration would look like below:


<configuration>
  <properties>
    <property>
      <name>failwheneverythingskipped</name>
      <value>true</value>
    </property>
  </properties>
</configuration>

If you are using command line, then this feature
can be turned on by passing 
-failwheneverythingskipped=true
@gorbysbm
Copy link

gorbysbm commented May 15, 2020

@orange-buffalo @juherr - I think we can fix this without any changes within TestNG.

The following TestNG Listener can be added (maybe as a mandatory listener)

import org.testng.IExecutionListener;
import org.testng.TestNG;

public class NoTestsFoundWarning implements IExecutionListener {

  @Override
  public void onExecutionFinish() {
    int status = TestNG.getDefault().getStatus();
    if (status == 2) {
      throw new IllegalStateException("No tests were executed.");
    }
  }
}

The added project demonstrates the solution in action.

issue_217.zip

Thanks for the suggestion, but this doesn't seem to report the failure at the level of the @test that had the failure. It's just a broadly thrown failure in setup that doesn't mark that specific test as failed. i.e. The test execution doesn't make it to my @AfterMethod listener so that I can mark the test as failed.

@sskorol
Copy link
Contributor

sskorol commented Jul 26, 2021

@krmahadevan @juherr I believe the above fix doesn't solve the issue that folks have described.

A data provider may contain steps to put the system into initial state for testing. So if any failure occurs on this phase it's important to notify a build and reporting system that something went wrong with particular test. But when a test is skipped, the build is green.

I have nothing against the actual skipping logic. But I agree with the other folks that it's important to let users override the state when the configuration method has failed.

There might also be a logic when we can throw a custom exception to achieve the same goal. In this case, we do want to see a red test with the root cause of the preparation step failure. On the other hand, sometimes we may want to intentionally skip tests, if e.g. there are open issues attached to them.

Technically, it could be achieved with the existing ITestListener, e.g.:

    @Override
    public void onTestSkipped(final ITestResult result) {
        StreamEx
                .iterate(result.getThrowable(), Objects::nonNull, Throwable::getCause)
                .dropWhile(cause -> !cause.toString().startsWith(AssertionError.class.getName()))
                .collect(last())
                .ifPresent(cause -> {
                    result.setThrowable(cause);
                    result.setStatus(TestResult.FAILURE);
                });
    }

But I noticed a race condition somewhere in the TestNG core, as despite the actual status change to FAILURE, TestNG still somehow mutates it back to SKIPPED.

You can run the following test N times to see the flakiness:

    @DataProvider
    public Iterator<Object[]> data() {
        throw new AssertionError("some message");
    }

    @Test(dataProvider = "data")
    public void shouldFail(String data) {}

Sometimes, it's marked as skipped, sometimes - as failed with the above listener.

@krmahadevan
Copy link
Member

@sskorol

But when a test is skipped, the build is green.

So does it mean that the build tool is not recognising the TestNG return code and then taking an appropriate action or is it the TestNG integration (For e.g., it would be surefire for maven) that is perhaps at fault here? I didnt quite understand this line as to how TestNG would be at fault for not failing a build.

In this case, we do want to see a red test with the root cause of the preparation step failure. On the other hand, sometimes I may want to intentionally skip tests, if e.g. there are open issues attached to them.

I was under the impression that the behaviour is now consistent with what a configuration failure results in. To me a data provider is kind of like a pre-requisite for running a test method. A configuration establishes the pre-requisite setup, a data provider establishes the pre-requisite test data.
Are you saying that you are seeing something different than what exists when a configuration failure is involved ?

@sskorol
Copy link
Contributor

sskorol commented Jul 26, 2021

@krmahadevan

I didnt quite understand this line as to how TestNG would be at fault for not failing a build.
Are you saying that you are seeing something different than what exists when a configuration failure is involved ?

Skipped tests don't affect the exit code the way how failed tests do. But the whole discussion is not about someones fault. It's about the way how people interpret the failures in the data provider and what options TestNG provides to customize this behaviour. There are lots of scenarios when users DO want to override the status code (some of them are described above). I wouldn't focus on exits codes. Personally to me it's more important to see the status I set if the data provider fails. Even with the above listener it doesn't work as expected and I still see a skipped test in report.

@juherr
Copy link
Member

juherr commented Jul 26, 2021

@sskorol @krmahadevan
IMO the return code should be relevant for the CLI integration only.
For the other integrations, listeners like IConfigurationListener should be enough to define the expected behavior on test methods.
Maybe the way to find the link between a test method and its configuration methods is not easy enough, right?

@krmahadevan
Copy link
Member

@juherr

Maybe the way to find the link between a test method and its configuration methods is not easy enough, right?

You mean a test method and its corresponding data provider right ? Yeah, I guess this needs a bit more investigation. Let me write up some samples and see if I can get more info on this and I will post them back.

Personally to me it's more important to see the status I set if the data provider fails. Even with the above listener it doesn't work as expected and I still see a skipped test in report.

It should be the case. But let me dig in and see what is going on. Maybe there's a bug lurking there somewhere which resets statuses back to what TestNG knows of.

@juherr
Copy link
Member

juherr commented Jul 26, 2021

@krmahadevan I mixed configuration methods and data provider methods. What do you think to add an IDataProviderListener in order to handle events around the data provider?

@krmahadevan
Copy link
Member

@juherr - Guess what. we already have a org.testng.IDataProviderListener which has a method to eavesdrop into data provider failures :)

@sskorol

Please check if the below sample is what you are looking for ? This fails the build when I run it via Maven.

import org.testng.annotations.DataProvider;
import org.testng.annotations.Test;

public class MyTestClass {

  @Test(dataProvider = "dp")
  public void testMethod(int i) {
  }

  @DataProvider(name = "dp")
  public Object[][] getData() {
    throw new RuntimeException("Failing the data-provider");
  }
}
import org.testng.IDataProviderListener;
import org.testng.ITestContext;
import org.testng.ITestNGMethod;

public class SimpleDPListener implements IDataProviderListener {

  @Override
  public void onDataProviderFailure(ITestNGMethod method, ITestContext ctx, RuntimeException t) {
    System.err.println(
        method.getMethodName() + "() failed with the exception " + t.getCause().getMessage());
  }
}
import java.util.List;
import java.util.stream.Collectors;
import org.testng.ITestContext;
import org.testng.ITestListener;
import org.testng.ITestResult;
import org.testng.TestNGException;
import org.testng.internal.TestResult;

public class TestFailingListener implements ITestListener {

  @Override
  public void onFinish(ITestContext context) {
    List<ITestResult> dataDrivenTestMethods = context.getSkippedTests()
        .getAllResults().stream()
        .filter(each -> each.getMethod().isDataDriven())
        .filter(each -> each.getSkipCausedBy().isEmpty())
        .filter(each -> (each instanceof TestResult))
        .peek(each -> each.setStatus(ITestResult.FAILURE))
        .collect(Collectors.toList());
    if (!dataDrivenTestMethods.isEmpty()) {
      String msg = prettify((dataDrivenTestMethods))
          + " were skipped due to a failure in the underlying data provider. Failing the test execution";
      throw new TestNGException(msg);
    }
  }

  private static String prettify(List<ITestResult> results) {
    return results.stream()
        .map(each -> each.getMethod().getQualifiedName() + "()")
        .collect(Collectors.joining());
  }
}
<suite name="ExcerciseSuiteA" verbose="2" preserve-order="false">
  <listeners>
    <listener class-name="com.rationaleemotions.github.issue217.SimpleDPListener"/>
    <listener class-name="com.rationaleemotions.github.issue217.TestFailingListener"/>
  </listeners>
  <test name="TestA" verbose="2">
    <classes>
      <class name="com.rationaleemotions.github.issue217.MyTestClass"/>
    </classes>
  </test>
</suite>

Execution output

...
... TestNG 7.4.0 by C?dric Beust (cedric@beust.com)
...

testMethod() failed with the exception Failing the dataprovider
[Utils] [ERROR] [Error] java.lang.RuntimeException: Failing the dataprovider
        at com.rationaleemotions.github.issue217.MyTestClass.getData(MyTestClass.java:16)
        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.base/java.lang.reflect.Method.invoke(Method.java:566)
        at org.testng.internal.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:133)
        at org.testng.internal.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:77)
        at org.testng.internal.MethodInvocationHelper.invokeMethodNoCheckedException(MethodInvocationHelper.java:46)
        at org.testng.internal.MethodInvocationHelper.invokeDataProvider(MethodInvocationHelper.java:146)
        at org.testng.internal.Parameters.handleParameters(Parameters.java:798)
        at org.testng.internal.Parameters.handleParameters(Parameters.java:740)
        at org.testng.internal.ParameterHandler.handleParameters(ParameterHandler.java:59)
        at org.testng.internal.ParameterHandler.createParameters(ParameterHandler.java:38)
        at org.testng.internal.TestInvoker$MethodInvocationAgent.invoke(TestInvoker.java:791)
        at org.testng.internal.TestInvoker.invokeTestMethods(TestInvoker.java:146)
        at org.testng.internal.TestMethodWorker.invokeTestMethods(TestMethodWorker.java:146)
        at org.testng.internal.TestMethodWorker.run(TestMethodWorker.java:128)
        at java.base/java.util.ArrayList.forEach(ArrayList.java:1540)
        at org.testng.TestRunner.privateRun(TestRunner.java:794)
        at org.testng.TestRunner.run(TestRunner.java:596)
        at org.testng.SuiteRunner.runTest(SuiteRunner.java:377)
        at org.testng.SuiteRunner.runSequentially(SuiteRunner.java:371)
        at org.testng.SuiteRunner.privateRun(SuiteRunner.java:332)
        at org.testng.SuiteRunner.run(SuiteRunner.java:276)
        at org.testng.SuiteRunnerWorker.runSuite(SuiteRunnerWorker.java:53)
        at org.testng.SuiteRunnerWorker.run(SuiteRunnerWorker.java:96)
        at org.testng.TestNG.runSuitesSequentially(TestNG.java:1212)
        at org.testng.TestNG.runSuitesLocally(TestNG.java:1134)
        at org.testng.TestNG.runSuites(TestNG.java:1063)
        at org.testng.TestNG.run(TestNG.java:1031)
        at org.apache.maven.surefire.testng.TestNGExecutor.run(TestNGExecutor.java:284)
        at org.apache.maven.surefire.testng.TestNGXmlTestSuite.execute(TestNGXmlTestSuite.java:75)
        at org.apache.maven.surefire.testng.TestNGProvider.invoke(TestNGProvider.java:119)
        at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:428)
        at org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:162)
        at org.apache.maven.surefire.booter.ForkedBooter.run(ForkedBooter.java:562)
        at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:548)

[ERROR] 
com.rationaleemotions.github.issue217.MyTestClass.testMethod() were skipped due to a failure in the underlying data provider. Failing the test execution
[INFO] 
[INFO] Results:
[INFO] 
[INFO] Tests run: 0, Failures: 0, Errors: 0, Skipped: 0
[INFO] 
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  3.460 s
[INFO] Finished at: 2021-07-27T09:32:26+05:30
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:3.0.0-M5:test (default-test) on project testng_playground: There are test failures.
[ERROR] 
[ERROR] Please refer to /githome/playground/testng_playground/target/surefire-reports for the individual test results.
[ERROR] Please refer to dump files (if any exist) [date].dump, [date]-jvmRun[N].dump and [date].dumpstream.
[ERROR] There was an error in the forked process
[ERROR] 
[ERROR] com.rationaleemotions.github.issue217.MyTestClass.testMethod() were skipped due to a failure in the underlying data provider. Failing the test execution
[ERROR] org.apache.maven.surefire.booter.SurefireBooterForkException: There was an error in the forked process
[ERROR] 
[ERROR] com.rationaleemotions.github.issue217.MyTestClass.testMethod() were skipped due to a failure in the underlying data provider. Failing the test execution
[ERROR]         at org.apache.maven.plugin.surefire.booterclient.ForkStarter.fork(ForkStarter.java:733)
[ERROR]         at org.apache.maven.plugin.surefire.booterclient.ForkStarter.run(ForkStarter.java:305)
[ERROR]         at org.apache.maven.plugin.surefire.booterclient.ForkStarter.run(ForkStarter.java:265)
[ERROR]         at org.apache.maven.plugin.surefire.AbstractSurefireMojo.executeProvider(AbstractSurefireMojo.java:1314)
[ERROR]         at org.apache.maven.plugin.surefire.AbstractSurefireMojo.executeAfterPreconditionsChecked(AbstractSurefireMojo.java:1159)
[ERROR]         at org.apache.maven.plugin.surefire.AbstractSurefireMojo.execute(AbstractSurefireMojo.java:932)
[ERROR]         at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:137)
[ERROR]         at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:210)
[ERROR]         at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:156)
[ERROR]         at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:148)
[ERROR]         at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:117)
[ERROR]         at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:81)
[ERROR]         at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:56)
[ERROR]         at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128)
[ERROR]         at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:305)
[ERROR]         at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:192)
[ERROR]         at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:105)
[ERROR]         at org.apache.maven.cli.MavenCli.execute(MavenCli.java:957)
[ERROR]         at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:289)
[ERROR]         at org.apache.maven.cli.MavenCli.main(MavenCli.java:193)
[ERROR]         at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[ERROR]         at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[ERROR]         at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[ERROR]         at java.base/java.lang.reflect.Method.invoke(Method.java:566)
[ERROR]         at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:282)
[ERROR]         at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:225)
[ERROR]         at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:406)
[ERROR]         at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:347)
[ERROR] 
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException

Note:
I believe that the line

[INFO] Results:
[INFO] 
[INFO] Tests run: 0, Failures: 0, Errors: 0, Skipped: 0
[INFO] 
[INFO] ------------------------------------------------------------------------

could be misleading because the underlying reporter org.testng.reporters.TextReporter

  • computes this data during onFinish(ITestContext)
  • The Listener I have also works on the same onFinish(ITestContext) and there's no listener ordering right now.

But I think this demonstrates two things that you had asked for

  • I would like to know when a data provider fails (We have a listener for that)
  • I would like to control the outcome of the build when a test skips due to a data provider failure.

@krmahadevan
Copy link
Member

Ok.. That approach is not going to work. Please ignore that above comment @sskorol
It kind of aborts the TestNG execution and thus causing downstream reporting listeners to NOT be executed, which is kind of like a morbid user experience. I will dig in more.

@krmahadevan
Copy link
Member

The below version of the listener should do the trick (I have tried this with TestNG 7.4.0 and I dont see any inconsistencies. I tried running this continuously for over 10 times and the result is consistent. The build fails)

import org.testng.ITestListener;
import org.testng.ITestNGMethod;
import org.testng.ITestResult;
import org.testng.internal.TestResult;

public class TestFailingListener implements ITestListener {

  private final Object lock = new Object();

  @Override
  public void onTestSkipped(ITestResult result) {
    ITestNGMethod tm = result.getMethod();
    if (!tm.isDataDriven()) {
      return;
    }
    if (!result.getSkipCausedBy().isEmpty()) {
      return;
    }
    if (!(result instanceof TestResult)) {
      return;
    }
    result.setStatus(ITestResult.FAILURE);
    synchronized (lock) {
      result.getTestContext().getSkippedTests().removeResult(result);
      result.getTestContext().getFailedTests().addResult(result);
    }
  }
}

@sskorol
Copy link
Contributor

sskorol commented Jul 27, 2021

@krmahadevan

I recorded a demo where it still reports a skipped status once in several runs: https://youtu.be/A0hY2wtk72A

The same behaviour was with the listener I posted in one of my previous messages. Still wondering who can change the actual status. Moreover, in some reports a test is marked as skipped, in others - failed. So it seems like the problem is in the listeners' execution order. Each build tool/reporter implements its own set of listeners. So if our custom one is executed after the results are written by the other listeners, we can see this inconsistency. Wondering if TestNG can provide a special configuration for listeners that may adjust the execution order, e.g. some sort of priority?

@krmahadevan
Copy link
Member

@sskorol - Thanks for sharing that video.

After I had shared the sample, I noticed that when I opened up a terminal from within IntelliJ and ran mvn clean test the build ALWAYS passed, even though the results were being altered to a failure.

But I wouldnt see that behavior, when I opened up a terminal on the MAC and ran the same test using mvn clean test.

So yes, I see the discrepancy in terms of results, but I dont know yet what is causing it. Will have to dig in more.

So it seems like the problem is in the listeners' execution order.

This has always been a behavior with TestNG. Listener execution order is NOT guaranteed. So it looks like the only option is to enhance the IDataProviderListener (maybe this or maybe some other listener) which can ask the user to decide what to do with the ITestResult object of the yet to be executed test method and just use that. I will dig in deeper to find out if this is something that is possible.

In terms of the listener execution order, that is something is being asked in another issue as well. There the ask was "insertion order". Since listeners can also be wired in using Service loaders, the priority is something that is going to be of a challenge.

So let's deal with this part separately. For now, I want to focus on what can be done to give that decision making ability to the listener implementor to decide what to do with a failed data provider.

Am re-opening this issue and will keep the comments warm with whatever I find.

@krmahadevan krmahadevan reopened this Jul 27, 2021
@krmahadevan krmahadevan modified the milestones: 7.2, 7.6.0 Jan 7, 2022
@scottashipp
Copy link

Wow. I just ran across this issue today and it's a doozy. In one of our regression test suites, we load JSON bodies for requests to a web service in from a flat file. A misformatted flat file was causing an exception to be throwing during this load, but our test job continued to pass with green and nobody caught this for like a month.

I just wanted to make that comment to indicate that the impact to real-world users of the default behavior can be significant.

krmahadevan added a commit to krmahadevan/testng that referenced this issue Apr 12, 2022
Closes testng-team#217

Enable a data provider to propagate its failure
And be considered as a test failure.

This can be done at an individual data provider 
level by using the “@dataProvider” annotation’s new
attribute “propagateFailureAsTestFailure”

This can be done at the entire TestNG level by using
the configuration parameter 
“-propagateDataProviderFailureAsTestFailure true”

Also regrouped the data provider test into single 
test class.
@baflQA
Copy link
Contributor

baflQA commented Apr 13, 2022

Hey - this is an awesome change. Would it be possible to enable this in the runtime, programmatically? Or enable that in the suite file?

krmahadevan added a commit that referenced this issue Apr 13, 2022
Closes #217

Enable a data provider to propagate its failure
And be considered as a test failure.

This can be done at an individual data provider 
level by using the “@dataProvider” annotation’s new
attribute “propagateFailureAsTestFailure”

This can be done at the entire TestNG level by using
the configuration parameter 
“-propagateDataProviderFailureAsTestFailure true”

Also regrouped the data provider test into single 
test class.
@krmahadevan
Copy link
Member

Hey - this is an awesome change. Would it be possible to enable this in the runtime, programmatically? Or enable that in the suite file?

You can set this value via code by using the setter available in TestNG main class

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment