-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
exception in DataProvider doesn't fail test run #217
Comments
There is something a bit odd going on. I added the following trace:
I run a test with a failing data provider and I get:
However, the shell reports: $ echo $# |
i am facing the same issue. what is the solution for this? i am using testNG.hasFailure to determine if there are any failures.. Even thought the report and console show a failure, hasFailure is false. |
I am facing this issue with version 6.9.10.
The result of its execution is skipped test instead of failure.
Guys, would you accept a pull request with the change to set test status to Failure for this case? |
Closes testng-team#217 Following is the gist of changes done: * Deprecated the nested ExitCodeListener and added a new one. * Added null checks to ensure that an ITestResult object created based on a Description object is not null. Description Object is typically seen to be null when JUnit encounters problems in parsing a method as a test method. * The new ExitCodeListener now considers pass/fail/ skip methods through-out the entire suite of suites to determine the effective return code (or) status. So altered some tests to now assert on the new status * Removed an invalid JUnit @test (annotation on a static method is not valid in JUnit world) * Since new ExitCodeListener is a reporter backed listener, altered a test that asserts on the reporter count. (UniqueReporterInjectionTest.java)
This is still reproducible with |
@orange-buffalo - In your previous comment you had mentioned that you would like to submit a PR. Please feel free to raise a PR to fix the issue and we will be more than glad to get it merged. |
… handling and always marking the test as failed in case of any exception
I don't know what is the expected test status here. If "skipped" is not the expected status, maybe an option could be used in order to configure the behavior. We can imagine too that the run will stop because a configuration method is not supposed to fail. |
@juherr - I concur with you as well. A data provider is kind of like a prerequisite needed for the test method. So if the prerequisite failed, then the test should be skipped is what I feel as well. |
The thing is that in real world data provider methods do fail, even if they are not supposed to. If data provider is more complicated then just returning constants (like in integration testing where more complex data might be required), there are chances of mistakes that cause exceptions in data providers. Current behaviour of skipping the tests totally hides these problems (Maven / Gradle / etc build will be successful). Perhaps you can share some use cases with us, but I can't imaging when a user would like the test to be silently skipped if their data provider failed with NPE. Maybe TestNG team could consider an explicit declaration of a pre-condition that causes the test to be skipped? For instance, if test code detected that precondition is not met, it could throw a "skip test exception", which would be a clear and distinct indicator (unlike "any exception") for the engine to skip the test. |
… handling and always marking the test as failed in case of any exception
@orange-buffalo @juherr - I think we can fix this without any changes within TestNG. The following TestNG Listener can be added (maybe as a mandatory listener) import org.testng.IExecutionListener;
import org.testng.TestNG;
public class NoTestsFoundWarning implements IExecutionListener {
@Override
public void onExecutionFinish() {
int status = TestNG.getDefault().getStatus();
if (status == 2) {
throw new IllegalStateException("No tests were executed.");
}
}
} The added project demonstrates the solution in action. |
Closes testng-team#217 Currently TestNG doesn’t throw an exception which build tools can leverage to fail a build, when all the “@test” methods get skipped and nothing was run. TestNG returns the proper status, but sometimes build tools may not be reading that status and take appropriate action. Introduced a new configuration through this PR using which a user can instruct TestNG itself to trigger an exception when no tests were executed. If one is using surefire plugin, then your surefire plugin configuration would look like below: <configuration> <properties> <property> <name>failwheneverythingskipped</name> <value>true</value> </property> </properties> </configuration> If you are using command line, then this feature can be turned on by passing -failwheneverythingskipped=true
Closes testng-team#217 Currently TestNG doesn’t throw an exception which build tools can leverage to fail a build, when all the “@test” methods get skipped and nothing was run. TestNG returns the proper status, but sometimes build tools may not be reading that status and take appropriate action. Introduced a new configuration through this PR using which a user can instruct TestNG itself to trigger an exception when no tests were executed. If one is using surefire plugin, then your surefire plugin configuration would look like below: <configuration> <properties> <property> <name>failwheneverythingskipped</name> <value>true</value> </property> </properties> </configuration> If you are using command line, then this feature can be turned on by passing -failwheneverythingskipped=true
Closes testng-team#217 Currently TestNG doesn’t throw an exception which build tools can leverage to fail a build, when all the “@test” methods get skipped and nothing was run. TestNG returns the proper status, but sometimes build tools may not be reading that status and take appropriate action. Introduced a new configuration through this PR using which a user can instruct TestNG itself to trigger an exception when no tests were executed. If one is using surefire plugin, then your surefire plugin configuration would look like below: <configuration> <properties> <property> <name>failwheneverythingskipped</name> <value>true</value> </property> </properties> </configuration> If you are using command line, then this feature can be turned on by passing -failwheneverythingskipped=true
Closes testng-team#217 Currently TestNG doesn’t throw an exception which build tools can leverage to fail a build, when all the “@test” methods get skipped and nothing was run. TestNG returns the proper status, but sometimes build tools may not be reading that status and take appropriate action. Introduced a new configuration through this PR using which a user can instruct TestNG itself to trigger an exception when no tests were executed. If one is using surefire plugin, then your surefire plugin configuration would look like below: <configuration> <properties> <property> <name>failwheneverythingskipped</name> <value>true</value> </property> </properties> </configuration> If you are using command line, then this feature can be turned on by passing -failwheneverythingskipped=true
Closes #217 Currently TestNG doesn’t throw an exception which build tools can leverage to fail a build, when all the “@test” methods get skipped and nothing was run. TestNG returns the proper status, but sometimes build tools may not be reading that status and take appropriate action. Introduced a new configuration through this PR using which a user can instruct TestNG itself to trigger an exception when no tests were executed. If one is using surefire plugin, then your surefire plugin configuration would look like below: <configuration> <properties> <property> <name>failwheneverythingskipped</name> <value>true</value> </property> </properties> </configuration> If you are using command line, then this feature can be turned on by passing -failwheneverythingskipped=true
Thanks for the suggestion, but this doesn't seem to report the failure at the level of the @test that had the failure. It's just a broadly thrown failure in setup that doesn't mark that specific test as failed. i.e. The test execution doesn't make it to my @AfterMethod listener so that I can mark the test as failed. |
@krmahadevan @juherr I believe the above fix doesn't solve the issue that folks have described. A data provider may contain steps to put the system into initial state for testing. So if any failure occurs on this phase it's important to notify a build and reporting system that something went wrong with particular test. But when a test is skipped, the build is green. I have nothing against the actual skipping logic. But I agree with the other folks that it's important to let users override the state when the configuration method has failed. There might also be a logic when we can throw a custom exception to achieve the same goal. In this case, we do want to see a red test with the root cause of the preparation step failure. On the other hand, sometimes we may want to intentionally skip tests, if e.g. there are open issues attached to them. Technically, it could be achieved with the existing @Override
public void onTestSkipped(final ITestResult result) {
StreamEx
.iterate(result.getThrowable(), Objects::nonNull, Throwable::getCause)
.dropWhile(cause -> !cause.toString().startsWith(AssertionError.class.getName()))
.collect(last())
.ifPresent(cause -> {
result.setThrowable(cause);
result.setStatus(TestResult.FAILURE);
});
} But I noticed a race condition somewhere in the TestNG core, as despite the actual status change to FAILURE, TestNG still somehow mutates it back to SKIPPED. You can run the following test N times to see the flakiness: @DataProvider
public Iterator<Object[]> data() {
throw new AssertionError("some message");
}
@Test(dataProvider = "data")
public void shouldFail(String data) {} Sometimes, it's marked as skipped, sometimes - as failed with the above listener. |
So does it mean that the build tool is not recognising the TestNG return code and then taking an appropriate action or is it the TestNG integration (For e.g., it would be surefire for maven) that is perhaps at fault here? I didnt quite understand this line as to how TestNG would be at fault for not failing a build.
I was under the impression that the behaviour is now consistent with what a configuration failure results in. To me a data provider is kind of like a pre-requisite for running a test method. A configuration establishes the pre-requisite setup, a data provider establishes the pre-requisite test data. |
Skipped tests don't affect the exit code the way how failed tests do. But the whole discussion is not about someones fault. It's about the way how people interpret the failures in the data provider and what options TestNG provides to customize this behaviour. There are lots of scenarios when users DO want to override the status code (some of them are described above). I wouldn't focus on exits codes. Personally to me it's more important to see the status I set if the data provider fails. Even with the above listener it doesn't work as expected and I still see a skipped test in report. |
@sskorol @krmahadevan |
You mean a test method and its corresponding data provider right ? Yeah, I guess this needs a bit more investigation. Let me write up some samples and see if I can get more info on this and I will post them back.
It should be the case. But let me dig in and see what is going on. Maybe there's a bug lurking there somewhere which resets statuses back to what TestNG knows of. |
@krmahadevan I mixed configuration methods and data provider methods. What do you think to add an |
@juherr - Guess what. we already have a Please check if the below sample is what you are looking for ? This fails the build when I run it via Maven. import org.testng.annotations.DataProvider;
import org.testng.annotations.Test;
public class MyTestClass {
@Test(dataProvider = "dp")
public void testMethod(int i) {
}
@DataProvider(name = "dp")
public Object[][] getData() {
throw new RuntimeException("Failing the data-provider");
}
} import org.testng.IDataProviderListener;
import org.testng.ITestContext;
import org.testng.ITestNGMethod;
public class SimpleDPListener implements IDataProviderListener {
@Override
public void onDataProviderFailure(ITestNGMethod method, ITestContext ctx, RuntimeException t) {
System.err.println(
method.getMethodName() + "() failed with the exception " + t.getCause().getMessage());
}
} import java.util.List;
import java.util.stream.Collectors;
import org.testng.ITestContext;
import org.testng.ITestListener;
import org.testng.ITestResult;
import org.testng.TestNGException;
import org.testng.internal.TestResult;
public class TestFailingListener implements ITestListener {
@Override
public void onFinish(ITestContext context) {
List<ITestResult> dataDrivenTestMethods = context.getSkippedTests()
.getAllResults().stream()
.filter(each -> each.getMethod().isDataDriven())
.filter(each -> each.getSkipCausedBy().isEmpty())
.filter(each -> (each instanceof TestResult))
.peek(each -> each.setStatus(ITestResult.FAILURE))
.collect(Collectors.toList());
if (!dataDrivenTestMethods.isEmpty()) {
String msg = prettify((dataDrivenTestMethods))
+ " were skipped due to a failure in the underlying data provider. Failing the test execution";
throw new TestNGException(msg);
}
}
private static String prettify(List<ITestResult> results) {
return results.stream()
.map(each -> each.getMethod().getQualifiedName() + "()")
.collect(Collectors.joining());
}
} <suite name="ExcerciseSuiteA" verbose="2" preserve-order="false">
<listeners>
<listener class-name="com.rationaleemotions.github.issue217.SimpleDPListener"/>
<listener class-name="com.rationaleemotions.github.issue217.TestFailingListener"/>
</listeners>
<test name="TestA" verbose="2">
<classes>
<class name="com.rationaleemotions.github.issue217.MyTestClass"/>
</classes>
</test>
</suite> Execution output
Note:
could be misleading because the underlying reporter
But I think this demonstrates two things that you had asked for
|
Ok.. That approach is not going to work. Please ignore that above comment @sskorol |
The below version of the listener should do the trick (I have tried this with TestNG import org.testng.ITestListener;
import org.testng.ITestNGMethod;
import org.testng.ITestResult;
import org.testng.internal.TestResult;
public class TestFailingListener implements ITestListener {
private final Object lock = new Object();
@Override
public void onTestSkipped(ITestResult result) {
ITestNGMethod tm = result.getMethod();
if (!tm.isDataDriven()) {
return;
}
if (!result.getSkipCausedBy().isEmpty()) {
return;
}
if (!(result instanceof TestResult)) {
return;
}
result.setStatus(ITestResult.FAILURE);
synchronized (lock) {
result.getTestContext().getSkippedTests().removeResult(result);
result.getTestContext().getFailedTests().addResult(result);
}
}
} |
I recorded a demo where it still reports a skipped status once in several runs: https://youtu.be/A0hY2wtk72A The same behaviour was with the listener I posted in one of my previous messages. Still wondering who can change the actual status. Moreover, in some reports a test is marked as skipped, in others - failed. So it seems like the problem is in the listeners' execution order. Each build tool/reporter implements its own set of listeners. So if our custom one is executed after the results are written by the other listeners, we can see this inconsistency. Wondering if TestNG can provide a special configuration for listeners that may adjust the execution order, e.g. some sort of priority? |
@sskorol - Thanks for sharing that video. After I had shared the sample, I noticed that when I opened up a terminal from within IntelliJ and ran But I wouldnt see that behavior, when I opened up a terminal on the MAC and ran the same test using So yes, I see the discrepancy in terms of results, but I dont know yet what is causing it. Will have to dig in more.
This has always been a behavior with TestNG. Listener execution order is NOT guaranteed. So it looks like the only option is to enhance the In terms of the listener execution order, that is something is being asked in another issue as well. There the ask was "insertion order". Since listeners can also be wired in using Service loaders, the priority is something that is going to be of a challenge. So let's deal with this part separately. For now, I want to focus on what can be done to give that decision making ability to the listener implementor to decide what to do with a failed data provider. Am re-opening this issue and will keep the comments warm with whatever I find. |
Wow. I just ran across this issue today and it's a doozy. In one of our regression test suites, we load JSON bodies for requests to a web service in from a flat file. A misformatted flat file was causing an exception to be throwing during this load, but our test job continued to pass with green and nobody caught this for like a month. I just wanted to make that comment to indicate that the impact to real-world users of the default behavior can be significant. |
Closes testng-team#217 Enable a data provider to propagate its failure And be considered as a test failure. This can be done at an individual data provider level by using the “@dataProvider” annotation’s new attribute “propagateFailureAsTestFailure” This can be done at the entire TestNG level by using the configuration parameter “-propagateDataProviderFailureAsTestFailure true” Also regrouped the data provider test into single test class.
Hey - this is an awesome change. Would it be possible to enable this in the runtime, programmatically? Or enable that in the suite file? |
Closes #217 Enable a data provider to propagate its failure And be considered as a test failure. This can be done at an individual data provider level by using the “@dataProvider” annotation’s new attribute “propagateFailureAsTestFailure” This can be done at the entire TestNG level by using the configuration parameter “-propagateDataProviderFailureAsTestFailure true” Also regrouped the data provider test into single test class.
You can set this value via code by using the setter available in TestNG main class |
If my @dataProvider method throws an exception, the TestNG report correctly shows that exception. However, the exit status from TestNG execution is zero, indicating that all tests passed. If a DataProvider fails, the build should fail as well.
The text was updated successfully, but these errors were encountered: