During the automation regression circles there are some defects that are carried over through the iterations.
So these tests appear as failed to ReportNG. Due to the fact that the new tests may introduce new defects that appear as failed to the report as well, some time is difficult and time consuming to distinguish between the old and the new defects.In this post we will describe a technique to illustrate the tests with the old defects in a separate column in the ReportNG.
The first step is to 'mark' somehow the test steps in which the known defects appear.There are a lot of ways to do this using features of the TestNG framework.One way to accomplish this is to use the
expectedExceptions attribute of the
@Test class.
The
expectedExceptions includes the list of exceptions that a test method is expected to throw. If no exception or a different than one on this list is thrown, this test will be marked a failure.Assuming that the testStep with the known defect fails in an assertion point, the code should be like this:
@Test(expectedExceptions=AssertionError.class)
public void testStepX(){
//Code
//Assertion point that fails
}
So after this code is executed the testStep with the known defect is considered passed.
The second step is to collect all these tests through a listener or a setup class.
So if you have a Super class that all your test classes extends from you can include the following code in this class. (This technique is the native dependency injection of the TestNG framework:
http://testng.org/doc/documentation-main.html#native-dependency-injection)
@AfterMethod(alwaysRun = true)
protected void ignoreResultUponExpectedException(ITestResult result) {
if (result.isSuccess() && result.getMethod().getMethod().getDeclaredAnnotations()[0].toString().contains("expectedExceptions=[class")) {
result.getTestContext().getPassedTests().removeResult(result.getMethod());
result.setThrowable(new Throwable("MARKED AS TEST WITH KNOWN DEFECT!!"));
result.getTestContext().getSkippedTests().addResult(result, result.getMethod());
}
}
An alternative way is to do this through a TestNG Listener:
import org.testng.IInvokedMethod;
import org.testng.IInvokedMethodListener;
import org.testng.ITestResult;
public class MyListener implements IInvokedMethodListener{
@Override
public void beforeInvocation(IInvokedMethod method, ITestResult testResult) {
}
@Override
public void afterInvocation(IInvokedMethod method, ITestResult testResult) {
if(method.isTestMethod()){
if (testResult.isSuccess() && testResult.getMethod().getMethod().getDeclaredAnnotations()[0].toString().contains("expectedExceptions=[class"))
{
testResult.getTestContext().getPassedTests().removeResult(testResult.getMethod());
testResult.setThrowable(new Throwable("MARKED AS TEST WITH KNOWN DEFECT!!"));
testResult.getTestContext().getSkippedTests().addResult(testResult, testResult.getMethod());
}
}
}
}
So to the moment the tests with the known defects seems as skipped (yellow color in ReportNG) with a specific exception message.
One may should stop here since the tests with the known defects are now in the skipped column.Nevertheless this also may be confusing because there are cases that we have additional skipped tests.This is done when a configuration method of the TestNG fails (e.g when
@BeforeClass fails).
The point here is that if we want to seperate further the tests with the known defects we must make changes to the native ReportNG code and build again the ReportNG.
The code is available under:
https://github.com/dwdyer/ReportNG/downloads
So we must import the ReportNG project in our IDE (
Eclipse or
NetBeans or
IntelliJ IDEA).It is a good idea to convert it to maven project in order to build more easily the jar that we will use.
The basic ReportNG class that is responsible for gathering and post process the TestNG results is the
HTMLReporter.java class.The changes in this class is in
createResults method in order to put the tests with the known defects in a seperate key for the Velocity Context. ReportNG uses the apache velocity context framework in order to generate the results. For more info about Apache Velocity you can visit:
http://velocity.apache.org/
public static final String KNOWN_DEFECTS_TESTS_KEY = "knownDefects";
@SuppressWarnings("deprecation")
private void createResults(List<ISuite> suites, File outputDirectory, boolean onlyShowFailures) throws Exception
{
int index = 1;
for (ISuite suite : suites){
int index2 = 1;
for (ISuiteResult result : suite.getResults().values()){
boolean failuresExist = result.getTestContext().getFailedTests().size() > 0 || result.getTestContext().getFailedConfigurations().size() > 0;
if (!onlyShowFailures || failuresExist){
IResultMap skippedTests = result.getTestContext().getSkippedTests();
IResultMap knownDefects = new ResultMap();
for(ITestResult tr:skippedTests.getAllResults()){
if (tr.getMethod().getMethod().getDeclaredAnnotations()[0].toString().contains("expectedExceptions=[class")) {
skippedTests.removeResult(tr.getMethod());
knownDefects.addResult(tr, tr.getMethod());
}
}
VelocityContext context = createContext();
context.put(RESULT_KEY, result);
context.put(FAILED_CONFIG_KEY, sortByTestClass(result.getTestContext().getFailedConfigurations()));
context.put(SKIPPED_CONFIG_KEY, sortByTestClass(result.getTestContext().getSkippedConfigurations()));
context.put(FAILED_TESTS_KEY, sortByTestClass(result.getTestContext().getFailedTests()));
context.put(KNOWN_DEFECTS_TESTS_KEY, sortByTestClass(knownDefects));
context.put(SKIPPED_TESTS_KEY, sortByTestClass(skippedTests));
context.put(PASSED_TESTS_KEY, sortByTestClass(result.getTestContext().getPassedTests()));
String fileName = String.format("suite%d_test%d_%s", index, index2, RESULTS_FILE);
generateFile(new File(outputDirectory, fileName),RESULTS_FILE + TEMPLATE_EXTENSION,context);
}
++index2;
}
++index;
}
}
The final changes should be done in
overview.html.vm,
reportng.properties and
reportng.css files:
In
reportng.properties add the property:
knownDefects=Known Defects
In
reportng.css add the style for the known defects tests.(I have set the color as pink)
.knownDefects {background-color: #ff3399;}
.test .knownDefects {background-color: #ff99cc;}
Finally in the
overview.html.vm we will make the most changes in order to illustrate the tests with the known defects in a separate column.(See the changes and additions underlined).If you perform these changes and build a ReportNG.jar (through
mvn install) and use this jar to your project's classpath in order to generate the results you will have a seperate column with the known defects.We must notice here that when the defect is fixed the test will fail with a message like:
expected exception was ... but...
This is an indication that you must remove the
expectedExceptions attribute from this test step.You can below a preview of how the report should be.
#foreach ($suite in $suites)
<table class="overviewTable">
#set ($suiteId = $velocityCount)
#set ($totalTests = 0)
#set ($totalPassed = 0)
#set ($totalSkipped = 0)
#set ($totalFailed = 0)
#set ($totalKnownDefects = 0)
#set ($totalFailedConfigurations = 0)
<tr>
<th colspan="8" class="header suite">
<div class="suiteLinks">
#if (!$suite.invokedMethods.empty)
##<a href="suite${suiteId}_chronology.html">$messages.getString("chronology")</a>
#end
#if ($utils.hasGroups($suite))
<a href="suite${suiteId}_groups.html">$messages.getString("groups")</a>
#end
</div>
${suite.name}
</th>
</tr>
<tr class="columnHeadings">
<td> </td>
<th>$messages.getString("duration")</th>
<th>$messages.getString("passed")</th>
<th>$messages.getString("skipped")</th>
<th>$messages.getString("failed")</th>
<th>$messages.getString("knownDefects")</th>
<th>$messages.getString("failedConfiguration")</th>
<th>$messages.getString("passRate")</th>
</tr>
#foreach ($result in $suite.results)
#set ($notPassedTests = $result.testContext.skippedTests.size() + $result.testContext.failedTests.size())
#set ($total = $result.testContext.passedTests.size() + $notPassedTests)
#set ($totalTests = $totalTests + $total)
#set ($totalPassed = $totalPassed + $result.testContext.passedTests.size())
#set ($totalKnownDefects = $totalKnownDefects + $utils.getKnownDefects($result.testContext.skippedTests).size())
#set ($totalSkipped = $totalSkipped + $result.testContext.skippedTests.size() -$utils.getKnownDefects($result.testContext.skippedTests).size())
#set ($totalFailed = $totalFailed + $result.testContext.failedTests.size())
#set ($totalFailedConfigurations = $totalFailedConfigurations + $result.testContext.failedConfigurations.size())
#set ($failuresExist = $result.testContext.failedTests.size()>0 || $result.testContext.failedConfigurations.size()>0)
#if (($onlyReportFailures && $failuresExist) || (!$onlyReportFailures))
<tr class="test">
<td class="test">
<a href="suite${suiteId}_test${velocityCount}_results.html">${result.testContext.name}</a>
</td>
<td class="duration">
$utils.formatDuration($utils.getDuration($result.testContext))s
</td>
#if ($result.testContext.passedTests.size() > 0)=
<td class="passed number">$result.testContext.passedTests.size()</td>
#else
<td class="zero number">0</td>
#end
#if ($result.testContext.skippedTests.size() - $utils.getKnownDefects($result.testContext.skippedTests).size() > 0)
#set ($skipped = $result.testContext.skippedTests.size() - $utils.getKnownDefects($result.testContext.skippedTests).size())
<td class="skipped number">$skipped</td>
#else
<td class="zero number">0</td>
#end
#if ($result.testContext.failedTests.size() > 0)
<td class="failed number">$result.testContext.failedTests.size()</td>
#else
<td class="zero number">0</td>
#end
#if ($utils.getKnownDefects($result.testContext.skippedTests).size() > 0)
<td class="knownDefects number">$utils.getKnownDefects($result.testContext.skippedTests).size()</td>
#else
<td class="zero number">0</td>
#end
#if ($result.testContext.failedConfigurations.size() > 0)
<td class="failed number">$result.testContext.failedConfigurations.size()</td>
#else
<td class="zero number">0</td>
#end
<td class="passRate">
#if ($total > 0)
#set ($passRate = (($total - $notPassedTests) * 100 / $total))
$passRate%
#else
$messages.getString("notApplicable")
#end
</td>
</tr>
#end
#end
<tr class="suite">
<td colspan="2" class="totalLabel">$messages.getString("total")</td>
#if ($totalPassed > 0)
<td class="passed number">$totalPassed</td>
#else
<td class="zero number">0</td>
#end
#if ($totalSkipped > 0)
<td class="skipped number">$totalSkipped</td>
#else
<td class="zero number">0</td>
#end
#if ($totalFailed > 0)
<td class="failed number">$totalFailed</td>
#else
<td class="zero number">0</td>
#end
#if ($totalKnownDefects > 0)
<td class="knownDefects number">$totalKnownDefects</td>
#else
<td class="zero number">0</td>
#end
#if ($totalFailedConfigurations > 0)
<td class="failed number">$totalFailedConfigurations</td>
#else
<td class="zero number">0</td>
#end
<td class="passRate suite">
#if ($totalTests > 0)
#set ($passRate = (($totalTests - $totalSkipped - $totalFailed -$totalKnownDefects) * 100 / $totalTests))
$passRate%
#else
$messages.getString("notApplicable")
#end
</td>
</tr>
</table>