To illustrate how you can customize your intaQt test cases executions, we'll use two Feature Files from a small project, which you might recognize from intaQt Studio Tips and Tricks:
The second Feature File tests a Smart Home interface, which reads the temperature from a thermostat display and compares against an expected value:
This Feature File includes two examples we read the values from. We check the values from
display1.jpgand check that it is 20 degrees Celsius. Then and we check that
display2.jpgshows a temperature of 10.5 degrees Celsius.
We'll execute the Temperature Feature File by right-clicking inside the intaQt Studio editor window and choosing the
We see all tests have passed, but we also want to see which scenarios and steps were executed. Therefore, we'll click the
Show Passed Testbutton:
As expected, the two Scenarios showing the different thermostat display values that were defined for our test case. We can further expand these by clicking on the arrow buttons to reveal the results from each Step executed by intaQt:
As you can see, the Webtest Scenario failed on the
on browser, search for queryStep. When a Step fails, all subsequent steps are skipped:
We can do this by going to the Run Configurations menu and selecting
Edit Configurations..., which will bring up a new window:
We had two test runs and for each of those intaQt Studio created default Run Configurations, that look something like this. We'll go through each item within the configurations and explain how they work:
- The Run Configuration
Namecan be chosen freely, but it must be unique, otherwise it may conflict with other configurations.
- The Feature File or directory is the relative path starting from the project root that points to the Feature File or to the directory that we want intaQt to execute.
- The hostname and port for intaQt that we want our tests to run on. If you have a custom intaQt installation, you may choose a different hostname and the port.
- The Tags configuration is very important. By default, you will see
~ignored, which means intaQt will always execute tests that are not tagged with the
@ignoredand next time we run our entire
featuresdirectory, we see that this Scenario is no longer executed:
On the other hand, we might want to execute a test based on annotations. For example, we might have a scenario where we want to execute all the ignored files inside our project. We'll select
Edit Configurations...again, and delete the tilde (
ignored, and then save the configuration:
When we run the project again, now it only runs the annotated test as we expected. Of course, this test fails but it’s interesting for us to see why it fails:
search for query, and the
expectedFirstResultName. We expected both arguments be references inside of Context Objects for our test run. By clicking on failed step, we can see an excerpt of the intaQt log, which contains an error message:
The problem, as you can see above, is that
‘expectedFirstResultName’are not yet defined inside our project. That’s why we cannot resolve those properties. One approach to fixing this problem is to define
expectedFirstResultNameas Context Objects inside our project configuration (
In our project configuration file, we'll define two Context Objects:
query, we specify
expectedFirstResultName, we expect the first search result to be the QiTASC website, which has the title
"QiTASC - the magic of testing".
When we run the Google Webtest again, we'll see a browser window open and the word
qitascare typed into the Google search bar. The first search result was what we wanted:
Would you like more instructions about creating and executing test cases? Visit intaQt Tutorials on the QiTASC Resource Center for out-of-the box test case examples!