Spiratest Best Practices
Guidelines for writing Test Cases
The test cases are used by people of varying technical backgrounds and familiarity with the product. This means that the test cases should be usable by anyone with the bare minimum technical knowledge of the system. The following guidelines should be followed when creating test cases. Please refer to the following test case as an example: http://spiratest.pentaho.com/21/TestCase/7069.aspx. Versioning test cases is explained in the following document: http://iwiki.pentaho.com/display/QA/Versioning Spiratests.
Test Cases
Organization
Organize Test Cases by depth and module. Test Cases should be grouped into folders by depth of testing and then by module. However, try to avoid over-organizing. We do not need 12 levels of folders. Sub-folders should only be used if a folder holds an excessive amount of test cases. The same test cases will be used across multiple versions of the products so organizing by version will only unnecessarily expand the size of the test case library. The following folders should be used:
- In Progress - Holds test cases that have not be finished.
- In Review - Holds test cases that have not yet been accepted into the Test Plan.
- Sniff Test - Placed in the "Smoketest" folder. A single test per project that covers the major functionality of the project. This tests should be able to be run in 15-30 minutes.
- Smoketest - No sub-folders. Only one smoketest per module. A module is a major section of the project such as "Filtering" or "Import / Export".
- Create Smoketest Test Cases for each module. Smoketest Test Cases are needed for all modules so that it can be quickly tested for basic functionality. Only create one smoketest test case for each module!!
- Smoketest Test Cases should only hit the critical path.  This means that we need to determine what steps the average user performs. (Ex. Log in, save, edit) It is best to only test the default options. The exception is for build items that would cause problems if they are not tested. Ex. Correct build or version number.
- Don't test the little stuff.  Smoketests should not include non-critical tests. Do not test items that do not block the user from proceeding.  Ex. Help links, resize columns, bells and whistles. These can be tested in Functional tests.
- Smoketests are designed to be run as quickly as possible. For this reason, try to structure the steps so that the amount of coverage is provided without backtracking or repeating steps as much as possible.  Think of the smoketest as a script of someone trying to accomplish a basic but common task.  Ex. Start server, login, connect to repo, create new transformation, add input step, add filter, add hop, add output step, add hop, run transform, save transform, new job, add transform to job, run job, schedule job, sign out, stop server, etc.
-  Functional - Folders for each module. Test cases should cover testing of functionality that is not done in smoketest. For example, the smoketest could test the ability to create a SQL data source but a functional test could test using specific types of SQL queries.
- Exception - Folders for each module. Test cases should cover scenarios that don't happen often. ex. negative tests or edge cases.
- Helper - Test cases that are used by other test cases. These are not run by themselves but are used often by other test cases. ex. Setting up a specific configuration
- Trash - Only admins have the ability to delete test cases. If you wish a test case to be deleted from the system, please place them in this folder.
New Test Cases
When creating new test cases, please follow this process:
- Create a new test case in the "In Review" folder for the project that applies to the test case.
- Set the Active status to "No"
- Write the Test Case using the Best Practices listed below.
- Set the "Estimated Duration" to a reasonable approximation of how long it will take to perform the test.
- List any related JIRA tickets in the description. Make sure that the Test case number is listed in the "Spiratest" field of the JIRA ticket it covers.
- When you are finished, please email bbruce@pentaho.com and gdavid@pentaho.com with a link to the test case.
- Brandon or Golda will reply with any edits that are needed.
- Make the fixes and inform them that the test case is ready to be reviewed again.
- Once the Test Cases are approved, they will be set as Active and added to the Test Plan.
Modifying existing Test Cases
- If an existing test case needs to be modified, copy the Test Case using the Copy button.
- Move the Copy to the "In Progress" folder and rename by adding "edit" to the end of the name.
- Set the Active status to "No"
- When the test case is ready for review, Move the test case to the "In Review" folder
- When you are finished, please email bbruce@pentaho.com and gdavid@pentaho.com with a link to the test case.
- Brandon or Golda will reply with any edits that are needed.
- Make the fixes and inform them that the test case is ready to be reviewed again.
- Once the Test Cases are approved, they will be set as Active and added to the Test Plan. The new test will replace the old as the active version. Golda and Brandon will handle this.
Best Practices
Test Cases
- Be Descriptive but not overly specific.
- Use descriptive names.  The purpose of a Test Case or Test Set should be obvious by reading the test case name.  This will make it easier to assign Test cases to Test Sets or other Test Cases. However, try to keep the name as short as possible. ex. "Create DB connection - Wizard" instead of "Testing creating database connections using the Wizard"
- Use descriptive descriptions.  The Test Case description should include a narrative description of the test case to enable a tester to best understand the purpose of the test. You not need to specify everything being tested, just the general purpose of the test. Ex. The purpose of this test is to identify if the data source wizard is correctly identifying content.
- List JIRA tickets affected. The description should include a list of JIRA tickets that are covered by the test case.
- Keep it simple. To avoid redundancy, each test case should have a very simple purpose. A Test Case should not test multiple different types of things. Complex test cases can be created by combining test cases if necessary. Multiple tests can be part of the same test case but only if the tests are very basic and are closely related. Smoketests are an exception but they should still only contain tests for a specific module. ex. A "basic UI" test case could test scrollbars, icons visible etc. for a module
- Avoid redundancy. We do not need multiple tests to test the same thing. If a test is checking the functionality of a module, you do not need to test whether you can login. Each test step in a test case should contain a single specific test. If you do not need to verify it for your current test, include the step in your description but it does not need an expected result. As a rule, we should have smoketests for basic functionality and functional tests to built on the base case.
- Enter accurate execution times. Once you have finished creating each test case, perform the test and record the time taken to perform it (assume the test passing). Â Entering arbitrary execution times does not help in planning. Ex. 10 minutes to stop the server is excessive if it is working correctly.
- Provide any needed test data. If a specific file or resource is needed for a test, attach the resource to the test case or provide a description of how to acquire the resource. For example, if an MS Access database is needed to test creating an MS Access Input step, provide directions on how to get the database.
- Connect Test Cases with JIRA when possible. When a new issue in JIRA is being tested, we should document which Spiratests apply to the issue. This can be done using the following steps:
- Open the JIRA issue for which you are writing the test case.
- Edit the JIRA Issue using the "Edit" Button at the top.
- Find the "Spiratests" fields at the bottom of the page directly above Comments
- Paste the link of all related Spiratests in that text box separated by commas.
- Save the changes. Notice that the Spira test Links appear in the Details section of the JIRA case.
- Do not delete Test Cases.  A Trash folder should be available in the list.  Move any obsolete test cases to this folder IF THEY HAVE NOT BEEN EXECUTED. The Spiratest admin will delete the test cases from this folder periodically. If the tests have been executed, move them to the Archive folder under the last version the test was applicable. This way we can keep the records on tests that were run even after they are obsolete.
- Test Cases should only cover the module or project being tested. Each validation should be limited to the modules being tested as much as possible. We do not want to be affected by failures in other modules when testing a module.  This includes other projects and other areas of the same project. Consider publishing to BIServer as an example. Other modules use the BIserver but we do not want our test to be affected by them. The test would be that the tested module publishes correctly. The following test cases would be needed:
- BIServer - test that published items can be viewed and run - Only test in BIServer
- PDI - test that items can be published - Only test in PDI and filesystem
- State the expected starting point. When a test case starts, you assume the user has done certain things. If actions are expected to have been performed, add a "Precondition" section to the first step of the test case that explains what needs to be done. Otherwise, specify where in the project you should be. ex. Click on the "New" button in the Data Source Wizard.
- <Obsolete - We don't encourage parametrization as much anymore> Parameterize when possible! Â There are times when the same tests need to be tested using different parameters. Â For example, each chart type in analyzer needs to have the same tests performed against each chart type. Â To prevent having to write a separate test case for each chart type and having to manage each one, you can create the test cases using a parameter and create a test case that calls the general test case and assign the parameter. Â For a reference, see http://spiratest.pentaho.com/8/TestCase/6492/TestSteps.aspx. Â Use the following steps to add parameters:
- Create test case in Spiratest ex. Test chart options
- Click the "Edit parameters" link
- Add parameters ex. Chart type
- Add test steps with the parameter placeholder in place of parameter. This is a dollar sign followed by the parameter name in curly brackets.
- Create a new test case to test parameter. Ex. Test options for Bar chart
- Choose the "Insert link" option and choose the previous test case
- Click the Edit button to the right of the test step
- Set the parameter to the desired value ex. Charttype = "bar chart"
- Assign the new test case to a test set
Test Steps
- Test Step descriptions should be in the form of an action.  Each test step should be a command to do something. Statements that include information but do not describe an action should never be its own step. They should be combined with the action they refer to. We should not have statements like "Note: the file used to be named something else" since they are not commands. If it is absolutely necessary, add any notes to the expected results.
- The expected result should be a statement.  The expected result is meant to be a list of results.  It should not include actions. You should use statements like "ABC should be present" instead of actions like "Verify that abc is present".
- Be concise! The more text someone has to read, the less interest they have in finishing reading. Each test step should be as short and clear as possible. Do not provide unnecessary explanation unless it is important to the test step. As an example, if you tell a tester to select a file from a folder, do not provide an example of a filename unless they need to select the specified filename. Examples should be descriptive, not filler.
- Be specific!  As a general rule, if the tester has to guess about what to do next, it is a poor test.  Generalized steps should only be used when the specifics do not affect the item being tested. Notice in Step 4 of the example that the specific details of the connection settings are listed but the password is labeled as "user dependent" since it depends on what the user enters as the original password.
- Be clear! It is important that you proofread your test cases for grammatical accuracy. Typos and poor grammar make test cases hard to follow and can sometimes change the intended meaning drastically. For example, typing "now" when you meant "not" can reverse the intended meaning of a result.Â
- Format for clarity. Use formatting to make the test steps more readable if multiple items need to be listed in a step such as multiple steps to be performed or multiple aspects of a validation. A long run-on sentence can be hard to decipher. If you have a collection of items, use a bulleted list. If you are describing things that need to be in a specific order, use an ordered list. When using formatting in Spiratest, use the built-in formatting. If you manually number a list, it become hard to maintain when adding and removing steps in the list. Notice in Step 4 of the example that each line of the connection credentials are bulleted instead of shown in one line. This makes it easier to read.
- Do not refer to other steps by number! If you say things like "do the same as steps 2-4", the test will be unusable if more steps are added since the numbers won't match.
 - Don't repeat yourself!  Each test case should only test the focus of the test.  Do not include tests that have been tested elsewhere.  If an action or actions must be performed before a test step can be performed, the actions can be entered in the same step as the test but only the expected result should be tested. Step 1 of the example states that you must create a database but there is no test for this. Creating a database will be tested in another test case so it can be used as a set up to creating a database connection but no test is needed.
- Show Preconditions in the first Test Step.  If actions need to be performed before a test case is run, add a precondition block in the first step, not the description.  The necessary steps can be entered in the block.  A link to another test case can also be used. See Step 1 of the example.
- Only test one thing per Test Step.  Each test step in a test case should exist for a specific purpose.  Test Steps should only have multiple steps if each step must be performed before testing the desired item.  If the test step has multiple acceptance criteria, it is difficult to track why the step has failed.  Each step should have one specific reason for being run. If the events happen at the same time, the description should specify where to look for the event. Ex. Look at the File Browser.
- Do not show steps for things that we do not control.  For example, never mention "Download x product from x build location" in any test step.  The ability to download a file to your computer is not a function of our product so it does not need to be tested.  You would start any test cases from the point of having the build. Do not have tests for items like the following: editing configuration files, setting up databases,etc.
Guidelines for writing Test Sets
- Test Sets are used to organized test cases for easier assignment and tracking. Test Sets will primarily be created by the QA Lead/Manager.
- Set environment parameters by Test Sets. Â Each test will need to be performed on multiple operating systems, browsers, databases, etc.. Â Every Test Set has custom properties for OS, Browser and database. Â Each Test Set should set these properties accordingly. Â To set these properties, do the following:
- Create a Test Set
- Add the desired test cases
- Choose the Custom Props tab
- Set the appropriate properties
- Copy the Test Set
- Edit the copied Test Set and change the properties to the next configuration
- Repeat this for all desired test environments
- Organize Test Sets by version. Test Sets are a scheduled and assigned list of test cases. Â Since they are specific to the configuration being tested, they should be versioned. Â When a new version is created, a new folder should be created with the name of the version. Â All previous Test Sets should be copied to the new version unless they are no longer valid. Â Any new Test Sets should be added to this version and any modifications of existing test sets for the new version should be done here. Â Test Sets should not be modified once they have been completed.