18649 - Testing Requirements


*Please submit all project-related questions to {email} -- thanks!
This document describes the procedure and requirements for unit, integration, and acceptance testings.  Over the course of the project, you will use these testing requirements to test your entire elevator.  Be sure to read this document carefully.

Introduction:  A Note on the Usefulness of Testing

Testing is an important, but often neglected, part of the design process.  It's not as fun as design or writing code, so it is even more important that you start early and make sure you allocate enough time to be thorough.  Like the rest of the process, its usefulness is directly related to the amount of effort you put in to writing your tests.  If you write bad tests, and the lesson you should take away from that experience is "it is important to write good tests", NOT "testing is a waste of time".

If you patch together a unit or integration test file that will "probably work," and then run the test, see where it fails, and adjust the test accordingly, then you probably won't find any bugs in your tests because you've tuned your test to match your design.  In the real world, it's common for testing to be done by a separate team for just this reason.  Although you do not have the luxury of having a separate team for testing, you can certainly have someone other than the designer write the tests.  Write unit tests while only looking at statecharts, and integration tests while only looking at sequence diagrams.  Be as thorough as you can.  If you write good tests, you'll catch bugs and be able to refine the design early on.  If you just go through the motions of unit and integration tests, and you may still be able to grind out the major bugs by debugging acceptance tests, but it will take a long time, and there is a good chance you'll overlook some subtle bugs. 

Simulator Testing Framework

Unit and Integration Testing Framework

The simulator can be configured to take a plain text file describing messages (and physical inputs) to be sent and actually sending them out at specific times.  It can also be configured with assertions to check the value of physical state and network messages at specified times.  You can use the -cf flag to specify the configuration (which object is instantiated for the unit test), and the -mf flag to specify the message inputs for the unit test.

The message input file format is described in the simulator command line reference (run the simulator with no parameters), but there are five basic line types:

Because of jitter in the network and the way the assertions are checked, you must wait longer than two message period to be sure that the assertion has the updated value.  That is to say, if the verbose output of your car call controller indicates that it has set the output to "true" at time 3.0s and the message (or framework value) is being updated at 100ms, you are not guaranteed that the assertion will read the update value until after 3.2s (e.g. 3.21 s).

The assertion format only allows you to check the value of one field.  If you wish to check multiple values, you can create multiple assertions.

The message injector file has a simple macro syntax:  #DEFINE <macro> <value>.  There is also an include directive:  #INCLUDE <filename>.  These are explained in more detail in the command line documentation.  You can use the -pd option to print a list of #DEFINE statements for all network and framework messages.  We STRONGLY advise you to use this list in an include file, since your CAN ids will change throughout the semester, and it will make it much easier to keep your tests up to date.

You can refer back to the test files in the testlight example to see a working example of message injection and assertion and controller state assertions.  The example is a unit test, but this framework can also be used for integration testing by using injections for arcs that come from system objects (these are test inputs) and assertions to check the arcs that come from control objects (these are test outputs).

Acceptance Testing Framework

Acceptance testing consists of instantiating the entire elevator, including the network and framework message architecture, all system objects and smart sensors, and all distributed controllers.  This is handled automatically by the simulator.  Once the elevator is instantiated, passenger objects interact with the physical interface by pressing buttons, passing through doors (sometimes triggering reversals), and adding weight to the elevator.

Acceptance tests can be started using the -pf flag.  Acceptance test files consist of a list of passengers.  Each passenger is defined by the landing they enter the system at and their destination.  Read the simulator command line reference for more information.

Unit Testing

If you have not already done so, go read the Introduction and Simulator Testing Framework sections at the beginning of this document.

Unit test for this project means testing the behavior of a single system object such as a door controller or the dispatcher.  These tests are derived from the system design, and shall completely cover all states and transitions in your state diagrams to ensure that code that is written actually implements the intended design.

You are responsible for implementing and testing the objects you've designed.  You are not responsible for testing environmental objects like the Safety object or the smart sensors.  You can assume that we will test these objects..  By the end of the project, you will design tests for all seven of your controllers.

Summary of Unit Testing:

Writing Unit Tests

The idea with unit tests is to generate a set of files that are sufficient to completely test a single controller implementation. We are going to leave the details of how to accomplish things to you, but to be correct a test shall address the following points:

Traceability for Unit Tests

Unit tests must cover all states and transitions in the statechart of the object being tested.  To accomplish this, you will add two specific types of traceability comments to your message injector (.mf) files.  This is in addition to the comments in the header that describe the test.

Peer Reviews for Unit Tests

Note that it is possible (likely!) that your unit tests will have bugs.   For that reason, someone else in the group (other than the author of the unit test) must review the unit test for accuracy.  Each unit test must be reviewed.  The review shall include the following checks:

For each test, add the checklist, the unit test author's name, and the name of the person who performed the review to the peer review section of the Unit Test Log in your portfolio.
Note:   If a peer review results in changes to the unit tests, those tests shall be logged in the issue log.

Unit Test Log

Instead of simply running tests, finding bugs and fixing them, we will ask you to create a test log. This log will allow you and us to quickly see what tests you have run, their outcome and any changes you had to make. A test log serves as a simple tool to measure the coverage of your tests and to highlight areas of your design where there were problems.

A template for the unit test log has been included in the portfolio template.  Be sure that you include the following items:
Note:  For the unit tests that you have written but are not required to execute, you may put "N/A" in the boxes for test results.

Unit Test Summary File

In order to facilitate automated testing, we require a machine-readable text input format that summarizes your unit tests.  The requirements for the test summary file are as follows.  Note that this requirement is in addition to the HTML summary of your tests in the Unit Test Log of the portfolio.
The portfolio template contains a placeholder file with the correct name and location and an example entry.

You can use the test_verify.sh script to check your summary file.  The syntax is
    ./test_verify.sh   <summary file name>
This script must be run from the unit_test/ directory of your portfolio.

Integration Tests / Sequence Diagram Tests

If you have not already done so, go read the Introduction and Simulator Testing Framework sections at the beginning of this document.

Throughout the project writeup, we will use the terms Integration Test and Sequence Diagram Test interchangeably.  With respect to the simulator framework, integration tests work much the same way as unit tests.  For each sequence diagram, you will generate a .cf file with a list of controllers for that test.  You will also generate a .mf file which defines messages to be injected and assertion to be checked.  Consult the command line documentation and the discussion in the unit testing section above for more details.

Summary of Integration Testing:

Writing Integration Tests

The idea with integration tests is to generate a test that is sufficient to completely test one sequence diagram.  Although you may only instantiate one controller (depending on which sequence diagram you choose), this is a form of integration testing because it tests the interaction between various objects in the system.  We're going to leave the details of how to accomplish things to you, but to be correct a test must somehow address all the below points:

Traceability for Integration Tests

Each network or framework message injection corresponds to an arc originating from a system object.  For arcs that originate from the controller objects, each arc is tested with an assertion.  Each of these injections or assertions shall be preceded by a comment of the form:

  ;#arc '1A/1b'

Where 1A refers to the sequence diagram being tested and 1b is the number of the arc being tested or injected.

Peer Reviews for Integration Tests

For each sequence diagram test, have someone other than the test author perform the following peer review on each test: For each test, add the checklist, the integration test author's name, and the name of the person who performed the review to the peer review section of the Integration Test Log in your portfolio.

Integration Test Log

Instead of simply running tests, finding bugs and fixing them, we will ask you to create a test log. This log will allow you and us to quickly see what tests you have run, their outcome and any changes you had to make. A test log serves as a simple tool to measure the coverage of your tests and to highlight areas of your design where there were problems.

A template for the integration test log has been included in the portfolio template.  Be sure that you include the following items:
Note:  For the sequence diagram tests that you have written but are not required to execute, you may put "N/A" in the boxes for test results.

Integration Test Summary File

In order to facilitate automated testing, we require a machine-readable text input format that summarizes your unit tests.  The requirements for the test summary file are as follows.  Note that this requirement is in addition to the HTML summary of your tests in the Unit Test Log of the portfolio.
The portfolio template contains a placeholder file with the correct name and location and an example entry.

You can use the test_verify.sh script to check your summary file.  The syntax is
    ./test_verify.sh   <summary file name>
This script must be run from the unit_test/ directory of your portfolio.

Acceptance Tests

If you have not already done so, go read the Introduction and Simulator Testing Framework sections at the beginning of this document.

Acceptance testing involves instantiating the entire elevator control system and testing its operation with simulated passenger objects.  Throughout the project, you will be provided with various acceptance tests.  Occasionally, you will also be required to write some of your own acceptance tests. 

In addition to the required tests, you are encouraged to make up your own tests to exercise your elevator system.  Very simple (e.g. 1 passenger) acceptance tests that exercise a certain problem may be simpler to analyze and debug.  Additional complex acceptance tests can help you identify additional problems with your implementation.

The acceptance test file (.pass file) defines the basic parameters for one or more passengers.  The system acceptance test file format is:

<entry time> <start_floor> <start_hallway> <destination_floor> <destination_hallway>  ;  <comment_field>
The command line reference contains a more complete description of these fields in the description of the -pf flag.  Tests with this input file are run with all modules in the system loaded.  The entire system is initialized before the testing begins.  An example input for the system integration test is:

15s 1 BACK 3 FRONT    ; person A going up
15s 7 BACK 1 FRONT    ; person B going down

Which means that at the same time, t = 15, two people show up, 1 person on the 1st floor back hallway wanting to go to the 3rd floor front hallway, and 1 person on the 7th floor back hallway wanting to go down to the 1st floor back hallway.  Keep in mind that certain floors do not have certain hallways, and if you mistakenly assign people to hallways that do not exist, you will receive an error!  Here is the hoistway setup.  Keep this handy when performing your acceptance tests.

Floor #
8 FH
7 FH and BH
6 FH
5 FH
4 FH
3 FH
2 BH
1 FH and BH

There are no peer reviews for acceptance tests.

Because of the pseudorandom behavior of the simulator, you may find that some bugs occur in a test when run with one random seed, but not when run with another.  For this reason, you should consider running tests multiple times with different random seeds.

Acceptance Test Log

Instead of simply running tests, finding bugs and fixing them, you will create an Acceptance Test Log. This log will allow you, your teammates, and the course staff to quickly see what tests you have run, their outcome and any changes you had to make as a result of testing. A test log serves as a simple tool to measure the coverage of your tests and to highlight areas of your design where there were problems.

All test files and output files must be included in the acceptance_test/ directory, and may NOT be placed in a subdirectory or any other location.

You shall save the results of the acceptance tests and include them in your project portfolio.  A template for the acceptance test log has been included in the project portfolio.  Be sure that you include the following items:

Back to Project 5