18-649 Project 12
Check the course webpage
for due dates
Please submit all project-related correspondence to
Changelog:
- 11/14/2014 Added Changelog
- 11/14/2014 Must justify acceptance tests that do not pass
In this project, you will finish your runtime monitoring, complete unit and integration testing, and begin acceptance testing.
Assignment:
1. Acceptance Stress Testing
1.1 Run Some More Acceptance Tests
We've provided a few more acceptance tests for you. You should look at
these tests as good examples of tests that your generator (in part 1.2)
might generate.
You should run these tests and update your Acceptance Test Log. This
includes linking the test files, and filling a row in the table with the
test statistics. You must also update the test results for the previous
project acceptance tests with the results from your new elevator
design.
Any tests that you do not pas should have a justification in the notes
section of your Acceptance Test
Log that describes the bug causing this behavior. This
justification should include the number of the bug in the Issue Log
(create a new entry in the log if you need to).
1.2 Write an Acceptance Generator
For this project you will need to write an acceptance test generator.
Your acceptance test generator:
- Must be a command line tool
- Must be called AcceptTestGenerator
- Must take at least 1 argument: the number of passengers in the test
- May take more arguments if you decide to do so
- If called with no arguments, must print usage details, along with an example command line call
- Should make tests that roughly follow the criteria given in project 8
- Every landing should be used at least once as a source for a passenger,
and at least once as a destination. Note that some floors have more than
one landing
(e.g. 1 FRONT and 1 BACK), so both the 1 FRONT and 1 BACK landings
must be used at least once in the test.
- Landings that do not exist (e.g. 2 FRONT) must not appear in the
test. (This will cause the passenger injector to fail and the
simulator to crash)
- You do not need to add passengers in the car at the beginning, as in proj12aceptance2.pass. If you want to add this to your tests it is acceptable.
You cannot add passengers to the car at a time after 0s. This will cause the elevator to throw a runtime error
Since your test generator will use some form of random number generator,
it's okay if not every generated test satisfies the conditions, however
your generator must be able to create tests that satisfy this criteria.
This means that if your test generator only sends passengers from 1
FRONT to 2 BACK, this is not an acceptable generator.
Your acceptance test generator can be written in whatever programming language you feel like, however, it must run on the ece machines. Additonally, if the generator is in a compiled language, you must include the source, and an executable that will run on the ece machines. For this reason we highly suggest a scripting language, such as Bash, Perl, or Python.
Here is a template of a bash script that will generate some tests that do not meet the criterion.
Include this test generator in your /acceptance_test folder.
1.3 Create and Run 40 Tests
Using your test generator, create and run 40 tests. Link them in your
Acceptance Test Log in a separate table below the first. Use
this table for stress tests, and the original table for the acceptance
tests that have been provided (e.g. proj7acceptance1.pass) or that you
have created yourself.
2. Write a Monitor
Write runtime requirements monitors for RT-8.1, RT-8.2, and RT-8.3, and add
them to the
RuntimeRequirementsMonitor java file. Information about runtime monitoring can
be found in Project 8. If there is a group
member who has not yet written a monitor, they must write one for this project. If you
are in a group of three, and everyone has written a monitor, then this
task can be assigned to any group members.
We highly advise that you run your monitors on your mid-semester elevator
(Sabbath mode), to make sure your monitors do in fact catch violations of the
high level requirements. If your project 7 elevator does meet any of the high
level requirements, we advise that you introduce behaviors that cause
violations, so as to ensure your monitor is not generating false negatives.
For the final demos, you will be required to pass your acceptance tests with
no warnings generated by the course monitor. The course monitor is quite
slack, but you should still make sure your monitors are actually detecting
faults, to give yourself plenty of time to debug your code prior to demos.
3. More Testing
3.1 Complete Unit Testing
Run all unit tests. For this project, all unit tests must pass
(0 failed assertions).
You should have already passed all of your unit test for last project,
so this should be a freebie.
3.2 Complete Integration Tests
First, double check your sequence diagrams to requirements
traceability to make sure it is up-to-date. Make sure your
sequence diagrams are consistent with the design changes introduced
in project 8 (and any updates you made in project 9 and 10), as well
as consistent with your updated message dictionary from
project 11.
If needed, review the Integration
Testing
section of the Testing Requirements document.
Once your sequence diagrams are fully up-to-date, you will finish updating
your integration tests, completing the remaining integration tests that
you didn't update last project. If you have been keeping up with your
issue log, it should be easy for you to use it to identify which
integration tests need to be created or updated. You should
also make sure that the the traceability comments in all of your
integration tests are complete and up-to-date.
Update your Integration Test
Summary File with the new integration tests. Use the
verification script to ensure that the summary file is correct and
complete.
When you have completed this part of the project, you should have at
least one integration test for every sequence diagram. Have
someone other than the test author complete a peer review of the
updated tests and update the peer review section of the Peer Review Log . Remember
any peer reviews that result in unfixed issues, should be added to
the Issue Log.
Execute your integration tests and record the results in the Integration Test Log.
Note that you do not need to specify the "-b 200" flag for
integration tests.
For this project, you must pass (0 failed assertions) *all* integration tests, including those you wrote for this project.
4. Peer review
For this project, you should peer review all of your updated integration
tests and your acceptance test generator. Additionally, if you have not
been peer-reviewing your monitor designs, you should add peer reviews
for them as well.
Team Design Portfolio
The portfolio you submit should contain the most up-to-date
design package for your elevator system organized and formatted
according to the portfolio
guidelines. You are going to
update your portfolio every week, so be sure to keep an up to date
working copy.
Ensure your design portfolio is complete and consistent.
The following is a partial list of the characteristics your
portfolio should exhibit:
- Changes requested by the TAs in previous projects have been
applied.
- All required documents are complete and up-to-date to the
extent required by the projects (you do not need to update files
or links related to future projects).
- All documents include group # and member names at the top of
the document. (This includes code, where this information
should appear in the header field)
- Individual documents have a uniform appearance (i.e., don't
look like they were written by 4 individual people and then
pieced together)
- The issue log is up to date and detailed enough to track
changes to the project.
Handing In Results
- Each team shall submit exactly one copy of the assignment.
- Follow the handin instructions detailed in the Project FAQ to
submit your portfolio into the afs handin directory ( /afs/ece/class/ece649/Public/handin/project10/group#/ontime/).
- Be sure you follow the
required format for the directory structure of the portfolio
and its location in the handin directories.
- Be sure to follow ALL
the portfolio guidelines detailed in the Portfolio Layout
page.
- Any submission that contains files with modification dates
after the project deadline will be considered late and subject
to a grade deduction (see course
policy
page for more information).
Grading Criteria:
The Minimum Requirements spreadsheet is located
here
This project counts as one team grade. Points are assigned as
follows. A detailed grading rubric is available
here (PDF).
This project assignment is worth 130 points:
- 25 points for a
test generator that generates tests that meet the criteria.
It must be able to run on the lab machines.
- 10 points for an up
to date Acceptance Test Log.
- 10 points for
successfully completing all unit tests (all tests must pass)
- 35 points for writing
and successfully completing all integration tests (all tests must
pass)
- 10 points for
the RT-8 requirement monitors
- 5 points for the peer
review of the acceptance test generator.
- 15 points for the peer
reviews of updated or newly created integration tests. You should
have at least 5 this project. Each team
member must complete at least one peer review.
- 15 points for the peer
reviews of runtime monitors. It is acceptable for these to have been created in previous projects. Each team
member must complete at least one peer review.
- 5 points for an entry in
the Improvements Log that
tells us what can be improved about this project. If you
encountered any minor bugs that we haven't already addressed,
please mention them so we can fix them. If you have no
suggestions, say so in your entry for this project.
Each team member must satisfy the minimum stated per-member requirements
(e.g., one object for each activity). Team members who omit any required
per-member activity will be penalized as described on the
course admin page.
Back to course home page