1 / 28

Grading Student Programs: A Software Testing Approach

Grading Student Programs: A Software Testing Approach. 14th CCSC Southeastern Conference November 3-4, 2000 Edward L. Jones CIS Department Florida A&M University Tallahassee, FL ejones@cis.famu.edu. Outline. Motivation Tester’s Approach Assignment Life Cycle An Example Lessons Learned

arella
Download Presentation

Grading Student Programs: A Software Testing Approach

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Grading Student Programs: A Software Testing Approach 14th CCSC Southeastern Conference November 3-4, 2000 Edward L. Jones CIS Department Florida A&M University Tallahassee, FL ejones@cis.famu.edu CCSC SE

  2. Outline • Motivation • Tester’s Approach • Assignment Life Cycle • An Example • Lessons Learned • Future Work CCSC SE

  3. In 25 Words or Less • Automation is possible • Requires up-front teacher investment • Requires behavior adjustment by students • Problem is well-structured • High potential for reuse • Just Do It!! CCSC SE

  4. Motivation • Enhance the learning experience • Grading is labor-intensive • Difficulty of consistency • Class size increases workload • Redundancy of explaining deductions CCSC SE

  5. A Tester’s Approach • Attitude -- finding what’s wrong! • Consistent, repeatable error search process • Careful documentation of results/findings • Automation to reduce time investment • Black-box testing based on specification. CCSC SE

  6. SR (submission) Student viewgrade getcopy postgrade submit PR (public) postassignment Teacher getassignment Programming Environment CCSC SE

  7. An Example • COBOL programming assignment • Example of HOW TO do it • Good sense of up-front effort • Repeatable/reusable pattern • Expose difficulties • Insights into possibilities CCSC SE

  8. Automated Grader PrepareAssignment Student Program Assignment Specification Write Program Program Grading Life Cycle Test plan Test driver Assignment Specification Checker script Test cases Implement Grader Grade Programs Grading Log Grade Report CCSC SE

  9. Lessons Learned • Student shock & outrage at exactness • Specification must be better than average! • Front-end loaded: Assure testability up front • First-time automation is costly • Amortized via similar assignment styles • Students need the grader before submitting • Students need to learn how to read, interpret and satisfy a specification CCSC SE

  10. Automation Challenges • Does the teacher have the time? • Is the automated grader TOO STRICT for CS1/CS2? • What to ignore? • Deduction schedules become complex – aggregate vs specific search strategy • The “just a little more automation” trap!! • The guts to do it!! CCSC SE

  11. Future?? • Do it again! • Do it on another class, CS1-CS3! • Investigate table-driven checker scripts – reduce costliest step. • Distribute grader along with assignment! • Sell colleagues on the idea! CCSC SE

  12. Questions? Questions? Questions? CCSC SE

  13. Thank You CCSC SE

  14. Interactive Input Requirements (1) (1) The program must be menu-driven based on TRANSACTION CODE. ----------------------------------------------- ENTER TRANSACTION CHOICE: 1 -- Open Account 2 -- Deposit 3 -- Withdraw 4 -- Close Account 5 -- Check Balance 9 -- QUIT PROGRAM ------------------------------------------------ CCSC SE

  15. Interactive Input Requirements (2) (2) Sequence of interactive data entry, per transaction code: Open (01) : Account#  Amount  Lastname  FirstName Deposit (02) : Account#  Amount Withdraw (03) : Account#  Check#  Amount Close (04) : Account#  LastName  FirstName BalCheck (05) : Account# CCSC SE

  16. Output Requirements - Messages - ACCOUNT HAS NON-ZERO BALANCE - INSUFFICIENT FUNDS C. UPDATE COMPLETION - DEPOSIT MADE - WITHDRAWAL MADE - BALANCE DISPLAYED - ACCOUNT OPENED - ACCOUNT CLOSED A. VALIDATION ERRORS - BAD TRANSACTION CODE - DEPOSIT AMOUNT TOO HIGH - DEPOSIT AMOUNT TOO LOW - MISSING ACCT OWNER NAME - MISSING INITIAL DEPOSIT B. UPDATE ERRORS - - ACCOUNT ALREADY EXISTS - ACCOUNT DOES NOT EXIST CCSC SE

  17. Processing Requirements BUSINESS RULES (excerpts) 2. An initial deposit AMOUNT must be given for OPEN. 3. The initial deposit AMOUNT must be at least $20.00. 5. The deposit AMOUNT must NOT exceed 10,000.00. 6. A withdrawal AMOUNT may not exceed the Balance. 7. An account can not be closed unless it has a ZERO balance. 10. The OPEN account must FAIL if the account already exists. CCSC SE

  18. Test Plan (1) 1. There will be three test runs 1.1 Nominal -- valid transactions, successful updates Goal: Demonstrate correct behavior in "perfect" world. 1.2 Update Errors -- valid transactions, update failures. Goal: Demonstrate ability to detect and correctly respond to master file conditions that prevent update. 1. 3 Transaction Errors -- invalid transactions Goal: Demonstrate ability to detect and correctly respond to invalid transactions. CCSC SE

  19. Test Plan (2) 2. All test runs use the same master file. 3. Test Execution & Results Verification 3.1 For each test run there will be an input test data set 3.2 For each test run there will be a verification script that scans the program output file (p7_audit.rpt) for expected results. 3.3 Each verification script tallies deductions for failure to produce expected output. 3. The verification scripts write results to log file p7_user.log. CCSC SE

  20. Test Plan (3) 4. File Naming Conventions Test run 1 2 3 Test Data Set testdata1 testdata2 testdata3 Verification Script checkscript1 checkscript2 checkscript3 CCSC SE

  21. Test Driver # ------------------------------------------------- # Test Driver: Perform 3 test runs. # ------------------------------------------------- set slog=p7_{$1}.log echo "p7 Grading Log for student $1" > $slog set totpen=0 foreach run ( 1 2 3) #-| Run program & check results. cobrun $1 < testdata$run csh checkscript$run $1 p7_audit.rpt set runpen = $status @ totpen=$totpen + $runpen echo “$1–RUN $run PENALTY = $runpen" end # for loop echo “Student $1 – TOTAL PENALITIES = $totpen" >> $slog exit CCSC SE

  22. Test Cases Run #1 NOMINAL Test Cases VALID transaction, SUCCESSFUL update Test Sequence / Expectation: 1) BAL CHECK -- ACCT (1111) / Bal = 250.55 2) DEPOSIT -- ACCT (1111) 249.45 3) BAL CHECK -- ACCT (1111) / Bal = 500.00 4) CLOSE -- ACCT (8888) Wine Brandy 5) WITHDRAW -- ACCT (9999) 26.00 6) BAL CHECK -- ACCT (9999) / Bal = 175.00 7) OPEN -- ACCT (5555) 876.54 Jones Ed 8) BAL CHECK -- ACCT (5555) / Bal = 876.54 ------------------------------- When Master File is: ------------------------------- 1111 000025055 010190 McNair Stub 2222 000560000 020293 Simmons Joe 3333 000000100 123199 Fisher Kelly 4444 000000750 012500 Broke Eyem 6666 000998900 071092 Beuche Bobby 8888 000000000 011595 Wine Brandy 9999 000002100 020100 Faulk Mark CCSC SE

  23. Checker Script (1) # ------------------------------------------------------- # Filename checkscript1 # Purpose: Determine deductions for wrong output. RETURN #points deducted. # # Invocation: checkscript1 student resultsfile # ------------------------------------------------------- set pen=0 set log=p7_{$1}.log #-| AGGREGATE counts -- transaction completion messages. #-| Search results written to student's grading log. foreach msg (OPENED CLOSED DISPLAY WITHDRAW DEPOSIT) grep $msg $2 >> $log if ($status) then @ pen = $pen + 1 echo "$1 - WRONG - missing $msg MESSAGE" >> $log endif end CCSC SE

  24. Checker Script (2) #------------------------------------------------------------------------ #-| SPECIFIC Search: 249.45 DEPOSIT accepted. egrep "[0]*2 " $2 | grep 249.45 >> $log if ($status) then @ pen = $pen + 1 echo "$1 - WRONG - missing DEPOSIT 249.45 MESSAGE" >> $log endif #-| SPECIFIC Search: 1111 DEPOSIT applied correctly. egrep "1111" $2 | grep 500.00 >> $log if ($status) then @ pen = $pen + 1 echo "$1 - WRONG - BALANCE after DEPOSIT not 500.00" >> $log endif … #-| Return the penalty points. echo "$1 - script1 PENALTY POINTS = $pen" >> $log exit $pen CCSC SE

  25. Program Grading Log (1) p7 Grading Log for tmorris ____________________________ 007 SUCCESS ACCOUNT OPENED tmorris - WRONG - missing CLOSED MESSAGE 001 SUCCESS BALANCE DISPLAYED 003 SUCCESS BALANCE DISPLAYED BALANCE DISPLAYED 005 SUCCESS WITHDRAWAL POSTED 006 SUCCESS BALANCE DISPLAYED 007 SUCCESS ACCOUNT OPENED 008 SUCCESS BALANCE DISPLAYED 001 VALID 002 VALID 003 VALID 004 INVALID MISSING AMOUNT 008 VALID 002 2 1111111 249.45 003 1111111 500.00 McNair Stub 007 1 5555555 30300 876.54 Jones Ed 5555555 876.54 000000 Jones Ed tmorris - script1 PENALTY POINTS = 1 CCSC SE

  26. Program Grading Log (2) 003 VALID 4 ACCOUNT CLOSED tmorris - WRONG - extraneous CLOSED MESSAGE tmorris - WRONG - missing FAIL MESSAGE 001 INVALID ACCT DOES NOT EXIST 002 INVALID ACCOUNT ALREADY EXI tmorris - WRONG - CLOSE -- missing NON-ZERO BALANCE message 004 INVALID INSUFFICIENT FUNDS 005 INVALID ACCT DOES NOT EXIST tmorris - script2 PENALTY POINTS = 3 001 SUCCESS ACCOUNT OPENED 003 SUCCESS ACCOUNT OPENED tmorris - WRONG - extraneous OPENED MESSAGE 004 INVALID DEPOSIT AMOUNT TOO HIGH 005 INVALID DEPOSIT AMOUNT TOO LOW tmorris - WRONG - extraneous DEPOSIT MESSAGE 004 INVALID DEPOSIT AMOUNT TOO HIGH 001 1 7777777 30300 5.00 OPENDeposit TooRLOW 001 7777777 5.00 000000 OPENDeposit TooRLOW tmorris - WRONG - OPEN -- missing MISSING ACCT OWNER NAME message tmorris - WRONG - OPEN -- missing MISSING ACCT OWNER NAME message 004 INVALID DEPOSIT AMOUNT TOO HIGH 005 INVALID DEPOSIT AMOUNT TOO LOW tmorris - WRONG - OPEN -- missing MISSING INITIAL DEPOSIT message tmorris - script3 PENALTY POINTS = 5 tmorris - TOTAL RUN PENALITY POINTS = 9 CCSC SE

  27. PrepareAssignment Implement Grader Grade Programs Student Program Assignment Specification Write Program Program Grading Life Cycle Test plan Test driver Automated Grader Assignment Specification Checker script Test cases Grading Log Grade Report CCSC SE

  28. SPRAE - A Tester’s Framework • Specification: basis for testing • Premeditation: no plan, no test • Repeatability: deterministic outcome. • Accountability: full disclosure • Economy: cost-effectiveness. CCSC SE

More Related