Lecture 10 Test the system against user and systemrequirements Summary of the SessionReview of the library visit last weekRECAP of last week’s session– (T-SQL queries for joining tables)Testing TutorialReview of assignment and feedback PlanDoCheckActionPDCASoftware Development Process Cycle PLAN (P): Device a plan. Define your objective and determine thestrategy and supporting methods required to achieve … Continue reading “Software Development Process Cycle | My Assignment Tutor”
Lecture 10 Test the system against user and systemrequirements Summary of the SessionReview of the library visit last weekRECAP of last week’s session– (T-SQL queries for joining tables)Testing TutorialReview of assignment and feedback PlanDoCheckActionPDCASoftware Development Process Cycle PLAN (P): Device a plan. Define your objective and determine thestrategy and supporting methods required to achieve that objective. DO (D): Execute the plan. Create the conditions and perform thenecessary training to execute the plan. CHECK (C): Check the results. Check to determine whether work isprogressing according to the plan and whether the results are obtained. ACTION (A): Take the necessary and appropriate action if checkupreveals that the work is not being performed according to plan or not asanticipated.PDCA Phases of SDLC• Requirement Specification and Analysis• Design• Coding• Testing• Implementation• MaintenanceSOFTWARE DEVELOPMENT LIFE CYCLE (SDLC) Introduction to Testing What is Software Testing? Why testing is necessary? Who does the testing? What has to be tested? When is testing done? How often to test? Most Common Software problems Incorrect calculation Incorrect data edits & ineffective data edits Data searches that yields incorrect results Incorrect processing of data relationship Confusing or misleading data Inconsistent processing Unreliable results or performance Incorrect or inadequate interfaces with other systems Inadequate performance and security controls Incorrect file handling Objectives of testing Executing a program with the intent of finding anerror. To check if the system meets the requirements and beexecuted successfully in the Intended environment. To check if the system is “ Fit for purpose”. To check if the system does what it is expected to do. VERIFICATION & VALIDATIONVerification – typically involves reviews and meeting to evaluatedocuments, plans, code, requirements, and specifications. This canbe done with checklists, issues lists, walkthroughs, and inspectionmeeting.Validation – typically involves actual testing and takes place afterverifications are completed.Validation and Verification process continue in a cycle till thesoftware becomes error free. TestingThe testing process involvesdevelopment of a test plan,executing the plan anddocumenting the testresults. Testing methodologies• Black box testing• White box testing Black box testing• No knowledge of internal design or code required.• Tests are based on requirements and functionality White box testing• Knowledge of the internal program design and coderequired.• Tests are based on coverage of codestatements, branches, paths, conditions. Black Box – testing technique Incorrect or missing functions Interface errors Errors in data structures or external database access Performance errors Initialisation and termination errors Black box / Functional testing Based on requirements and functionality Not based on any knowledge of internaldesign or code Covers all combined parts of a system Tests are data driven White box testing / Structural testing Based on knowledge of internal logic of anapplication’s code Based on coverage of code statements,branches, paths, conditions Tests are logic driven Functional testing– Black box type testing geared to functional requirements of an application.– Done by testers.System testing– Black box type testing that is based on overall requirements specifications;covering all combined parts of the system.End-to-end testing– Similar to system testing; involves testing of a complete applicationenvironment in a situation that mimics real-world use. Acceptance testing– Final testing based on specifications of theend-user or customerIntegration Testing– Software testing where the software modulesare combined and tested as a group. It occursafter unit testing and before validationtesting Stress Testing– Testing under unusually heavy loads, heavy repetition of certainactions or inputs, input of large numerical values, large complexqueries to a database etc.– Term often used interchangeably with ‘load’ and ‘performance’testing.Performance testing– Testing how well an application complies toperformance requirements. Alpha testing•Testing done when development is nearingcompletion; minor design changes may still bemade as a result of such testing.Beta-testing•Testing when development and testing areessentially completed and final bugs andproblems need to be found before release. White Box – testing technique All independent paths within a module have beenexercised at least once Exercise all logical decisions on their true and falsesides Execute all loops at their boundaries and within theiroperational bounds Exercise internal data structures to ensure theirvalidity Testing Levels/TechniquesWhiteBoxBlackBoxUnit TestingXIntegrationTestingXSystem TestingXAcceptanceTestingX TEST PLANObjectives To create a set of testing tasks. Assign resources to each testing task. Estimate completion time for each testing task. Document testing standards. A document that describes the– scope– approach– resources– schedule…of intended test activities.Identifies the– test items– features to be tested– testing tasks– task allotment– risks requiring contingency planning. Purpose of preparing a Test Plan Validate the acceptability of a software product. Help the people outside the test group to understand ‘why’ and‘how’ of product validation. A Test Plan should be– thorough enough (Overall coverage of test to be conducted)– useful and understandable by the people inside and outsidethe test group. ScopeThe areas to be tested by the QA team.Specify the areas which are out of scope (screens, database,mainframe processes etc).Test ApproachDetails on how the testing is to be performed.Any specific strategy is to be followed fortesting (including configuration management). Entry CriteriaVarious steps to be performed before the start of a test i.e.Pre-requisites.e.g.– Timely environment set up– Starting the web server/app server– Successful implementation of the latest build etc.ResourcesList of the people involved in the project and theirdesignation etc. Tasks/ResponsibilitiesTasks to be performed and responsibilitiesassigned to the various team members.Exit CriteriaContains tasks like•Bringing down the system / server•Restoring system to pre-test environment•Database refresh etc.Schedule / MilestonesDeals with the final delivery date and the variousmilestones dates. Hardware / Software RequirementsDetails of PC’s / servers required to install theapplication or perform the testingSpecific software to get the applicationrunning or to connect to the database etc.Risks & Mitigation PlansList out the possible risks during testingMitigation plans to implement incase the riskactually turns into a reality. Tools to be usedList the testing tools or utilitiese.g.WinRunner, LoadRunner, Test Director, Rational Robot, QTP.DeliverablesVarious deliverables due to the client at variouspoints of time i.e. Daily / weekly / start of theproject end of the project etc.These include test plans, test procedures, testmetric, status reports, test scripts etc. Annexure Links to documents which have been / will be used in the course oftestinge.g. Templates used for reports, test cases etc. Referenced documents can also be attached here.Sign-off Mutual agreement between the client and the QATeam. Both leads/managers signing their agreement on the Test Plan. Good Test Plans Developed and Reviewed early. Clear, Complete and Specific Specifies tangible deliverables that can be inspected. Staff knows what to expect and when to expect it. Good Test Plans Realistic quality levels for goals Includes time for planning Can be monitored and updated Includes user responsibilities Based on past experience Recognizes learning curves TEST CASESTest case is defined as A set of test inputs, execution conditions and expectedresults, developed for a particular objective. Documentation specifying inputs, predicted results and aset of execution conditions for a test item. Good Test Plans Developed and Reviewed early. Clear, Complete and Specific Specifies tangible deliverables that can be inspected. Staff knows what to expect and when to expect it. Good Test Plans Realistic quality levels for goals Includes time for planning Can be monitored and updated Includes user responsibilities Based on past experience Recognizes learning curves Test CasesContents– Test plan reference id– Test case– Test condition– Expected behavior Good Test CasesFind Defects Have high probability of finding a new defect. Unambiguous tangible result that can be inspected. Repeatable and predictable. Good Test Cases Traceable to requirements or design documents Push systems to its limits Execution and tracking can be automated Do not mislead Feasible Expected Vs. Actual Defects DetectedAnalysis between the number of defectsbeing generated against the expectednumber of defects expected from theplanning stage. Defects Detected Vs. Corrected GapA line graph format that shows theNumber of defects uncovered versus thenumber of defects being corrected andaccepted by the testing group. Average Age Detected Defects by TypeAverage days of outstanding defects by itsseverity type or level.The planning stage provides the acceptableopen days by defect type. ExerciseCreate a test template for Student_Course databaseInclude at least 8 testing processesUse the sample template from slide 33