Introduction:
Embedded system is becoming more complex in the modern Technology. This is because of Integration of various technologies and various bolt-on Hardware with main Embedded System as well as New Software upgrades in the existing system at consumer and Client places. New Methodologies and process are emerging to improve Embedded Software Quality. This white paper speaks about how QA team works towards effective methods of V&V and process approach.
This paper describes Stage by stage of QA approach for the iDTV/Set top Box embedded system. Different stages are Project Proposal, Requirement grounding, Test Plan & Strategy, QA Design, Test case Preparation, Test case execution and Defect Capturing, Change Request handling, integrating the offshore testing and field test results, Regression execution and Software Maintenance.
In above stages, QA team involvement is measurable and accountable. QA team involvement can be split to three types, 1) what is scope in the system, 2) when has to accomplish 3) how are we going to do.
In any iDTV/Set top Box validation and verification QA team can be adopt this approach to attain High Results.
Project Kickoff:
Whether QA team has the knowledge about the Delta requirements about the new system? The Delta Requirements for Current trend is New Broadcast Technologies, New User interactive applications and Different Connectivity methods.
The Broadcasting technologies like DVB-T2, Open Cable and multi tuner support. The new user interactive applications are MHEG HD, MHP BB, Ginga, Hbb TV, Time shifting and video store. Different connectivity methods like Wifi support, IPTV , IP EPG, IP UPG and DLNA.
If QA Engineers are not having knowledge of delta requirements during kick off, there should be training plan in a place for them to understand the system*.
Requirement Grounding:
During Requirement preparation by Customer or Requirement Engineer, they do release the draft versions of requirements for review to the team (Development and QA). During Draft version preparation they do more interactive Customers, Country standards, Country Broadcast Authorities, Certification Bodies, Private Broadcasters and other Suppliers (It could be Hardware or software). During this phase, requirement would have multiple open points.
The QA Team should attend all the overview sessions provided by the customers or RE. If QA team has any doubts about the requirement, they should ask and clarify the doubts in the overview session.
QA team should record all the peer review comments in the appropriate tool. During Requirements Draft stage there could be chance in the requirements might be missing about Negative use cases behavior, Event actions behavior, Performance behaviors and Localization settings related things. QA Team should attend walkthrough session of requirement. During walkthrough with Customer or RE, they will explain line by line statement about the requirement. QA team members can provide the uncovered or not complete statements about in the Requirements, if exists.
QA team should attend 1) overview sessions about requirement, 2) Review the Specification and enter the comments with valid data in appropriate recording procedure 3) Questioning the requirement statements and capture the unclear requirements during Walkthrough*.
Test Plan and Strategy:
Test planning and strategy are critical phase in software test life cycle. During Test Plan, all the mandatory data should be captured, like resource planning, competence Matrix, Number of Test cycles, and Project Risks and QA team deliverables. In Test Strategy should contain the Validation team scope, QA Approach, Critical Features, Test design methods and other dependences. The Test Plan should be prepared by Test Manager /Lead. Test Strategy is highly technical document, so it should be prepared by Test Architect/ Technical Test Lead.
The QA team should understand and clearly visualize the overall system from Test Planning and Test strategy Document.
In Digital TV /Set top box validation, there many dependencies like Hardware delivery, Test Environment set up, Test stream/data Preparation, Tools, Test ware availabilities and some of the scenarios cannot be executed in the development site ( It can be executed only in the field or customer place).
Test plan and strategy should be aligned with Costumers, Software Development team and review has be done these all the respective stake-holders.
During Test Plan & Strategy QA Team member’s contribution should very high level. Because Test Manger and Test Architect will prepare the credentials based on the QA Team input also. Ex. What is test cases estimation for new delta feature (like MHP, Ginga ) , what are the H/W required , Tools and Software’s requirement for feature ( Like IP streaming, USB, DLNA Servers, high resolution video Generator ) and What are features not going to test or validate, basically it is covered in requirement not possible to executed in the Test Site .
QA team should provide all the necessary data’s to the respective Test lead (Management / Technical) and QA team should verify the documents whatever their inputs are captured or not.*
QA Design:
There are many and more QA design approaches are available for different types of the system. As a general approach Block and White box testing are common Practice.
During Digital Television and Set-top box QA Design has the following design Approaches can be used.
1) White box testing
2) Block box testing
3) Automation and Regression testing
Wwhite Box testing:
During Initial phase (Functional Block) releases, white box testing has important aspect. Because the entire module is not always connected, so using Driver and stub used for connecting and disconnecting the modules with required inputs and outputs. White box checklist is used for cyclomatic complexity measurement.
Block Box Testing:In any iDTV/STB application has the following block box testing approach.
1) Pre-release testing
2) Pre-Integration testing
3) Functional testing
4) System testing
5) Stress and Stability testing
6) Pre-Certification testing
7) Field testing
8) Performance testing
9) Consumer forum and Usability testing
Pre-release testing:
Pre-release testing is basic functional check testing and check whether system is useable or not after loading the new software (Firmware). Pre-release testing is done before the official software release. In iDTV/STB Test Environment, Software (firmware) will be loaded in the correct target and tested via RFS mode / USB mode.
In Pre-integration testing, is a systematic approach .it is kind of Smoke and sanity testing. During Pre-integration the Platform , Middleware , UI software and Third-party Deliverables ( if any ) . It has well structured checklist and test cases to cover all the important use cases in the system. Some time , it is called as feature wise major checklist. Ex. Audio, video, Program control, data, OAD, Installation, MHEG, Content browser and Open Internet. This testing will be executed by Block box QA member.
Functional Testing:
In any Software application, Functional Test always plays the very important role and contribution .Irrespective of Product or Project, whatever system objective will be tested. All the functional testing test case created based Functional Requirement Specification. All functional positive and negative use cases tested with referring to FRS. Different type of test design methods will be used,
a) Control flow technique ( ex: Menu Navigation in the TV, Song Selection from the list of songs)
b) Data flow technique ( ex: setting the volume in the iDTV/STB, the data is applied in the system or not )
c) Decision Tables ( ex: any fallback mechanism , during Logical channel number assign )
d) State Transition ( ex: Program zap, Program watch mode to Internet streaming mode )
e) Bound Value analysis ( ex: check the starting and ending freq support in the TV )
f) Equivalence Partition ( ex: Progress bar display during auto installation. )
System Testing:
System testing slightly is differ functional testing. In iDTV/STB environment, System used is used for user approach with Positive and negative use cases. Like how user will use the feature in this target. Test use cases prepared based on the Costumer Requirement specification. All the definitions and statements of the document will be evaluated.
Stress and Stability Testing:
Any consumer application, nobody knows how many hours and how many days system will be going the use continuously. The navigation and selection of menu or application is differing from user to user. User always tries to use system with all the possible ways. For this, system always to be steady (without crash or malfunctioning). To concern that, Test cases and use cases design to fulfill the requirement approach. Put the system in overnight and long runs with valid use cases ( ex: progam zap, navigation , source selection ) .
Pre-Certification Testing:
Certify the system with against the certification Test cases. These Test cases will be provided by certifying agencies. ( Ex. DLNA, DTG PSI/SI, DTG MHEG HD, Allegro, Dolby, Ziggo, UPC, Hbb TV). After all the test cases pass in the Development environment, that version software can be sending for certification. Once Certification is passing, the iDTV/STB allow to their Logo in the particular model.
Field Testing:
The field testing will be done after software maturity. The system will be taken to respective region (supporting countries) and tested with live broadcast. For the field testing QA Tester will execute the Field testing checklist.
Performance testing:
During Performance testing, all the defined performance parameters values will be validated. Ex: program zap time take to display video, Installation time with lot of services in the broadcast. Internet access and time out with network congestion. This test will be done in the Validation Environment.
Consumer forum and Usability Testing:
This testing is done private consumer forum and usability test organization. They provide results and they publish results in the forum. It is an autonomous body. There is not definition about test cases. Only they have own checklist.
Understand the Test methods and scope of validation, then map with Requirements, apply the techniques and design the test cases. Apart from this, they should watch dog the non scope testing results also. It gives a better understanding about the system and features.*
Test case Preparation
Test cases can be prepared using requirement use cases and QA designs approach. During Test cases preparation, Test setup mile stone timeline should be considered,. For example, in functional testing all the test cases design techniques can be used. In Performance testing, Critical to Quality (CTQ) parameters should be verified. In field testing, all the field scenarios should be captured and executed. For example, In NIT ( Network Information Table ) testing, multiple NIT , NIT missing , NIT present in the second loop .
Choose right QA Design for scope of the system and adopt the test techniques related to QA Design approach.*
Test case Execution and Defect Capturing:
After certain level of functional code ( as defined software deliverables mentioned in the Planning) software will be release to validation team. QA team will load the software in NFS mode and UPG mode using UART or USB or Network cable. All the functional malfunctions should be captured and entered in the Bug Tracking tool. In Test case execution QA team member should not executed all the test cases, only based on scope related test cases should be executed.
During Test case execution, precondition and post results always matched as defined in the Test cases. If Post results does not match expected result, it consider as malfunction. It should be captured and raised in the defect tracking Tool. (ex: During MHEG invoke, Application reboots. For Translated dumps and logs should be provided in the Bug tracking tool ). During execution there is a need compare the results with some other competitor project. The Competitor product may the clear visualization picture about the system.
Test execution results will provide the maturity of the software, so it is important Process method. **
Change Request handling:
Software change request can be triggered to system at any phase of project by the costumer. Adopting and validating Customer change request is Mandatory. Plan and create the Test cases for CR and preparing the test data for the system. For example, add the MHP (Multimedia Home Platform) functionality to set top box. In above example, there should be execution plan in the place, based on Customer change request; Understand the CR and Impact on existing test cases can be captured. new test cases should be prepared. The Test Data and standards can be gathered from the specific country and certification authorities.
Understand the CR, understand and adopt the existing and create new tests * as well as it should be executed with proper understanding **
Integrating the offshore test and field test results
How to measure the Quality and Reliability of the software? Using the test result it will be addressed. Each phase of the test results provide the respective stage Maturity. As specific the Pre-release and Pre-int tests results provides, how many functions and modules working in the current releases ( ex: MHEG, OAD, Channel map cloning , time shift ). The functional reports provide the how overall system working with required features. Stress and stability results provide the reliability and strength of software in the embedded H/W. Field test results provides the localization results. For Example, in the iDTV/STB respect to country specific, region foot prints and working and behavior with latest live broadcast streams. The offshore and field test provides the strength of the system.
Regression execution
Regression Execution will be executed after the product maturity phase. During regression test 40% of the Core test cases will be executed for level of testing. During regression phase, the testing can be executed by automation or manual method. Depending upon the availability and critical of product and resource, it will be decided. In Regression phase, Critical test cases should be executed. For example, Regression phase of iDTV/STB, Right Hardware should be used and right Test data should be used. In regression phases, there is a high chance of system may suddenly malfunction. This is because of lot of code change and CR implementation , this regression may affect other modules also.
Regression testing is critical testing phase. It should contain the critical test cases and right h/w should be used.**
Software (Firmware) Maintenance:
After Successful release the product in the market, there could be chance from field for field issues. These issues can be gathered via customer call centre. It should be addressed by product Development Company. Project companies will analysis issues with Requirement engineer, Development team and Test team member. They will analysis results and check whether valid issue or not. Based on the critically they try to fix and release the software to Validation team. The Validation team will verify and issue with new s/w whether properly working or not. The Software will be shared to costumer with different modes ( ex: Via Over the air download transmission , Product company web site , IP UPG update).
In Maintenance testing, check the user report problem thoroughly and test around the feature.**
Conclusion:
The Validation and Verification team contribute for “Success of the product”. In iDTV and STB testing will need the effective and specific approach in the V&V. The above points and methods are followed and process are in Place with specific the embedded. The validation never fails and can attain the High results.
*Verification Approach
** Validation Approach
No comments:
Post a Comment