User Acceptance Testing
Determine the degree to which delivered software meets agreed user requirement
specifications. User Acceptance testing confirms whether the software supports the existing business processes and any
new initiatives expected as part of the delivery. User requirements may evolve during
implementation as the early benefits and short-comings of a new system begin to be realized.
During UAT, the way the software is intended to perform and behave upon release for general use is assessed. This includes the:
- Accuracy and utility of user documentation and procedures.
- Quality and accuracy of data being produced.
- Layout and content of reports and audit trails.
- Release and installation procedures.
- Configuration management issues.
Ensure that all software to be put into the ebc production environment is tested for compliance with statutory regulations or organizational business
processes and rules, and to observe how it will behave under operational conditions.
- break/fix solutions tested in pre-production environment within 2 days, 95% of the time
- full regression testing completed in pre-production environment within 20, 90% of the time days
- new systems tested within pre-production environment within 30 days, 90% of the time

How are we doing ?
- business unit participation in developing testing scripts
- participation in all User Acceptance testing
- Business unit sign-off for Release to Production processes
Costs are paid through an annual appropriation to the ebcIT Division through the Ministry of Labour as negotiated during the annual government-wide fiscal budgeting process.
6. Dependencies
- application being handed over for UAT should be complete according
to the agreed specification, and has been thoroughly and effectively system tested by any vendors. if this is not the case, and a great deal of UAT resource and time can be spent finding functional errors that should have been discovered and fixed during system testing.
- the existing of a Definitive Software Library (DSL) and quality release management process are consider best practices for release of software into the production environment
- maintenance of pre-production servers for mission critical applications in Peterborough UAT lab remain in synchronization with production servers in order to expedite testing and resolution of break/fix problems encountered in production
- determining requirements - usually covering:
- Functional - ensuring all business functions are performed as defined and satisfy user’s needs.
- Operational - ensuring requirements for data capture, processing, distribution and archiving are met.
- Performance - ensuring performance criteria such as response times, transaction throughput and simultaneous processing are achieved.
- Interface - ensuring all business systems linked to the software system in UAT pass and receive data or control as defined in the requirements specification.
- Security - ensuring that the application is properly configured to prevent unauthorised or inappropriate access.
- Disaster Recovery - ensuring that in the event of the application being abnormally terminated, that it can be reset and re-established without losing or corrupting data.
- Data Migration - ensuring that any legacy data has been correctly and accurately converted and migrated to the new system including the ability of the application to carry out day-to-day functions on that data.
- recording of any known errors encountered during system or UAT testing for which solutions have not been implemented
- preparing risk analysis associated with promoting tested software into production environment
environment.
- Requirements Analysis/Test Planning - identify the essential elements of the software to be tested, with any wider coverage being considered desirable or a bonus. With larger or extremely complex software, total test coverage may not be possible or practicable. Whilst it may be possible to calculate all the
possible combinations and permutations of test conditions, many of these will not actually occur in practice or not be significant enough to warrant testing time and effort.
- Test Design/Test Planning - Suitable test design enables the requirements to be put into a form whereby the application is broken down into testable chunks. Requirements are usually at a high-level and the design process seeks to further refine these into tangible processes that can be specified as test cases.
- Once each test case has been quantified, it can be broken down into three parts:
- Pre-requisites - what data and other set up is required to run the test
- Test Steps - what actions within the application should the user take to run the test
- Verification - how will each test be verified that it has actually worked
- Test Execution - the tester must record the conditions prior to starting a
test, the actions taken, and what results occurred. The tester must produce physical evidence, for example screen prints, and be able to repeat the problem. Otherwise, the developer charged with fixing the problem will not understand it, not be able to repeat it, and reject it as a problem. The test will then have to be repeated and recorded properly, a waste of everyone’s time and effort.
- Managing Incidents - testers will come across incidents of clarification and
enhancement. Ambiguities in specifications are quite common and may not be discovered until UAT. These are clarified and resolved between the teams involved and agreed to be either a software problem to be cleared now or an enhancement to be provided in some future release.
8. More Information
- Responsible for Service Management, delivery and accuracy of this Catalogue item: Ray Nakano, Director, Application Systems
- User Acceptance Test Standards - British Columbia
- UAT Process description
