The main considerations in selecting a tool for an organization include:
Assessment of organizational maturity, strengths and weaknesses and identification of opportunities for an improved test process supported by tools.Evaluation against clear requirements and objective criteria. A proof-of-concept to test the required functionality and determine whether the product meets its objectives.Evaluation of the vendor (including training, support and commercial aspects).Identification of internal requirements for coaching and mentoring in the use of the tool.
Introducing the selected tool into an organization starts with a pilot project, which has the following
- Learn more detail about the tool.
- Evaluate how it fits with existing processes and practices, and determine what would need to change.
- Decide on standard ways of using, managing, storing and maintaining and the test assets.
- Example (deciding on naming conventions for files and tests, creating libraries and defining the modularity of test suites).
- Assess whether the benefits will be achieved at a reasonable cost.
Success factors for the deployment of the tool within an organization include:
- Rolling out the tool to the rest of the organization incrementally.
- Adapting and improving processes to fit with the use of it.
- Providing training and coaching/mentoring for new users.
- Defining usage guidelines.
- Implementing a way to learn lessons from use.
- Monitoring use and benefits.
The single technique to change your underlying tool while doing POC
The user interface objects are mainly
- Options button
There are many types of controls that may be implemented in standard. They need to be automated via wrapper functions. The wrappers will be called in further high-level scripts or components. So building a wrapper around the tools control will give you more flexibility while changing tool. You do not have to touch the scripts. For a tool change, you have to change the wrappers. This way you can test the same flows with multiple tools and find the efficiency.
Rolling out Automation POC
Rolling out automation POC in a company for a large group of teams is a nice idea. So while we rolling out POC automation ideas, we need to keep the following in mind:
- Create Single structure/framework/standard – We need to leave a single structure for all our automation teams who will be part of the POC exercise. This will not create any ambiguity for the new tool, new people. This process makes the development time of the POC script real faster. since the framework or structure will be the same for all tools under evaluation. They can be judged in a better way.
- Set of Reviewers– There should be a predefined set of reviewers before rolling out POC automation for a set of tools into org level. Each set of people having reviewer tag can spend 40% to 50% of their time to review the code. This will ensure all teams follow the same standard or process.
- Responsibility to test correctly– During POC of any tool, testers should not bring old bag and baggage with them. The reviewers will ensure the review process of the tool is purely neutral. The result should be unbiased.
- Reusability– The automation test engineers needs to be educated to build the code keeping reusability in mind. Reusable code puts down the development and maintenance cost in the long run. This will also give an insight into the tool on how it is behaving on reusability call.
- Resource– with the same structured code the resource feels at home in. The learning curve becomes low and the productivity goes higher. On the other hand, if any team needs separate, extra, replacement of team members, they can be pulled very easily into the POC activities without much effort. This way we can minimize the risk of the sudden resource need into the POC.
- Core Framework Building Team– surprised? Yes even if this is a POC building activity, I always prefer to bring Framework building team into POC development. It is always better to segregate the framework building team from functional and Automation testers. The Objective is very simple, being framework building team members they can go little extra miles to find out the shortfalls of a tool. Moreover being a separate team they can concentrate on wrapper building whereas the automation team can look after the POC delivery.
Other Advantages of POC
- A lot of automation code can be obtained before even starting of actual automation. Is that not cool?
- The wrapper functions are available quickly to start the test script development. Being tested over and over, they become hugely reliable.
- A framework/guideline or a structure will lead to creating a solid foundation for a further framework development which will lead to better automation.
Potential benefits of using tools include:
- Repetitive work is reduced (e.g. running regression tests, re-entering the same test data, and checking against coding standards).
- Greater consistency and repeatability (e.g. tests executed by a tool, and tests derived from requirements).
- Objective assessment (e.g. static measures, coverage).
- Ease of access to information about tests or testing (e.g. statistics and graphs about test progress, incident rates and performance).Risks of using tools include:
- Unrealistic expectations for the tool (including functionality and ease of use).
- Underestimating the time, cost and effort for the initial introduction of a tool (including training and external expertise).
- And underestimating the time and effort needed to achieve significant and continuing benefits from the tool (including the need for changes in the testing process and continuous improvement of the way the tool is used).
- Underestimating the effort required to maintain the test assets generated by the tool.
- Over-reliance on the tool (replacement for test design or where manual testing would be better).
A study report of US NIST says —
This shows in 2004 the automation coverage was 5% and in 2006 it was 20% but today it is almost 50%
Software product based companies lose $21.2 billion /year due to insufficient testing.
$931 million was invested during 1999 and in 2004 it went up to $2.6 billion.
The solution may be….