Will Automated Testing Tools Make Testing Easier?
  • Possibly. For small projects, the time needed to learn and implement them may not be worth it unless personnel are already familiar with the tools. For larger projects, or on-going long-term projects they can be valuable.
  • A common type of automated tool is the ‘record/playback’ type. For example, a tester could click through all combinations of menu choices, dialog box choices, buttons, etc. in an application GUI and have them ‘recorded’ and the results logged by a tool. The ‘recording’ is typically in the form of text based on a scripting language that is interpretable by the testing tool. Usually the recorded script is manually modified and enhanced. If new buttons are added, or some underlying code in the application is changed, etc. the application might then be retested by just ‘playing back’ the ‘recorded’ actions, and comparing the logging results to check effects of the changes. The problem with such tools is that if there are continual changes to the system being tested, the ‘recordings’ may have to be changed so much that it becomes very time-consuming to continuously update the scripts. Additionally, interpretation and analysis of results (screens, data, logs, etc.) can be a difficult task. Note that there are record/playback tools for text-based interfaces also, and for all types of platforms.
  • Another common type of approach for automation of functional testing is ‘data-driven’ or ‘keyword-driven’ automated testing, in which the test drivers are separated from the data and/or actions utilized in testing (an ‘action’ would be something like ‘enter a value in a text box’). Test drivers can be in the form of automated test tools or custom-written testing software. The data and actions can be more easily maintained – such as via a spreadsheet – since they are separate from the test drivers. The test drivers ‘read’ the data/action information to perform specified tests. This approach can enable more efficient control, development, documentation, and maintenance of automated tests/test cases.

Other automated tools can include:

  • Code Analyzers – monitor code complexity, adherence to standards, etc.
  • Coverage Analyzers – these tools check which parts of the code have been exercised by a test, and may be oriented to code statement coverage, condition coverage, path coverage, etc.
  • Memory Analyzers – such as bounds-checkers and leak detectors.
  • Load/Performance Test Tools – for testing client/server and web applications under various load levels.
  • Web Test Tools – to check that links are valid, HTML code usage is correct, client-side and server-side programs work, a web site’s interactions are secure.
  • Other Tools – for test case management, documentation management, bug reporting, and configuration management, file and database comparisons, screen captures, security testing, macro recorders, etc.
image credit: clixmarketing.com