Ø Possibly. For small projects, the time
needed to learn and implement them may not be worth it. For larger projects, or
on-going long-term projects they can be valuable.
Ø A common type of automated tool is the
'record/playback' type. For example, a tester could click through all
combinations of menu choices, dialog box choices, buttons, etc. in an
application GUI and have them 'recorded' and the results logged by a tool. The
'recording' is typically in the form of text based on a scripting language that
is interpretable by the testing tool. If new buttons are added, or some
underlying code in the application is changed, etc. the application can then be
retested by just 'playing back' the 'recorded' actions, and comparing the
logging results to check effects of the changes. The problem with such tools is
that if there are continual changes to the system being tested, the
'recordings' may have to be changed so much that it becomes very time-consuming
to continuously update the scripts. Additionally, interpretation of results
(screens, data, logs, etc.) can be a difficult task. Note that there are
record/playback tools for text-based interfaces also, and for all types of
platforms.
Ø Other automated tools can include:
code analyzers - monitor code complexity, adherence to
standards, etc.
coverage analyzers - these tools check which parts of the
code have been exercised
by a test, and may
be oriented to code
statement coverage,
condition coverage, path
coverage, etc.
memory analyzers - such as bounds-checkers and leak detectors.
load/performance test tools - for testing client/server
and web applications
under various load
levels.
web test tools - to check that links are valid, HTML code
usage is
correct, client-side and
server-side programs
work, a web site's
interactions are secure.
other tools - for test case management, documentation
management, bug
reporting, and configuration
management.
No comments:
Post a Comment