Welcome to the Software Testing Spot!
This spot has some ramblings
about software testing (and software design, development and other
information technology related stuff too), some useful tips, an
exercise in design, usability and testing and
a lot of web places to investigate.
If you've got no idea
how you ended up here, don't panic. For light amusement, visit
MagPortal. For a slightly stronger
diversion, try a googlewhack or two. For
heavier entertainment, try teaching yourself a little about software
design, quality and testing with this.
you are interested in some of Erik Petersen's ideas about exploratory
testing and bug clusters, have a read of the award winning Back
to the Beginning paper or choose a link below to read about
In September 1997, in an online discussion
on the purpose of testing, Boris Beizer said
that software was tested 1] to find bugs and 2] to check quality.
David Gelperin said a higher priority
purpose (when achievable) was 3] to prevent bugs being born. None of
these areas need be the exclusive domain of software testers,
particularly the last 2.
People always make mistakes. When computer
systems are built, it is often from scratch and in a rushed time
frame so the possibility of mistakes is significant. What's worse is
many people are completely unaware that they make so many mistakes
and never bother to check what they have done. When these mistakes
occur in software, they are called "bugs". According to
Boris Beizer , this is a very old English word
(from the Welsh "bwg") that meant a problem or a
difficulty. It was later used to describe problems with machines then
computers. A bug causes people problems or difficulties when they use
While "testing" traditionally (up
until the early 1980s) referred to what was done to a system once
working code was delivered (now often referred to as system testing),
testing around 2002 was "greater testing", where a tester
could be involved in almost any aspect of the software development
life cycle. Once code is delivered to testing, it can be tested and
checked but if anything is wrong, the process involved to fix it is
quite detailed and time consuming. If the error was caused by a
design ambiguity, or a programmer oversight, it is simpler to try and
find the problems as soon as they occur, not wait until an actual
working product is produced. Studies have shown that about 50% of
bugs are created at the requirements (what do we want the software to
do?) or design stages, and these can have a compounding effect and
create more bugs during coding. The earlier a bug or issue is found
in the life cycle, the cheaper it is to fix (by exponential amounts).
Rather than test a program and look for bugs in it, requirements or
designs can be workshopped or documents can be reviewed.
Anyone working in software development (not
just testers) needs to check their own work for mistakes, and check
for other team members, so everyone is involved in "greater
testing". We need to check what and how we design, develop and
test, and better design and development practices can help
mistake-proof our application for the people who will use it. As
agile software development has become popular, the line between
developer and tester has blurred and disappeared in some cases. A
developer testing tends to be structure focussed, and system testers
tend to focus on software behavior.
Software is typically built to a model that
is documented in some detail (from a specification, to index cards,
to a conversation that creates an example of how a system is to
function), and the testers compare the model to software as it is
designed. They need to learn about the model (through reading
documentation or investigation) then compare the expected model
actions with the actual system actions. This comparison of actual
versus expected can move beyond software testing into general quality
assurance for the whole software development life cycle. For example:
Is a project plan meeting expectations - is it
realistic; does it have short tasks and regular milestones to track
progress; does it have checking tasks such as reviews, risk
assessments, audits or replans; does it allow for slippage and
rework of defective code; is it agreed to by the team working to it,
Is a design specification (whether for a large
system or a prototype) meeting expectations - is it clear , concise,
unambiguous, without contradictions, understandable by both the
customer and the technical team that will work with it, etc
Is code meeting expectations - following
standard practices (for the industry, organization, department or
team); containing enough explanatory comments; as simple as possible
to allow easy modifications and maintenance; using Test Driven
Development (TDD); containing tracing or debugging code if required,
Is a test plan meeting expectations -
following standard practices (for the industry, organization,
department or team); explaining the required test environment; the
type and amount of test data needed; roles and responsibilities;
change management; bug raising, resolution and retesting processes,
With the popularity of the internet,
software was often developed without a specific model, making it much
more difficult to test. Just as documents could be reviewed without
specifically defining each expected result of each step of the
review, so could tests be performed without explicitly defining
everything that had to be tested in advance. Testing approaches to
this problem are becoming known as as "Exploratory
Based Techniques". The techniques include risk based
exploratory testing, rapid testing, attack or taxonomy based testing,
Testing is a lot simpler than it seems once
the basic principles are understood, but software quality is going
backwards. One study
found that most software was tested, though about three quarters of
all software was only tested informally. Another study (Meta Group
Dec 2001) found only 20% of web sites are tested at all. Hopefully
this site may help in some way to correct that appalling figure.
already know one effective approach. In the last decade, practices
built around small teams (known as agile methodologies) have been
documented and described, focussing on minimizing the overhead
involved in delivering software by trying to deliver regularly
working software to customers, and focussing on human to human
communications when determining the models that are used. I think
these approaches will replace the traditional ones within a decade. I
hope so at least....
My name is Erik Petersen. I'm based in
Melbourne, Australia, and have a consultancy
I've worked in all facets of software development since the mid 80s,
concentrating on quality since the middle 90s. This occasionally
biographical site contains
some of the things I have found since I came online in
1996. During a short career break in 2002, I created this
software testing site, different from other testing sites on the web.
I wanted it to be like a book of information that I could use and
share with other people. I based it on a folder I had in the early
90s when I worked in software services and moved regularly between
roles across the software development lifecycle. The folder had
reference material with tips and articles of relevance to work, or
things I may do occasionally like give a presentation, or help out in
an interview, etc.
The Internet was and remains a major source
of knowledge for me beyond hands-on experiences. I had a good
collection of good reference sites on testing and quality, and I have
added to those for other relevant areas such as life skills and
project management. This site has introductory material for people
who want to learn more, but it is also a useful resource for
experienced testers, developers, designers, managers and others. It
also has an exercise in design, usability and
testing that both novices and experts should find rewarding.
invite you to wander through and discover things. The Spot is one
large page so you can search it all using your browser, or use the
heading links or just page through it. The
linked sites are all free, and you can grab the information as you go
(but respect the legalese at each site please). Most sites have their
own links page too. Sometimes I have pointed out particular articles,
sometimes you can investigate for yourself by searching the site or
looking through the articles. Some files are PDFs, so I presume you
have Acrobat Reader. Some files are MS-WORD documents, but most word
processors would handle them. If you do find an unreferenced article
was very useful or have a favorite unlisted site, send me a note
and I may mention it by name. There are other sites that I will list
when I get the time (and others that I have yet to discover). For the
time being, try these.
The information here spreads across the
entire software development lifecycle from requirements to design to
development to testing to release. Once you finish here, don't forget
to try the exercise in design, usability and
The software testing and quality engineering
site (acronym STQE, pronounced sticky). When I started at the ANZ
Bank in 1997, one of our goals was to create an internal database of
testing knowledge. We never got around to it, but this cornucopia
did. It has the best articles about almost anything from hundreds of
sources. Also has some other good features like weekly "editorials",
and discussions. Sticky minds is part of SQE
which Dave Gelperin used to give as his home page before he sold his
share (but you'll still find some of his articles at StickyMinds).
This software engineering magazine crosstalkonline.org (with a
defense slant) had all the issues online and searchable. Each issue
focused on a particular area of interest with articles from leading
experts. If anyone finds it archived, please let me know
All about Reviews
This is an 89 slide presentation, from the
people behind CrossTalk, covering all types of reviews across the
This magazine has all issues online and
searchable. They also have great newsletters.
Search here at the Magazine Portal for
software development, design and testing articles from the technical
Dr Dobbs Journal, to big-picture CIO and other mags as well. Search
All categories to include other IT magazines, but also diverse mags
from Scientific American to Wired, and many others. There were 101
magazines that had articles containing "software testing"
though quite a lot would be passing references (like "Car and
Driver", "Bank Director", "Askmen.com" and
defect reduction top 10
I consider this short article by Barry Boehm
and Victor Basili to the closest thing yet to a holy grail of
software quality, and if you understand each item and the
implications of it, and you are able to implement a process to
leverage it, you will be producing great software.
The Software Engineering Body of Knowledge is
a brave experiment, modelled on the Project
Management BOK. The Association for Computing Machinery pulled
out of the SWEBOK process based on the findings in this report.
While the BOK approach may work for some aspects of the
software lifecycle, software testing is another matter. There are so
many different situations and variations on testing that a BOK
cannot cater for. Some University of Calgary students have created
view of the testing SWEBOK section (slightly more agile
I voted against approving the draft versions of the
SWEBOK. I was happy to note that Australia had the largest number of
reviewers outside of the US and Canada, but 16 people out of 600 or
so is a tiny sample of the possible reviewer base.
of context-driven quality
The realist's manifesto for quality software
systems. Rather than a testing BOK, what is required is a context
driven approach that changes with the type of software, or project,
or supporting documentation or deadline involved. With minimal doco
or planning time, we utilize exploratory
based techniques. The techniques include risk based exploratory
testing, rapid testing, attack and taxonomy based testing, etc. See
James Bach's or Cem Kaner's
sites for more information, or read some of James's articles on
StickyMinds. Also check out James
Lyndsay's great exploratory testing page
Completely unrelated to Fred Flintstone's
"Yabbadabbadoo", the UPEDU or Unified Process for
EDUcation is a fully featured software development methodology, but
designed to teach good practices in software development (and not
for commercial software production!).
It is based on Rational
Software's RUP process, and a text book that describes the process.
Many companies have proprietary methodologies that they will only
distribute to paying customers. RUP is probably leading the world
because it is built around Use Cases that represent what software is
meant to do, where diagrams supplement words. Search yoopeedoo for
the Use Case artifact for more information or try StickyMinds
A cynical but often true view of the software
development life cycle
Mostly human error
Once you are finished here, you'll also
find some examples in the exercise in design,
usability and testing if you look carefully.
Composers of music often write pieces of music
that contain variations on a musical phrase. This archived page
lists variations of a different sort.
don't need to spell to sell
An example of the confusion caused by the
Britney (and other) variations, and a clever piece of functionality
to get around it.
Ray Panko has collated the results of many
studies that show how feeble we really are. One of the most
dangerous aspects of this now is users creating spreadsheet macros
and not realizing how easily they can get it wrong. This rates a
mention as the last item in the Software defect reduction top 10
Engineer's view of Human Error
Article about Trevor Kletz on various sorts of
human error. He believes better training or supervision can prevent
some errors but the most effective action is to reduce opportunities
for error, or minimise their effects, by changing designs or methods
of working. This is a presentation
of his as well.
to Human Failure
This talks about different aspects of human
errors. There's also other reference
material as well
This is a course on errors and failures from a
English quarrying site, with lecture notes and slides.
No matter how perfect a computer system may
be, a fundamental flaw we usually cannot avoid is it needs to be
used by humans. Thanks to Gerard Yvanovich for the link.
Once you finish here, don't forget to try
the exercise in design, usability and testing
Well, sort of. This is my "Back to the
Beginning" discussion of testing axioms from 1976 and their
impact on testing today. It won Best Paper at STARwest in California
If you are after some basic information, you
may find it in the Software Testing Frequently Asked Questions
System Testing Lifecycle
This was a good introduction to the classic
system testing life cycle. There was also 10
System Testing Commandments with explanations.
Luckily this is on the web archive. (There was a typo in the
explanation but there are probably a few on this page too.) It's
also worth looking at 16
Testing Life Cycle
This link actually takes you to a book called
"Rapid Testing". The sample chapter is a good introduction
to testing activities from design through to code, and defines some
useful buzzwords like validation, verification, V model, static
testing, dynamic testing, etc. Use these terms in job interviews.
risk and quality
This paper, "A Software Quality Model and
Metrics for Identifying Project Risks and Assessing Software Quality
" is a good introduction from NASA on quality, requirements and
risk, though all the ideas on metrics may not be relevant. Check out
the typo in the URL as well. Oops. [grin]
article is "verification and validation implementation at
NASA", which you will find at
It includes the risk criteria that NASA use.
A Testing Introduction
A brief but meaty university introduction to
software testing by Kevin Sullivan. Testing wasn't even available
when I was at uni, and I don't believe you need a uni education to
be a good tester, but it certainly wouldn't hurt.
data for tests
You can't test everything, so you need to be
choosy. Equivalence Partitioning is a key technique and this article
is one of the best I've seen. It will help you to choose a good
cross-section of your data. Craig Borysowich does not seem to
mention tester or trainer in his bio but he writes great articles
about testing. The slides by Gail Kaiser at Columbia University was
a long time reference, but they have vanished into the ether. So
have George Corliss' slides from Marquette U.
to equivalence partitioning is on this University
of Maryland page. Follow it up with this California
Polytechnic EP example from Dr John Dalbey.
A guide from the Eclipse team.
A great example of using equivalence classes
from Debra Richardson (reposted). Could also be called "How to
thrash a Find function". Study the example, then try the
partitioning exercise then compare your answer to the provided one.
More great uni material.
James is a leader in the new school of
software testing moving away from the traditionally heavily
engineered approach to testing software to agile methods drawing
from other disciplines such as human problem solving and scientific
method. The "exploratory based techniques" link in the
opening paragraphs of this page links to this
article by James that sums up his ideas. You'll find a lot of
interesting reading at his web site. I recommend "Exploratory
Testing Explained". Try to solve his dashboard puzzle (under
Cem says he "tests software, legislation,
and people's patience (not always in that order)". Shift.com
reported that Network Computing magazine called him "one of the
sharpest thorns in the side" of software makers. Cem is a
Professor in Computer Science at Florida Institute of Technology,
and is looking at educating the next generation of software testers.
Back in 1996, when I first started participating in online testing
discussions, I had no idea who he was and blithely gave him some
testing advice. I have used a lot of his advice and wisdom in my
testing. Look at "Paradigms of Black Box testing" for a
great overview of the variety of test approaches.
If you watch the deleted scenes with
commentary in the 2nd Blade movie, you'll see they try the Michael
Bolton experiment with the Vampire King, putting a long blonde
curly wig on him. [grin] This is not that singing MB, but this one
is into experiments of many kinds. One of the deepest thinking
testers and definitely one of the most productive in terms of
regular output, Michael has many great conference presentations and
regular columns to read. Start by reading "A Map By Any Other
James was the professor at FIT who lured Cem
from consulting in California and into education in Florida. James
is breeding "Jedi Testers" that will routinely crash bad
software, just as advanced martial artists smash bricks and wood. If
you haven't heard the mantra of "input, output, data,
computation", you have some catching up to do. With a teaching
team headed by Whittaker and Kaner, can you guess where Microsoft
used to go to hire testers? Then they hired James! Now James has
moved on to Google. James's web site seems to have gone off line, so
I've linked to a Breaking Software presentation.
Brian has his own articles on testing and
development and some surveys of new testing ideas here. Brian has a
foot in both the developer and tester camps, and was the original
tester signatory for the Agile
Alliance. He has probably the best set of agile
testing links on the web.
One general article of Brian's,
"Classic Testing Mistakes", has since become a minor
classic in itself, and I am very proud of the fact that I was one of
the reviewers of the original draft.
After providing your email address and name,
you'll find great articles on risk based testing amongst others.
Paul Gerrard probably has the largest number of web hits for his
name of any person in testing. Paul is a British tester, but also
shares his name with a footballer. I wonder if that has anything to
do with it. [grin]
Elisabeth, Ms Test Obsessed, has some great
articles about software testing. Check out "Why are my pants on
Rex has some great material on software
testing too. Check out "Risk Based Testing: What It Is and How
You Can Benefit".
Rob is always writing great pieces. Search the web to read Rob's "I am a bug" presentation
to learn a little about bugs!
Jonathon is interested in exploratory testing
amongst other things. In the fourth
part of his ET intro , he presents strategies for test ideas.
Many of these qualify as quicktests, short test ideas designed to
highlight a particular behaviour of interest.
Danny is a interesting contributor to the testing space and has some great content.
Harry specializes in model based testing but
he is also interested in testing humor like testing
broken) Boris Beizer
Boris concentrates on writing books and hasn't
got a web site. He's been heavily involved in testing and quality
since the mid 60s (after getting a testing doctorate!), and takes a
engineering approach to testing and doesn't like to cut corners, but
is now semi-retired. This book extract highlights the best and worst
of 13 test practices. Note that Jakob Nielsen has disproved Boris's
thoughts on usability testing (to my satisfaction at least but
probably not Boris's). Boris also presumes that management always
allows lots of time to develop and test....
The saqa domain
routinely vanishes then returns so if the above link is broken,
you'll also find the extract
This site has some excellent training material
and links to American testing and quality courses.
TPI was created by Martin Pol and Tim Koomen,
and is probably the best process improvement tool to measure where
you are presently and target actions to produce improvements. You'll
probably need to buy the books to do it properly (now called TPI
Next with new tools
), but this is an introduction at least.
This is Randy
Rice's approach when you're not sure what you are testing.
After 50 years of software projects, they
still seem to fail at a rate that would be completely unacceptable in
any other industry. I think the role of risk management will be more
important over the next decade in both testing and software
development. As well as these links, search the page for other risk
links including one from NASA.
Negative Side of Positive Thinking
Payson Hall discusses why risk is so important
in software development. You'll also find some other great risk
articles by Payson here
and at StickyMinds.
In 1997, IEEE Software magazine had a special
issue on "Managing Risk". When the guest editor, Tom
DeMarco, and the author of the lead article, Tim Lister, write a
book on risk "Waltzing with Bears", you would expect a
lot, and they have delivered. There is also a related risk
planning tool (in an Excel spreadsheet). There is more like this
at their ASG site
This essay by Mark Cashman looks at the risks
that technology development and integration projects face, and
suggests some mitigation strategies.
A summary of a speech by Les Hatton.
guerilla guide to risk management
A speech by Rear Admiral Kathleen Paige, with
the amazing title of Chief Engineer, Assistant Secretary of the Navy
for Research, Development and Acquisition/ Director, Theater Air and
Missile Defense and Systems Engineering. Please respect the
conditions of use. This is now broken, so I'm trying to relocate it.
This is the Software Engineering Institute's
take on risk. There are some (long) documents to download relating
to the 3 letter acronyms SRE, CRM and TRM. Happy reading.
A lesson on ranking risk for testing,
including sample questions.
A double sided reference on risk. Please
respect the legalese.
Once you finish here, don't forget to try
the exercise in design, usability and testing
Jakob Neilsen's useIT (Useful Information
Technology) site is a treasure trove of information on building user
focused systems. He has proved it is quite a simple exercise to test
how usable a system is. If you want to do it too, start out with
"Why you only need to test with 5 users", "Cost of
user testing a website", and "Instructions for branch
office testing". Also check out Bob Stahl's "Usability
Testing" article at StickyMinds.
Alan Cooper wrote some software that Microsoft
bought off him and renamed Visual Basic. Alan is into engineering
simple mistake-proof human oriented designs that allow us to do our
work easily. His articles have gone from his site, but I recommend
the archived Fourteen
principles of polite applications
This is an extract of Paul's "The
elements of friendly software design". It is a hyperlinked
document, and the legalese states it is for reading only and not for
Karl focuses on the design side of software,
with some good articles on requirements and reviews amongst others.
They claim "If you build software,
chances are that you and your organization are using some technique
developed by The Atlantic Systems Guild." They have a lot on
requirements here, articles, a template and links too.
Thomas has a nice discussion on kinds of
requirements, ways to group them, and who could be interested in
them. There's also various other articles to explore.
Developers sometimes will ask testers for
assistance or sources of information, so try these. Also, the Agile
Alliance has seen development testing become much more advanced, and
the line between tester and developers is becoming less clear in many
teams. (This section is under development!)
Estimates are invariably wrong
on traditional IT projects for many reasons. This simple agile
approach to a more formal method (a.k.a Delphi
) gets all the team involved, is fast and fun. This site is Mike
Cohn's free web-based planning tool. For a more detailed explanation
of how it works, read this.
For non-web p-poker, you can make your own cards if you want to
(here is an example)
or use this template
pack or this set of very visual cards.
Read this piece
for ideas on further improving estimates.
The Agile Alliance is an amazing group of
people creating better ways to design, develop and deliver software
that does what customers want and need. It will take me a long time
to collate all the links I want here, so in the interim, this site
has audio interviews with many of the key players. Alternatively,
try the Open Directory's agile
The Atlassian folk blogging on how they do
Here is Brad Appelton's suggested approach for
a presentation slide.
tips on agile software development
This is taken from a conference session by
Test Driven Development is a poor name for an
amazing practice. By building small unit tests that assert the
validity of small parts of programs as the program is built, the
tests as a whole become a regression test checking that everything
still works as changes are made. The requirement to plan tests in
advance of creating the code also forces simplicity of design on the
code, that makes long term changes much simpler and less time
consuming. If newly added code breaks existing code, an assert test
should fail, and relevant code needs to be corrected so that the
failing test passes. The tests remain part of the code and can only
be moved to system test when everything works. Properly done, this
covers off lots of the risks of regression that system testers
typically faced. All new software development should be done using
TDD, and as the assert code is free, there is no obstacle to it.
Here is some more TDD info
and case studies involving money
handling and reading
a data file (both in the original TDD language, Java). If you
are interested in trying TDD, why not try
it with Ruby. Alternatively, here is a non-programmer 1 hour
involving Excel macros.
If you are wanting to learn more about
development, try Ruby (go on, try
it now!). If you want more, learn
to program with Ruby, or just learn
Ruby, or even read the amazing illustrated
Ruby book. Ruby will enable you to create utilities to help make
testing easier, and has libraries to support test automation and
other application areas (look up watir
). Ruby is free, as are a lot of the references. Brian
Marick's Ruby book is a great source of information too (but
you'll have to buy it).
To make things easier, use an IDE (integrated
Development environment) like EasyEclipse
for RUBY . Read this
short introduction to using it (read from Creating a Sample Ruby
Application From Eclipse), or a longer
introduction if you prefer.
While a program work for a user, the actual
under-the-covers implementation may be overly complex, not tested
properly, hard to understand and maintain, contain duplicated code,
and basically be unfinished. Refactoring is all about cleaning this
up. The stuff that refactoring cleans up is called technical
debt. This material is from Martin Fowler. There's another good
article by Chuck Connell on The
missing theory of refactoring
I once did some technical writing for IBM, and
one of the challenges was writing help text or explanations of
complex functions, while trying to use the language of a grade 6
student. Explaining dependency injection to a non-coder is a similar
challenge. Imagine you run a window washers collective. Each worker
has their own basic gear, buckets, sponges, scrapers, and can handle
smaller jobs unassisted. Larger jobs need scaffolding or cherry
pickers, so they need to be provided for the workers in those cases.
Dependency injection provides complicated material to a worker
function to help it do its job. This simplifies the design, and
makes it more flexible allowing different types of materials to be
provided in different cases. Another advantage is the ability to
provide fake materials, when we are only interested in testing the
function to check it still works properly. Another introduction to
DI is the first chapter
of a book all about it, or this shorter article.
Here's a ruby focussed intro
Jean Debell says "May the source be with
you". These 10 items serve as lessons to code by for
developers, and evidence that "developers love acronyms"
To Write Unmaintainable Code
While this is part of a large site dedicated
to Java, it is of value to all programmers, especially in procedural
or object oriented languages. I hope your developers will realize
that this is really a "how not to" guide!!
yourself programming in ten years
The Amazon search in this article no longer
works. Amazon give you a message saying use Advanced Search, but
provide no links or clues to find it! Maybe they need to hire some
of the people the article talks about!
Testing, Debugging and Reviews
This university book chapter is a technical
introduction aimed at software developers, and it includes example
programs in C++.
This parable is a clever look at management
styles, and the importance of a comfortable chair....
An article based on the famous book.
Johanna writes about managing people and
projects as well as testing. Her new Manage It! book is fantastic.
Mike's Life Cycle, Project &
Infrastructure Management site is a mix of artifacts and links
(across the SDLC including testing and risk), and some engaging web
logs (under Content We Update Daily). Thanks to Chris Holt for the
project management skills
These articles are written from an engineering
point of view but apply to IT as well.
The PMBOK is the standard internationally
accepted guide to managing a project, but it is viewed as
information that needs to be paid for, hence the wikipedia entry. In
2002, old versions of the PMBOK were available at the PMI site, then
it was only extracts and now, nothing. This dead link
had the entire 1996 PM Body of Knowledge here, then they followed
the example of the English language sites and removed it. Oh well,
suffice to say that the PMBOK doc ("A guide to the project
management body of knowledge) is available as both a pdf and zip
file, and if you experiment
with a search engine you may find a copy. If you wanted to, you
the latest PMBOK at the PMI site......
Busy Person's Project Management book
Rob Thomsett's plain speaking guide to all
things projecty. Rob's from Melbourne Australia like me.
based project management
James Chapman adds 10 principles to focus on
beyond the PMBOK. I like his 8th, "If it hasn't been tested, it
doesn't work." He also has some tips for project mgmnt newbies.
This is a good summary of IT project
management, including Steve McConnell's classic project mistakes.
There are 100 rules for project managers here,
based on experiences at NASA.
There is a tutorial and summaries of a variety
of books here (down to a single sentence or even word!), including
the 1 minute manager, and lists from Martin Luther King and Benjamin
Corner Office is a section of the Business
pages of the New York Times. It features interviews with senior
managers on leadership and management. Some interviews (like Steve
Ballmer) also have the audio highlights to listen to.
Boss magazine (Australian based but
internationally focussed) is a general management magazine that
comes out in print on the second Friday of each month as an insert
in the Australian Financial Review newspaper. All the content is
online as well and it has great articles, information and links and
survey of free and commercial Test Tools
Rather than reinvent the wheel, I'll let Danny
Faught take over on this.
Source Test Tools
I haven't played with many of the tools here
but they look promising.
Enough Test Automation
Read these articles extracted from the book of
the same name.
"(archived) dedicated to exploring and
supporting the area of software test automation and how it fits into
your software testing cycle" and doing it well...
Alan Richardson has some tool lists, a
presentation, a paper and even some movies of the tools in action.
I've had freemind on my PC for several years now and track most of
my work activities with it.
A good list with several free tools. Also see
the tools links pages at QA City, Testing Stuff and Workroom
You'll find free requirements, design and life
cycle tools here.
Once you finish here, don't forget to try
the exercise in design, usability and testing
dozen bug writing tips
This is short, sharp and to the point. Also
see Rex Black's "The fine art of writing a
good bug report", and Elisabeth Hendrickson's
"Writing Effective Bug Reports".
A useful article from the programmer point of
of the species
Darwin , but Fred Shapiro on the origins of the words "bug".
Apparently, his origins are not altogether. Grace didn't actually
find the bug, but was involved with the team that did. They knew of
the term "bug" already, which is why they made the pun
(first ACTUAL bug) in the report.
Ben Simo asks "Is there a problem here?"
where software struggles to function or communicate to the user when
it has not functioned according to expectations.
The Bad User Interface gallery highlights
screens and messages that are a little less than "user
A neat introduction to retrospectives from
Retrospectives - Appreciative Inquiry
An overview of the core material in the book
Retrospectives by Esther Derby and Diana Larsen.
and learn from Retrospectives
Another introduction to retrospectives, from
A framework for retrospectives, with lots of
links, from Idia Computing.
To quote Rob Bowley's site, "a resource
for sharing retrospective plans, tips & tricks, tools and ideas
to help us get the most out of our retrospectives".
Alan Dayley wrote this in the washup of a
twitter thread I started about the 1 key agile practice to adopt
first, based on my blog
post. For me it was standups because of the impact on quality,
but the bulk of commenters thought it was retrospectives!
and 5 Whys
Using the 5 Whys to avoid the blame game.
Mostly Life Skills
This site also has information on creativity
and memory skills, problem solving, management, etc.
This site has tips on writing and language.
The material on conciseness is similar to a poster presentation I
did while I was working for Computer Sciences of Australia. If you
follow the conciseness rules when writing, your message should be
understood every time.
Online tutorials to improve skills for public
speaking, interviewing, meetings, teamwork and more.
Dog's Biscuit Bowl
Don't let the name fool you, there is some
great material here on training, leadership (including presentation
skills), an amazing links page (under Library) and more.
Continuing on with strange site names, this
grew out a consultancy that provided juggling lessons as part of
team building. This site has all sorts of gems, covering team
building, human resources, leadership, communication, humor, even
and problem solving
Some ways to generate ideas or solve problems
in teams and by yourself.
This is a comprehensive guide to a range of
problem solving techniques.
in People Skills article
The first half of this article is a short
people skills introduction. Good people skills are essential for
testers who are typically criticizing other people on a regular
basis and need to be good at sugar coating the message and building
to Connect: A Beginning Frame
An approach to improving your listening
skills, by Sally Ann Roth (scroll down till you see her name). I
wish I could follow the advice here more often, but just as my
fingers keep producing typos, my mouth keeps on talking!
Genie Laborde's book "Influencing
with Integrity" is one of the most amazing books I have
read (in the library of one of the companies I worked for) , but
unfortunately I don't remember enough of it.
I have to buy a
copy soon. If you don't want to buy it, the downloads at the bottom
of this link page introduce some of the core ideas.
This holistic list is something to aspire to,
and not that hard to attain if you set your mind to it.
Links blogs and
Once you finish here, don't forget to try
the exercise in design, usability and testing
A series of tester and developer blogs,
focussed on testing and quality.
My blog at testing reflections.
Project Failures Blog
Michael Krigsman muses on the sad realities of
A comprehensive list of links, maintained by
Bret Pettichord. At the very bottom of the list you'll find a link
to his own papers and presentations, or just go to his website
to see them.
Kerry Zallar has a neat name for his site, and
a great disclaimer, "This may not be the best testing web site
out there ... but, it links to many that are! "
I was looking for James Lyndsay's site for
some articles, but was pleasantly surprised at the great links list
he also had. James has won several best paper awards, and you'll
find those papers here, plus an exploratory testing timer amongst
other things. We used the timer at an ET workshop we both
co-presented at AsiaSTAR 2003 & 2004 in Sydney, Australia. We
have since independently delivered ET training sessions around the
world. Also check out James's great exploratory testing page and his
shockwave black box testing machines
QA city has now merged into another website.
Go to the page of Downloads and look for "Common software
errors" to see a great (long) list of error types, after you
provide your details.
of online training
While it is preferable to learn technical
skills from a human, there are many introductory courses available
for free on the web. All testers should know XML and SQL, and you'll
find interactive courses that let you practice your programming as
and development links
A (archived) great page of design,
architecture, patterns, use cases and other links.
A great page of usability and user interface
A page of useful links, aimed at non-profit
Project Management Links
Another page of useful links, mostly templates
and reference with some (archived)
Research and analysis pages
These links are to assorted magazine articles.
Directory's software testing links
The Open Directory sites are listed by
popularity as sites and links. Will this page ever make it? Yep. It
appeared in mid 2002, and cracked the top 30 in Sept 2002, but has
since dropped out (it is now on the Directories page). Maybe I
should have updated those broken links quicker... :)
The name may indicate Quality Assurance, but
in their own words, this is "The online community for software
testing & quality assurance professionals". There are
nearly 40 forums covering process, approaches, tools and logistics.
There's also an associated links
Here you'll find odds and sods including
some sites that look back at the past, a site that looks to the
future, and some web related sites.
Australasian functional testers, with a nimble
twist, rather dormant but may revive
This is an archived version of this page
translated electonically into Arabic. Read my Arabic
on-the-fly translations blog entry for more and a description
how to generate translations yourself.
Hand Signal Technique
I've blogged (is this a dictionary word yet?)
on this. It is an incredibly simple technique to use in meetings to
wrap up all the discussion on a topic before moving on to the next
topic, or alternatively jump to a new topic to move on from a stale
James Lyndsay got me interested in using
shockwave programs run in browsers for teaching testing. He prefers
to write his own testing machines, but I
prefer to reuse other people's (with real bugs). I have shown
hundreds of people my exploratory testing of clocks demo. I have
written up but not posted a simple intro to testing using shockwave
programs (stay tuned!). If the following game and clock don't work,
you will need to download shockwave. So here is an interesting game
you could try and a corresponding clock.
I challenged James Bach with the game as a
testing challenge, and he ended up beating my high score of 6481 I
think (can you find the strategy that lets you score in the
thousands?). If you get an extremely high score, send me a
screenshot. The clock has no bugs but you can watch it forever....
Points in Computing (up till 1999 at least)
A special 1999 issue of the IBM Systems
Journal, this is a good source of historical information.
Folks in IT
Computerworld's interviews with significant IT
This is a interesting source of information
covering many aspects of technology.
history of Programming
David Chisnall takes us from Turing to
The rapidly changing face of computing is the
subject of this report (it used to be called that until recently as
well). People from around the world assist Jeff Harrow with items
for his newsletter. You can subscribe via email or just read it
here. Thanks to Craig Spendlove for this.
A must-visit site for the serious net
searcher, with a great monthly email newsletter
All work and no play makes for an unhappy day.
If Magportal can't amuse you, try
googlewhacking as an intellectual challenge. What is googlewhacking?
Try it and see. A related activity is using Google Autocomplete,
or creating your own Autocomplete application
Try this out
Try this exercise in design, usability and
quality. Here are some examples of perpetual calendars (most of
valid for a month or a year at a time. A perpetual calendar is
designed to allow a calendar to stay current. It may also be used
for future planning, or investigating historical dates. Try out each
calendar then try and answer these questions. Which are the best
designed? Which are the most usable? Which are the simplest? How
would you justify your choices? If you redefined your concept of
good design or usability or simplicity, would your choices change?
What if you had to choose a calendar for a child to use?
What functionality does each
calendar offer? Is it a yearly, monthly, weekly or daily calendar?
Is it obvious how to use it? Do you think people would need to be
trained to use it? Can you tell what day of the week a date falls
on? Can you easily move to adjacent months or years? Could you use
your browser to print a calendar for a month or a year and hang it
on a wall? If you print it, do you have space to write notes for
particular days? Which calendar would you choose to use? Would this
change if you had to meet a particular requirement (e.g need space
for notes, or need to easily view next or previous month), or some
combination of requirements?
What is the quality of the
software like? Are the calendar headings correct? Do they overlap?
Are you doing all the calculation instead of the computer?
Alternatively, could you print one out as fallback (perhaps during a
power strike) to calculate a future date? Try doing incorrect things
as well. What happens if you enter unexpected information, eg
letters for numbers or vice versa, or just spaces or nothing at all
or is it mistake proofed to prevent you? What happens if you try to
enter a year with 10 digits, or a negative year or a year of zero?
Which calendar would you choose to use in terms of quality?
you want to, try the exercise again with these perpetual calendars.
While these calendars all work for both Internet Explorer and
Netscape Navigator, calendar E does not fully function with W3C
browsers like Opera, Mozilla, etc (though all the other calendars
are OK for W3C), and Opera does ugly things to calendar D.
Was the spot "Some Preaching On
Testing"? Or "Some Polemically Overblown Tirade"? Or
something in between? Send comments to Erik Petersen at
If you took the challenge in the Mostly Bugs section, you may
have already seen the first
In case you are wondering why some links are
listed as the something site, or Mr Somebody Useful, and others are
not, HTML won't let you nest a site link inside a page link, so the
extra text at the start of the site link lets you do it (but it looks
a little clumsy all the same).
this site is under (perpetual?) construction. Please visit again.
Thanks to the wayback machine site for preserving lost
Copyright © 2002-2020 Erik Petersen
First created - March 25 2002
Last read - What
time is it now?
Last updated - 20 Mar 2020
Ongoing Task - updating 50 broken links
Last updated with archived sites - 21 Aug 2011
Word clouds added - 6 Nov
First deleted - On June 2 2004 for 10 days by an annoying
(hey IR, this is a site to help people so please don't
Also hacked with random text and links for expensive european watches, WTF?
Disclaimer : My fingers ignore my brain as I
handcraft HTML so please ignore typos. The reliable links were all
live last time I checked.