TEST Magazine - February-March 2012

Page 1

InnovATIon For soFTWAre QuAlITY volume 4: Issue 1: February 2012

Testing in New Zealand the 'number-8 fencing wire' approach

Inside: Test automation | outsourcing | data-driven testing visit TesT online at www.testmagazine.co.uk



TEST : INNOV ATION FOR SO FTWARE QUAL ITY

Leader | 1 Feature INNOVATION FO R SOFTWARE QU ALITY

Volume 4: Issue 1: February 2012

Testing in New Zealand the 'number-8 fencing wire' approach

VOLUME 4: ISS UE 1: FEBRUAR Y 2012

InnovATIon For soFTWAre QuAlITY

Inside: Test automation

| Outsourcing | Data-driven

testing

Visit TEST online at www.testmag azine.co.uk

Is software quality on the curriculum?

T

here was a largely positive response to the UK Government’s recent announcement that it intends to change the way IT is taught in schools; indeed, many have been calling for a new approach for some time.

I’m sure few of us would argue with any of that, it all seems very positive. However, is it too much to ask that quality is stressed when we are talking about software development? I note that ‘a robust and rigorous assessment process is in place’, but I suspect that this refers to the assessment of pupils rather than software.

Matt Bailey, Editor

© 2012 31 Media Limited. All rights reserved. TEST Magazine is edited, designed, and published by 31 Media Limited. No part of TEST Magazine may be reproduced, transmitted, stored electronically, distributed, or copied, in whole or part without the prior written consent of the publisher. A reprint service is available. Opinions expressed in this journal do not necessarily reflect those of the editor or TEST Magazine or its publisher, 31 Media Limited. ISSN 2040-0160

www.testmagazine.co.uk

A pilot running in 20 schools is set to help determine three critical aspects of the new IT GCSE qualification. First, the right balance of skills and knowledge; ensuring coding and computational principles, software development and logic skills are combined with creativity, design and teamwork. Secondly, that a robust and rigorous assessment process is in place. And finally, that resources to help teachers deliver the new qualification, and students achieve it, are comprehensive, inspiring, challenging, and based on real-world application. I’m sure few of us would argue with any of that, it all seems very positive. However, is it too much to ask that quality is stressed when we are talking about software development? I note that ‘a robust and rigorous assessment process is in place’, but I suspect that this refers to the assessment of pupils rather than software. The intention is to create a qualification that students find stimulating, and that is at the same time highly regarded by universities and employers. Stephen Leonard, chief executive of IBM UK and Ireland, commented: “We are long overdue a completely new approach

editor Matthew Bailey matthew.bailey@31media.co.uk Tel: +44 (0)203 056 4599 To advertise contact: Grant Farrell grant.farrell@31media.co.uk Tel: +44(0)203 056 4598 Production & design Toni Barrington toni.barrington@31media.co.uk Dean Cook dean.cook@31media.co.uk

to teaching IT as a subject. With our work, we will make IT inspiring to young people and put the UK on the world stage in educating the technologists of the future. We are putting the weight of industry behind a transformation in education, working with schools and universities to create courses of academic substance and industry relevance.” Other employers participating include Blitz Games, BT, Cisco, Microsoft, John Lewis Partnership, the BBC, SAS, Capgemini, Deloitte, Google, A1-Technologies, Storythings, BAFTA, Interactive Opportunities, Accenture, DXW, The Open Rights Group, Nokia and Autonomy. Clearly there is an emphasis on the glamorous, ‘media’ end of the spectrum, but it is encouraging to see Capgemini in the mix – I fervently hope and expect them to be making the case for a testing-based approach! While I’m sure no one would argue with the aims of this approach to IT in education, it would surely reassure those who know a little about software development – and the often catastrophic consequences of neglecting quality – to see testing and QA woven in to this qualification from the start.

Matt Bailey, Editor

editorial & Advertising enquiries 31 Media Ltd, Three Tuns House 109 Borough High Street London SE1 1NL Tel: +44 (0) 870 863 6930 Fax: +44 (0) 870 085 8837 Email: info@31media.co.uk Web: www.testmagazine.co.uk Printed by Pensord, Tram Road, Pontllanfraith, Blackwood. NP12 2YA

February 2012 | TEST


Enjoy your extra time... It can be so easy to create your test automation solution.

Get your individual solution at

www.testingtech.com


Contents | 3

Contents... FEBRUARY 2012 1

leader column Is software quality on the curriculum?

4 6

news Cover story – Testing in new Zealand – the ‘number-8 fencing wire’ approach New Zealand has a can-do attitude. The challenges of its geography and often sparse population have fostered an improvisational approach exemplified by the ‘number-8 fencing wire analogy’ where the ubiquitous material is used for any number of purposes in addition to fencing. Chris and naomi saunders spoke to a group of their friends in the industry for TEST magazine.

12 software testing as a growth market steve Fice reports on a study by Pierre Audoin Consultants into perceptions of software testing and quality across Europe and discovers that with test optimisation on the rise, organisations are increasingly looking for managed testing services.

16 Testing services – it’s not just a cost thing As outsourcing and offshoring are developing they are adding more to the testing process than simply cost savings as raja neravati explains.

18 Supplier Profile – Top performance TEST magazine speaks to olivier Hanoun of Neotys about testing services, mobile app testing and automation and other topical testing issues...

6

20 Top ten test management tips With over a decade of experience in test and QA management, nadia mcKay offers her top ten test management tips.

24 Testing the message layer Andrew Thompson offers his top tips for testing web service enabled applications in an article designed to give you an idea of what sort of tests to consider.

26 How to test mobile apps instead of being tested by them Mobile testing doesn’t have to make you testy. george mackintosh examines the app market and reveals how automated graphical user interface (GUI) testing tools can make life considerably easier.

30 Test automation can make you happy

16

With a long history in software testing, Theofanis vassiliou-gioles describes how use of the correct test automation tools can bring a smile to a weary tester’s face.

32 The complexities of data-driven testing – Testing voice biometric authentication solutions Ashley Parsons explains the benefits of voice biometrics in strong authentication solutions and the complexities behind data-driven testing.

36 Training corner – Professionally speaking... The subject of email etiquette and ‘netiquette’ is a tricky one, even after more than a decade of use, we still haven’t really cracked it. Angelina samaroo makes it her resolution to have a go.

37 Design for TEST – Survival of the fittest

24

Taking his inspiration from nature with a Darwinian approach, mike Holcombe is breeding test sets using evolutionary testing.

38 Does getting a test certification make you a better tester? It’s experience and enthusiasm that make for a good tester argues ramanath shanbhag, but he can see a day when testers will yearn to be certified in order to qualify to do the job and get the respect they deserve.

41 TesT directory 48 last Word – one man wolf pack dave Whalen is facing a lone wolf in his testing pack. Should he break out the wolvesbane and silver bullets or tolerate a singular talent? www.testmagazine.co.uk

32 February 2012 | TEST


Ibm buYs green HAT

g

lobal computing giant Ibm has announced that it has acquired green Hat, the uK/us-based testing solutions company. According to Ibm, green Hat will join its rational software business which the company says when combined with its rational solution for Collaborative lifecycle management will allow developers and testers to achieve unprecedented levels of efficiency, effectiveness, and collaboration while delivering quality software to their business. “IBM and Green Hat will help customers maximise continuous integration of an application, including creating virtual protocols, message formats, services, customisation and engagement with third-party software,” says the company. “Development teams can avoid scrap and rework and dramatically reduce costly

delays while achieving greater business agility and accelerating the delivery of software applications.” “This acquisition extends IBM’s leadership in driving business agility and software quality by changing the way enterprises can manage software development cost, test cycle time and risk,” said Kristof Kloeckner, general manager, IBM Rational. “Together, we offer the most complete solution available today for Agile software development and testing, with flexible options such as the cloud. Green Hat’s application virtualisation capabilities will help our customers accelerate their delivery of business critical software.” “We’ve been focused on transforming our customers’ software development processes through innovative testing and quality improvements,” said Peter Cole, CEO, Green

Debugging tools play key role at Saudi Arabian supercomputing facility

T

ools from Allinea software will play a key role in the development of software at a new supercomputing facility at the aeronautical engineering department, King Abdulaziz university (KAu) in saudi Arabia that will extend research capabilities in topics such as turbulent flow modelling. KAu which has advanced facilities encompassing a broad range of research areas, is an established institution of higher education in saudi Arabia, with over eighty-two thousand students. Allinea DDT with CUDA and Allinea OPT software, which enables users to debug code running on the GPU in addition to the CPU, and perform comprehensive analysis for complex codes respectively, will run on a 4,300 cores IBM system installed at the facility. Ibraheem Al-Qadi, assistant professor at KAU commented: “For our new HPC facility, we required debugging and optimisation software that would allow us to work more efficiently in accomplishing our goal to carry out advanced turbulent flow simulations. The impressive scalability, support for GPU

TEST | February 2012

and intuitive user interfaces would enable us to achieve this, and as such, these were the deciding factors for choosing Allinea DDT with CUDA and Allinea OPT products.” Jacques Philouze, worldwide VP of sales and marketing at Allinea Software, commented: “We are delighted that King Abdulaziz University have decided to use Allinea Software for their development. Allinea focuses on empowering developers with tools that can debug quickly and easily, as well as offering tremendous scalability. The acquisition of Allinea DDT with CUDA and Allinea OPT tools will contribute greatly to computational fluid dynamics research at the university. We are also proud to announce that this deal with KAU marks our first partnership with a client in Saudi Arabia.”

Hat. “We are looking forward to bringing Green Hat’s innovative application virtualisation and continuous integration testing expertise to IBM customers who have a growing business need to better manage their complex testing environments.” The Green Hat software testing solutions also will be offered through IBM Global Business Services’ Application Management Services (AMS). “Despite possessing a state of the art rational division that already offers a wealth of software development and testing tools, IBM has decided to cut down on all software development expenses and employ Green Hat products instead as a costeffective solution,” commented testing industry insider and Bugfinders director Martin Mudge. IBMs decision certainly seems sound in present circumstances: according to the US National Institute for Standards and Technology, the national annual cost of software testing ranges approximately between $22.2 and $59.5 billion. Martin Mudge says he is convinced software testing should not be overlooked even in these difficult times and supports IBM’s decision to invest in this field: “Development is becoming an increasingly faster process and traditional software testing methods just can't keep up.”

Testing partnership supports Agile development

W

eb load and performance testing specialist Reflective Solutions has signed a solutions partner deal with Capgemini uK. Capgemini has integrated Reflective’s performance testing tool, stressTester, into its uK Accelerated delivery Centres (AdCs). The AdCs provide on-demand client and project hosting and according to the company its use of stressTester enables it to support Agile development process to provide clients with correctly tested applications, in shorter timescales and with less risk. Scott Davies, engineering director, Systems Development and Integration at Capgemini UK, comments: “StressTester has lowered the barrier to entry for performance testing for our sprint teams. It has also enabled us to put a performance testing capability in each Agile team, giving us performance testing capability within the sprint.”

www.testmagazine.co.uk


NEWS

First cloud-based full-service test management tool

A

utomation Consultants, the IT services and software company has launched a full-service, cloudbased test management tool, TestWave. Priced at £75 per user per month, and instantly accessible through any Javaenabled web browser, TestWave fills a gap in the market for a full service test tool that is affordable and easy to use according to the company. The software has been developed in-house by Automation Consultants' team of engineers who have consulted on testing projects and IT transformations for clients that include BSkyB and Vodafone. It manages the testing of an IT system by enabling teams to store test scripts, analyse results, and record and track defects. Test managers can see the progress of testing in real time through intuitive dashboards, and map the testing to releases and requirements. TestWave integrates with HP's test automation programme, QuickTest Pro

(QTP), enabling testers to store QTP scripts in TestWave. Additionally, QTP tests can be run in large numbers directly from TestWave, eliminating the need to start up QTP, and the results are returned to and stored in TestWave, where they can be statistically analysed across multiple test runs. Francis Miers, TestWave director at Automation Consultants, comments: "Having worked in this sector for a decade, we knew there was a need for a test management tool that could reduce waste in IT projects without breaking the bank. TestWave enables the test

Capgemini ranked number one for outsourced testing

O

utsource tester Capgemini Group has announced it has topped Ovum’s 2011 benchmarking study of Testing Services, ranking above other worldleading technology service providers. The combined testing practice of Capgemini and Sogeti business units was recognized particularly for its test process expertise as well as its customer intimacy and responsiveness. Ovum in particular noted its structured approach to testing through Sogeti’s Test Management Approach (TMap) and Test Process Improvement (TPI) methodologies, which are even used by other (competitive) testing services vendors. Furthermore, in many organisations worldwide, TMap is viewed as the standard for testing, containing practical methods for a risk-based testing approach, allowing testers to optimize the cost and benefits of testing and making it easier for CIOs and test managers to obtain buy-in from C-level decision-makers. The Ovum Services Guide: Outsourced Testing benchmarks software and systems testing services providers across the world. Ovum bases its study on 20 key criteria ranging from cost and value, service portfolio, domain expertise, innovation and talent pool. Ovum benchmarked 13 software and systems testing services providers, ranking Capgemini Group at number one, ahead of other major players in the market. Dr. Alexander Simkin, Lead Analyst at Ovum and author of the study said: “For a vendor of Capgemini Group’s size, its ability to build deep, enduring customer relationships with its testing customers is impressive. No other testing services vendor has managed to establish a global presence and strong connections with its customers the way Capgemini Group has.”

www.testmagazine.co.uk

manager to keep track of the progress of the many tests involved in any significant IT project, regardless of the scale of the project or whether it is spread over multiple locations." Compared to tracking testing by spreadsheets and e-mail, TestWave reduces duplication and omission, and makes management easier by showing the current state of testing in a series of intuitive real-time dashboards. It is therefore ideal for software development or major changes to existing systems such as upgrades.

UK Government plans for high-speed broadband could kick start flexible working revolution

W

ith an estimated £530m to be invested in the UK’s rural broadband connectivity, and a further £100m to upgrade broadband in the ten biggest cities, improved broadband could be a driving force for changes of modern working practices according to John Drover, the chief executive of virtual offices provider Executive Offices Group. “Broadband provision in rural areas could facilitate seamless business operation for professionals on the move, allowing work to be virtually uninterrupted by trips across the country,” says Drover. “However, the greatest impact will be evident for those people who wish to live rurally, and operate and manage their business on location, where the option is currently not available. In this case, better broadband provision will allow them to efficiently conduct their business from home, significantly improving their quality of life by eliminating the need to commute into cities. “More people than ever are choosing to work from home through the means of virtual offices, which allows them to fit their working life around their private lives. The provision of better broadband is an evolutionary step in the path to better and more flexible working practices required by the modern working professional.”

February 2012 | TEST


6 | Cover story

Testing in New Zealand –

the ‘number-8 fencing wire’ approach New Zealand has a can-do attitude. The challenges of its geography and often sparse population have fostered an improvisational approach exemplified by the number-8 fencing wire analogy where the ubiquitous material is used for any number of purposes in addition to fencing. Kiwi testers Chris and naomi saunders spoke to a group of their friends in the industry for TEST magazine.

n

ew Zealand’s test industry reflects the culture of the country. It is a country with a land area (268,021 km2) one tenth bigger than the united Kingdom, but it only has 4.4 million inhabitants, with another estimated half a million Kiwis abroad; imagine the uK with the population of Yorkshire.

The world knows all about the mighty All Blacks, the huge agriculture industry and the beautiful landscape where Lord of the Rings was filmed. Last year rugby fans watched the Rugby World Cup as it unfolded in New Zealand but they also saw on their TVs the devastation of the Christchurch earthquakes. This is a part of what the world sees and knows about New Zealand and New Zealanders, but what type of people make up the New Zealand test industry and what influences the test profession of this pair of small islands in the south Pacific? We interviewed three test professionals, who shared their personal views on the influencing factors behind the testing industry in New Zealand.

What I basically do is work with teams, listen to what their testing headaches are and help them develop their own process that means they can reliably release quality software on a regular basis.” Ian Ross is the principal consultant for the Software Quality Practice at Clarus and is on the board of the ANZTB (Australian and New Zealand Testing Board), he is also a native New Zealander. He describes himself as, “Geeky – that is to say technical – agile, with strong test analysis leanings, motivated by the big picture (that is often bigger than the product). I love the ability to dive into the detail and I’m generally context driven.” Ian Wells is a test manager for the GIS Data Collection Division of Trimble Navigation and immigrated to New Zealand five and half years ago from Boston (United States). He originally comes from Canada. “I have a passion for testing high tech systems,” he says, “for improving how we test and perhaps paradoxically, developing processes that can reduce the need for testing. I am motivated by the desire to make products that delight customers.”

People, programming, testing

new Zealand culture – outward looking

Clare McLennan is an independent test automation consultant and a native New Zealander. She describes herself: “People, programming, testing – it all interests me.

New Zealand is a melting pot of different ethnicities: Maori, European, Polynesian, South African, East Asian and others. They have all originally come from somewhere,

TEST | February 2012

so as a nation New Zealanders tends to be outward looking. This can clearly be seen in what is known as the New Zealanders’ OE (overseas experience). Young people often leave New Zealand to explore the world, usually after graduating from university in their early to mid twenties and with some industry work experience behind them. Currently over ten percent of Kiwis are aboard. Most return home over time, bringing with them a range of work and personal overseas experience, and in the process they are redefining the culture of New Zealand. In 2005 Clare McLennan embarked on her three year OE before returning to New Zealand. During this time, she white-water kayaked (grade 4/5), mountaineered and rock climbed in many different countries. “Some of the biggest professional growth I experienced came from learning to analyse risk better through doing so much outdoor stuff. The outdoors is a particularly challenging place to learn about risk, because whenever you are staring at a hard rapid and deciding whether you’re going to paddle it, you are facing a potentially life and death decision,” she says. “A tester’s job is to reduce the risk of the software doing something bad in the hands of the customer. Since exhaustive testing is impossible we need to be continually thinking about how likely is it that something will go wrong and any consequences.

www.testmagazine.co.uk


Cover story | 7

Assessing such risks accurately is something as people we aren't naturally very good at, so I would definitely recommend aspiring test professionals work on getting better at it.” Travelling overseas not only impacted Clare with the consequences of risk, but also highlighted her own cultural self-awareness. “Going to Sweden, learning Swedish and being the ‘dumb foreigner who couldn't understand’ was a really interesting experience. It has certainly helped me understand the challenges non-native English speakers face. But perhaps my most eye-opening cultural experience was some volunteer work I did in Nepal on a Nepali-Swedish project. As an outsider to both cultures I found it fascinating to observe how easily miscommunication can occur between peoples from different cultures.” While Ian Ross has not lived overseas, he draws on experiences and material outside the testing industry, particularly with his involvement with Search and Rescue. He draws on “concepts and ideas from operations research, systems modelling, games theory and AI. I think we, as professionals, should be looking outside of our own little patch,” he argues. “Perhaps that's a New Zealand take on things, we are used to being a small country, used to looking overseas, looking out and bringing ideas in. I think, as a profession testing is still young, we need to be looking at other professions and other ways of problem solving. With my test teams I bring a lot of what I do with the search and rescue teams into what I do in testing. For example if we are talking about exploratory testing, I bring concepts regarding breaking down and searching a problem space like those we would use to search a large park. I like bringing those other things into the profession.” The modern technological advances for communication and travelling have allowed Ian Wells access to the global test community. “New Zealand is more tightly connected to the rest of the world than its geographic distance may suggest. Increasing ease of communication is a trend that will continue to make it easier to live in such a beautiful country and still be tightly connected to businesses anywhere else in the world,” he says. “That being said, personal contacts and face-to-face meetings are critical to professional development and keeping up with accelerating

www.testmagazine.co.uk

IAN ROSS, PRINCIPAL CONSULTANT FOR THE SOFTWARE QUALITY PRACTICE AT CLARUS AS WELL AS WORKING IN SEARCH & RESCUE. HE DESCRIBES HIMSELF AS, “GEEKY – THAT IS TO SAY TECHNICAL – AGILE, WITH STRONG TEST ANALYSIS LEANINGS, MOTIVATED BY THE BIG PICTURE (THAT IS OFTEN BIGGER THAN THE PRODUCT). I LOVE THE ABILITY TO DIVE INTO THE DETAIL AND I’M GENERALLY CONTEXT DRIVEN.”

technological change. Events are important to get to, like testing conferences; from New Zealand, one just has to travel further, making a bigger point to do trips to the US or to Europe, to meet more people doing testing face-to-face.”

Test culture New Zealand’s testing culture is varied. Ian Ross comments on it as being “diverse, ranging from, the person who knows the application very well or you are the person who has complained the most, so congratulations you are the ‘company tester’. The range extends from there right through to people who are passionate, about the creation of good software and improving the creation of software and culture, of not just testing but development teams in companies. A lot of testers I run into in New Zealand are people who have tended to come to testing through other means, and it is only now becoming an established industry, that people are starting to go, ‘I want to be a software tester’. “In some places in New Zealand I think there is definitely a testing culture,” adds Ross, “for example in Wellington, where you have a pool of contractors that move around. These people get exposed to a lot of different companies and because they take this with them, companies

all start to become similar. A lot of Wellington work is for the government, so naturally there is a bias to becoming similar anyway. Christchurch seems to be a little bit more company-focused.” This means in general Wellington has a state service-oriented test culture, compared to the corporation-oriented test culture in the rest of New Zealand. “As Kiwis,” says Ross, “we don’t mind sharing ideas and thoughts on how to get better at doing stuff, we are used to helping each other out. That does not mean to say we can’t keep industrial secrets, we recognise intellectual property, but there is also a community.” With New Zealand sitting in Asia Pacific, Ian Wells has this to say about the New Zealand test culture. “It depends more on the organisation than the country. It has more to do with company culture, probably the emphasis on testing by the managers of that company. New Zealand is on the interface, along with Australia, of western cultures and Asian cultures. From my perspective, I see in western business cultures an increasing emphasis on exploratory testing. I see a move towards improving the efficiency of testing, because of the need to balance time to market with increasingly complex product and higher customer quality expectations. This has led to

February 2012 | TEST


8 | Cover story

NATIVE NEW ZEALANDER CLARE MCLENNAN IS INDEPENDENT TEST AUTOMATION CONSULTANT AND AVID MOUNTAINEER. “PEOPLE, PROGRAMMING, TESTING – IT ALL INTERESTS ME. WHAT I BASICALLY DO IS WORK WITH TEAMS, LISTEN TO WHAT THEIR TESTING HEADACHES ARE AND HELP THEM DEVELOP THEIR OWN PROCESS THAT MEANS THEY CAN RELIABLY RELEASE QUALITY SOFTWARE ON A REGULAR BASIS.”

“A tester’s job is to reduce the risk of the software doing something bad in the hands of the customer. Since exhaustive testing is impossible we need to be continually thinking about how likely is it that something will go wrong and any consequences. Assessing such risks accurately is something as people we aren't naturally very good at, so I would definitely recommend aspiring test professionals work on getting better at it.”

TEST | December 2011

growth in exploratory testing; agile development/testing teams and lean approaches in general. The more conventional “check list” style testing, although important in places, is no longer sufficient to get quality products shipped on time. Good testing requires inquisitiveness, excellent powers of observation, clear communication and the ability to question. These are all attributes of the New Zealand culture and the New Zealand test culture. ”

Home grown – inventors in the shed The attitude of many New Zealanders is ‘you can turn your hand to anything usually inventing something useful in your Kiwi shed’. Ian Ross says, “I think the number-8 fencing wire metaphor is a little tired these days, but to a degree it is still true; I’ve been in departments where we have bolted solutions together, without spending hundreds of thousands on third party product suites. The final result has been better than if we had purchased a solution.” The number-8 fencing wire mentality relates to a typical Kiwi being a problem solver, lateral-thinking type, with an ability to invent or fix anything with what’s available in their shed; a farmer would typically use number-8 fencing wire to fix a number of his farming problems. For example one Kiwi farmer has patented a bent piece of the wire, with the purpose of holding up plastic containers to feed animals on the farm. “Kiwis when compared to places overseas that I have had exposure to seem more accepting of testing,”

says Ross. “We are used to mucking in and therefore a tester is not there so much to tell the developer he has got it wrong, but you’re a second pair of hands to make sure you’re doing it right. I think that attitude fits well within the New Zealand culture.” Clare McLennan has similar views on the testing practice in New Zealand. “We do come up with some interesting new stuff in New Zealand as we like to do things our own way,” she says. “We tend to look to world experts for new practices, but then mould their ideas to our current situation. I see quite a few good ideas developed in NZ but many just stay within the company or city where they were developed. Possibly, we are not so good at pushing our ideas back to the world through blogs or conference presentations etc. However, we do have a few people who are world leaders, like in my area of ATDD (Acceptance Test Driven Development), we have Rick Mugridge. He's quite an expert and wrote one of the first books about it – ‘Fit for Developing Software’.” Ian Wells has introduced his own invention for tester’s career paths, by accommodating a person’s testing persona. “There are two kinds of people. There are people that are career testers, who are kind of rare and then there are people who see testing as an avenue into working somewhere else. It is good to recognise these two different kinds and treat their aspirations accordingly and setup career paths that match. I have career testers, but they are the senior testers; and I have those who come in mainly as interns or temporary people. They

www.testmagazine.co.uk


Cover story | 9 come in to get industry experience in a whole broad range of products, like testing, development, product management and to see the whole development cycle. They stay there for a shorter period of time. “I structured my testing groups differently into two levels,” adds Wells. “Master testers love testing, they are good at it and they have domain knowledge. They can anticipate the customers’ needs because they talk to customers regularly. They hold in their heads an amazing amount of knowledge about the technology under test that is not written down. They are career testers and retention of them is key for the organisation’s success. If they had a lot of repetitive work, the same work for every release, they would get tired of it and want to leave. My goal is to keep them concentrated on strategy, planning, talking with customers and organising, as well as hands-on testing. Staff-wise, I complement them with interns, they may or may not become master testers – they may end up being senior people in other roles. They are working in testing for a different reason – they are there to learn first-hand all the basics of the R&D process and to find out what good testing is all about. I have found interns bring fresh ideas and vitality to the testing group, to prevent the testing mindset getting stale.”

Test community Clare McLennan sees the sharing of ideas as helping to develop the test community. “Talking and sharing ideas with others is something that's important to me, so I attend a lot of local meetings; TPN (Tester Processional Network), APN (Agile Processional Network), and the Canterbury Software Cluster. I talk a lot about what I do (ATDD) as I am very passionate about it. I also write a blog on software quality, regularly add features to FitNesse, and have spoken at conferences, including the Australia New Zealand Testing Board Conference in Auckland in 2011. But I think most of my best ideas come when I collaborate with people in real life, so I am glad there is a strong IT community in Christchurch.” Developing the test community is also on the minds of both Ian Ross and Ian Wells, with Wells saying, “The majority of impact is directly with people that I work with, in particular the testers working for me, also the people that I am working with, product and project managers. The point is to make sure ideas are promoted that reduce the amount of testing that we have to do by building in quality early on in the product lifecycle.”

www.testmagazine.co.uk

IAN WELLS, TEST MANAGER FOR THE GIS DATA COLLECTION DIVISION OF TRIMBLE NAVIGATION AND BEEKEEPER. “I HAVE A PASSION FOR TESTING HIGH TECH SYSTEMS, FOR IMPROVING HOW WE TEST AND PERHAPS PARADOXICALLY, DEVELOPING PROCESSES THAT CAN REDUCE THE NEED FOR TESTING. I AM MOTIVATED BY THE DESIRE TO MAKE PRODUCTS THAT DELIGHT CUSTOMERS.”

Immigrating testers The New Zealand testing industry is diversely made up of native New Zealanders and immigrants. Lifestyle seems to be the primary reason test professionals are drawn to New Zealand. Ian Wells immigrated to New Zealand five and half years ago. “For a long time we thought about immigrating to New Zealand, for the lifestyle. The time came right with our family and our children for us to make a big move, so we upped and moved and came here. Our whole family is glad we did.” Extreme outdoor adventurer and tester Clare McLennan has worked with many testers and development teams. “There are definitely quite a few immigrants working here as testers, although you have to ask at what point do they become a Kiwi? I think people mainly come here for the lifestyle, so if they are from Britain maybe they

like open spaces, being able to own their own house, being able to get into the mountains and go skiing and also having a short commute. Basically people come here for the lifestyle; if you want to get rich then you probably wouldn’t head here. But there is a much more to life than that!” Ian Ross echoes this same idea on why immigrants have a desire to come to New Zealand. “Generally I think it is lifestyle. Dollar for dollar you can earn a lot more money overseas. That said the lifestyle that you get in New Zealand, for the dollars you earn stacks up very well. I can finish work and be climbing in the hills, or surfing at the beach an hour later.”

Influence of international experts Agile software development and is growing in popularity in New Zealand’s software industry. Ian Ross says he is

February 2012 | TEST


10 | Cover story

When I think about other professionals who diagnose and probe complex systems, I think of doctors. Testers are doing the same kind of thing as a doctor; you have some symptoms of dysfunction in a very complicated system and you are trying to figure out what is causing that; it takes the same kind of intellectual power and experience, but doctors have many years formal training in the human body (a very complex system). Let’s make sure future master testers have similar educational depth as doctors do today.”

TEST | December 2011

influenced by “almost everything I read and everyone I talk too. It has been an evolving journey, which hopefully is continuing to evolve. Two of the biggest documents to affect me personally, are Agile Manifesto and the ISTQB Glossary. To me Agile makes so much sense and the glossary provides a common set of terms for testers.” Ian Wells reveals that “because I have seen testing take too long and I have seen bugs discovered after release, I look for better ways to test. Checklists are helpful and management loves the metrics they bring. I look for ways to make testing really reduce the number of problems found by customers and to help make products that delight. For software, I keep abreast of the latest trends in Agile and lean development methodologies, as these seem closest to understanding the nature and complexity of testing software. I network with testers within the Trimble worldwide community. Probably the biggest influence has just been James Bach, a testing consultant and frequent presenter at testing conferences. He has trained my staff in exploratory testing approaches, which has led to a rework, and demonstrable improvement in our test strategies. For exploratory testing to be effective, it is crucial to train managers and the rest of the team in this approach.”

The future of testing in New Zealand Ian Ross sees New Zealand in the future as a place other countries will outsource too. “New Zealand has been a primary industry-based economy and our IT industry has largely been in a supportive role, but I think times are changing, the world is shrinking. I think that for New Zealand IT outsourcing is an obvious market to focus on. New Zealand is not the cheapest place to outsource to, but we are not the most expensive by a long shot. We offer the ability to develop solutions to hard problems, rather than just providing many 'cheap' workers. We have a western business culture and natively speak English. Even the time difference is sometimes an advantage. I think there is a niche here that will be exploited in the years to come. I have already seen small examples; however it has been due to preexisting relationships as opposed to going out to get work from the international market." Ian Wells sees the future where location is immaterial. “In my experience, the best testing is done within a collocated team, where testers

have deep technical and domain knowledge and the collocated team follows a self-improving process, such as Agile or lean and testing has the support of upper management. Today, because of the interconnected and distributed nature of technological products, not all testers can be collocated. A clear trend is the improvements in communication and collaboration technologies. I expect that the effectiveness of noncollocated testing teams to improve in the near future which means that testers here in New Zealand will be more and more part of global teams. I would expect that commonality in testing/development cultures will be a greater factor than physical location in future. I also expect that because New Zealand is on the interface between Western and Asian test cultures, that we should expect cross pollination of testing ideas between Asia and the West occurs.” However Clare McLennan sees the cultural problems of outsourcing testing activities. “I do not believe in splitting testing and development to different locations for projects which will be maintained over a long time period. It is much more effective if developers have a stake in making the system more testable, how the code is written makes a huge difference to how easily tests can be automated. If you are paying someone to do the testing, they can't easily ask you to write a code hook to allow them to test such and such. And even if they tried, the company-to-company cultural gap means you probably wouldn't understand what they wanted!” And finally all three have emphasised that the testing profession is still very young, with Ian Wells’ analogy of the education of testers. “Testers today are challenged by finding possible faults in complex systems. These systems will get increasingly complex. I fully expect that testers will require more training in future to test complex information systems. When I think about other professionals who diagnose and probe complex systems, I think of doctors. Testers are doing the same kind of thing as a doctor; you have some symptoms of dysfunction in a very complicated system and you are trying to figure out what is causing that; it takes the same kind of intellectual power and experience, but doctors have many years formal training in the human body (a very complex system). Let’s make sure future master testers have similar educational depth as doctors do today.”

Chris Saunders & Naomi Saunders testRUN New Zealand Ltd www.testrun.co.nz

www.testmagazine.co.uk



12 | Testing services

Software testing as a growth market Steve Fice reports on a study by Pierre Audoin Consultants into perceptions of software testing and quality across Europe and discovers that with test optimisation on the rise, organisations are increasingly looking to managed testing services.

C

ompanies around the world currently invest more than â‚Ź50 billion annually in applications testing and quality assurance. The awareness of the commercial added value of flawless, fail-safe corporate applications is increasing, and as a result companies are actively seeking opportunities to improve both software and the organisation of testing while meeting budget constraints. This article is based on a PAC (Pierre Audoin Consultants) study sponsored by SQS Software Quality Systems, which surveyed 309 managers and IT decision-makers in companies across Europe and North America ranging from 1,000 to over 5,000 employees. The study concluded that 91 percent of the managers surveyed recognise software testing and quality assurance as two of the most important IT disciplines within their companies. It also explores what approaches to software testing and quality management lend themselves to

TEST | February 2012

achieving the greatest possible added value; how expenditure and quality of testing can be harmonised; what standards companies expect from both testing activities and external testing service providers and how collaboration can be best organised.

Boom in optimisation of test activities The majority of the companies surveyed are currently involved in optimising their testing activities. Approximately a third of companies were in the optimisation phase at the time of the survey, while a quarter of interviewees confirm that optimisation will commence within the next twelve months and a further eleven percent within the next two years. The motives prompting increasing numbers of companies to opt for the test optimisation include enhancing product quality (61 percent), increasing quality and transparency within testing processes (58 percent) and advancing the objectivity of testing activities (35 percent).

www.testmagazine.co.uk


Testing services | 13

Almost half the respondents expect the level of testing automation to increase as a result of these improvement measures but, interestingly, comparatively few respondents – around just a quarter – stated reducing costs as a reason for optimising testing activities, showcasing that companies see quality as more significant.

Almost half the respondents expect the level of testing automation to increase as a result of these improvement measures but, interestingly, comparatively few respondents – around just a quarter – stated reducing costs as a reason for optimising testing activities, showcasing that companies see quality as more significant. The interviewees explained that benefits such as greater objectivity in the company’s testing activities can be best achieved in collaboration with a dedicated, independent testing service provider. Additionally, many feel that an increase in automated testing can be realised via long-term managed test services engagements.

Increasing acceptance of external testing support There is growing interest in collaboration with professional service providers for software testing and quality management. The integration of a professional service provider, either on a project or ongoing basis, is a way of achieving high quality through a more objective approach while also increasing efficiency and effectiveness in testing. Longer-term testing engagements in the form of managed testing services partnerships based on strictly-defined Service Level Agreements are increasingly important, experiencing double-digit annual growth and becoming one of the most dynamic segments within IT services. While two thirds of the companies surveyed involve external service providers in their testing activities, the intensity of the collaboration varies from occasional collaboration on projects through to permanent

www.testmagazine.co.uk

cooperation in long-standing managed testing services agreements. Where longer-term engagements are involved, the service provider often also assumes responsibility for results and risks via robust Service Level Agreements. The goals of a long-term collaboration usually also include the optimisation and standardisation of test processes, increasing the level of testing automation and the use of consistent and proven methods, best practices and suitable testing tools.

Adding value Benefits that companies associate with the testing services model differ depending on the level of external support. For companies that already use external support, a clear advantage is improved access to qualified testing specialists as well as reduction of internal IT expenditure. The majority (85 percent) confirm that collaboration with external service providers for software testing and quality management cuts costs, reduces the burden on internal IT teams, freeing them up for other projects, and increases standardisation of testing processes. Most expenditure is incurred for testing activities that are part of more comprehensive IT service agreements. As well as testing, these often incorporate the development, operation and maintenance of IT systems. In contrast, dedicated contracts for testing and quality management exist and account for more than a third of the budget for external testing services; a trend that is on the increase. Ninety one percent of managers surveyed consider the independence

of the testing team and the product development team to be an important or very important success factor, indicating that collaboration with specialised partners in the field of software testing will rise as customers increasingly demand independent objective tests of the application landscape. The dedicated award of contracts, particularly in the scope of longerterm managed test services agreements is also increasing in significance as the market matures. Companies that separate out software development and testing benefit from a clear allocation of roles, improved transparency of services and test quality and also the objectivity of testing performance guaranteed by the fact that the testing specialist is independent of the development team. As experts in their field, independent testers also bring special competencies to the table such as involvement from an early stage in the development process, so contributing to the efficiency and effectiveness of overall testing performance.

Blended service delivery Global service provision is a factor that impacts quality, flexibility and the cost of testing engagement. The study shows that ‘blended service delivery’, ie the use of local resources in combination with near and offshore testing centres is of high importance. 76 percent of companies have outsourced part or all of their testing activities to an external service provider and currently perform software tests in collaboration with near and offshore facilities.

February 2012 | TEST


14 | Testing Services

When asked about the ideal distribution of testing activities, the number of companies that desire a very high proportion of near and offshore collaboration is currently relatively low: only 14 percent can imagine more than half of the testing activities being performed near and offshore. None of the interviewees want to outsource all testing tasks to a near or offshore testing centre, as they consider it important to continue to perform a certain proportion of testing activities internally. For the companies surveyed, the use of external testing centres serves a range of purposes. For more than half of the respondents, reducing costs (61 percent) and increasing the quality of service provision (54 percent) plays a more important role. 48 percent stated the increasing availability and flexibility that can be achieved through global distribution of teams across different time zones. Around a quarter of those surveyed consider access to additional competences to be an important argument for the integration of near and offshore resources into their testing activities.

Success factors

Steve Fice UK Managing Director Software Quality Systems www.sqs.com

TEST | February 2012

A main challenge raised in the study is identifying the ideal approach for optimising testing activities in collaboration with an external service provider. The top consideration factors for choosing an external software testing and quality assurance provider are: Availability and flexibility: Speed and flexibility are at the forefront for companies in terms of their testing activities. The majority of companies surveyed – 60 percent – expect high to very high availability and flexibility from a testing team, including permanent, “around the clock” readiness (24/7). Objective evaluation: More than 90 percent of respondents consider

a longer-term, partnership-based collaboration in testing to be indispensable and view the agreement of clearly defined performance indicators, which should be standard in managed test services agreements, to be crucial for success. Two thirds regularly measure whether the testing team is achieving its agreed objectives by using selected key performance indicators (KPIs) and, of these, 76 percent monitor whether deadlines are met. Results-orientated remuneration: 89 percent consider flexible billing based on results achieved, not simply days worked, to be important or very important. Linguistic capabilities: Language provides an additional facet to communication so multi-lingual capabilities are indispensable, above all in globally-operating companies and for international testing activities. The principal requirement is for contacts who can speak the language of the respective country and English. With more companies looking to optimise testing, whether via on- or off-shore testing teams, increased accountability through robust SLAs or results-orientated remuneration, the competition to provide specialist testing services will no doubt become fiercer. In my opinion, this study demonstrates that testing providers with round-the-clock availability, a blended services offering and innovative output-based pricing will become increasingly critical to the success of any large software testing engagement. The findings from the study on the ‘Growth market on software testing: market trends, service providers and success factors’ can now be downloaded from http://go.sqs.com/ market-research-2011.

With more companies looking to optimise testing, whether via on- or off-shore testing teams, increased accountability through robust SLAs or results-orientated remuneration, the competition to provide specialist testing services will no doubt become fiercer.

www.testmagazine.co.uk


TM

Powerful multi-protocol testing software

- A powerful and easy to use load test tool - Developed and supported by testing experts - Flexible licensing and managed service options

Don’t leave anything to chance. Forecast tests the performance, reliability and scalability of your business critical IT systems. As well as excellent support Facilita offers a wide portfolio of professional services including consultancy, training and mentoring.

Facilita Software Development Limited. Tel: +44 (0)1260 298 109 | email: enquiries@facilita.com | www.facilita.com


16 | Testing services

Testing services – it’s not just a cost thing As outsourcing and offshoring are developing they are adding more to the testing process than simply cost savings as raja neravati explains.

W

ith the impact of failure moving way beyond the boundaries of an organisation, in today’s world, companies need to verify and validate everything. little wonder application testing services have shown tremendous growth for several years now. on one end this trend has led to more vendors adding testing service to their portfolios, emergence of new test-centric vendors and existing vendors extending their capabilities, staff and investments.

On the other, more and more enterprises (buyers) are trying to separate testing from development. A good number of these enterprises

TEST | December 2011

entrust their application development work to development specialists and separate testing from development by handing over testing to independent testing services providers. The reasons attributed to this include – testingonly vendors hire people specifically for testing, have tools and process that focus on testing and have lower attrition rates. Unlike in the past, we are seeing many companies having a strong and robust testing strategy in place. They have separate budgets and RFPs while looking for a testing vendor. Application testing services will continue to grow and increase traction for enterprises. Cost savings is among the reasons for this growth but is not the most pervasive reason anymore. That worked a couple of years ago. Today,

www.testmagazine.co.uk


Testing services | 17

it is about the processes and tools an application testing service provider can bring to bear and the level of assurance provided in achieving more predictable application functionality and performance. Understanding of the key practices and tools that can reduce risk in projects and enable higher-quality software delivery is imperative. In this context the focus is on skills and applying standards and automation to ensure testing is a ‘continues process’. This has seen many organisations benefiting with significant improvement in their business strategies. These benefits include risk mitigation, validation of new products/services, being able to support faster timeto-market with reduced test-cycles and improving real-time business performance and monitoring.

The value of automation Many organisations today understand the value software test automation can provide to an enterprise. These benefits include reuse, better labour utilisation and cost savings. Having said that, there are companies that have not yet considered automation or believe they are doing a great job on software test automation by just automating test execution. Automating test execution is definitely the first right step. Also, by utilising an outside party to automate test scripts that were otherwise being run manually, it will at least help realise some of those immediate benefits of automation. But the truth is that there is more to test automation – benefits to be had by automating aspects of test data creation and management, test infrastructure management, test processes, and so on. Companies who focus all their energies on automating test execution do not get what they are looking for from test automation – optimum test coverage, minimum manual steps, fast "go/no go" decisions and excellent ROI.

Consulting Testing consulting is another new area that is definitely coming up. Now clients are saying we don’t know what kind of testing we need to do. So rather than going and choosing a vendor and selecting what we want to do, we are going to first go out and get some consulting help, so that we have an idea about what is the most important kind of testing maybe we

www.testmagazine.co.uk

should be doing. With limited funding and bandwidth where should we concentrate our money and effort? Enterprises today understand that testing is required to verify and validate that a business objective is being realised through a project or a development effort, throughout the life cycle, not just at the user acceptance phase. Today they want their service providers not just to provide them with solutions that meet their business objectives but also measures, metrics and executive dashboards. Metrics that minimize the delay in defect detection and resolution and focus on continuous improvement while providing a view of ‘total cost of quality’ have become the key. Typically companies that keep testing in-house do not use testing professionals they use people who are doing testing on top of their day job. They also do not have the tools and processes in place. Today if we take stock, most prevalent test outsourcing deals include staff augmentation and execution support for specific types of testing. But this is fast changing with the outsourcing testing becoming a mature discipline and some companies having testing budgets of $100MM per year. They have begun to realise that testing of their own applications, third party applications where in the past people were misled to believe that since it came from a third party and it is completely tested is false.

The growth of outsourcing There is tremendous potential for growth in this market as most companies today outsource less than 50 percent of their work. This means that these companies are leaving money on the table and are not able to achieve the efficiencies that many of their competitors are actually achieving like reduced cycle time in testing, increased resource utilisation, reduced defects in UAT, improved return on investments and a given, cost savings. The way application testing is sold today is less about the money to be saved, but about what poor quality will cost enterprises without adequate testing, it’s about improving quality across an organization and how a testing service provider can translate that into a dollar value, rather than just better software. When defects get into production the costs can be massive. This is what most enterprises buy into.

Raja Neravati SVP AppLabs www.applabs.com

Application testing services will continue to grow and increase traction for enterprises. Cost savings is among the reasons for this growth but is not the most pervasive reason anymore. That worked a couple of years ago. Today, it is about the processes and tools an application testing service provider can bring to bear and the level of assurance provided in achieving more predictable application functionality and performance.

February 2012 | TEST


18 |TEST Supplier Profile

Top performance TEST magazine speaks to Olivier Hanoun of Neotys about testing services, mobile app testing and automation and other topical testing issues...

S

ince 2005, Neotys has been helping more than 1,000 customers in over 60 countries enhance the reliability, performance, and quality of their web and mobile applications. Its main product, NeoLoad is a load and performance testing solution that is flexible, easy to use with infinite scalability from the cloud and support for all web 2.0 technologies. With a growing customer base in the United States, and over 1,000 customers in more than 60 countries, NeoLoad is fast becoming a key web load and performance testing tool for many testers. TEST: What are the origins of the company; how did it start and develop; how has it grown and how is it structured? Olivier Hanoun: The founders were previously testers working for a web application development team. During a major project for the British government they recognised the need for a load and performance testing solution, but none of the available solutions met their requirements. The tools available at the time fell into one of two categories: They were either big, expensive and cumbersome or cheap and so lightweight that they

TEST | February 2012

lacked the features to meet the project requirements. Frustrated by the lack of adequate tools available, the application testers saw an opportunity. Shortly thereafter, Neotys was founded and the first easy yet powerful load and performance testing solution was born. The NeoLoad product has fuelled tremendous growth for Neotys since hitting the market in 2005 and now with over 1,000 customers in 60 countries Neotys has been awarded to the Deloitte EMEA Fast 500 two years running. TEST: Does the company have any specialisations within the software testing industry? OH: Since 2005, Neotys has specialised in helping companies enhance the reliability, performance, and quality of their web and mobile applications. NeoLoad is an enterpriseclass load and performance testing solution that enables teams to efficiently apply best practices for load testing with the cloud. It is integrated with multiple cloud platforms, and supports realistic, large-scale tests across multiple geographical zones with bandwidth simulation and parallelized requests. It complements cloud testing with full support for internal lab-based testing, and enables engineers to reuse test scripts across these domains.

NeoLoad is a full-featured, secure, and easy-to-use solution that accelerates load testing with advanced real-time analysis, agentless monitoring, scheduling, scripting, and reporting capabilities. It also provides extensive support for a wide range of web technologies including Adobe Flex and AIR, Silverlight, RTMP, SAP, and Siebel among others. NeoLoad enables testing organisations to make the most of what the cloud makes possible, and helps them use the cloud to make substantive improvements in load testing. Neotys solutions are backed by a dedicated team of Neotys professional services and a broad network of certified services partners to ensure customer success. TEST: What are the key market dynamics Neotys has observed? OH: Macro-economic: Competition is getting more intense which means developers need to reduce time to market for application functionality. And the increasing importance of information systems in all companies, employees/user productivity, high cost of failure/poor performance means applications must perform better than ever before to maintain/gain a competitive advantage in the market place. Demand for more interactive and compelling user experiences

www.testmagazine.co.uk


TEST Supplier Profile | 19 is driving the trend to adopt new application technologies (flex, Silverlight, AJAX etc.), while cost pressures are increasing; exchange capital expense for operating expense so there is an increase in application complexity trends also from heterogeneous, multi-tiered virtualised, distributed architectures and migration to the cloud. The challenge for testing departments is that the expertise to test new application functionality with these new technologies and architectures often lags behind the requirement to do it which results in the increased trend to require experts to assist with load and performance testing. TEST: What do you recommend companies do with regards to outsourced testing? OH: The key is for organisations to recognise the uncertainty of the times we live in and to embrace it. They must develop IT strategies that allow them to be flexible and nimble regardless of what the future brings. Practically with regards to performance testing, there are a couple of key options to explore: Outsourcing the testing services: This approach can help organisations address some of the market dynamics mentioned above and is something we help many of our customers with. In some cases we provide testing expertise directly and in other cases we leverage certified partners to provide these services. Load generation from the cloud: With this approach, whether you have migrated your applications to the cloud or not, there are significant capital expense savings to be had by using cloud infrastructure to generate the load as opposed to in-house systems. With either of these options, we believe one of the most important success factors is to use a load and performance testing solution that is a good fit for the organisation’s capabilities. Even if the decision is made to outsource or leverage services to set up an in-house testing capability, it is important for companies to find a solution that will meet their requirements for the future as well. We have found that often organisations that outsource their testing often bring the capability in-house at a later date, making this requirement of careful solution selection even more important. If the load and performance testing solution is easy to use and can grow as the IT organisation grows both in terms of scale and sophistication, then they

www.testmagazine.co.uk

retain the option to bring the capability in-house. If the tools are cumbersome with a steep learning curve the options for the future are decreased. TEST: Can you give an example of companies you’ve seen who have taken this approach? OH: One of our key UK services partners, ‘The Test People’, have had tremendous success using NeoLoad on behalf of their customers both because it is a modern tool designed to more efficiently test today’s complex applications. Because it is easy to use it significantly reduces the time spent on non value-added tasks enabling users to concentrate on testing, analysing results and providing recommendations, thereby delivering more value to their customers’ projects. It’s also easier for their customers to use so they can (and often do) opt to bring the load and performance testing function in-house once their in-house resources catch up. In one instance a large multinational financial services company, BNP Paribas, used The Neotys Professional Services Team to help them get to a “go/no go” decision on a critical million dollar project. Satisfied not only with the services from the consultants, but also the NeoLoad product itself, they decided to bring the capability in-house. TEST: You mentioned earlier that the cloud is another way for IT organisations to reduce costs. Given all the hype around cloud, can you elaborate on that? OH: Yes. Load and performance testing from the cloud can save time and also reduce capital costs with a ‘pay as you go’ approach. But again, the key is to select the right solution that will help you take advantage of some of the benefits promised by the cloud. When considering a cloud testing solution, in addition to evaluating the load and performance testing capabilities themselves, it is also important to consider to what extent the solution is integrated with the cloud. There are many solutions on the market that allow you to test from the cloud provided you handle setting up an account with each cloud provider, you do the provisioning etc, but there are a few (including NeoLoad) that have a fully integrated solution that makes the underlying cloud provider transparent to the user. Another key consideration for a cloud testing solution is whether or

not it can support unified tests inside and outside the firewall. A solution with this capability allows you to reuse scripts from lab testing for your cloud testing and facilitates the comparison of results to identify performance regressions as the application load scales. TEST: Who are the company’s main customers today and in the future? OH: The list includes companies like Toyota, Xerox, Unisys, Siemens, Motorola, Ericsson, Shutterfly, Axa, ArcelorMittal, Hilton International, Terremark, ING, Direct Energy, Merrill Lynch, BNP Paribas. We expect many more companies to begin using NeoLoad because of the product direction which features enhanced capabilities for collaboration, mobile, new and legacy protocols and more. We’re excited to see that many of our current customers are able to significantly elevate their game by using NeoLoad. The value they deliver to their respective organisations is impressive and we’re just happy to be able to help them get to that ‘next level’ of load and performance testing.

Olivier Hanoun Sr. Performance Engineer Neotys www.neotys.com

Olivier Hanoun is a senior performance engineer at Neotys. Previously, he worked as a technical marketing manager in a Taiwanese semi-conductor company. Prior to that Olivier was an engineer in STMicroelectronics. Being in Neotys from the early development of the company five years ago, Olivier has been involved in load and performance testing projects as well as delivering professional services and training. He graduated from the Ecole Centrale de Marseille with an MS in Computer and Sciences, and from the French Air Force Academy with an MS in Mechanic and Aeronautic.

February 2012 | TEST


20 | Test management

Top ten test management tips With over a decade of experience in test and QA management, nadia mcKay offers her top ten test management tips...

TEST | February 2012

www.testmagazine.co.uk


Test management | 21

H

aving been in a test or QA management role for more than a decade in various types of organisations, on projects large and small I can definitely say the challenges that I have faced do not differ greatly and meeting these demands may often seem like the labours of Hercules. There is no magical solution unfortunately (tried pixie dust and I’m still waiting for my refund). However my advice below will hopefully help test managers and test leads meet some of their challenges, or at the very least help to keep them sane.

1. One size does not fit all Time and time again test personnel are asked to produce a test plan for their project and the ‘Blue Peter’ approach is taken: ’here’s one I prepared earlier’. It does not seem to matter that the test plan was written in 1988 for a maintenance project using Waterfall methodology and the project under test now is Agile for a brand new system! Look at what can work from previous plans and modify to suit this project. For example the fluidity of requirements during Agile development will require more emphasis on exploratory test techniques which is about adapting your tests throughout the various iterations. In a traditional software development cycle where the requirement is fully known at the outset (allegedly) it is more applicable to have pre-prepared test pack.

2. Juggling is for the circus It is true that a test manager needs to manage many aspects of the test process and some may see it as their job to keep all the balls in the air at the same time. However this need not be the case; a good test manager should recruit a skilled trusted team

www.testmagazine.co.uk

to spread the load and allow the test manager to concentrate on the key priorities.

3. lies, damned lies, and statistics Don’t baffle your audience with meaningless numbers, screeds of text and numerous complicated graphs. Produce metrics that are at the right level for your intended recipient. A project sponsor only wants to know ‘are we on track?’ whereas the project manager also needs detail of issues and what the blockers are. Your reports should show at a glance the state of testing and allow you to continually review and alter your approach where required.

4. one team, one goal A successful test manager needs to build and maintain strong relationships not only within testing but with other IT departments and the business stakeholders. Ultimately getting the deliverable under test ‘live’ and the business benefits realised is in everyone’s interest. Get to know the ‘go to’ people that make things happen. The power people are not always obvious. You never know when you desperately need a new test server that the wee bloke you bought a pint for last week happens to be the guy in charge of building this.

5. Test management is not for the faint hearted Yes we know you’ve made friends (see above). However there will be times when you will be challenged. Be prepared to defend your estimates (read: actual, set in stone), your decisions on testing scope and why the fixed deadline is impossible within the previously agreed scope. Strong soft skills are required as well as objective arguments if you are

Time and time again test personnel are asked to produce a test plan for their project and the ‘Blue Peter’ approach is taken: ’here’s one I prepared earlier’. It does not seem to matter that the test plan was written in 1988 for a maintenance project using Waterfall methodology and the project under test now is Agile for a brand new system!

February 2012 | TEST


22 | Feature Test management

proposing time, cost or scope changes on necessary solution.

6. be a pragmatist, but don’t cut corners Yes, we must follow a process and yes, we are the gatekeepers of quality but don’t be sidetracked by checking off a long list of ‘must haves’ that are not always necessary. What may have been important at the outset may now be a lesser priority. Continually re-evaluate where you are and assess which of your quality criteria are still valid and of most importance. Do not be frightened to change the test approach: this may be necessary as the risks and priorities change.

7. Change is good As your business develops and processes evolve make sure you are evolving your test practices, review your regression sets, re-assess your team’s skill set; do you need to consider new test techniques, new tools, training? This quote from Alfred Edward Perlman says it all: “After you’ve done a thing the same way for two years, look it over carefully. After five years, look at it with suspicion. And after ten years, throw it away and start all over.”

8. Change is bad?

nadia mcKay Edge Testing www.edgetesting.co.uk

TEST | February 2012

Well, for the test manager’s sanity it can be. Imagine this, you are near completion of your test phase and somehow it looks as though you are going to meet the deadline. The phone rings; it’s the project manager with a critical late change, “It’s only a one line code change” he says. Sound familiar? The important thing here is to remember what we have previously said – review the impact on testing and ultimately the deadline. Inform the PM of the risks, consider what else can be done, how critical the issue is, is it a knee jerk reaction? Stay calm and manage the change, make sure all the stakeholders are aware of the change and the impact on them.

9. The early bird catches the worm What do I mean by this? We all know that the single biggest issue for testers is 'specification and requirements'; the lack of, incomplete, ambiguous, and constantly changing requirements and specifications to name but a few issues. It’s a broken record but you wouldn’t build a house without having plans for it first, would you? Building plans and designs are scrutinised by planning professionals, builders and architects and must meet certain standards before anyone goes near a trowel and cement mixer. We should learn from this and start our testing early by inspecting our requirements against criteria such as whether the requirement is unambiguous and consistent? Like any other test technique, this is a skill that can be learned and, like the house building project, get the professionals to identify and assess the standards that need to be met to ensure the final product is of high quality.

10. There’s always the coffee shop Quite regularly I have heard test managers at the end of their tether (well to be honest this one is not unique to TMs!) say “that’s it, I’m done, I’m off to work in the sandwich/coffee shop/ garden centre” (delete as appropriate) because these professions seem to be less stressful to them….really? Take a closer look. These industries are no different; there are still deadlines to be met (“I want my coffee and I want it now, with sprinkles on the top!”), challenges of meeting these (“the coffee machine’s broken and there is a queue out the door”). Try and remember why you chose to work in IT and why you continue to do so. Think about when your project successfully delivered, how much value you and your team added to this success, how you felt and the people you inspire. Remember what you have learned and continue to develop. good luck!

Building plans and designs are scrutinised by planning professionals, builders and architects and must meet certain standards before anyone goes near a trowel and cement mixer. We should learn from this and start our testing early by inspecting our requirements against criteria such as whether the requirement is unambiguous and consistent?

www.testmagazine.co.uk


Industry-leading Cloud, CEP, SOA and BPM test automation Putting you and the testing team in control Since 1996, Green Hat, an IBM company, has been helping organisations around the world test smarter. Our industry-leading solutions help you overcome the unique challenges of testing complex integrated systems such as multiple dependencies, the absence of a GUI, or systems unavailable for testing. Discover issues earlier, deploy in confidence quicker, turn recordings into regression tests in under five minutes and avoid relying on development teams coding to keep your testing cycles on track. GH Tester ensures integrated systems go into production faster: • • •

Easy end-to-end continuous integration testing Single suite for functional and performance needs Development cycles and costs down by 50%

GH VIE (Virtual Integration Environment) delivers advanced virtualized applications without coding: • • •

Personal testing environments easily created Subset databases for quick testing and compliance Quickly and easily extensible for non-standard systems

Every testing team has its own unique challenges. Visit www.greenhat.com to find out how we can help you and arrange a demonstration tailored to your particular requirements. Discover why our customers say, “It feels like GH Tester was written specifically for us”.

Support for 70+ systems including: Web Services • TIBCO • webMethods • SAP • Oracle • IBM • EDI • HL7 • JMS • SOAP • SWIFT • FIX • XML


Testing | Feature 24 |Web

Testing the message layer Andrew Thompson offers his top tips for testing web service enabled applications in an article designed to give you an idea of what sort of tests to consider.

s

o you’ve been given the task of testing your new web service enabled application, but where do you start? If you have never tested this type of application before it can be a daunting task. gone are the friendly client interfaces that you are used to, now you are expected to understand technologies such as Xml, soA, Jms, esb, mQ and others with such acronyms. This article is designed to give you an idea of what sort of tests to consider, not all may be relevant to your application, and I cannot describe them all fully here, but hopefully you will get a flavour and use this as a starting point in your endeavours.

Check the Wsdl is correct Web services are commonly defined within either a WSDL or an XSD file. These files will define the format of the requests and responses. It is critical therefore that tests should be

TEST | February 2012

performed on these files to ensure they are correct and conform to company standards. It is suggested the following types of tests are undertaken. schema tests: WSDL & XSD files will have schema definitions that the XML must conform to. Failure to comply with these definitions could mean WS calls have unpredictable results. semantic tests: It is common practise to ‘include’ other files within a WSDL or XSD file. These may define operations common to other applications. We need to test that these ‘import’ statements can find the required files. regression: Has the file defining the service been amended since running the last set of tests? If so, what has changed, and does it affect your functional tests? Ws-I interoperability: The Web Services Interoperability Organisation (WSI) is an open industry organisation chartered to establish Best Practices for Web services interoperability. The definition of your web services may need to be checked against these

standards, as well as the messaging for all functional tests. Find out if your company policy is to comply with WSI 1.1, 1.2, or 2.0. Internal Policies: Has your company laid down internal governance policies which the messaging must comply to? An example of this might be a standard naming convention for field names for a house address.

Functional unit Tests A web service will comprise of one or many operations. For example a simple service for a book store may define search, order, payment and cancel operations. Each of these operations should initially be tested in isolation, before integrating them into scenario based end to end tests. Each operation should have the following tests: • Each field accepts ‘good’ data, your go right path. • Check how each field handles negative data. This may be incorrect field length, incorrect format, eg

www.testmagazine.co.uk


Web Testing | 25

String instead of Integer data, etc. • WS-I Interoperability tests for the go right path. • The response message must comply with the WSDL/XSD definitions. • And of course, did you get the response you were expecting. • Run each test with a range of data to ensure good coverage.

scenario Testing Now each operation has been tested in isolation we need to ensure they all work together. Taking our bookstore example, the customer may search for a book, retrieve the ID number of the book they are after, and use this to place an order. They then require the order number to make a payment. Now of course if we are doing this manually then we just need to make a note of the values returned in the ‘id’ and ‘order’ number fields, and use these as input into the following request. For example, this is part of the response to our search for a book: <i xsi:type="n3:Book"> <id xsi:type="xsd:int">2</id> <title xsi:type="xsd:string">Java How to Program (4th Edition) </title> <quantity _ in _ stock And here is part of our next request message placing an order: <SOAP-ENV:Body> <placeOrder xmlns="http://www. parasoft.com/wsdl/store-01/"> <itemId>2</itemId> <quantity>1</quantity> </placeOrder> Of course this is a very simple example, and the complexity of some services only lends weight to the argument for automating this transfer of data between tests.

negative testing

Andrew Thompson Managing director Parasoft UK www.parasoft.com

Are you testing the GUI, or the back end application? This is an interesting point to make, as you cannot test both properly at the same time. Negative testing is one very important area where using the browser based GUI to test the BEA cannot work, as you will be testing that the GUI does not

www.testmagazine.co.uk

allow incorrect field data rather than how the application handles incorrect field data. We have to bear in mind that at some point in time the web service may be used by multiple different input methods. So whilst our browser based GUI is the only interface at this moment in time, this could change to include mobile phones, thick clients, other applications etc. Therefore we need to test what will happen if we should send an incorrect date format or perhaps a String in an Integer field. The GUI is (or should) be designed to constrain the user, and therefore will constrain the tester, and is the primary reason to use a test tool rather than the GUI for message layer testing. The next user interface may not be as critical in what it accepts as this one! Another type of negative test is where we deliberately send a request that will cause a failure message to be returned. Obviously we want to check that the ‘go wrong’ path works as expected as well!

security testing This is not some dark art that has to be left to the specialists. Any response to a test that you make, especially negative tests, should be checked to see if it contains any information you are not meant to receive. Examples of what to look out for include: • Application exception messages; • ‘Stack Traces’ ie, information related to lines of code that had an error; • Information related to a database query; • Error messages that specifically identify which component – userid or password – was incorrect when you tried to log on. Check that for systems that require a user to log on, can you perform tests with having done so? Do consecutive messages require a valid session ID? What happens if you do not provide this? Are any elements of the message encrypted? Provide negative tests that prove that the encryption process is working, i.e. invalid security keys.

How does the system handle invalid security keys – do you get a clean error message, or an application crash? Basic penetration tests should be applied to every field in every message sent. How does the application respond to SQL Injection attacks that attempt to bypass log on steps? Does an XML bomb crash the application? While I would not suggest this is a manual task, there are tools available to do these tests for you. Again, early detection of these issues is vital to getting your project delivered on time and to budget. The security consultants should only be verifying that the application passes all their tests, they should not be finding anything! If they do find something, then create a test that checks for this failure in the future.

Performance testing For the purpose of this article I am going to suggest a two stage performance testing system, the first step being to create ‘low volume’ load tests that are there to act as a warning flag. What defines ‘low volume’ will depend on your project, but I would suggest something in the region of 100 virtual users for a business application. These tests will be there to highlight areas of immediate concern, and can be run by the ordinary test team as opposed to the performance specialists. The second phase is in the realms of the specialist, and is outside the scope of this article. Performance tests can be set up on a unit level just the same as a functional test can. You do not need to wait until all the pieces are in place to allow you to do a performance test end-to-end. Once you have a working functional test, create a load test for it. Record the metrics for this test so you can see if further development degrades the performance of this function. Forewarned is forearmed as they say, if you can show a degradation of this service over time, the development team will more easily be able to tie that degradation to the work they have done and isolate if this was acceptable or not.

February 2012 | TEST


26 | Test Feature automation

How to test mobile apps instead of being tested by them Mobile testing is hard. With hundreds of variables to consider, and with a need to come to market quickly, it’s no wonder many organisations struggle. But mobile testing doesn’t have to make you testy. George Mackintosh examines the app market and reveals how automated graphical user interface (GUI) testing tools can make life considerably easier.

TEST | February 2012

www.testmagazine.co.uk


Test automation | 27

I

f there’s one thing that defines the modern age, it’s choice. This is particularly true when it comes to mobile technology. Consumers have access to different manufacturers, platforms, operating systems, colours, capacity – options everywhere. And for customers, it’s fantastic. For quality assurance teams, it’s an enormous headache. Demand for mobile applications has exploded in the last few years, but all this choice means that the software testers have to consider hundreds of variables, from compatibility across different platforms to how it will look on different resolution screens. A quick look at two of the larger marketplaces and the iOS and Android applications available shows that some companies are better at it than others. There’s a fascinating inconsistency in how developers manage mobile quality assurance across industries and platforms. The overall quality of applications is higher on iOS – Apple is rather draconian about what it’ll allow in the AppStore, and if the software isn’t fit for purpose, it’s likely to be pulled very quickly. Even so, many will still say that iOS still has its fair share of buggy or poor performance apps floating around. Android is a less restrictive platform and it’s this openness that makes it hugely attractive for many developers. But with freedom, comes less quality control, which means more of the bad apps – the ones with barely any testing – get through.

Being the next Angry Birds Of all industries, it’s games developers who seem to be putting the most effort into mobile testing. It’s understandable

www.testmagazine.co.uk

– everyone wants to be the next Angry Birds, and that’s not going to happen if their game is full of more bugs or crashes more often than your average NASCAR driver. Financial applications also benefit from extensive testing. Banks can’t afford to allow the security of their mobile banking and online banking applications to be compromised, so thorough testing processes are essential. If there’s anyone that’s dragging their feet when it comes to mobile applications, it’s social media. Facebook took its time to get a stable mobile application, and Twitter was also relatively late to the party. I’ve used many other social media applications that have been laggy or unreliable. It’s possible these apps didn’t have a particularly high budget for quality assurance, but as better software starts coming out, the teams that have skimped on testing are going to get a pretty brutal wakeup call, as users abandon them in droves.

Speed to market In some ways, a lack of consistency across mobile applications is unsurprising. Mobile testing is hard, constantly throwing up new challenges that quality assurance teams are forced to adapt to. A big one is that mobile apps often need to be brought to market very quickly – much faster than is really possible using traditional test tools. Regression testing is also a big issue for mobile app quality assurance teams. Updates roll around pretty quickly on phones and tablets, so testers need to be able to get their heads round running and scheduling regression tests very quickly. All this means mobile testing is complicated. Even if apps are developed for a single platform,

There’s a fascinating inconsistency in how developers manage mobile quality assurance across industries and platforms. The overall quality of applications is higher on iOS – Apple is rather draconian about what it’ll allow in the AppStore, and if the software isn’t fit for purpose, it’s likely to be pulled very quickly. Even so, many will still say that iOS still has its fair share of buggy or poor performance apps floating around.

February 2012 | TEST


28 | Test Feature automation

developers need to ensure the app works across different versions of the device, and earlier versions of the operating system at the very least. And if the app is designed for multiple platforms… well, you have the potential for the kind of horror story that keeps testers awake at night.

The real thing or emulation? When it comes to mobile testing, it’s important to test on the device itself. Many organisations struggle because they have to do this manually. Sitting for hours pounding away at a phone or a tablet is nobody’s idea of a good time. Even if it was, it’s horrifyingly inefficient and expensive. Some developers find creative alternatives – using an emulator for example – but it’s not an ideal solution because it’s unlikely to be as accurate as testing on the device itself. So mobile testing is challenging. But here’s the thing – it doesn’t have to be. Many of the problems that quality assurance teams encounter when testing these applications come from the use of standard methodologies and traditional testing tools. But mobile technology is still relatively new, and it requires its own approach. Test automation is a good way to greatly simplify mobile testing. Graphical user interface (GUI) testing tools in particular can be invaluable. These work by comparing images at pixel level to identify bugs and other issues. It’s a quick, easy and extremely reliable way to manage mobile testing – if the application throws up a different image to the one expected, you know there’s a problem. These tools automatically collect and collate results in a single place, so identifying the cause of the bug is remarkably straightforward. A GUI testing tool is particularly useful because it lets teams automate tests on the device itself. People aren’t forced to spend hours manually prodding a phone or tablet – they set a single

TEST | February 2012

test script and run it across multiple platforms and devices simultaneously. That means testing times are dramatically reduced – in some cases, from weeks to hours – so apps can come to market extremely quickly. It also eliminates the need to bring in extra contractors to manage manual testing. It’s this that traditionally eats up a hugely significant portion of testing budgets, so eliminating the need for extra manpower is extremely useful, particularly with so many businesses tightening their belts.

Salvation with automation Automating testing with a GUI-driven test tool also solves the problem of regression testing. With the ability to schedule and run tests with minimal input, development teams are able to respond to new updates extremely quickly. In fact, system updates simply cease to be an issue for testers. Ultimately, testers can stop being challenged by all choice offered to users if they realise they have a choice themselves. They can choose to test in a more efficient and intelligent way with automated testing tools, for example. They can choose easier analysis and bug reporting. And they can choose to release apps that are so robust, so user-friendly, that the choosing to use something else won’t even enter a customer’s head.

George Mackintosh CEO TestPlant www.testplant.com

Ultimately, testers can stop being challenged by all choice offered to users if they realise they have a choice themselves. They can choose to test in a more efficient and intelligent way with automated testing tools, for example. They can choose easier analysis and bug reporting. And they can choose to release apps that are so robust, so user-friendly, that the choosing to use something else won’t even enter a customer’s head.

www.testmagazine.co.uk


Join the Revolution Don’t let your legacy application quality systems hamper your business agility At Original Software, we have listened to market frustrations and want you to share in our visionary approach for managing the quality of your applications. We understand that the need to respond faster to changing business requirements means you have to adapt the way you work when you’re delivering business-critical applications. Our solution suite aids business agility and provides an integrated approach to solving your software delivery process and management challenges.

Find out why leading companies are switching to Original Software by visiting: www.origsoft.com/business_agility


30 | Test automation

Test automation can make you happy With a long history in software testing, Theofanis Vassiliou-Gioles describes how use of the correct test automation tools can make you happy.

Y

ou must know the adage that as system complexity increases special attention must be paid to the testing approach used. Furthermore, you are certainly aware of the possibility to reduce the risk of product failure by introducing test automation into the software development process. I am not talking about the classical unit testing approach here, I am referring to professional testing being part of the whole product or service development process. I am sure you have heard lots of stories about the ‘dark side’ of test automation. Read some examples and add your worst nightmare to this list of ‘My test automation project has failed because… … I have not had enough people to develop and/or maintain my test system.’ … the costs were exploding when developing my test system.’ … I could not justify the maintenance costs of my test environment to my management.’ There are many other reasons for failure, but there are also lots of success stories about test automation. Take this one for example: A small team of a Belgian operator for the radio

TEST | February 2012

communication network managed to automate its testing needs in a relatively short time, right on schedule and budget. How can it be? Why are certain projects successful? How do successful projects differ from failed ones? A lot of academic papers have been written about this topic, lots of studies have been carried out. But still, the first step towards success and happiness in your project is to consider some practical views.

Typical test automation A typical test automation project usually never starts on a green field. Either there are already tools available that have reached their end of life cycle, or there is a collection of different test tools that have been developed in-house. However, the most common situation is to have both types of legacy systems, as test engineers tend to create their own tooling landscape when they are unsatisfied with the performance of the existing test system, if there is one. On the one hand, I am happy to come upon this type of project since it shows that the team and management already value the benefit of a test automation solution, if it works. But on the other hand, it complicates my task because

every new approach is being strictly compared with the existing solution. Specific issues like the requirement to keep existing functionalities or to easily add new ones are of particular importance, but the question of how secure the proposed new investment will also be key. To be honest, managing these requirements alone is already a challenge in itself. What makes it even more ambitious though are the people involved. As a matter of fact, I identified at least two types: The software developer that has excellent skills in software design and implementation, and the test engineer that has excellent skills in test design and test analysis.

Tool landscape Going into this in detail, you will see that if the project was driven by software people you would find a pretty enlarged tool landscape which is, however, lacking the support of ‘real’ testing challenges. Obviously these types of tools have been enhanced bit by bit with required features over the years. Since former activities have been driven by software developers, some (from a test perspective) basic features have been overseen making it one day virtually impossible to maintain or enhance the existing test tools any

www.testmagazine.co.uk


Test automation | 31

further. You remember? One of the nightmares! If the former project was driven by test engineers the situation is comparably awkward. In respect of testing issues the testing tools come up pretty well but there are shortcomings from a software design perspective now as the tools have been ‘designed’ by test engineers. Senior software architects and developers are very, very seldom found in internal test automation software projects, especially in software companies! Why? Simply because they work on ‘real’ products or services that keep the company alive. Is this not a nightmare for a tester? Another typical approach to overcome such problems is to purchase or license ready to use test tools. The valid assumption is that commercial or even open source tools are designed by professional software engineers, whose absence is one of our internal shortcomings as you know. A test automation software company spends a lot of engineering money in developing the optimal testing tool, for a specific domain. Being strictly focused on a particular domain or technology is the greatest strength of every commercial tool, unfortunately its weakness too. As efficient as the solution may be in the focused domain, it will be poor if applied outside the intended area. Is this not again a nightmare?

The solution The only way out of this situation would be to find a test technology that is well designed by testers for testers with a solid test system architecture designed by software developers for software developers. And, if you could add some commercial tool and service support that is backed by a strong community in your particular domain, that would get you a long way down the path to personal happiness. At this point, let us get back to the above mentioned operator’s case study to find out how they managed to become happy with test automation. Theofanis Vassiliou-Gioles CEO Testing Technologies www.testingtech.com

In practice The QA process for a new service contains nearly all the same components as needed when

www.testmagazine.co.uk

introducing test automation at an operator. First of all it involves a new service which implies new technologies beyond the usual. Secondly, there are limited engineering resources that suggest the need for an off-theshelf solution but in a new scenario. And finally, introducing a new service typically also means tight schedules, as the motivation for a new service is to have happy old and new customers. While scouring the existing commercial tool market, the Belgian operator took notice of Testing Technologies’ TTworkbench, which had already been applied in their particular domain. Its unique features, especially how test progress is visualised and reported, convinced them to give TTworkbench a try. But unfortunately no off-the-shelve tool support was available for their particular problem within TTworkbench. This could have been the end of a promising business relationship but it was not in this case. TTworkbench offers extension capabilities via open and standardised APIs enabling the implementation of additional functionalities. Testing Technologies provided all missing functionalities as an off-the-shelf solution just in time, so the operator’s testers could immediately start to create their specific test environment. To implement the same functionality would have been an alternative option. As a result the Belgian service provider could not only deploy a highly customised, but off-the-shelf test environment, but all their investments in building this particular test infrastructure could be reused in the future too, as the integrated TTworkbench is based on an internationally standardised test technology called TTCN-3. This technology was created by testers for testers in an international, technology independent environment, and implemented by software developers for software developers. You see, test automation can be easy. It helps you to assure the quality of your products and services while saving you time and money, thus it makes you feel happy. Spare further nightmares with support from the experts of test automation to be found at www. testingtech.com or at www.ttcn-3.org.

The only way out of this situation would be to find a test technology that is well designed by testers for testers with a solid test system architecture designed by software developers for software developers. And, if you could add some commercial tool and service support that is backed by a strong community in your particular domain, that would get you a long way down the path to personal happiness.

February 2012 | TEST


32 | Test data

The complexities of data-driven testing Testing voice biometric authentication solutions Ashley Parsons explains the benefits of voice biometrics in strong authentication solutions and the complexities behind data-driven testing.

T

he human voice is a key means by which people identify other people that are already known to them. As such, the ease with which speech is used by most and the strength of speech as a biometric makes it an exciting technology which is becoming more mainstream in its application and usage within security and authentication solutions. This is becoming crucial as the need to secure the

TEST | February 2012

mobile world progresses at an increasingly rapid growth rate. While a significant amount of research has already been conducted into the use of voice biometrics for a wide spectrum of security possibilities, speaker verification can be said to have aligned itself with the future of mobile banking and cloud solutions.

multi-factor authentication The advances of mobile banking and other cloud-based services, coupled with fraudulent activity

www.testmagazine.co.uk


Test data | 33

and user perception, are providing evidence that current two factor authentication techniques are less successful at protecting us than they have been for many years in, for instance, authenticating internet banking transactions. The addition of speaker verification, which represents something you are (a characteristic which is unique to you), to a combination of other factors, something you know (a OTP/ PIN), something you have (a mobile) and somewhere you are (as location and jurisdiction are crucial when providing point-of-sale solutions) can help to provide three or four factor authentication solutions to assist in securing the mobile banking world. It is due to the advances being made by the research into voice biometrics, evolution of server hardware and the progression to the increasing use of smart phones and mobile banking that speaker verification has moved from being a niche market, to a now more mainstream biometric for authentication.

Testing considerations As with all solutions, the greater the amount of testing and coverage is completed the higher the level of confidence that can be achieved for the specific solution. Biometric solutions are no different but have many additional complexities than a standard software solution. In order to test a voice biometric solution we have to consider a variety of factors for example the demographic of the end user, the importance of accents, the impact of telephony and codecs. Another complexity is that, as research into the potential use of speaker verification is evolving, new techniques and optimisation are of great significance to the technology becoming mainstream. At the forefront of this is the research into the compatibility of voice biometrics with easy to use and practical end solutions. For example an EU funded project is currently ongoing, involving ValidSoft (as consortium lead) and a number of other European companies alongside Laboratoire en Informatique d’Avignon.

www.testmagazine.co.uk

The objective of this project is to help develop and refine the approach to determine the strongest solutions to be taken for the use of voice biometrics. The potential test coverage of any voice biometric solution will not represent full regression coverage for a large time period. With the introduction of each advance in voice biometric technologies being integrated into end user solutions, an increase of the scope of the testing will be a necessity. This project has proved that research and development at an academic level, relies heavily on testing and not only can we as testers assist in the lifecycle of a product or solution, as seen in Agile or more stringent v-model methodologies, but our skills should also be used to assist in advancements within the academic research and development studies of our partners. Testing, as a service, can also assist in the organisation of data into well planned and structured test cycles or iterations in order to assist researchers in continuing this drive forward. Understanding key concepts and conditions under which speaker verification should be used to provide the highest levels of performance, for instance the level of background noise, the impact of using different phones are not only of great importance to researchers but, perhaps more importantly to the end solution in commercial applications.

The necessity of data One critical element for testing a successful solution is good quality data and the testing of speaker verification, for both research and end solution development is no exception. The considerable amount of progress being made by research teams is owed to a greater amount of data being made available. With ever growing speech databases, research has been able to develop innovative approaches based on statistical modelling of speech variations (how voices vary from one day to another, from one speaker to another), resulting in the design of more robust voice biometrics systems.

The addition of speaker verification, which represents something you are (a characteristic which is unique to you), to a combination of other factors, something you know (a OTP/ PIN), something you have (a mobile) and somewhere you are (as location and jurisdiction are crucial when providing pointof-sale solutions) can help to provide three or four factor authentication solutions to assist in securing the mobile banking world.

February 2012 | TEST


34 | Test data

The implementation of voice biometrics into real world solutions requires not only data collection for the testing of an end solution but it is also critical to the calibration and tuning of the biometrics engine. It is during the calibration and tuning stages that the more detailed performance and error rate targets are made achievable. The confidence garnered by performing this work against a large data set is vital to providing statistical relevance of such rates and the overall success of the solution. For example, when working with a system running with a two percent false negative rate (when a genuine speaker is not rightly recognised), the chance of having such errors are one in 50. It means there should be large enough number of different speakers and test trials in the testing phase to make sure that enough errors are observed and consequently that the subsequent result analysis is of statistically significant.

Heuristically the best candidates for the provision of data, are those that provide the greatest level of commitment and that have a vested interest in the success of the product either directly or indirectly. For example the staff of the customer or those who feel that being part of something exciting and the future is of great enough significance. It is this approach that was taken and proved successful when undertaking system, performance and user acceptance testing. The levels of resource available to our client, alongside a desire to see the project succeed, provided us with a comprehensive bank of audio file data. This data set enabled us to perform extensive test cycles at all levels of testing using real life data. This data was all the more valuable as it was heavily representative of the target demographic of the deployed solution.

Data collection challenge

The final hurdle

ValidSoft’s latest deployment, combining both text dependent and text independent biometric checks, required a large data collection exercise in order to achieve the level of confidence required by our own high standards and the customers’ needs. But what is good data? How do you gather a large amount of good quality speech input data tailored to meet a specific end solution? Good data, in terms of testing for voice biometric solutions is data that is representative of the operational conditions and of the end user demographics. For example, if your solution is designed for residents of Eire, you are going to receive stronger confidence levels testing using data from residents of Eire than that of Scotland or England. There are, of course, notions that all data is good data, as even data which does not match the demographic can provide good quality negative test analysis. For speaker verification however this data can skew or alter results negatively when used for tuning and training verification models.

It is essential that you do not waste the hard work in collecting a large quantity of good data by making it difficult to use and analyse. The complexity of categorising data should not be underestimated. Good data utilises structured naming conventions for files collected, ways to identify data as individual items and data grouped by data items are critical to being able to make good use of the collected audio. Identifying the meta data that can be captured to assist further analysis is also key to the overall success of the gathering exercise. Preparing the requirement for the analysis in advance of the gathering exercise will ensure that the data you collect remains good quality throughout the lifecycle of the project. Admittedly, you will find that candidates from unexpected sources can provide good quality data. You should never limit yourself to any one means of gathering and let this slip through. This data is invaluable, and thanks to a high quality data gathering exercise, the testing of such solutions is made that little bit easier.

TEST | February 2012

Ashley Parsons Test manager ValidSoft www.validsoft.com

Good data, in terms of testing for voice biometric solutions is data that is representative of the operational conditions and of the end user demographics. For example, if your solution is designed for residents of Eire, you are going to receive stronger confidence levels testing using data from residents of Eire than that of Scotland or England.

www.testmagazine.co.uk


Does Your VoIP Solution Play Well with Fax?

Get Them Talking with Help from QualityLogic and 2LJ If your VoIP solution isn’t communicating with your customers’ fax machines, we can help.

Your VoIP solution and fax can play well together contact us today to find out how we can help.

You can use QualityLogic’s FaxLab fax device emulator to test interoperability or the T.30 conformance of a fax terminal or gateway. FaxLab provides precise control and reporting of T.30 call messages that are sent through T.38 FoIP gateways. This removes the greatest uncertainty in T.38 testing, which is knowing exactly what is happening with the T.30 calls that are being translated. If you need to monitor a fax call between two terminals, or monitor the IP data stream as well as the T.30 exchanges between two gateways, QualityLogic’s DataProbe T30 Analyzer or DataProbe T30-T38 Analyzer can help. These analyzers are designed to troubleshoot the interoperation of multiple fax devices in communicating pairs.

2LJ QualityLogic Distributor for Europe 8 rue Lesdiguiére MBE 150 38000 Grenoble (France) Tel + 33 (0)6 89 84 14 20 www.2LJ.fr

·

contact@2LJ.fr


36 | Training Corner

Professionally speaking... The subject of email etiquette and ‘netiquette’ is a tricky one, even after more than a decade of use, we still haven’t really cracked it. Angelina samaroo makes it her resolution to have a go.

T

his is my first communication with you for 2012, so let me begin by saying Happy new Year to you all. In the uK we have the Queen’s diamond Jubilee celebration (not to mention the extra bank holiday) and the london olympics to keep up the good cheer until the first fall of leaves to remind us of the longer nights beckoning, before we can ring in the good cheer again.

Angelina samaroo Managing director Pinta Education www.pintaed.com

TEST | February 2012

On the subject of communications, over the last year and now, I have become increasingly aware of a growing tendency towards the possibly not-quite-polite ways of communicating with others. I am of course, guilty of a few of these. So, to follow the philosophy, if you write down your goals (resolutions?), you’re more likely to achieve them. Let us begin with the opening email on a subject thread. How many now begin with just your name? No ‘Hello’, ‘Hi’, or the very continental ‘Dear’. I must have missed the memo saying that it is unprofessional to say hello first. If you meet someone you know in the street, they may be rather alarmed if on meeting you, you simply repeated their name – they probably know that gem already. ‘Hi Joe, how nice to bump into you’, may set the warm and friendly tone for the rest of the conversation. In business, warm and friendly may well reap better rewards than a cold, ‘Let’s get down to business, shall we?’ In the same vein, how many begin with just ‘Hi’? This time, no name. So having said that I must already know my name, why does this concern me? To me, it suggests a lack of real interest; I’m having to talk to you because I

want something, not because I really want to. Again, the warmth for me is missing. Let’s now look at the subject header. The opening email header generally suggests the content, and my experience is that we’re pretty good at it. What we’re not so good at, is the reply. Generally, we hit the reply button, without adding to the subject header. The reply to the email may well have new content. For instance, an email with the subject header ‘What time would you like to meet?, could be replied to with an updated header ‘What time would you like to meet - how about 10am?’ I know this will be in the content, but this is rather more convenient, especially when you’re travelling, and living off the smart-phone. The other issue with the reply button is that often there have been many replies from many recipients, with the eventual content diverging somewhat from the original subject. So, ‘What time would you like to meet?’ remains as the subject header, with the content moving from ‘time to meet’, to ‘meeting notes’, to ‘here’s my proposal,’ to ‘here’s my quote,’ to ‘travel arrangements’ and so on. All OK to a point. Consider though that emails may need to be responded to by someone else in your absence, or you have the infernal plethora of emails to sort through, or the fatal blow, where you have forwarded to a third party, revealing not just travel arrangements (intended), but the entire thread from initial conversation to contract being awarded (unintended, unhelpful). If forwarded, then the last attachment may well be intact. Yes, you can recall the message, but that does not stop the detail of your transactions getting

out, just that you’ve made a mistake and are asking the jury to ‘please ignore my last message’. At some point, it can be useful to start afresh with the contact list, not the last email. Then you’re actively thinking about the relevance of the target recipient to the message. When it comes to the content, we often forget the niceties. The niceties, which like the arts, remind us of the higher orders of life and for me one of the most tangible and accessible ways to lift my head out the sand, every day, many times over the course of it. ‘How are you?’; ‘Trust you had a good weekend’; ‘It’s Friday – yippeeee’. The last one probably should be saved for those you are on friendly terms with, but again, it may well set up the necessary tone for longevity of a business partnership. As for the ending, most of us now have an auto signature. This leads us to forget to end the conversation properly. ‘Thanks’; ‘Have a great weekend’. And back to where we started, being bothered to type your own name. It’s in the autopilot. And does the full signature need to be on every email to the same person? Formality is required on initial contact, but as the saying goes ‘less is more’. A subtle act of deciding on who needs the autopilot and who doesn’t suggests an active management of an on-line partnership. Now let’s look at what may happen when you don’t hit the reply button. Some have the out-of-office assistant to do it for them. Now, this one for me works provided you remember to switch it off on your return to work. What if you’re not away? The (bona fide) sender is left to wonder. Are you hiding? Are you the very convenient ‘too busy’? Are you acting on the email?

www.testmagazine.co.uk


Design for TEST | 37

Survival of the fittest Any option could be true, but how does the other party interpret silence? Perhaps you have a pattern between you. I mentor those applying for registration as professional test engineers, and this requires significant activities on both sides, so long turnaround times have become the norm for us. Whilst it may do your ego a world of good to be so very busy, this could be interpreted as either ‘too busy, for you’, or just ‘too disorganised, for anyone’. Perhaps a holding reply – ‘got it, I’ll get back to you next week’ may well do the trick. Of course, if you really don’t want to hear from this person, perhaps it would be more professional to just say no thanks, then you’re in a position of real strength – ‘got it, read it’, don’t want it’. On to social media; to quote Michael Jackson: “All the little birdies on Jaybird Street, love to hear the robin go tweet tweet tweet”. Make your on-line contributions make others sing. We all have a need to vent sometimes, but consider if a tweet or wall is the best way to do this. Also, censor your own work – watch out for profanity, for statements which could be interpreted as libellous, for mere gossip. Once out there, it can be difficult to take it all back, especially when others have responded and things have escalated, possibly out of your control. Remember the #IAmSpartacus in 2010? Raises a smile on the rest of us, but what price to the author for his fame? As for Wikileaks, the lesson here for me is to make sure you’ve got the strength of character to see your way all the way to the top of the Google rankings. There may be no such thing as bad publicity for a business in the long run, but the long arm of the law will reach the individual, rename him (or her) the perpetrator, thus turning him from hero to zero to those in power, in the headlights – where it all matters to them. Just got to go now and apologise to a few people on my lack of netiquette. So, ‘till next time, here’s to 2012 being our most professional year yet.

www.testmagazine.co.uk

Taking his inspiration from nature with a Darwinian approach, mike Holcombe is breeding test sets using evolutionary testing.

o

ne approach to finding effective test sets is to ‘breed’ them. The idea, inspired by biology and, in particular, genetics, is based on two fundamental ideas. Firstly we need to find a way to represent test inputs as genes. Then we define some way of measuring how good a fit the ensuing genes are and this is based on choosing a suitable criteria. If we are interested in exercising particular branches in a program then the criteria would evaluate against this desire. We start with the control flow graph. This describes the different ways an execution of a program can proceed through decision points, taking different choices and branches and carrying out functions on those branches. The program will require a number of input values and, depending on these, will execute different paths through the program. One way to test the program is to exercise every possible path through the program or exercise every function on every path. For a large program this is a challenging task and needs to be automated. Evolutionary testing starts with a vector representing the input values – a test case. What we need to do is to generate these vectors automatically with the aim of finding enough to exercise every path. If we choose a particular branch of the program and desire to find a test case that will exercise it we could randomly generate a test vector and then try to check whether it reaches the desired path, this will require checking what the program with this particular input vector does at each

mike Holcombe Founder and director epiGenesys Ltd www.epigenesys.co.uk

decision point. If these decisions mean that it goes down alternative paths and cannot reach the section that we want we then throw this vector away. A new vector is generated to try gain. The insight in evolutionary testing is that we ‘breed’ these vectors in a way inspired by genetics. Some people will encode the input vectors into binary strings – eg, one vector might be 0011001010, another might be 1110000011. Suppose that these test cases get quite close to the desired part of the program path we then breed a couple of new ones. The method is to mix the two binary vectors up. We could, for example choose new ones by splitting them in two and recombining them using the first five digits of one with the last five digits from the other, thus: 0011000011 and 1110001010 This is called a crossover mutation and here takes place in the middle of the vectors. We could also introduce some randomness by flipping a digit: 0011100011 and 1110001110 There are other techniques, again based on nature, to create new populations of vectors that ‘preserve’ the best features of the vectors so far found in the hope that better ones will result. When we have some new vectors we check them against the criteria of how close they get to the required part of the program and continue this process until we manage to get some vectors to exercise the branch we want. For more information take a look at: search-based software Test data generation: A survey, by Phil mcminn: http://philmcminn.staff.shef.ac.uk/ papers/2004-stvr.pdf

February 2012 | TEST


38 | Test qualifications

Does getting a test certification make you a better tester? It’s experience and enthusiasm that make for a good tester argues Ramanath Shanbhag, but he can see a day when testers will yearn to be certified in order to qualify to do the job and get the respect they deserve.

TEST | February 2012

www.testmagazine.co.uk


Test qualifications | 39

I remember reading a blog from James Whittaker (currently working as test director at Google) which said, “I have yet to meet a single tester at Microsoft who is certified. Most don’t even know there is such a thing. They’ve all learned testing the old fashioned way: by reading all the books and papers they can get their hands on, apprenticing themselves to people at the company that are better at it than they are and critiquing the gurus and would-be gurus who spout off in person and in print”. This is what he said when he was at Microsoft (more than three years ago), and I don’t think his thoughts would have changed even at Google.

I

was on a panel a few weeks back to discuss the above topic and, to be honest, my first thought was “It’s an oxymoron!” I asked myself, how someone can become a better tester just by getting a certificate. Interestingly, we had esteemed panel members as part of this discussion – and during our initial sync-up it sounded like everyone was backing the thought. I was the odd man out! So I had to do my homework. I remember reading a blog from James Whittaker (currently working as test director at Google) which said, “I have yet to meet a single tester at Microsoft who is certified. Most don’t even know there is such a thing. They’ve all learned testing the old fashioned way: by reading all the books and papers they can get their hands on, apprenticing themselves to people at the company that are better at it than they are and critiquing the gurus and would-be gurus who spout off in person and in print”. This is what he said when he was at Microsoft (more than three years ago), and I don’t think his thoughts would have changed even at Google. So I read literature from many other testing gurus and, every one

www.testmagazine.co.uk

of them seems to believe that these certifications don’t make you a better tester. One even said, “I don’t think these certifications are really certifications at all. It’s just training. For most of the certifications, it's not even training. It's just passing an exam”. Having read and heard from these people within the industry I was happy that I was not alone in thinking this way. But, after talking to others, I realised that: - T hings change by the time you prepare and pass the certification – so you are always playing catch-up. Hence, the inquisitiveness to always learn new things and keep up-to-date does matter. -M ost people take these exams because it looks good on their CV. I read somewhere that when a HR person is short-listing similar profiles that the one with a certification is weighed higher than the other. However, I am not sure to what extent this is true. -W orking towards getting certified requires lot of discipline. Some said that they wouldn’t have learnt things if they hadn’t taken the certification. However, I have my views on this as I believe if you are interested to learn something and are really committed to it, I don’t think you

will wait for a certification exam. Although, of course, you might take the certification just to validate your learning.

The big picture In my opinion, what is important for a tester is to get the big picture quickly, and apply the concepts they know in order to discover the defects in the product / application. How quickly they can identify the use cases and apply them to make the product better will take them further in their career. Let us now look at this from two different angles, maturity of testing and testing community support.

Maturity of testing Those who have been in the IT industry for over a decade will appreciate the way software testing has emerged into a discipline when it used to be an activity which was done mostly by developers to validate what they wrote. There was no specific process or methodology on how a programme should be tested, it was all down to what the developer felt was important at that time – or where they expected the programme to be vulnerable. Today there are various schools of thought on where a certain way of testing is better over the other. For

February 2012 | TEST


40 | Test qualifications

Testers will yearn to be certified in order to qualify to do the job – from being an apprentice to a certified tester! Being certified will make them think differently and enable them to apply the concepts imbibed to the work at hand - and be respected by the community and peers. That day is not too far off considering the rate at which we are growing.

TEST | February 2012

example, exploratory testing will have an upper hand when you don’t have a spec, and need to quickly discover defects in a given app. There are various heuristics and oracles developed to enable testers to find bugs at an enormous pace which were only known to a few experienced testers. These have all made testing much more sophisticated than it used to be. Unlike the longstanding disciplines like mechanical or civil engineering, software doesn’t have as much history where certain concepts have matured and been used in a similar way. For example, even today a hammer is used in a similar way to hit a nail on the wall as it used to be done a100 years ago. And, although the material used to make hammer is more refined and they use various alloy or carbon to make stronger and lighter hammers, the basic function on how this tool is used still remains. The irony is, even today, we develop different software for different customers without much reuse. The above examples suggest that things have evolved, and evolved quite quickly in the software / IT world. Hence, we see tools, software and utilities being developed to solve some specific needs – rather than growing with human evolution.

Testing community support

Ramanath Shanbhag General manager MindTree www.mindtree.com

I am of the opinion that for anything to grow and mature, an active community plays an important role. Nowadays, there are forums to post your query and someone from that community will quickly respond. With the advance in technology and tools at our disposal, some of these communities have grown better than the others.

However, these communities still operate in silos – where they exist to solve a specific problem. For example, take a tool like Selenium which is widely adopted in automating web applications. HP’s FT (Functional Tester aka QTP) is another popular tool for test automation, and HP encourages testers to be FT certified. As you might expect, most of the discussions that would happen in this forum would be around how a specific web component can be automated with this tool. However, they don’t provide scope for creativity and innovation and, even if innovation does take place, it is marginal. My suggestion is if such active communities can be leveraged by the testers (who have a big picture of the industry), then the possibilities that it can create is enormous!

Becoming better testers Now what have these two aspects to do with our topic “Does getting a test certification make you a better tester?” My stake is that as this industry makes strides from an infant to a toddler to a youth to a grown-up adult, some of these communities will play an important role in how we become better testers. In addition, the bar for getting certified will increase and hence, we will expect the tester to have experienced the nuances of the trade before being certified. Testers will yearn to be certified in order to qualify to do the job – from being an apprentice to a certified tester! Being certified will make them think differently and enable them to apply the concepts imbibed to the work at hand - and be respected by the community and peers. That day is not too far off considering the rate at which we are growing.

www.testmagazine.co.uk


Can you predic TEST company profile | 41

Facilita Facilita load testing solutions deliver results Facilita has created the Forecast™ product suite which is used across multiple business sectors to performance test applications, websites and IT infrastructures of all sizes and complexity. With class leading software and unbeatable support and services Facilita will help you ensure that your IT systems are reliable, scalable and tuned for optimal performance.

Forecast™ is proven, effective and innovative A sound investment: Choosing the optimal load testing tool is crucial as the risks and costs associated with inadequate testing are enormous. Load testing is challenging and without the right tool and vendor support it will consume expensive resources and still leave a high risk of disastrous system failure. Forecast has been created to meet the challenges of load testing now and in the future. The core of the product is tried and trusted and incorporates more than a decade of experience and is designed to evolve in step with advances in technology. Realistic load testing: Forecast tests the reliability, performance and scalability of IT systems by realistically simulating from one to many thousands of users executing a mix of business processes using individually configurable test data. Comprehensive technology support: Forecast provides one of the widest ranges of protocol support of any load testing tool. 1. Forecast Web thoroughly tests web based applications and web services, identifies system bottlenecks, improves application quality and optimises network and server infrastructures. Forecast Web supports a comprehensive and growing list of protocols, standards and data formats including HTTP/HTTPS, SOAP, XML, JSON and Ajax. 2. Forecast Java is a powerful and technically advanced solution for load testing Java applications. It targets any non-GUI client-side Java API with support for all Java remoting technologies including RMI, IIOP, CORBA and Web Services. 3. Forecast Citrix simulates multiple Citrix clients and validates the Citrix environment for scalability and reliability in addition to the performance of the published applications. This non-intrusive approach provides very accurate client performance measurements unlike server based solutions. 4. Forecast .NET simulates multiple concurrent users of applications with client-side .NET technology.

6. Forecast can generate intelligent load at the IP socket level (TCP or UDP) to test systems with proprietary messaging protocols, and also supports the OSI protocol stack. Powerful yet easy to use: Testers like using Forecast because of its power and flexibility. Creating working tests is made easy with Forecast's application recording and script generation features and the ability to rapidly compose complex test scenarios with a few mouse clicks.

4

G

Facilita Software Development Limited. Tel: +44 (0)12

Supports Waterfall and Agile (and everything in between): Forecast has the features demanded by QA teams like automatic test script creation, test data management, real-time monitoring and comprehensive charting and reporting. Forecast is successfully deployed in Agile "Test Driven Development" (TDD) environments and integrates with automated test (continuous build) infrastructures. The functionality of Forecast is fully programmable and test scripts are written in standard languages (Java, C# and C++ ). Forecast provides the flexibility of Open Source alternatives along with comprehensive technical support and the features of a high-end commercial tool. Monitoring: Forecast integrates with leading solutions such as dynaTrace to provide enhanced server monitoring and diagnostics during testing. Forecast Virtual User technology can also be deployed to generate synthetic transactions within a production monitoring solution. Facilita now offers a lightweight monitoring dashboard in addition to integration with comprehensive enterprise APM solutions. Flexible licensing: Our philosophy is to provide maximum value and to avoid hidden costs. Licenses can be bought on a perpetual or subscription basis and short-term project licensing is also available with a “stop-the-clock” option.

Services Supporting our users In addition to comprehensive support and training, Facilita offers mentoring by experienced consultants either to ‘jump start’ a project or to cultivate advanced testing techniques. Testing services Facilita can supplement test teams or supply fully managed testing services, including Cloud based solutions.

5. Forecast WinDriver is a unique solution for performance testing windows applications that are impossible or uneconomical to test using other methods or where user experience timings are required. WinDriver automates the client user interface and can control from one to many hundreds of concurrent client instances or desktops.

Facilita Tel: +44 (0) 1260 298109 Email: enquiries@facilita.co.uk Web: www.facilita.com

www.testmagazine.co.uk

February 2012 | TEST


42 | TEST company profile

Parasoft Improving productivity by delivering quality as a continuous process For over 20 years Parasoft has been studying how to efficiently create quality computer code. Our solutions leverage this research to deliver automated quality assurance as a continuous process throughout the SDLC. This promotes strong code foundations, solid functional components, and robust business processes. Whether you are delivering Service-Orientated Architectures (SOA), evolving legacy systems, or improving quality processes – draw on our expertise and award winning products to increase productivity and the quality of your business applications.

Specialised platform support:

Parasoft's full-lifecycle quality platform ensures secure, reliable, compliant business processes. It was built from the ground up to prevent errors involving the integrated components – as well as reduce the complexity of testing in today's distributed, heterogeneous environments.

Trace code execution:

What we do Parasoft's SOA solution allows you to discover and augment expectations around design/ development policy and test case creation. These defined policies are automatically enforced, allowing your development team to prevent errors instead of finding and fixing them later in the cycle. This significantly increases team productivity and consistency.

End-to-end testing: Continuously validate all critical aspects of complex transactions which may extend through web interfaces, backend services, ESBs, databases, and everything in between.

Advanced web app testing: Guide the team in developing robust, noiseless regression tests for rich and highly-dynamic browserbased applications.

Access and execute tests against a variety of platforms (AmberPoint, HP, IBM, Microsoft, Oracle/ BEA, Progress Sonic, Software AG/webMethods, TIBCO).

Security testing: Prevent security vulnerabilities through penetration testing and execution of complex authentication, encryption, and access control test scenarios.

Provide seamless integration between SOA layers by identifying, isolating, and replaying actions in a multi-layered system.

Continuous regression testing: Validate that business processes continuously meet expectations across multiple layers of heterogeneous systems. This reduces the risk of change and enables rapid and agile responses to business demands.

Multi-layer verification: Ensure that all aspects of the application meet uniform expectations around security, reliability, performance, and maintainability.

Policy enforcement: Provide governance and policy-validation for composite applications in BPM, SOA, and cloud environments to ensure interoperability and consistency across all SOA layers. Please contact us to arrange either a one to one briefing session or a free evaluation.

Application behavior virtualisation: Automatically emulate the behavior of services, then deploys them across multiple environments – streamlining collaborative development and testing activities. Services can be emulated from functional tests or actual runtime environment data.

Load/performance testing: Verify application performance and functionality under heavy load. Existing end-to-end functional tests are leveraged for load testing, removing the barrier to comprehensive and continuous performance monitoring.

Spirent CommunicationsEmail: plc Tel:sales@parasoft-uk.com +44(0)7834752083 Email: Web: www.spirent.com Web: www.parasoft.com Tel:Daryl.Cornelius@spirent.com +44 (0) 208 263 6005

TEST | February 2012

www.testmagazine.co.uk


TEST company profile | 43

Seapine Software TM

With over 8,500 customers worldwide, Seapine Software Inc is a recognised, award-winning, leading provider of quality-centric application lifecycle management (ALM) solutions. With headquarters in Cincinnati, Ohio and offices in London, Melbourne, and Munich, Seapine is uniquely positioned to directly provide sales, support, and services around the world. Built on flexible architectures using open standards, Seapine Software’s cross-platform ALM tools support industry best practices, integrate into all popular development environments, and run on Microsoft Windows, Linux, Sun Solaris, and Apple Macintosh platforms. Seapine Software's integrated software development and testing tools streamline your development and QA processes – improving quality, and saving you significant time and money.

TestTrack RM TestTrack RM centralises requirements management, enabling all stakeholders to stay informed of new requirements, participate in the review process, and understand the impact of changes on their deliverables. Easy to install, use, and maintain, TestTrack RM features comprehensive workflow and process automation, easy customisability, advanced filters and reports, and role-based security. Whether as a standalone tool or part of Seapine’s integrated ALM solution, TestTrack RM helps teams keep development projects on track by facilitating collaboration, automating traceability, and satisfying compliance needs.

TestTrack Pro TestTrack Pro is a powerful, configurable, and easy to use issue management solution that tracks and manages defects, feature requests, change requests, and other work items. Its timesaving communication and reporting features keep team members informed and on schedule. TestTrack Pro supports MS SQL Server, Oracle, and other ODBC databases, and its open interface is easy to integrate into your development and customer support processes.

TestTrack TCM TestTrack TCM, a highly scalable, cross-platform test case management solution, manages all areas of the software testing process including test case creation, scheduling, execution, measurement, and reporting. Easy to install, use, and maintain, TestTrack TCM features comprehensive workflow and process automation, easy customisability, advanced filters and reports, and role-based security. Reporting and graphing tools, along with user-definable data filters, allow you to easily measure the progress and quality of your testing effort.

QA Wizard Pro QA Wizard Pro completely automates the functional and regression testing of Web, Windows, and Java applications, helping quality assurance teams increase test coverage. Featuring a nextgeneration scripting language, QA Wizard Pro includes advanced object searching, smart matching a global application repository, datadriven testing support, validation checkpoints, and built-in debugging. QA Wizard Pro can be used to test popular languages and technologies like C#, VB.NET, C++, Win32, Qt, AJAX, ActiveX, JavaScript, HTML, Delphi, Java, and Infragistics Windows Forms controls.

Surround SCM Surround SCM, Seapine’s cross-platform software configuration management solution, controls access to source files and other development assets, and tracks changes over time. All data is stored in industry-standard relational database management systems for greater security, scalability, data management, and reporting. Surround SCM’s change automation, caching proxy server, labels, and virtual branching tools streamline parallel development and provide complete control over the software change process.

www.seapine.com Phone:+44 (0) 208-899-6775 Email: salesuk@seapine.com United Kingdom, Ireland, and Benelux: Seapine Software Ltd. Building 3, Chiswick Park, 566 Chiswick High Road, Chiswick, London, W4 5YA UK Americas (Corporate Headquarters): Seapine Software, Inc. 5412 Courseview Drive, Suite 200, Mason, Ohio 45040 USA Phone: 513-754-1655

www.testmagazine.co.uk

February 2012 | TEST


44 | TEST company profile

Micro Focus Deliver better software, faster. Software quality that matches requirements and testing to business needs. Making sure that business software delivers precisely what is needed, when it is needed is central to business success. Getting it right first time hinges on properly defined and managed requirements, the right testing and managing change. Get these right and you can expect significant returns: Costs are reduced, productivity increases, time to market is greatly improved and customer satisfaction soars. The Borland software quality solutions from Micro Focus help software development organizations develop and deliver better applications through closer alignment to business, improved quality and faster, stronger delivery processes – independent of language or platform. Combining Requirements Definition and Management, Testing and Software Change Management tools, Micro Focus offers an integrated software quality approach that is positioned in the leadership quadrant of Gartner Inc’s Magic Quadrant. The Borland Solutions from Micro Focus are both platform and language agnostic – so whatever your preferred development environment you can benefit from world class tools to define and manage requirements, test your applications early in the lifecycle, and manage software configuration and change.

Requirements Defining and managing requirements is the bedrock for application development and enhancement. Micro Focus uniquely combines requirements definition, visualization, and management into a single '3-Dimensional' solution, giving managers, analysts and developers precise detail for engineering their software. By cutting ambiguity, the direction of development and QA teams is clear, strengthening business outcomes. For one company this delivered an ROI of 6-8 months, 20% increase in project success rates, 30% increase in productivity and a 25% increase in asset re-use. Using Micro Focus tools to define and manage requirements helps your teams: ollaborate, using pictures to build mindshare, drive •C a common vision and share responsibility with rolebased review and simulations. • Reduce waste by finding and removing errors earlier in the lifecycle, eliminating ambiguity and streamlining communication. • I mprove quality by taking the business need into account when defining the test plan. Caliber ® is an enterprise software requirements

definition and management suite that facilitates collaboration, impact analysis and communication, enabling software teams to deliver key project milestones with greater speed and accuracy.

Software Change Management StarTeam® is a fully integrated, cost-effective software change and configuration management tool. Designed for both centralized and geographically distributed software development environments, it delivers: • A single source of key information for distributed teams • Streamlined collaboration through a unified view of code and change requests • Industry leading scalability combined with low total cost of ownership

Testing Automating the entire quality process, from inception through to software delivery, ensures that tests are planned early and synchronize with business goals even as requirements and realities change. Leaving quality assurance to the end of the lifecycle is expensive and wastes improvement opportunities. Micro Focus delivers a better approach: Highly automated quality tooling built around visual interfaces and reusability. Tests can be run frequently, earlier in the development lifecycle to catch and eliminate defects rapidly. From functional testing to cloud-based performance testing, Micro Focus tools help you spot and correct defects rapidly across the application portfolio, even for Web 2.0 applications. Micro Focus testing solutions help you: • Align testing with a clear, shared understanding of business goals focusing test resources where they deliver most value • Increase control through greater visibility over all quality activities • Improve productivity by catching and driving out defects faster Silk is a comprehensive automated software quality management solution suite which enables users to rapidly create test automation, ensuring continuous validation of quality throughout the development lifecycle. Users can move away from manual-testing dominated software lifecycles, to ones where automated tests continually test software for quality and improve time to market.

Take testing to the cloud Users can test and diagnose Internet-facing applications under immense global peak loads on the cloud without having to manage complex infrastructures. Among other benefits, SilkPerformer ® CloudBurst gives development and quality teams: • Simulation of peak demand loads through onsite and cloud-based resources for scalable, powerful and cost effective peak load testing • Web 2.0 client emulation to test even today’s rich internet applications effectively Micro Focus, a member of the FTSE 250, provides innovative software that enables companies to dramatically improve the business value of their enterprise applications. Micro Focus Enterprise Application Modernization, Testing and Management software enables customers’ business applications to respond rapidly to market changes and embrace modern architectures with reduced cost and risk.

For more information, please visit www.microfocus.com/solutions/softwarequality

TEST | February 2012

www.testmagazine.co.uk


TEST company profile | 45

Original Software Delivering quality through innovation With a world class record of innovation, Original Software offers a solution focused completely on the goal of effective software quality management. By embracing the full spectrum of Application Quality Management (AQM) across a wide range of applications and environments, we partner with customers and help make quality a business imperative. Our solutions include a quality management platform, manual testing, test automation and test data management software, all delivered with the control of business risk, cost, time and resources in mind. Our test automation solution is particularly suited for testing in an agile environment.

Setting new standards for application quality Managers responsible for quality must be able to implement processes and technology that will support their important business objectives in a pragmatic and achievable way, and without negatively impacting current projects. These core needs are what inspired Original Software to innovate and provide practical solutions for Application Quality Management (AQM) and Automated Software Quality (ASQ). We have helped customers achieve real successes by implementing an effective ‘application quality eco-system’ that delivers greater business agility, faster time to market, reduced risk, decreased costs, increased productivity and an early return on investment. Our success has been built on a solution suite that provides a dynamic approach to quality management and automation, empowering all stakeholders in the quality process, as well as uniquely addressing all layers of the application stack. Automation has been achieved without creating a dependency on specialised skills and by minimising ongoing maintenance burdens.

An innovative approach Innovation is in the DNA at Original Software. Our intuitive solution suite directly tackles application quality issues and helps you achieve the ultimate goal of application excellence.

Empowering all stakeholders The design of the solution helps customers build an ‘application quality eco-system’ that extends beyond just the QA team, reaching all the relevant stakeholders within the business. Our technology enables everyone involved in the delivery of IT projects to participate in the quality process – from the business analyst to the business user and from the developer to the tester. Management executives are fully empowered by having instant visibility of projects underway.

Quality that is truly code-free We have observed the script maintenance and exclusivity problems caused by code-driven automation solutions and has built a solution suite that requires no programming skills. This empowers all users to define and execute their tests without the need to use any kind of code, freeing them from the automation specialist bottleneck. Not only is our technology easy to use, but quality processes are accelerated, allowing for faster delivery of businesscritical projects.

Top to bottom quality Quality needs to be addressed at all layers of the business application. We give you the ability to check every element of an application - from the visual layer, through to the underlying service processes and messages, as well as into the database.

Addressing test data issues Data drives the quality process and as such cannot be ignored. We enable the building and management of a compact test environment from production data quickly and in a data privacy compliant manner, avoiding legal and security risks. We can also manage the state of that data, so that it is synchronised with test scripts, enabling swift recovery and shortening test cycles.

A holistic approach to quality Our integrated solution suite is uniquely positioned to address all the quality needs of an application, regardless of the development methodology used. Being methodology neutral, we can help in Agile, Waterfall or any other project type. We provide the ability to unite all aspects of the software quality lifecycle. Our solution helps manage the requirements, design, build, test planning and control, test execution, test environment and deployment of business applications from one central point that gives everyone involved a unified view of project status and avoids the release of an application that is not ready for use.

Helping businesses around the world Our innovative approach to solving real pain-points in the Application Quality Life Cycle has been recognised by leading multinational customers and industry analysts alike. In a 2011 report, Ovum stated: “While other companies have diversified, into other test types and sometimes outside testing completely, Original Software has stuck more firmly to a value proposition almost solely around unsolved challenges in functional test automation. It has filled out some yawning gaps and attempted to make test automation more accessible to non-technical testers.” More than 400 organisations operating in over 30 countries use our solutions and we are proud of partnerships with the likes of Coca-Cola, Unilever, HSBC, Barclays Bank, FedEx, Pfizer, DHL, HMV and many others.

www.origsoft.com Email: solutions@origsoft.com Tel: +44 (0)1256 338 666 Fax: +44 (0)1256 338 678 Grove House, Chineham Court, Basingstoke, Hampshire, RG24 8AG

www.testmagazine.co.uk

February 2012 | TEST


46 | TEST company profile

Green Hat The Green Hat difference In one software suite, Green Hat automates the validation, visualisation and virtualisation of unit, functional, regression, system, simulation, performance and integration testing, as well as performance monitoring. Green Hat offers codefree and adaptable testing from the User Interface (UI) through to back-end services and databases. Reducing testing time from weeks to minutes, Green Hat customers enjoy rapid payback on their investment. Green Hat’s testing suite supports quality assurance across the whole lifecycle, and different development methodologies including Agile and test-driven approaches. Industry vertical solutions using protocols like SWIFT, FIX, IATA or HL7 are all simply handled. Unique pre-built quality policies enable governance, and the re-use of test assets promotes high efficiency. Customers experience value quickly through the high usability of Green Hat’s software. Focusing on minimising manual and repetitive activities, Green Hat works with other application lifecycle management (ALM) technologies to provide customers with value-add solutions that slot into their Agile testing, continuous testing, upgrade assurance, governance and policy compliance. Enterprises invested in HP and IBM Rational products can simply extend their test and change management processes to the complex test environments managed by Green Hat and get full integration. Green Hat provides the broadest set of testing capabilities for enterprises with a strategic investment in legacy integration, SOA, BPM, cloud and other component-based environments, reducing the risk and cost associated with defects in processes and applications. The Green Hat difference includes: • Purpose built end-to-end integration testing of complex events, business processes and composite applications. Organisations benefit by having UI testing combined with SOA, BPM and cloud testing in one integrated suite. • Unrivalled insight into the side-effect impacts of changes made to composite applications and processes, enabling a comprehensive approach to testing that eliminates defects early in the lifecycle. • Virtualisation for missing or incomplete components to enable system testing at all stages of development. Organisations benefit through being unhindered by unavailable systems or costly access to third party systems, licences or hardware. Green Hat pioneered ‘stubbing’, and organisations benefit by having virtualisation as an integrated function, rather than a separate product.

• ‘ Out-of the box’ support for over 70 technologies and platforms, as well as transport protocols for industry vertical solutions. Also provided is an application programming interface (API) for testing custom protocols, and integration with UDDI registries/repositories. •H elping organisations at an early stage of project or integration deployment to build an appropriate testing methodology as part of a wider SOA project methodology.

Corporate overview Since 1996, Green Hat has constantly delivered innovation in test automation. With offices that span North America, Europe and Asia/Pacific, Green Hat’s mission is to simplify the complexity associated with testing, and make processes more efficient. Green Hat delivers the market leading combined, integrated suite for automated, end-to-end testing of the legacy integration, Service Oriented Architecture (SOA), Business Process Management (BPM) and emerging cloud technologies that run Agile enterprises. Green Hat partners with global technology companies including HP, IBM, Oracle, SAP, Software AG, and TIBCO to deliver unrivalled breadth and depth of platform support for highly integrated test automation. Green Hat also works closely with the horizontal and vertical practices of global system integrators including Accenture, Atos Origin, CapGemini, Cognizant, CSC, Fujitsu, Infosys, Logica, Sapient, Tata Consulting and Wipro, as well as a significant number of regional and country-specific specialists. Strong partner relationships help deliver on customer initiatives, including testing centres of excellence. Supporting the whole development lifecycle and enabling early and continuous testing, Green Hat’s unique test automation software increases organisational agility, improves process efficiency, assures quality, lowers costs and mitigates risk.

Helping enterprises globally Green Hat is proud to have hundreds of global enterprises as customers, and this number does not include the consulting organisations who are party to many of these installations with their own staff or outsourcing arrangements. Green Hat customers enjoy global support and cite outstanding responsiveness to their current and future requirements. Green Hat’s customers span industry sectors including financial services, telecommunications, retail, transportation, healthcare, government, and energy.

• Scaling out these environments, test automations and virtualisations into the cloud, with seamless integration between Green Hat’s products and leading cloud providers, freeing you from the constraints of real hardware without the administrative overhead. • ‘Out-of-the-box’ deep integration with all major SOA, enterprise service bus (ESB) platforms, BPM runtime environments, governance products, and application lifecycle management (ALM) products.

sales@greenhat.com www.greenhat.com

TEST | February 2012

www.testmagazine.co.uk


TEST company profile | 47

T-Plan T-Plan since 1990 has supplied the best of breed solutions for testing. The T-Plan method and tools allowing both the business unit manager and the IT manager to: Manage Costs, Reduce Business Risk and Regulate the Process. By providing order, structure and visibility throughout the development lifecycle from planning to execution, acceleration of the "time to market" for business solutions can be delivered. The T-Plan Product Suite allows you to manage every aspect of the Testing Process, providing a consistent and structured approach to testing at the project and corporate level.

What we do Test Management: The T-Plan Professional product is modular in design, clearly dierentiating between the Analysis, Design, Management and Monitoring of the Test Assets. • What coverage back to requirements has been achieved in our testing so far?

Test Automation: Cross-Platform Independence (Java) Test Automation is also integrated into the test suite package via T-Plan Robot, therefore creating a full testing solution. T-Plan Robot Enterprise is the most flexible and universal black box test automation tool on the market. Providing a human-like approach to software testing of the user interface, and uniquely built on JAVA, Robot performs well in situations where other tools may fail. • Platform independence (Java). T-Plan Robot runs on, and automates all major systems, such as Windows, Mac, Linux, Unix, Solaris, and mobile platforms such as Android, iPhone, Windows Mobile, Windows CE, Symbian. • Test almost ANY system. As automation runs at the GUI level, via the use of VNC, the tool can automate any application. E.g. Java, C++/C#, .NET, HTML (web/browser), mobile, command line interfaces; also, applications usually considered impossible to automate like Flash/Flex etc.

• What requirement successes have we achieved so far? • Can I prove that the system is really tested? • If we go live now, what are the associated Business Risks?

Incident Management: Errors or queries found during the Test Execution can also be logged and tracked throughout the Testing Lifecycle in the T-Plan Incident Manager. “We wanted an integrated test management process; T-Plan was very exible and excellent value for money.” Francesca Kay, Test Manager, Virgin Mobile

Web: hays.co.uk/it Email: testing@hays.com Tel: +44 (0)1273 739272

www.testmagazine.co.uk

February 2012 | TEST


48 | The last word...

the last word... One man wolf pack dave Whalen is facing a lone wolf in his testing pack. Should he break out the wolvesbane and silver bullets or tolerate a singular talent?

I

was watching one of my favorite movies, The Hangover, the other day. one line from the movie really brought home a situation at work. “I tend to think of myself as a one-man wolf pack.” One of the major factors in the success of any team, especially Agile teams, is teamwork. We succeed or fail as a team. There is no place for a lone wolf. They're a cancer on the entire team. Nothing can kill a successful team faster. Worse is when management coddles them. Our highly successful Agile team is currently experiencing the lone wolf syndrome. We recently expanded our team for a couple of reasons. First, our delivery date was moved up. Second, and most important, we needed people with specialised skills. As the end of the project nears, our focus has shifted from the backend to the user interface. The team now includes a UI developer and a security specialist. I was concerned about expanding the team. It's actually our second team expansion. The last time we were able to hand-pick the new team members. That expansion worked very well. This time, because of the specialised skills needed, we didn't have that luxury. So the lone wolf was added to the team. The sad part is that we knew it, but had little power to do anything about it. It was obvious from the beginning we were going to have issues. One of the new team members came to us from a previous, somewhat dysfunctional, team. This was a typical waterfall team. It was very isolated and very siloed. The team was scattered all over the building. The team members only talked to each other when a formal meeting was called or there was a problem. It was hardly collaborative. But if you're a one man wolf pack, it's the perfect environment. So our team was expanded. Right away there were issues. The lone wolf won't work in our collaborative work

TEST | February 2012

space. In fact he usually telecommutes from home. He picks and chooses which meetings, formal or informal, are worth his time and only attends when he feels like it. He is very directive rather than collaborative. An expert in his area and he knows it. He is not open to discussing his approach or any ideas about ways to improve or do better. Things are getting tense. I brought up the subject to our project manager. Basically I was told to let it go and tolerate it because his skills are too valuable. This article then is not only informative but is a way to vent my frustrations. The burning question: Should any single team member be that important to the team's success. I don't think so. I think I'm pretty darned important, but I know I'm replaceable. That knowledge keeps me humble. No one is irreplaceable. Sure, it may take some time to get someone new up to speed on our application, but in the long run we may be better off. My real concern is the cohesiveness of our team. As I said earlier, these lone wolves can be a cancer. They can infect other team members by killing off any collaboration. Special treatment of these new team members may anger or upset other current team members. Older team members may also begin to isolate the non-collaborative member. If the lone wolf feels isolated and attacked they may sabotage team efforts. I may be exaggerating a bit. Let's hope so! What to do? So far I'm taking a wait and see attitude. I hope I'm wrong. I've shared my concerns with the project manager. I'd like to think it's just me, but I'm sensing it isn't just me. We haven't really talked about it, but there is definitely tension in the air. So when it's time to expand your wolf pack, be sure to find someone willing to be a member of the pack. There's no room for the one man wolf pack... Now let's go to Las Vegas – c'mon red 14!!!

The burning question: Should any single team member be that important to the team's success. I don't think so. I think I'm pretty darned important, but I know I'm replaceable. That knowledge keeps me humble. No one is irreplaceable. Sure, it may take some time to get someone new up to speed on our application, but in the long run we may be better off.

dave Whalen

President and senior software entomologist Whalen Technologies softwareentomologist.wordpress.com

www.testmagazine.co.uk


INNOVATION FOR SOFTWARE QUALITY

Subscribe to TEST free! INNOVATION FOR SOFTWARE QUALITY Volume 3: Issue 5: October 2011

INNOV

ATION

FOR SO FTWA

Vol um e 3: Issu e 6: Dec em ber

201 1

ing rs ad ide Le rov 20 ng P sti Te

Y

e: st sid ge In T Di S TE

ust 201 1 Issu e 4: Aug Vol um e 3:

LIT RE QUA

TEST : INNOV ATION FOR SO FTWAR E QUA LITY

A R SOFTW TION FO

TEST : INNOVATION FOR SOFTWARE QUALITY

ALITY RE QU OFTWA FOR S ATION INNOV TEST :

A INNOV

RE QUA LIT

Y

n’t Apps do e Testin& n g centre lo Tact a d n a s st of excell Diplomacy ence ombe on Mike Holc Professor the world to in d ideas fitting goo

| Getting

VOLUM E 3: IS SUE 6: DECEM BER 20 11

VOLUME 3: ISSUE 5: OCTOBER 2011

T 2011 AUGUS SUE 4: E 3: IS VOLUM

testing rformance Inside: Pe

Focussing on collab oration Devyani Borade on developing the right attitude

testing ethics of real | The Insid Inside: Load testing | Test automation | e: Requirements Static an alysis | Da ta

obfuscat .uk ion | Testi agazine.co ng tools Visit TEST online at www.testmagazine.co.uk www.testm Visit TEST online at online at Visit TEST www.testm agazine.co .uk

For exclusive news, features, opinion, comment, directory, digital archive and much more visit

www.testmagazine.co.uk Published by 31 Media Ltd Telephone: +44 (0) 870 863 6930 Facsimile: +44 (0) 870 085 8837

www.31media.co.uk

Email: info@31media.co.uk Website: www.31media.co.uk

INNOVATION FOR SOFTWARE QUALITY


BORLAND SOLUTIONS FROM MICRO FOCUS DESIGNED TO DELIVER BETTER SOFTWARE, FASTER

Borland Solutions are designed to:

Align development to business needs

Strengthen development processes

Ensure testing occurs throughout the lifecycle

Deliver higher quality software, faster

Deliver stable mobile applications, even under peak loads

Borland Solutions from Micro Focus make up a comprehensive quality toolset for embedding quality throughout the software development lifecycle. Software that delivers precisely what is needed, when it is needed, is crucial for business success. Borland Solutions embed quality into software delivery from the very beginning of the development lifecycle whichever methodology you use – traditional, Agile or a mixture – from requirements, through regression, functional, performance and load testting. The result: you meet both business requirements and quality expectations with better quality software, delivered faster.

Micro Focus Ltd. Email: microfocus.communications@microfocus.com © 2011 Micro Focus IP Development Limited. All rights and marks acknowledged.


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.