Home | About QAI | QAI Canada Home Page | Federated Chapters
Conference Home Page
Manager's Workshop
QUEST Magazine
Get Involved

Industry Practices Sessions

*You can jump ahead to any selection by clicking the DATE or the session TITLE

WEDNESDAY, September 24

8 Steps to Risk-Based Testing
Akemi Micallef, Compuware Corporation

Track 1:  11:30 - 12:30

Risk-based testing allows testing organizations to reduce testing time without compromising quality.  The days of long testing turnarounds are over.  No one is given the time to test "everything" anymore.  Yet, quality is more important than ever.  In this presentation, Akemi shows how testing organizations can refine their existing testing processes to keep up with the ever-increasing demand for speed.  The key concepts of risk-based testing and the various risk factors that should be considered in the test case design will be explained, as well as how testers can solicit greater business and technical input. There are no complex formulas or special tools required. You will learn how risk-based testing can be implemented into your existing processes in 8 steps with minimum impact to the current testing model. This simple technique of test case development and execution can give testing organizations the flexibility and speed they need to survive in today's market.

About the speaker...
Akemi Micallef has fifteen years of experience in the IT industry with ten years in software testing and quality assurance.  She has led testing projects through the entire software development life cycle and currently performs quality process assessments and implementations for various organizations across many different categories of projects. In her role as a QA architect for Compuware Corporation, Akemi has been the driving force behind the creation of a project delivery methodology and training.  Akemi holds a CSTE certification.

 Back to top

Test Metrics:  A Practical Approach to Tracking & Interpretation
Shaun Bradshaw, Questcon Technologies

Track 2:  11:30-12:30

Test metrics can be used to track and measure the efficiency, effectiveness, and success or shortcomings of many activities on a software development project.  While it is important to recognize the value of gathering test metrics data, it is the interpretation of that data that makes the metrics meaningful.  This session is designed to help test analysts and lead testers learn how to establish a test metrics program.  Shaun will describe a metrics quick start program and provide examples of metrics that can be tracked during the testing effort.  Most importantly, you will discover how to decipher these metrics in a way that makes the information meaningful to the overall quality of the project.

About the speaker...
Shaun Bradshaw joined Questcon Technologies in 1997.  As the Director of Quality Solutions, he works with clients in various industries; advising, teaching, and mentoring them on the use of effective testing and test management techniques such as modular test case design, test metrics, the S-Curve, and the Zero Bug Bounce.  Shaun is the co-author and editor of the QuestAssured® Service Methodologies, as well as the primary creator of the methodology training classes offered by Questcon.  He has been a featured speaker at various local and national QA and Testing conferences.  Shaun received his BS in information systems from the University of North Carolina at Greensboro.

 Back to top

Why is Test Planning So Hard?
Elizabeth D'Angelo, PhD, CGI

Track 3:  11:30-12:30

Organizations often view test planning as overhead and, in some cases, an optional activity.  After all, the project is already late and we need to start executing!  The result is more often than not a "vanilla flavor," generic test plan that does not truly communicate what will actually be tested and how that testing will be done.  Elizabeth will present a practical approach to avoiding this pitfall and successfully producing a project specific test plan. A framework to facilitate discussion of specific high risk testing items and  incorporating those items into a detailed plan and schedule will be included in the presentation.  You will also learn how to use metrics during test execution to refer back to plan assumptions and how to elaborate options to avoid schedule overruns.

About the speaker...
Elizabeth D'Angelo has broad experience testing software for complex applications and managing remote testing teams.  She has a proven track record in leadership of software test teams and software improvement processes.  Elizabeth currently serves as Director of Testing Services at CGI, Inc.  Prior to this role, she held various testing positions in the financial industry and high technology sector.  She has solid experience setting strategic direction for large teams through all phases of testing and extensive experience in automation initiatives.

 Back to top

Keyword Test Automation Framework
Kai Chiu, IBM Canada

Track 4:  11:30 - 12:30 

Implementing a successful test automation framework can result in significant benefits.  These benefits include improved quality through broader and deeper test coverage and shorter test cycles, freeing up subject matter experts to focus their efforts on more complex testing scenarios.  However, the opportunity to automate testing must be weighed against the costs involved, such as tools and framework set up, script creation and maintenance.  This presentation highlights the design concepts and implementation techniques used in adopting a Keyword Test Automation Framework.  Kai will share his experiences in creating and setting up a Keyword test automation framework for a Java based application involving multiple builds and releases.  He will discuss the obstacles encountered and the solutions used.  You will see a short demo of the framework using Rational Functional Tester as the test engine and will understand the ROI you can realize by using these techniques on your own projects.

About the speaker...
Kai Chiu is a tools and automation specialist with IBM Canada.  He has extensive quality assurance and test automation experience in the equities trading and the telecom industries. Kai has successfully spearheaded the deployment of automation tools and frameworks at a number of organizations.  He has presented on test automation to industry peer groups and at industry conferences including the Rational User Conference in 2006.

 Back to top

THURSDAY, September 25 - MORNING

Managing a Software Quality and Testing Group
Steven Rakitin, Software Quality Consulting, Inc.

Track 1:  10:15 - 11:15

Not everyone is cut out for a management role and the management of a Software Quality/Testing Group presents a unique set of challenges on its own.  Maintaining positive relationships with development teams is an on-going struggle.  Obtaining upper management support, securing resources, equipment, and enough time to do a reasonably good job are critical factors, but difficult to achieve.  Early involvement on project teams is essential but often does not happen.  It is not surprising, then, that both team and individual motivation and morale become overriding challenges for software quality/test group managers. Drawing on over 30 years of experience in the software quality field, Steve will present his own "lessons learned" on ways to effectively manage, challenge, and motivate a Software Quality/Testing group.  

About the speaker...
Steve has over thirty years experience in the software and quality field.  He has written extensively on software quality and has authored a book entitled Software Verification & Validation for Practitioners and Managers.  Steve helped to write the first IEEE Software Engineering Standard for Software Quality Assurance Plans and is currently a member of the IEEE Standard 1012 (Software Verification & Validation) Working Group.  He has earned certifications as a Software Quality Engineer (CSQE) and Quality Auditor (CQA).  Steve is a member of the IEEE Computer Society, the ASQ Software Division, and is on the Editorial Review Board for the ASQ journal Software Quality Professional.  Steve is President of Software Quality Consulting, Inc.

 Back to top

Integrated Compliance Platform: An Implementation Experience
Thiyagarajan Ganesh Ganesan, Cognizant Technology Solutions

Track 2:  10:15 - 11:15

Organizations worldwide have realized the increased importance of regulations such as SOX, BASEL II, and GLBA.  They have also understood that the cost of ongoing compliance activities has become a necessary element of doing business today.  Thiyagarajan's session is focused on optimizing the management of these compliance activities by developing an integrated compliance platform using a CoBIT 4.0 framework.  You will learn how this integrated platform helps in creating synergies in the assessment of an organization's internal control environment considering multiple regulatory requirements.  CoBIT, an internationally recognized IT governance framework, apart from enabling unified control documentation, facilitates effective IT governance in the organization.  Thiyagarajan will highlight a case study providing details on how this integrated compliance platform enabled one standard for the assessment, monitoring, and reporting of the operational and financial risks of an actual organization.

About the speaker...
Thiyagarajan Ganesh Ganesan, Senior Associate with Cognizant Technology Solutions, has over nine years of experience in software quality assurance, risk management  and software testing.  He has consulted and trained over twenty-five different organizations in the areas of risk management, SOX & Basel compliance, project management, software quality assurance, software testing, configuration management, and software estimation.  Thiyagarajan's Professional designations include Project Management Professional (PMP), Certified Information Systems Auditor(CISA), and Specialist in Rational Unified Process and ITIL.  Thiyagarajan holds a BE Mechanical from Madras University.

 Back to top

Deriving Test Cases from Use Cases
Margaret Harris, Computer Sciences Corporation

Track 3: 10:15 - 11:15

In this object oriented age, an important requirements and design articulation tool is the use case.  The contents of use cases are a gold mine of information for authoring test cases.  This presentation shows test case authors how to derive tests from use cases.  Join Margaret as she discusses how the primary content of a use case is related to the primary content of a test case. She will also explain how to analyze a sample use case to create a complete set of test cases and how to avoid some pitfalls in test case derivation. Multiple examples drawing on actual use cases and test case derivation approaches will be presented.  You will learn techniques that can be immediately applied on your own application testing team.

About the speaker...
A software test manager with Computer Sciences Corporation (CSC), Margaret Harris has twenty years of professional software engineering and management experience including integration and system testing, quality assurance, project management, requirement analysis, GUI design, and relational database development in client-server, web, and COTS applications.  She has spoken at several conferences including the Practical Software Quality and Testing conference and the Verify 2007 conference.  Margaret is a Certified Software Tester and a Certified Test Manager.  She has been involved in process improvement activities such as Software Engineering Institute Capability Maturity Model (CMM) and Capability Maturity Model Integration (CMMI) evaluations, Total Quality Management (TQM), and Defect Causal Analysis (DCA) activities.

 Back to top

An Integrated Performance Engineering Approach
Subhash Mukherjee, Deloitte, Inc. and Edmond Chan, HP Software Canada

Track 4:  10:15 - 11:15

The majority of today's web applications are not meeting performance requirements despite the fact that the revenue derived from these systems continues to increase significantly.  Most IT organizations are unable to predict performance problems in their web-based applications and are therefore unable to meet the business performance expectations.  Generally, performance testing is conducted late in the application development lifecycle and in a scaled-down environment where the results may be misleading.  To combat these issues, there is a need for an Integrated Performance Engineering Approach that spans the entire development lifecycle, starting performance activities early in the lifecycle and continuing past go-live.  Such an approach requires continual validation against performance acceptance criteria covering all aspects of people, process, and technology.  This session will introduce an Integrated Performance Engineering Approach to software delivery and maintenance that will arm you with answers to the critical questions of managing performance during the lifetime of an application.

About the speaker...
Subhash Mukherjee is a project lead and enterprise application architect at Deloitte.  With over seventeen years of software implementation experience, Subhash's primary expertise is in enterprise and application architecture, performance engineering methodology, diagnosis and optimization of software applications,  and custom-development and test automation, all using diverse technology platforms.  Subhash holds a PMP certification as well as Sun programmer and developer certifications.

Edmond Chan is a Sr. Solutions Architect for HP Software Canada.  He has over fifteen years of experience in IT, working with Fortune 1000 companies and government organizations to improve development life cycles and to help management understand IT from a business perspective.  Since joining Mercury Interactive/HP, Edmond has been focused on helping customers with quality management, performance testing, and application security testing.  Edmond is a certified ScrumMaster and has been ITIL Foundation certified.

Back to top 


Quality Management: Based on Models - Implemented in Reality
Barbara Ainsworth, Process Plus International LLC

Track 1:  1:30 - 2:30

Experience demonstrates that, no matter the end product, the foundation for and focus on implementing process improvement remains consistent.  Using models as the basis for improving processes makes success more likely. However, no single model has all the right answers.  Rather, a combination of models is needed.  This presentation contains an overview and comparison of popular models including Software Engineering Institute's SW_CMM and CMMI, Project Management Institute's Project Management Common Body of Knowledge, Quality Assurance Institute's Bodies of Knowledge for Quality Assurance and Testing...and more!  You will also see the influence of both the "old" and "new" masters, Juran, Crosby, Humphrey, and receive the benefits of "gems" collected over the years to help you determine which models are the best fit for your organization's needs.  Examples of starting and reinvigorating process improvement initiatives will also be provided.

About the speaker...
Barbara Ainsworth, PMP, CSQA, CSTE, and ITIL Service Management, is a Managing Member and Principal Consultant for Process Plus International LLC.  Barbara serves as a consultant to global IT groups across a broad range of industries to help them meet their organization's objectives by advancing their IT capabilities.  She has held an assortment of IT roles and her experience includes over ten years of implementing models and utilizing supporting frameworks from the Software Engineering Institute (SEI), Quality Assurance Institute (QAI), Project Management Institute (PMI), Industry Standards Organization (ISO), Malcolm Baldridge National Quality Award (MBNQA), and Six Sigma.  Barbara is certified to perform SEI Interim Profile Assessments and has completed many such assessments for a variety of organizations.

Back to top

Testable Requirements - Separating Fluff from Substance
Ray Stacey, ACS

Track 2:  1:30 - 2:30

What is it that leads us to consider a requirement one of good quality?  The answer to this question varies by organization and by discipline.  For the quality assurance professional, however, the answer is quite simple.  Can I test it effectively?  In his presentation, Ray will examine the practice of requirements analysis and how to use the Intent, Context, and Evidence approach to determine testability.  You will look at examples of both good and bad requirements in both traditional requirements specification and use case formats.  Ray will also touch briefly on application requirements decomposition.

About the speaker...
Ray Stacey is a QA manager at ACS.  He has been in the quality assurance profession for over ten years and has been a CSTE since 2002.  In his career, he has been responsible for all facets of quality process improvement, including the creation of a department from the ground up.  He regularly presents to business units on the methods of optimizing testing efficiency and has also presented at several QAI Conferences.

Back to top

Establishing a Quality Assurance Regression Test Bed
Donald Mark Haynes, Synova

Track 3:  1:30-2:30

The purpose of an independent regression test environment is to provide the quality assurance test team with a stable and predictable test arena free from unexpected alteration.  The intent of the testing environment operations manual is to facilitate coordination of operational testing activities and maximize control over the physical test environment and regression test bed.  An often overlooked aspect of managing quality assurance activities is the test environment itself and the maintenance of test data.  Pre-production test environments have many unique considerations and should be maintained in a similar fashion as production environments.  Typically, however, quality assurance test environments are treated as just another development environment.  If processes and controls are not in place, compliance may be inconsistent but environment related QA issues often take on a low priority.  In order to maximize reuse potential, test data requires the same maintenance and proper documentation as test cases.  In this presentation you will gain a deeper understanding of the issues involved with establishing, maintaining, and managing an independent quality assurance test environment and a robust regression test bed. 

About the speaker...
Donald Mark Hayes is a software engineering specialist with over twenty-two years of experience implementing software development solutions.  Don has in-depth expertise in quality assurance, software development, project management, development methodologies, and software metrics.  He has led development and QA teams on web, client server and mainframe platforms and has performed in various project roles in development, infrastructure, process, audit, and training.  Currently a QA Project Leader at Ford (Synova) Don supports Engineering Applications.  He holds a BS in Biology and Data Processing from NMU Marquette, Michigan.

Back to top

Testing Web Applications for Security Vulnerabilities
Jimmy Xu and Karim Moosa, CGI, Inc.

Track 4: 1:30 - 2:30

As organizations increasingly adopt the Internet as a venue for conducting essential business operations, web application vulnerabilities become a more and more serious issue. This is an issue that not only security professionals, but also software development and testing teams will need to address as a critical element of the software development lifecycle. In this presentation, Jimmy and Karim will discuss the methodology, processes, and techniques that a testing team should adopt to check for vulnerabilities within web based applications. Based on the view that security is vital to software quality, the session will give software quality assurance and testing professionals the necessary background information to participate in web security testing without acquiring sophisticated technical skills or tools.

About the speaker...
Jimmy Xu has been working in the IT industry for the past 16 years with various companies including i2, IBM/DWL, and CGI. His experience encompasses development of enterprise applications for the manufacturing, financial, and telecom industries.  Jimmy's technical expertise includes application security, performance, tuning, and testing.  He also has experience in network infrastructure and security.  Jimmy holds a masters degree in information systems from the University of Arizona. He is a CISSP, CSTE, HP LoadRunner Specialist, Sun Certified Java Programmer, and BEA WebLogic System administrator.  Jimmy's current job responsibilities and research interests are in application security, performance, and tuning.

Karim Moosa has been working in the IT industry for the past four years. He has led or participated in 20+ enterprise application security, performance, and tuning projects for large organizations in Canada and the U.S. Karim Moosa holds a Bachelor of Applied Science in Computer Engineering from the University of Toronto and a number of certifications including CSTE, Sun Certified Java Programmer, ITIL Foundation, and a Certificate in Preventive Engineering and Social Development. His current job responsibilities and interests include enterprise application security, performance, and tuning.

Back to top

FRIDAY, September 26

Agile Testing and the Role of an Agile Tester
Declan Whelan, Whelan & Associates

Track 1 - 10:15-11:15

Agile development has crossed the chasm and is now main-stream in many software organizations.  However, most of the agile literature focuses on development practices, so there is confusion about the role of testers.  On agile teams, customers write stories and automated story tests while agile developers write comprehensive unit and integration tests; therefore, much of the testing is being done without skilled testers.  Moreover, agile teams deliver working software on a weekly or bi-weekly basis which puts great time pressure on testing.  This situation provides new challenges and opportunities for testers on agile teams.  In this session, Declan will offer insights into the role of a tester on an agile team and will discuss how agile teams need to break the traditional barriers between development and testing to truly deliver value.

About the speaker...
Declan Whelan is an active software developer and agile coach.  He is a professional engineer with twenty-five years of experience in the software industry supporting many types of businesses including financial, medical, educational, manufacturing, and utilities.  He was co-founder and CTO for Innovasys, a start-up company that developed electronic imaging and workflow products for the financial market.  He successfully transitioned the company from a start-up through to profitable venture and eventual sale.  Declan is a certified Scrum Master and a member of the IEEE Computer Society, Agile Alliance, and Scrum Alliance.  Declan's focus is on working in the trenches with teams to deliver better value, quality, and time-to-market through agile principles and practices.

Back to top

How to Improve Test Processes and Become CMMI Compliant
Richard Bechtold, PhD, Abridge Technology

Track 2 - 10:15-11:15

Join Richard as he explains an overall strategy for improving processes critical to achieving compliance with the Capability Maturity Model Integration v1.2 (CMMI).  This is a goal that is readily achievable by the test organization, even if other groups, such as systems or software engineering, are not yet interested in process improvement or CMMI compliance.  Topics discussed will include managing test-related requirements, planning and managing test projects, managing test-related configuration items, performing objective quality assurance oversight of key testing activities and artifacts, test measurements and metrics, and establishing agreements with suppliers of test-related services or products.  Richard will also present a process improvement implementation lifecycle that is a proven strategy for transitioning the test organization from its current state to an improved state that is fully compliant with all CMMI requirements at and below the targeted maturity level.  This lifecycle focuses on rapid, incremental change, and ensuring successful, effective, and measurable progress toward enhanced efficiency, effectiveness, and total compliance.

About the speaker...
Richard Bechtold is a senior consultant for Abridge Technology, a Virginia-based company he founded in 1996.  Richard provides consulting, training, and support services in the areas of risk management, project management, process improvement, statistical techniques, and organizational change.  He has assisted government and industry with implementing CMM and CMMI methodologies.  Richard is a regular presenter at conferences and has been published over 70 times.  He has taught for George Mason University, the University of Maryland, and the Software Engineering Institute at Carnegie Mellon University.  Richard's latest book is Essentials of Software Project Management, 2nd Edition.

Back to top

TSP Teams Transform Test
James McHale, Software Engineering Institute

Track 3:  10:15-11:15

As pervasive as software has become in today's world, it is remarkable that the development of that software is the only modern technology that depends upon testing as a primary method of defect removal.  The current expectation in most organizations, large and small, from initial development and major enhancements through lifecycle maintenance, is that finding and fixing defects during testing typically consumes half or more of allocated software costs.  Why is this so?  What would a different reality look like?  Could that reality be achieved using known methods?  There is a potential for a very different reality for test using the Team Software Process (TSP) and the Personal Software Process (PSP).  This session presents an overview of the TSP and PSP with actual results from classroom and real-world usage, showing how a coherent packaging of known methods used at the individual and team levels can transform testing phases from high-cost defect removal to high-value-added functional verification.

About the speaker...
James McHale is a Senior Member of the Technical Staff at the Software Engineering Institute.  Prior to joining the SEI in 1999, he spent over 20 years in industry as a software engineer, system designer, project leader, and manager working on control systems for diverse applications such as steel mills, power plants, robotics, and transportation.  James is an authorized TSP coach, a SCAMPI Lead Appraiser candidate, and he teaches the PSP and TSP course suite at the SEI as well as Introduction to CMMI.  He has also co-authored several technical reports relating TSP to process improvement models such as CMM and CMMI.

Back to top

A Practical Approach to Enterprise SOA Application Integration Testing
Dr. Miroslav Kis, KISM Consulting and Mirko Latkovic, Creative Solutions

Track 4:  10:15-11:15

The reality of SOA implementation in an enterprise environment is that it is never simple. Interfaces change as new service providers and consumers join while the number of interconnections increases exponentially. Adding features or fixing defects can cause an avalanche of new problems.  When we add to that the issues related to interoperability of heterogeneous development and production environments, it is obvious that SOA applications need new testing methods.  Mirko and Miroslav will present an approach to SOA an application testing that is based on the System Integration Test Service (SITS) concept.  The objectives of the approach are to drive business use case based testing, minimize dependency between delivery teams, and provide capability to simulate and test system integration early in the project.  The approach also makes it possible to leverage testing efforts and capture regression scenarios while eliminating retesting for service consumers.  The key elements of this approach, SITS framework, simulator, and Interoperability Health-Check Service will be described.  Finally, an actual experience from a successful SOA implementation will be discussed.

About the speaker...
Mirko Latkovic is a Sun Microsystems Solution and Integration Partner.  Mirko has been working with leading financial institutions based in Canada and across the globe.  Mirko consults with enterprise architecture and project delivery teams on setting up enterprise wide application and technology strategies and delivering mission critical, high availability, solutions.  Mirko holds a BS in computer science.

Miroslav Kis has more than twenty years of experience in high assurance systems architecture, information security, and process improvement both in Canadian chartered banks and global technology companies.  Miroslav is a regular speaker at international and government conferences, presenting on system development, information security, and quality assurance topics.  He holds a PhD in computer science, TOGAF Certification, and is a Certified Information Systems Security Professional.

Back to top

Conference: Conference At A Glance | Keynote Presentations | Industry Practices Sessions | Solutions Benchmarking Sessions & Case Studies | Workshops | Panel Discussion | Coaching Sessions | EXPO Theatre

Quality Engineered Software & Testing (QUEST) Conference - Copyright © 2008