Home | About QAI | QAI Home Page | Federated Chapters
QuEST Chicago
 
Conference Home Page
Manager's Workshop
QUEST At A Glance
QUEST Magazine
Session Presentations
 
QUEST Chicago 2008
Last updated: 05/05/2009


Industry Practices Sessions

*You can jump ahead to any selection by clicking the DATE or the session TITLE



WEDNESDAY, APRIL 22

Management's Role in Achieving Predictable Software Development
Steven Rakitin, Software Quality Consulting, Inc.

Track 1: 11:00 - 12:00

Many software development organizations lack discipline, credibility, and predictability. As a result, these organizations are unable to accurately predict when products will be released. The goal of Predictable Software Development is simply to deliver what was promised, when it was promised, and with the level of quality that customers expect. Becoming more predictable usually means changing the culture. To do this, management must provide leadership, support, and commitment. This talk describes the critical role management must play to help the organization achieve Predictable Software Development. Join Steve and learn how managers can influence the development and testing process in positive ways including specific areas where management focus can achieve improved results in process effectiveness and time to market. In short, learn how to lead, not dictate, process improvement efforts.

About the speaker...
Steve Rakitin has over 30 years experience as a software engineer and software quality manager. He frequently speaks on topics related to software development and software quality at conferences worldwide. He's published several papers on the subject of software quality and a written a book titled Software Verification & Validation for Practitioners and Managers. As President of Software Quality Consulting, Inc., he works with clients who are interested in improving the predictability of their development processes and the quality of their products.

 Back to top

Test Metrics: A Practical Approach to Tracking & Interpretation
Shaun Bradshaw, Questcon Technologies

Track 2: 11:00 - 12:00

Test metrics can be used to track and measure the efficiency, effectiveness, and success or shortcomings of many activities on a software development project. While it is important to recognize the value of gathering test metrics data, it is the interpretation of that data that makes the metrics meaningful. This session is designed to help test analysts and lead testers learn how to establish a test metrics program. Shaun will describe a metrics quick start program and provide examples of metrics that can be tracked during the testing effort. Most importantly, you will discover how to decipher these metrics in a way that makes the information meaningful to the overall quality of the project.

About the speaker...
As Director of Quality Solutions at Questcon Technologies, Shaun Bradshaw is responsible for managing Questcon's team of Senior Practice Managers in the areas of quality solutions development and service delivery. In his role, Shaun works with clients in various industries; advising, teaching, and mentoring them on the use of effective testing and test management techniques. He is the co-author and editor of Questcon's QuestAssured® suite of service methodologies. Shaun has been a featured speaker at various local and national quality assurance and testing conferences. Shaun received a BS in Information Systems from the University of North Carolina at Greensboro.

 Back to top

How Can a Tester Cope With the Fast Paced Iterative/Incremental Process?
Timothy Korson, PhD, Qualsys Solutions

Track 3: 11:00 - 12:00

About the speaker...
Dr. Timothy Korson has had over a decade of significant experience working on a large variety of systems developed using modern software engineering techniques. This experience includes distributed, real time, embedded systems as well as business information systems in an n-tier, client-server environment.  Tim's typical involvement on a project is as a senior management consultant with additional technical responsibilities to ensure high quality, robust test and quality assurance processes and practices.  Tim has authored numerous articles, and co-authored a book on Object Technology Centers, Object Technology Centers of Excellence.  He has given frequent invited lectures at major international conferences and has contributed to the discipline through original research. The lectures and training classes he presents receive uniformly high marks.

 Back to top

Building a Software Testing Strategy
Karen Johnson, Software Test Management, Inc.

Track 4: 11:00 - 12:00

If you need to build a test strategy, this presentation will offer you some practical ideas for getting started. Karen will provide an in-depth look at the elements that should be included in a test strategy. She will also cover techniques for soliciting ideas from other leads and members of your project team. You will learn how to update your strategy through the course of the project and how to discuss and gain input and acceptance throughout your organization. Karen will share from past experiences and will illustrate why building a test strategy isn't about creating a document, but is about thinking and planning strategically for a product release. Karen offers ideas for uncovering testing challenges and tackling those challenges both in hands on project work and by incorporating those insights into a test strategy. Geared for beginning and experienced test managers, this presentation will benefit anyone who needs to build a test strategy.

About the speaker...
Karen Johnson is an independent software test consultant based in Chicago, Illinois. She views software testing as an intellectual challenge and believes in the context-driven school of testing. Karen teaches and consults on a variety of topics in software testing, frequently speaking at software testing conferences. Karen's work has been published in Better Software and Software Test and Performance magazine and on InformIT.com and StickyMinds.com. She is the co-founder of WREST, the Workshop on Regulated Software Testing. For the past two years, Karen has served as an executive board member for the Association for Software Testing (AST).

 Back to top

Testing Services Within a SOA Architecture
Chip Crawford, HP Software

Track 5: 11:00 - 12:00

SOA based applications present their own set of difficulties to testers, but they also provide exceptional opportunities.  In this session, you will learn about the unique quality challenges of SOA-based applications, how an application's web-services are tested, and how the HP solution can facilitate the testing and test-management process around web-services.  This session will provide an overview of HP Software's solution for managing quality and test-automation around SOA-based applications.  Chip will demonstrate the HP Service Test and Service Test Management products, which are part of the widely-used Quality Center suite.

About the Speaker…
Chip Crawford is a Solution Architect for the Application Quality Management products group at Hewlett Packard Software.  Formerly with Mercury, Chip practices in a pre-sales role around the Quality Center and Performance Center product line, which includes hands-on work with the Service Test and Service Test Management products used to manage and validate SOA application quality.  Prior to his tenure at Mercury, Chip has over 10 years working in the enterprise applications space with Oracle.

 Back to top

THURSDAY, APRIL 23 - MORNING

Avoid Creep: Discover the REAL Requirements
Robin F. Goldsmith, JD, Go Pro Management, Inc.

Track 1: 9:45 - 10:45

There is a simple, although not easy, way to avoid much of the requirements scope creep that many developers assume is normal and unavoidable. "That's not what I expected" and "Users don't know what they want" indeed are repeatedly predictable outcomes of the inadequate way requirements are defined conventionally, but such problems can be avoided. Creep mainly occurs when products, systems, or software requirements fail to meet the REAL business requirements. This usually happens because developers don't understand the important differences between business requirements and system requirements and do not know how to discover them. In this interactive session based on his recent book, Discovering REAL Business Requirements for Software Project Success, Robin will discuss powerful techniques for discovering the REAL requirements and documenting scope in ways that can dramatically reduce creep.

About the speaker...
Robin F. Goldsmith, JD has been President of Go Pro Management, Inc., consultancy since 1982. He works directly with and trains business and systems professionals in requirements, quality and testing, metrics, ROI, software acquisition, and project and process management. Previously he has been a developer, systems programmer/DBA/QA, and project leader with the City of Cleveland, leading financial institutions, and a "Big 4" consulting firm. Member of the IEEE Software Test Documentation Std. 829-2008 Revision Committee and formerly International Vice President of the Association for Systems Management and Executive Editor of the Journal of Systems Management, Robin is the author of the Proactive Testing methodology and the recent Artech House book, Discovering REAL Business Requirements for Software Project Success.

 Back to top

TSP Teams Transform Test
James McHale, Software Engineering Institute

Track 2: 9:45 - 10:45

As pervasive as software has become in today's world, it is remarkable that the development of that software is the only modern technology that depends upon testing as a primary method of defect removal. The current expectation in most organizations, large and small, from initial development and major enhancements through lifecycle maintenance, is that finding and fixing defects during testing typically consumes half or more of allocated software costs. Why is this so? What would a different reality look like? Could that reality be achieved using known methods? There is a potential for a very different reality for test using the Team Software Process (TSP) and the Personal Software Process (PSP). This session presents an overview of the TSP and PSP with actual results from classroom and real-world usage, showing how a coherent packaging of known methods used at the individual and team levels can transform testing phases from high-cost defect removal to high-value-added functional verification.

About the speaker...
James McHale is a Senior Member of the Technical Staff at the Software Engineering Institute.  Prior to joining the SEI in 1999, he spent over 20 years in industry as a software engineer, system designer, project leader, and manager working on control systems for diverse applications such as steel mills, power plants, robotics, and transportation. James is an authorized TSP coach, a SCAMPI Lead Appraiser candidate, and he teaches the PSP and TSP course suite at the SEI as well as Introduction to CMMI. He has also co-authored several technical reports relating TSP to process improvement models such as CMM and CMMI.

 Back to top

Automation and the Magic of Metrics
Bob Crews, Checkpoint Technologies, Inc.

Track 3: 9:45 - 10:45

Metrics, when gathered and analyzed correctly, can provide invaluable information and can assist in the best decision making. This is especially true in the world of test automation. How do we determine if our current process is the "best" process? Which manual test cases should we automate and which should we leave manual? What strategy should we use in implementing test automation? Are we realizing a return-on-investment? Has our testing process improved with automation? These are all important questions when needing to compare processes, track progress, and determine R.O.I. This presentation will provide key metrics which should be tracked to answer these questions and more. A practical worksheet will be provided to you that will assist in the analysis of amazing, magical metrics!

About the speaker...
Bob Crews, President of Checkpoint Technologies, is a consultant and trainer with over eighteen years of IT experience including full life-cycle development involving development, requirements management, and software testing. He has consulted and trained for over 160 different organizations in areas such as effectively using automated testing solutions, test planning, implementing automated frameworks, and developing practices which ensure the maximum return-on-investment with automated solutions. Bob has presented at numerous conferences and user groups throughout the world including QAI, EuroStar (Copenhagen), HP Software Universe, and LatinStar (Mexico City). Bob was named as one of the top five speakers at the QAI Annual Software Testing Conference in 2004.

 Back to top

Measuring Test Effectiveness
Pradeep Chennavajhula, QAI's Edista Testing Institute

Track 4 9:45 - 10:45

Every organization spends a lot of time, effort, and money in doing testing. Even after spending the resources, organizations grapple with lack of a deterministic approach for justifying the value of testing. The issue becomes more imminent in the current uncertain environment. Interestingly, there are many variations of approaches and measures organizations have created for understanding when to stop testing.

Using NFR approach published by Chung, the presentation focuses on evolving a deterministic approach to Test Design. The four main elements of interest for managers to take decisions in the field of Testing have been Rigor, Speed, ROI, and Defects in the Testing activity. For each of the desired elements, the presentation proposes a set of attributes and influencers to be considered, and suggests metrics for effectiveness of Tests. Using these metrics, the author highlights and suggests specific measurable and predictable criteria for making the most important decisions - “When to Stop Testing”.

About the speaker...
Pradeep Chennavajhula (PC) is CEO of the Edista Testing Institute, an independent venture of QAI Global Services focused on Software Testing.  His current focus is to enable IT organizations with Workforce Development – Strategy & Execution in the Software Testing domain. At ETI PC leads the design and delivery of products and services for professional development of Testers across all levels. Well known on the IT Conference circuit in India, particularly those focusing on Software Testing,  PC’s technical interests include Software Testing, Estimation, Project Management, and business interests are in Talent Management, Economics, and Strategy. His clients include the likes of Oracle, Microsoft, Accenture, Infosys, Unisys, Deloitte, Deutsche Bank, Emirates, Logica, and many others.

Back to top 

The Final Quality Gate: Software Release Readiness
Nancy Kastl, Kaslen Group, Inc.

Track 5: 9:45 - 10:45

The implementation of a software release brings a level of risk to the stability of a business. The challenge of Information Technology is to deliver constant application software changes to meet business needs, while ensuring minimal disruption of service to internal and external customers. A company faced with frequent or highly visible production problems impacting customers needs improved control over the introduction of software releases. How do we determine if a software release is ready to "go live?" Are QA test results sufficient evidence for the release deployment decision? This presentation will address the multi-dimensional criteria for evaluating readiness beyond product quality, that include dimensions for customer, deployment, support, operational, and security readiness. A simple yet effective process for evaluating multi-dimensional readiness will also be provided for the final quality gate within the software release life cycle. This approach integrates with any formal or informal methodology to achieve a better informed and more confident "go live" decision.

About the speaker...
Nancy Kastl, CSQA, is President of the Kaslen Group, a Chicago based IT consulting firm. She is an accomplished professional with over twenty-five years experience as a consultant, manager, facilitator, and instructor. She brings her expertise to management teams in strategic planning, measurement, quality management, software testing, process re-engineering, program management offices, and project management. She has established, staffed, and managed independent testing teams. Nancy is a former VP of Quality Assurance at Harris Bank and is the founder and President of the Chicago Quality Assurance Association. Nancy is a frequent conference speaker and is currently the QUEST conference chairperson. She has served on the Purdue University's IT Advisory Committee and the QAI Certification Board.

Back to top

THURSDAY, APRIL 23 - AFTERNOON

The Power of Inspections and Peer Reviews
Rebecca Staton-Reinstein, PhD, Advantage Leadership, Inc.

Track 1: 1:00 - 2:00

Most organizations rely on traditional testing to determine whether software works or has bugs.  Yet, for over 30 years, the formal Inspection process developed by Michael Fagan has proven to be more effective and efficient at finding defects.  If you use inspections early in the life cycle when defects are easier and less costly to find and fix, you generate major savings in time and money.  Learn how to unleash the power of Inspections (Static Testing) while implementing a technique that has proven strategic value for finding defects and improving processes.  Use Inspections as your peer review procedure to improve maturity. Show your management significant bottom-line value and ROI.  Learn from case studies demonstrating millions of dollars in savings in companies like yours.  Get the information you need to make a persuasive case to management.

About the speaker...
As President of Advantage Leadership, Inc., Rebecca Staton-Reinstein, Ph.D., CSQA, works with companies to improve the quality and productivity of software-related efforts. She helps IT organizations assess the current situation and create strategic plans to engineer successful processes, establish business-oriented measurement, and improve bottom-line results. She works with both technical and managerial staff to discover hidden costs and demonstrate ROI. Rebecca has successfully established three QA organizations; she has an international client base, and is the author of books on improving software quality and strategic planning including Get Great Requirements, The Hard Job of Making Software Work: Building the QA Function Step-by-Step, Success Planning: A 'How-To' Guide for Strategic Planning, and Conventional Wisdom: How Today's Leaders Plan, Perform, and Progress Like the Founding Fathers.

Back to top

Getting to Consensus Quickly
Sandra Lamartine, James Hardie Building Products

Track 2: 1:00 - 2:00

How many times have you had project delays because new ideas emerged late or the key stakeholders couldn't reach consensus? This session will help you learn different techniques you can employ when having to bring together people with different agendas. Sandra will discuss establishing team charters and delineating roles early on in a project. You will also review meeting management, including building agreements, voting and ranking techniques, managing strategic moments, and capitalizing on energy and creativity. Finally, we will discuss steps you can take to influence those who seem reticent to take on the group goal or charter.

About the speaker...
Sandra Lamartine is currently Organizational Development Manager at James Hardie Building Products. Sandra has over 13 years of experience in organizational development, change management, performance management, employee and customer satisfaction, and cross-culture and company culture studies. Prior to James Hardie, Sandra worked as an internal OD manager and consultant with Tellabs, Arthur Andersen, Hughes Aircraft and the Southern California Gas Company. She has her Masters in Industrial and Organizational Psychology from California State University and Bachelors in Psychology from McGill University, Canada. She is the organizer of the Western Suburbs Organization Development Network, a group of internal OD consultants who meet monthly to discuss topics of interest.

Back to top

A Quick Automation Guide for Testing Managers
Robby Green, Infosys

Track 3: 1:00 - 2:00

Almost all test managers face the dilemma, "Should I or shouldn't I go for automation?" There are many pros and cons and the answer does not come easily. Several factors need to be considered before making a decision. In his presentation, Robby will discuss how to make this choice. What set of information should be taken into account? What are the risks in automation? How should automation be presented to senior management? Join Robby as he presents both automation projects that were successful and some that failed and discloses the reasons behind these outcomes. An overview of the categories of automation tools and cost estimates of automation projects will be given. Finally, the soft side of the automation project will be covered, how to get the cooperation of the existing test team and overcome their fear of becoming redundant?

About the speaker...
Robby Green has over 15 years in IT management with most of it in the testing arena. During his career Robby has managed multimillion dollar testing programs in a diverse range of industries including telecom, insurance, banking, retail, and media. Robby has developed testing methodologies and has taken part in performance testing for complex multi component systems and in academic research based on queuing theories. Robby's specializes in test management and planning, automation, and performance. Currently, Robby is a Business Manager in the testing practice at Infosys Technology Ltd.

Back to top

Assuring Certainty through Effective Regression Testing
Vishvesh Arumugam, TATA Consultancy Services

Track 4: 1:00 - 2:00

Effective regression testing is a mainstream IT initiative that ranks among the top priorities of management. This presentation is based on Vishvesh's experience in the testing of various web based applications for a global market leader in automotive systems and facility management & controls. During his discussion, Vishvesh will outline the regression testing approach adopted to optimize the business outcome. There will be a focus on best practices followed during the test cycle highlighting some of the unique practices that proved to be the most successful. The importance of repeatable process to reduce testing cycle time and improve overall quality will also be stressed. As a final point, methodology that embraces Six Sigma principals, reduces project risk, and streamlines test delivery will be covered.

About the speaker...
Vishvesh Arumugam holds a masters in computer applications, is a Certified Software Tester, and currently works as a Quality Analyst for TATA Consultancy Services. He has been part of the software testing industry for the past seven years. As a quality analyst responsible for managing the Software Release and Testing Team, Vishvesh has experience in test architecture, planning, effort estimation, and reporting as well as the management of system and regression testing for the software release as a whole.

Back to top

Performance Testing Best Practices
Lee Barnes, Utopia Solutions

Track 5: 1:00 - 2:00

In his presentation, Lee will focus on general best practices and ideas for those looking to implement performance testing in their organization. High level topics covered will include resource requirements, planning, scripting, data management, execution, and analysis. Lee will particularly discuss the importance of planning in performance testing. You will come away with a performance testing roadmap that you can use in your own organization to avoid common performance testing pitfalls thus enabling you to focus your efforts on the specific performance details of the systems you are testing.

About the speaker...
Lee Barnes has over 16 years of experience in the software quality assurance and testing field. He has successfully implemented test automation and performance testing solutions in hundreds of environments across a wide array of industries. He is a recognized thought leader in his field and speaks regularly on related topics. As founder and CTO of Utopia Solutions, Lee is responsible for the firm's delivery of software quality solutions that includes process improvement, performance management, and test automation.

Back to top

FRIDAY, APRIL 24

360° Project Lifecycle Health Assessments
Anthony Mattucci, Milano, Inc.

Track 1: 9:45 - 10:45

Now, more than ever, businesses cannot afford costly surprises, flawed information, and limited visibility into project performance. Proactive, preventive, “early warning” processes must be put in place to help diagnose and manage project health and reduce the cost of poor quality. This presentation will address how the rigorous application of project assessment and independent verification and validation (IVV) processes, tools, and techniques throughout the project lifecycle can increase an organization’s ability to detect, measure, and manage risk more effectively. This will reduce the likelihood of budget overruns, late deliveries, sponsor dissatisfaction, and reduced work product quality. Leveraging best practices knowledge to customize scorecards and conduct health assessments that combine subjective intelligence with statistical data, helps to solve immediate problems and prevent their root causes from degrading future performance.

About the Speaker…
Anthony Mattucci, founder and Chairman of the Board of Milano, Inc., is a former Chief Quality Officer and VP of Project Management with a 25 year track record of accomplishment in project and process performance improvement. He is the author of the Calibra® Performance System, an enterprise project performance improvement process and toolset. Anthony has trained hundreds of senior IT professionals in project assessment, coaching, mentoring, and turnaround and recovery. He is a certified Calibra® Master Practitioner.

Back to top

Using CMMI for Services for IT Excellence
Pradeep Chennavajhula, QAI's Edista Testing Institute

Track 2: 9:45 - 10:45

About the Speaker…
Pradeep Chennavajhula (PC) is CEO of the Edista Testing Institute, an independent venture of QAI Global Services focused on Software Testing.  His current focus is to enable IT organizations with Workforce Development – Strategy & Execution in the Software Testing domain. At ETI PC leads the design and delivery of products and services for professional development of Testers across all levels. Well known on the IT Conference circuit in India, particularly those focusing on Software Testing,  PC’s technical interests include Software Testing, Estimation, Project Management, and business interests are in Talent Management, Economics, and Strategy. His clients include the likes of Oracle, Microsoft, Accenture, Infosys, Unisys, Deloitte, Deutsche Bank, Emirates, Logica, and many others.

Back to top

The Importance of Securing Test Data
John Miner, Original Software

Track 3: 9:45 - 10:45

A widespread and crucial issue facing organizations during the testing of mission critical applications is how to handle test data in a secure way. Testing directly on live production systems is, of course, out of the question, therefore separate and isolated testing environments are created. Because of recent regulations like PCI and Sarbanes Oxley, providing copies of production data for testing purposes is no longer an option unless the data is de-identified. In his presentation, John will highlight the dangers and pitfalls of testing insecurely and will suggest approaches to addressing this critical issue.

About the speaker...
With a career spanning more than 24 years, John Miner has been helping software application vendors like Compuware and, most recently, Original Software bring their products to market in roles including Pre-sales Manager, Product-line Sales Director, Product Launch Director, Partner Business Development Manager, and Operations Director. John began his career as a systems control engineer in the chemical industry. Following this, he participated in the design, development, and delivery of large mission critical applications for companies like DuPont and i2 Technologies. John majored in Computer and Electrical Engineering at Purdue University.

Back to top

Port 80 Is Wide Open: Scanning for Application-Layer Vulnerabilities
Joshua Burton, PhD, IBM

Track 4: 9:45 - 10:45

If your web application or service is vulnerable to corrupt or unsanitized user data, no conventional firewall will protect you. Learn how hackers can exploit vulnerabilities in your web application and how you can fight back. IT security testing has focused overwhelmingly on events at the hardware, network, and session levels, but most actual attacks now occur at the application level. This session will provide a quick tour of cross-site scripting, SQL injection, server-side injection, and other attacks that can sail right through the window your network administrator explicitly leaves open for your users. Joshua will offer state of the world advice about the malware ecosystem and will describe testing methods to detect vulnerabilities before your software is deployed.

About the speaker...
Dr. Joshua Burton supports IBM Rational customers in all aspects of the testing role. During his ten years in the software industry, he has worked on every side of the quality assurance problem. He has published technical articles in Dr. Dobbs' Journal and Software Developer & Publisher, and has implemented memory management strategies for sustained uptime in critical telecom, aerospace, and financial applications, on a variety of platforms. Before 1997, Dr. Burton worked for a decade as a theoretical physicist and still holds an adjunct appointment at Northwestern University as a visiting scientist.


Back to top


Quality Engineered Software & Testing (QUEST) Conference - Copyright © 2009
www.qaiquest.org