David Dang, Zenergy

One-Day Class: Leveraging Selenium Open Source for Test Automation

Monday, April 15, 2013: 8:30 AM - 4:30 PM

Instructor:

Abstract

This one-day course focuses on the major aspects that QA managers and test automation engineers must consider before utilizing Selenium open source for test automation. A combination of lecture, classroom discussion, and experiential techniques provide attendees with a thorough understanding of the strategy and effort necessary to implement Selenium. This includes the ideal uses of Selenium, the level of automation supported by Selenium, the analysis needed to define a test automation framework, the efforts and timelines to implement Selenium, and its integration with other tools.

Course Objectives

The primary objective of this class is to instruct test automation architects, senior automation engineers, and QA/test managers on the strategy needed to implement Selenium.

Attendees will learn:

  • The ideal uses of using Selenium for automation
  • The components of Selenium
  • The level of automation supported by Selenium
  • The factors that impact the design of a test automation framework
  • The effort and timeline to implement Selenium
  • The integration with other tools

Course Outline

  • Ideal uses of Selenium for automation
    Describes the types of situations where Selenium best fits for test automation. This includes application technologies, supported environments, costs, and resource skillsets.
  • Components of Selenium
    Describes the functionality between Selenium IDE, Selenium RC, Selenium Client API, Selenium WebDriver, and Selenium Grid.
  • Level of automation supported by Selenium
    Explains how to use Selenium for functional, system, and regression testing. This also includes the impact of development methodologies such as Agile and Waterfall.
  • Design of a test automation framework
    Covers the factors that impact the design of the test automaton framework such as short-term and long-term goals for test automation, the end user of test automation, frequency execution, and the size of the automation suite.
  • Effort and timeline needed to implement Selenium
    Provides examples of the effort and timeline needed to implement Selenium over two projects. This includes resource needs, environments, and duration.
  • Integration with other tools
    Details how Selenium works with continuous integration and behavior driven tools.
  • Open Discussion
    Participants will engage in discussion on real-world scenarios and experiences in using Selenium.

Mike Lawler, BCBS Association

Half-Day Tutorial: Performance Testing: Engineering for Success

Tuesday, April 16, 2013: 8:30 AM - 12:00 PM

Instructor:

Applications are built with a variety of infrastructure hardware and software components. Each component can be an impact to the overall infrastructure. Performance testing is the unique method in the software quality portfolio that enables us to identify the effect of each component. Discover what is required to successfully integrate performance testing into your organization. Successful performance testing of your critical applications will help you to identify and correct system bottlenecks before going to production. It will also lead to less production downtime and the opportunity to identify defects otherwise impossible to detect with functional testing alone. Join Mike to discuss the use of technology to support performance testing. You will explore proven implementation approaches for performance testing processes and test automation tools for a variety of technology platforms including web, data warehouse, and others.

  • Learn the concepts and use of technology to support performance testing
  • Understand how to analyze and derive performance requirements
  • Investigate proven implementation approaches for performance testing

Daniel Miessler, HP

Half-Day Tutorial: Security Testing for QA Professionals: Enabling the Versatile Tester

Tuesday, April 16, 2013: 1:00 PM - 4:30 PM

Instructor:

Functional testers and QA professionals are much more readily equipped to perform software security testing than you may think. The difference between confirming known functionality and discovering unknown, unintended functionality in a piece of software or application may seem like night and day – but in reality it’s simply a matter of tools, mindset, and enablement. This tutorial will teach you these key components, and couple them with your deep understanding of the application design, use-cases, and test data to make help you become far more effective to the enterprise’s software security testing strategy. Join Daniel to learn why we do security testing, what characterizes security testing versus other types of testing, and the difference between static and dynamic testing. Understand the testing toolkit and real world examples of its use.

  • Understand how the security testing perspective differs from other testing types
  • Explore the methodology used in performing black-box testing
  • Learn common tools used by security testers

Michael Portwood

Presentation: The Importance of Code Coverage in the Internet Age

Wednesday, April 17, 2013: 9:45 AM - 10:45 AM

Speaker:

With the proliferation of mobile devices, cloud computing, and client side scripting coupled with web services, how do you guarantee code coverage? Many of these components can easily go uncovered leading to defects and disappointing user experiences. Michael will discuss the importance of unit test coverage and show techniques, tips, and tricks that simplify the process of guaranteeing complete coverage for Internet enabled solutions. He will highlight subtle but common unit testing issues that allow defects to slip into the field. Illustrated specific quick start and real world rollout strategies help you identify, isolate, and then remove latent uncovered code before your customer tells you about it. Unit testing is an important part of a comprehensive quality program. Join Michael to improve this phase of your testing program today.


Jim Holmes, Telerik

Presentation: That Sounds Great in Practice, But…

Wednesday, April 17, 2013: 11:00 AM - 12:00 PM

Speaker:

This talk doesn’t even pretend to give you simplistic answers on how to effect change around quality in your organization; however, you will learn practical tips on how to start making that change happen. We’ll discuss forming and refining a clear vision, getting stakeholders on board, and dealing with forces resisting the changes. You’ll also learn critical concepts like clarifying your idea, speaking the right language, creating a good pitch, and figuring out who owns the money you’ll need for your idea. Some of the real world examples you’ll hear include working to set expectations around improvements in your organization’s approach to quality, getting appropriate hardware and software for testing environments, dealing with offshore/outsourced testing teams, and creating an organization-wide culture that cares about quality. You’ll leave this session with ideas on avoiding pitfalls based on places where Jim’s fallen short, and also approaches to try based on his successes.


Christin Wiedemann, PhD, PQA

Workshop: Critical Thinking in Software Testing: Leveraging the Scientific Method

Wednesday, April 17, 2013: 1:00 PM - 2:30 PM

Workshop Leader:

Strong parallels between a software tester and a scientific researcher are evident; they both employ their intelligence, imagination, and creativity to gain empirical information about the property or system being investigated. Science is credible, curiosity-driven, critical, impartial and dynamic – can’t we strive for our testing to be the same? Science continuously challenges and questions methods, techniques and core beliefs – how can you question and improve your own approach to testing in this way? Christin will introduce the core steps of the scientific method, describe how good software testing adheres to those principles, and then explore how truly understanding and embracing the scientific method can make us better at questioning our assumptions, staying impartial and being more credible testers. In smaller groups you will get to practice your critical thinking on a set of examples. You will also learn to recognize bias, and experience the importance of questioning assumptions hands-on.

  • Learn how to use the scientific method in software testing
  • Understand the difference between induction and deduction, and the risks associated with using these reasoning processes
  • Practice questioning assumptions and thinking critically

Clyneice Chaney, Quality Squared

Presentation: Verification and Validation Methods: Should You Be Using Them Today?

Wednesday, April 17, 2013: 3:00 PM - 4:00 PM

Speaker:

Verification and validation are often confusing, misunderstood, miss-used, and seldom applied appropriately. In this presentation, Clyniece will discuss verification and validation in light of today’s market needs and development and testing approaches. Can verification and validation be used to enhance quality? How and when should you consider it? What about verification and validation automation tools and techniques? These questions and more will be answered in this timely session.


Peter Varhol, Seapine Software

Presentation: The Battle of the Bug: Working Together in the Pursuit of Quality

Thursday, April 18, 2013: 9:45 AM - 10:45 AM

Speaker:

Software defects become more costly to address as a software project moves from design to development to test and deployment. Because of difficulties with characterization, reproduction, and importance of defects, struggles over defect management often take valuable project time that can slow down the release of new features. Project participants have different motivations in the defect process, causing much of the conflict. This presentation describes the conflicts inherent in defect management and provides a framework for defect processes across a variety of projects. It looks at how clear information on requirements, user stories, defect identification, characterization, and status helps teams to avoid disagreements, resulting in higher quality software delivered more quickly. Attend Peter’s presentation and learn how a well-defined defect management process and equal availability of information can help all parties come to satisfying and productive agreements.


TBA

Presentation: Testing Collaboration Sites and Content Management Solutions

Thursday, April 18, 2013: 11:00 AM - 12:00 PM

Speakers: ;

As companies begin relying more on their collaboration sites and enterprise content management solutions for line of business support, it is becoming increasingly important to test these solutions. Many of these solutions support internal, external, and mobile access by employees, customers, and partners through intranet, extranet, and internet sites. Although the robust out-of-the-box features of these solutions may not need to be tested, the company specific configuration and customizations should be verified. Similar to applications, these collaboration and content management solutions are subject to changes including new versions, upgrades, and patches that can impact the functionality of the business applications built on them. In this session, you will learn how to approach testing collaboration and content management solutions by identifying core functionality, building and organizing a re-usable test repository, and aligning testing scope to risk mitigation. SharePoint will be used as an example of a collaboration and content management solution. Learn about this new opportunity for QA to provide added value to your company.


Peggy Schretter, Trustmark Insurance Company

Workshop: Integrating Exploratory Testing with Traditional Testing Methods

Thursday, April 18, 2013: 1:00 PM - 2:30 PM

Speaker:

Exploratory software testing is a powerful approach and can leverage the intuitive knowledge of the business or product being developed. It provides guidance to the journey of uncovering defects and begins to answer the questions of “what if.” Exploratory testing is not just ad hoc testing, though. It is a scientific process and requires a skilled tester to understand the process and expectations of the desired result. Exploratory testing is parallel with learning, test design, and test execution. In other words, exploratory testing allows the tester to control the design of the tests as those tests are performed and use the information gained while validating new and improved test scenarios. This workshop will focus on the benefits of exploratory testing, explain the situational activities, and provide hand on practice of the concepts. It will provide insights into how to manage exploratory testing, where it is best suited, the productivity gain, and how to put ET into action.

  • Learn to manage the expectations of exploratory testing
  • Explore the situational practice of exploratory testing
  • Understand how to leverage exploratory testing in the real world

Linda Westfall, The Westfall Team

Presentation: Risk-Based Configuration Control: Balancing Flexibility with Stability

Thursday, April 18, 2013: 3:00 PM - 4:00 PM

Speaker:

There is a dichotomy in software configuration management. On one side, individual developers and testers need the flexibility necessary to do creative work, to modify code and tests, to try what-if scenarios, to make mistakes and learn from them to evolve better software solutions. On the other side, teams need stability to allow code and tests to be shared, to create builds and perform testing in a consistent environment, and to ship high-quality products with confidence. This requires an intricate balance. Too much flexibility can result in problems. On the other hand, enforcing too much stability can result in costly bureaucratic overhead, delays in delivery, and may even require developers and testers to ignore the process in order to get their work done. This presentation explores risk-based software configuration control and techniques that can be used to help maintain the balance between flexibility and stability as software moves through the life cycle.


TBA

Practice Exam: CSQA/CSTE Practice Certification Exam

Friday, April 19, 2013: 9:45 AM - 12:30 PM

Proctor:

Gain first-hand experience to aid in your preparation for the Certified Software Tester (CSTE) or the Certified Software Quality Analyst (CSQA) exam. Please arrive promptly at the beginning of this session to participate.