Industry Insights: Friday

View Industry Insights sessions: WednesdayThursdayFriday

Unit Testing Getting Started with Test Driven Development
Stephen Vance, uTest
Friday, May 4
9:45 AM – 10:45 AM
We often are told, especially within the agile community, that Test Driven Development (TDD) is a great way to “build the quality in.” The benefits sound incontrovertible, but how do you get started? If you have ever tried to work with TDD only to wonder where or how to start, or if you have used TDD and feel like it just isn’t “clicking” and you are not realizing the benefits you expected, you will want to attend this session. Steve will begin with a spec for a simple but realistic software development project and walk through it in a test-driven way with the goal of helping you to understand how to successfully start and execute on a project with TDD. Steve will cover where to start, how to proceed, and how to look at code like a devtester. Examples will be in Java with JUnit, but may be easily interpreted in other languages.
Stephen Vance, uTest

Regression Testing How to Optimize Your Existing Regression Testing
Arthur Hicken, Parasoft
Friday, May 4
9:45 AM – 10:45 AM
This presentation introduces tips for ensuring that your regression testing process and system will identify regression issues as soon as possible without bombarding your team with false positives. Arthur will cover how to configure the regression system so that it provides the optimal level of results, how to best integrate the regression testing into various build processes and development infrastructures, and finally, how to establish a minimally-intrusive workflow that keeps the regression test suite in sync with the evolving application. Join Arthur and learn how to do your very best work during the regression testing phase.
Arthur Hicken, Parasoft

User Acceptance Testing The UAT Chess Game: Playing Your Pieces to Win
Glenn Stout, PhD
Friday, May 4
9:45 AM – 10:45 AM
User acceptance testing is a challenge to many organizations. What is it? What are the goals, scope, and structure of it? Is the organization mature enough to handle a “true” UAT? Testing organizations will have to answer these questions to ultimately deliver a validated product. This session is designed to provide the audience with a few best practices – “chess moves” if you will – to best work with the user community and maximize the value of user acceptance testing so that everyone wins! Glenn will focus on the “people, process, and technology” using the backdrop of the “chess game” to help show what moves to make. You will take away the answers to all of the questions above and you will learn how to initially engage the user acceptance testing team, how to work with the UAT team, ways to motivate and coach them, and, most importantly, how to give guidance on determining what constitutes success.
Glenn Stout, PhD

Performance Comprehensive Performance Monitoring of Large Enterprise Applications
Vinkesh Mehta and Vic Soder, Deloitte Consulting LLP
Friday, May 4
9:45 AM – 10:45 AM
A performance test is only as good as the performance test monitoring and, the right performance monitoring setup can be complex. It takes time, money, and skills. For a very large enterprise application, a collection of over 10 different monitoring tools for test and production operations is used. In this session, Vinkesh will share the approach for developing requirements, designing the architecture, and implementing a performance monitoring system for large applications. He will explain the pros and cons of a performance monitoring system for build vs. buy and single vendor vs. best of breed solutions. You will learn how to use the monitoring system to identify and analyze performance issues. Participate in this session and discover the skills required to build, maintain, and use performance monitoring systems.
Vinkesh Mehta, Deloitte Consulting LLP
Vic Soder, Deloitte Consulting LLP

Lifecycle Testing Closing the Specifications Quality Gap through Behavior Driven Development
Chris Kozak, ThoughtWorks
Friday, May 4
11:00 AM – 12:00 PM
Many quality issues stem from poor communication between a project’s various stakeholders. Behavior Driven Development (BDD) attempts to address this problem by clarifying desired software behavior through discussion between those stakeholders – the BA who thinks about the business’ objectives, the QA who thinks about the edge cases, and the developer who thinks about the implementation. This can be a significant change for many organizations since this process involves re-deploying QA’s as analysts of a system’s negative space and developers as analysts of the system’s non-functional requirements. The outputs of these discussions are specifications that describe a system’s requirements in high-level natural language. These specifications are then used to guide the implementation of the system, eventually becoming test cases that lock-in the desired behavior. They also serve as a form of “living documentation” for the life of the application. Join Chris to understand why quality is much easier and cheaper to “bake-in” at the beginning of the project than to apply at the end.
Chris Kozak, ThoughtWorks

Web Testing The Many Faces of Web Testing at Google
Greg Dennis and Simon Stewart, Google
Friday, May 4
11:00 AM – 12:00 PM
Google lives and dies by the quality of its web apps. Testing then is an essential part of the product development. Just as in every company, however, not every developer and tester at Google has the same level of technical ability. In this session, you’ll find out which tools Google uses to make it as painless as possible to automate testing and how you can put these same tools to work on your project. Most of these tools are Open Source and some of them may already be familiar to you, while others might be new. Tools include Selenium, WebDriver, Web Puppeteer, and the Closure test runner. Of course, the tools are only half the story. Greg and Simon will also cover mistakes they’ve made, give tips on writing stable tests, and talk a little about the secret sauce of Google’s web testing infrastructure.
Greg Dennis, Google
Simon Stewart, Google

Test Reusability RSTAR™: Achieving Reusable Tests and Data
Karen Johns, Mosaic, Inc.
Friday, May 4
11:00 AM – 12:00 PM
Reusability is the key to cost-effective manual and automated testing. Most testing approaches achieve only partial reuse since they ignore a critical component — test data. RSTAR™ is a testing and test automation framework that manages your test data as a reusable asset to enable Full Reuse™ of your manual and automated tests. It implements a complete test architecture from test planning and test data management through test execution with a single shared repository. Using the reusable manual tests and test data, RSTAR™ then enables a powerful test automation framework with a wide variety of automation tools. This demonstration will illustrate the benefits of Full Reuse™ and the critical role test data provides to enable Full Reuse™. Karen will also demonstrate the features of RSTAR™ that support reusable tests, test data, and test execution. How these features then enable a powerful test automation framework will also be demonstrated with Selenium as the test automation tool.
Karen Johns, Mosaic, Inc.

Performance Real World Production Performance Testing from the Cloud
Dan Bartow, SOASTA and Lee Barnes, Utopia Solutions
Friday, May 4
11:00 AM – 12:00 PM
Online application performance is critical – no one would challenge this statement. Yet, in the web and mobile world the amount of performance testing done on applications is appallingly low. When performance testing is done, it’s usually conducted in a test lab. Even with thorough lab-based testing, applications very frequently topple under the pressure of real-word users. Results from lab testing alone are not delivering the performance answers that leadership needs for their business-critical systems. Testing in production is an essential component of world-class performance methodologies. However, this approach is not without its own set of challenges including security, test data, and live customer impact. In this session, you’ll learn how business from the New York Stock Exchange to Netflix use Cloud-based performance testing to gain deep insight into how their systems will perform. In addition, you will take away key elements from the methodology that companies are using to address the unique challenges presented by performance testing in production.
Lee Barnes, Utopia Solutions

View Industry Insights sessions: WednesdayThursdayFriday