Running a Test Automation? Follow These Best Practices.
When it comes to test automation, making decisions on what to automate and when to run the suite of automated tests is a critical part of the process.
But an underrated part? Full-team involvement.
That’s according to Mike Levan. He’s a lead QA automation engineer at Procare Solutions, and during his time in the role, he’s learned that the more people who participate in the process, the better.
“Just because this is test automation doesn’t mean only QA resources perform the tasks,” Levan said. “Everyone on the team needs to have an active role. Back-end and front-end developers can write unit level, integration and even some basic ‘happy path’ test automation to ensure that the product is high-quality.”
Levan and his team use a variety of tools to make this process happen as seamlessly as possible. So Built In Colorado spoke with him to learn more about what kind of tests his teams automate, his test automation best practices and some of his favorite tools.
What they do: Procare Solutions provides child care management and parent engagement software, as well as integrated payment processing technology and services.
Describe your top three test automation best practices.
My first test automation best practice is choosing which tests to automate. The ROI is always important when automating tests. Automating test cases that are not run frequently or features rarely used do not help your efforts. Developing and maintaining automated test cases that will be run after each build or are core features is where you get the best bang for your buck.
The second best practice is getting involvement from your whole team. Everyone on the team needs to have an active role.
The third best practice is choosing when to run your suite of automated tests. Not all tests are created equal. There are smoke tests, end-to-end, browser-specific, performance and more. A good approach is to start with your smoke tests to ensure the environment is stable to continue with more automated, as well as manual tests. From there you can either run end-to-end tests or the rest of your suite — it’s whatever makes sense for you.
What kind of tests does your team automate?
Our team is currently automating unit, integration and regression tests for the web. Select tests are pulled from our integration and regression suite to make our smoke tests. We like this approach because it allows us to effectively test our system at different levels. Smoke tests let us quickly and effectively determine whether the deployed software build is stable or not. Our developers focus mainly on unit and integration tests to ensure what they have coded works within itself and with other parts of the system.
Our QA engineers focus on regression tests and primarily UI tests. Automating UI tests allows us to continually test the basic parts of our application and free up time for us to do more exploratory testing. It also ensures consistency, increases efficiency, and gives us reusable test scripts, which saves a lot of precious testing time. Soon, we are going to be diving into iOS and Android automated testing. We will be doing each natively, XCUITest for iOS, and either Espresso or Kotlin for Android. Just like the web, we will focus on unit, integration and UI automated tests.
Our developers focus mainly on unit and integration tests to ensure what they have coded works within itself and with other parts of the system.’’
What are your team’s favorite test automation tools? How do they match up with your existing tech stack?
Our team really likes Cypress for front end and UI automation tests. One of the reasons why it’s our favorite is because of the framework’s ability to capture snapshots at the time of execution. This allows us to debug and solve issues at that particular step. Another reason is how simple and fast it was to set up the framework. We currently have a lot of UI automation tests in Protractor and converting them to Cypress was easy. During the conversion, we have been able to drop a lot of our page and element wait commands as well. Our front-end developers are maintaining their Cypress tests and QA works with them to ensure that we aren’t duplicating tests.
For back-end testing, we are using RSpec since the back end is written in Ruby and is maintained by the developers. Our team likes to use JMeter for load and stress testing. Using production traffic, we can craft what our 1x baseline is and have the ability to measure our application's performance and response times. Once the conversion of UI tests from Protractor to Cypress is complete, we can then focus on this. Once we do, we’ll be able to easily ramp up traffic and ensure our app can meet the needs of our customers.