LayerX’s QA Initiative: Don’t be Tempted by the Ice Cream Cone
Sep 01, 2021
This is a translate article that LayerX written.
Here is the original.
Hello! I’m Kaji, one of the guys at LayerX.
Today, I’d like to write about our current efforts towards test automation.
I want to share how our approach towards software testing has changed as we moved from developing an MVP (Minimum Viable Product) to achieving PMF (Product/Market Fit).
MVP development was backed by manual E2E testing
The convention: Climb the Testing Pyramid from the bottom
The Testing Pyramid is often talked about as the ideal of software testing.
Unit tests are at the bottom, API/integration tests are built on top of that to verify logic across multiple layers, and UI/E2E tests are at the top to verify everything else to ensure good UX.
The closer a test is to the bottom, the faster the execution speed, the more frequently it can be executed, and the easier it is to identify the cause of an issue. Of course, the opposite is true, too.
That’s why unit tests, which are right at the bottom, should generally be run sooner in development and more frequently.
I’ve heard that in a good development structure, unit testing takes up 80%, or the testing ratio is 7:2:1 from the base.
Our Official Release: Don’t climb Testing Pyramid from the Bottom!
The beta version of LayerX Invoice launched a few months after its development started last fall and was officially released in January.
In the inception deck that we created when we started the business, speed was our #1 priority. We also developed the initial product without trying to make it perfect.
∗ Of course, the product deals with accounting, so it works correctly. I mean that we didn’t try to write all of the tests as a test-driven development, or create a QA table that was detailed enough detail that even a new part-time staff could run tests.
Two months after release, we continued to add features very quickly while listening to customer feedback. Although we wrote tests for critical and challenging areas for humans to verify, quality assurance was mainly done by manual E2E testing.
By sticking to DDD (Demo Driven Development), which creates a system that customers need while they see it, we could get a lot of feedback and problems to solve and were able to develop an MVP early.
From a business development perspective, I believe prioritizing speed was the right decision.
Breaking Away from the Ice Cream Cone
However, from a testing point of view, this situation is often referred to as an anti-pattern and is likened to an Ice Cream Cone. The sweet temptation is that manual E2E testing alone can ensure quality, and engineers wouldn’t have to spend their precious time writing tests.
The workload and man-hour required for testing increases proportionally as time passes and as more features are added. Taking one bite won’t do any harm, but eat too much and you get fat.
You have to stop giving into temptation at some point.
We changed our ways when we released additional features two months after the official release.
Immediately after the stable release, we found three bugs. They had slipped through because of human error during manual E2E testing and because we failed to verify affected parts after a hotfix. Fortunately, there was no direct impact on customers, but I felt an enormous sense of responsibility for releasing something that contains three bugs.
The MVP is becoming more widely accepted, and the number of customers is increasing. We have plans for adding even more features as we work towards PMF, but if we continue doing things the same way, there will be a critical bug at some point that will affect customers.
The number of items we check in E2E testing has increased to 900 items. So far, we’ve managed to get to a stable release in two days after deploying from the staging environment, but this release cycle could come crashing down on us at any point.
Given all of those reasons, we decided to allocate 20% of the development resources to quality assurance and write tests.
Ensuring Quality while Maintaining Speed
Aim for 70% Unit Testing?
If you decide to write tests, you should follow the status quo and climb the Testing Pyramid from the bottom and aim for 70% unit testing first. However, aiming for 70% wasn’t the right move for us, considering our phase.
I think aiming for 70% would be fine if the product is highly likely to win in the market and already has enough QA staff. It would also be an option for business systems where you can’t roll back very easily or systems where even a single bug would be a catastrophe, such as systems in medicine, government, or elections.
Since SaaS is used in business operations on a daily basis, it would cause a great deal of inconvenience to the customers if our business stopped. Quality is vital to a product, but if you can’t add useful features or make improvements, the customer’s experience will suffer and could kill the business.
Startups may be full of dreams, but they are always short on engineer resources. Dealing with bugs and regression is time-consuming, so it’s important to inch forward while identifying areas where writing a test would accelerate development.
We don’t have a defined percentage. Instead, we identify the areas prone to bugs and regression and the features where bugs would be critical. We then spent 20% of our resources implementing unit tests and API tests in those areas.
By running E2E testing on all other areas, we believe we can keep the speed while maintaining a certain level of quality.
Introducing an E2E Test Automation Platform
Aside from implementing unit tests, we also need to figure out how to manage the bloated E2E test situation. Manually testing 900 items in two days is already overwhelming, and we were asking people outside the development team for help.
So, we introduced a no-code E2E test automation tool. It’s a capture replay tool, which records the E2E test operation and executes it automatically.
After using it, I found that the no-code E2E test automation tool is a tool that can change the way we climb the traditional Testing Pyramid and change how we allocate resources. Traditionally, only an engineer could implement tests. So, the only way to climb the pyramid was to go from the bottom.
However, with the test automation tool, engineers could climb from the bottom while someone else could descend from the top. Besides, there are areas where unit tests can be substituted, and engineers can continue testing accordingly.
Some people have told us not to use capture replay, but I think what they actually mean is that we shouldn’t rely on it just because it looks convenient. We should think of it as tempting ice cream. Although it’s automated, it’s still an E2E test. It’s prone to breaking, it’s slow, and it’s challenging to identify the cause of an error. Our way of doing things here at LayerX is to know the limitations of tools and SaaS and use them anyway.
We carefully considered four test automation tools. In the end, we chose Autify. There are many reasons for this choice, but the main ones were that it was intuitive and straightforward to use, and because the Japanese customer support staff were quick and thorough to respond.
We have only just completed the implementation stage, so it’s still early days, but we have already automated tests for more than 50 items. It’s already reduced QA hours. In the next month, we plan to automate about 300 items.
Build a Team that can Steadily Develop Quality Products
Guaranteeing quality isn’t limited to software testing.
We also have to cover any regression or bugs we find by running tests by fixing them quickly before release. To guarantee is to promise that you will provide reliable quality.
As we work on test automation, we are starting to think towards the next phase, creating a system and structure that allows development while maintaining high quality. We have begun developing a mechanism to aggregate bugs found in QA, analyze the pattern of each cause, and systematically fix them.
However, our business is still in its infancy.
We are continuing to develop new functions quickly. It would be counterproductive to be so careful about QA that it hurts the business’s progress. I would like us to have a balanced approach to choosing measures that will accelerate development speed.
There are many things we want to work on. If you are interested, we are hiring. You can apply by running the following test:
You are interested in LayerX.
- Click the job link at the bottom.
- Visually check the job description
- Press the Apply button
- Expected results
Thank you. Please share this article if you found it interesting!