It’s not unreasonable to assume that when you go for speed during development, you might end up sacrificing quality. I don’t think that’s entirely true. The onset of new tools, better automation practices, and a tightly-woven team culture, all significantly contribute to the element of speed without any real downsides. Here’s how we manage this at Bothrs.
Focusing on reusability
Using npm packages is a common practice in a lot of programming languages, but we've taken it a step further by creating our own open source packages for functionalities we use all the time. This is not business logic at all but just the Bothrs version of default things like a date picker, a modal to ask for notifications, or the way we set up translations in an app.
This allows us to not only go faster, but minimize the risk of writing bugs when building an application. Because these packages are used by a lot of different developers in different projects, that means they are battle-tested and we can be confident of the quality. If a bug does show up, we can fix it in one place and update the package in a small patch to our applications, instead of changing it in a lot of different places. When doing a task repetitively, you increase the chances of making a mistake tremendously, as well as just wasting valuable time when we could be writing amazing features for your app.
Low and no-code
By keeping our eyes open for how the landscape is evolving, and focusing on using as many tools that are already out there, we can ensure we don't reinvent the wheel if we can avoid it. A lot of the previous concepts from the previous paragraph apply to this as well, only now it's not in the form of packages but an application or environment used to build software, like **Webflow. Because a lot of work has been put into making sure their features work, we can be confident our application (built on theirs) will also work great.
Focusing on testing and quality
This allows us to create testing suites with a lot of complex scenarios to verify a component or screen of entire app feature can handle everything a user might throw at it. A lot of the time we will follow the pattern of prioritizing integration tests over unit tests. What this means is that we prefer to test an entire user flow. For example:
👉 Logging in as a user,
👉 Adding some products to the basket,
👉 Attempting a checkout with a discount,
👉 Then going back to the home screen
This allows us to make sure the app will work for the user because it is tested like a user would use it. Unit tests focus on simple things, like making sure an input field can't be abused to enter things you shouldn't or testing code that calculates the final price of the basket. While these things are important; having the perfect submit button doesn't help if the rest of the checkout flow is broken.
Aside from automated tests, we also do a lot of manual tests. Whenever we finish a feature, the product team (design and project manager) will make sure to test it themselves on an actual device. This makes sure we keep the human element involved, and have a good focus on the visual or otherwise awkward user experiences that an automated test can't test for. The same principle from before applies, we want to test the app as an actual user would, to make sure we deliver the best experience
Meanwhile we use tools like Prettier and Eslint to make sure we auto-format the code to clean, industry standards, making sure the code is readable, easy to collaborate on, and also to hand it off to our clients.
Focusing on continuous integration and delivery
Focusing on CI/CD is still heavily undervalued by a lot of organizations. Immediately integrating a new feature and shipping it, allows us to gather a lot of insights and metrics on usage, performance and potential bugs. Because you ship one thing at a time it's easy to figure out what and when it went wrong, to then fix it immediately.
Keeping each other accountable
Part of our process is to make sure multiple eyes look at code before it is added to the application we're building. By doing code reviews a lot of issues with code structure or weird naming, and other inconsistencies or sub-par setup's can be weeded out. This is also a powerful tool for knowledge-sharing and guiding less experienced developers by showing best practices and giving input right in the code.
Testing by product team
As mentioned before, in the manual testing part of this article, we put heavy emphasis on testing and making sure non-developers test our features. Unconsciously, as a developer you will be careful and try to stick to "the golden path" where everything works. By having other people in the team test the features, a programmer is more conscious of this while building, and more issues are found by testing.
Checking in with the client and their end user a lot
Embedded in our company culture, we focus on giving a demo every week, showing what we've done and how it is working right now, even going so far as to give the client the current state of app themselves, so they can test while we are still working on this. We believe that this complete transparency builds trust and allows us to focus much more on outcome than output, by listening to the clients feedback and adjusting our focus during the track rather than after. This will always result in a product that our client and the users of our client are most happy.
Laser focus and user testing
For that last one, we put a lot of focus on user testing, making sure we show actual users the app and interviewing them on exactly what they need and how they want it. Understanding our client and our use-case this intimately allows us to build the right features in the right way from the start as much as possible. This laser focus results in code that is a lot more logically structured and clean overall as opposed to a feature that's changed and adapted 5 times because a stakeholder is unhappy. That always leads to junk code.