Disclaimer: I’m presenting this post without any figures or math, so it’s just my opinion based on professional past eye-witnessed experiences. This post will not be about hate, but rather on the weaknesses of delivery of recent trending technologies.
Recently I’ve seen a dangerous trend in software development, more specifically about web and desktop technologies. I wish not to bash on software frameworks such as Electron, but I think there are hidden costs for both junior developers and enterprises are ignoring. Let’s dig a bit deeper ahead.
Whenever you take the technical choice to abstract your codebase with a multiplatform framework, you need to be sure about the hidden costs of using it, and I’m not talking just about the implied costs of software performance. If you provide a poor user experience because your app feels crap to use, that’s a potential paying user that you’ll lose or at least will be upfront harder to acquire.
Sometimes software development teams are on a budget and need to make a choice like this, in order to increase market access and in that way, achieve faster market-product fit. I can understand that, but what is hard to reconcile to me, is the cost you save equals the cost your customers end up paying. Ideally, you’d not cut costs on this because after all, you need to provide the best-in-class experience to your users if you’re to retain them over several billing renewal cycles.
To put it bluntly: As a company, you might mistakenly think to save on software development costs, but your users actually don’t. What actually happens is: They end up paying the costs you attempted to save (more on this later), exponentially. It’s inversely proportional.
Did you deliver an app to the marketplace quickly? Yes. Does your app solve the problem? To some degree. Does the app perform painlessly fast? Muddy waters my friend. Can it run Crysis? I don’t think so.
The expected outcome is that you save on costs and have a greater runway for your startup to live, but let’s not forget: Poor software performance can be considered a bug. If you’ve read Steve McConnell’s Code Complete 2nd. Ed., you’ll notice proper software development is hard, and most of the time: with 5, 6 or even 7 figures on costs. And if your ideal resource-saving scenario doesn’t realize (time and money), your company will eventually find themselves reimplementing the app again in the latest technology-fad of that moment, feeding the vicious cycle.
Conversely, paying high costs on software development, while doesn’t guarantee business success, certainly helps in capturing better engineering for your product or service. So you had to have a team on iOS development, another one for Android development, another one for Windows, macOS, and Linux, but you delivered a superior experience. You might need to strike a balance between these two tug-of-war situations, to meet both business requirements and good engineering.
Imagine a world where mobile devices and personal computers ran on public clouds (much like Google Stadia), such as AWS or Azure. In such a scenario, your users would have to provision more expensive compute instances to actually run your 300 Mb social media application (without accounting for the data it stores locally). Storage and RAM are evergrowing, but here’s a little secret nobody tells you: There’s no need to take it up in its entirety!
Boring technology is really great not only because of its predictability, but it’s also great because most of the time it has been battle-tested for performance. The vulnerability of bloated technologies lies in the trend that some applications seem to be running really fast, but these are very few. So few they can be counted with the fingers of a single hand. All other applications range from overweight to really heavy on the OSes they run on. Maybe the framework is an easy “abusable” trap to create underperforming applications. Hat tip to these engineers of these few applications, these are a great feat of debugging, profiling, and engineering to achieve these results.
Yes, I can hear you… “money talks,” but if that’s the case, then consider technologies that overall reduce the cost to implement and to run. Example: Sciter‘s learning curve is harder than Electron’s, it actually requires you to learn a native or intermediate language to implement your business logic, and it also has 5 times less carbon footprint, but hey! You’re a proper engineer shipping some serious code to production environments, after all, you choose what’s best for your customers, or do you? 😉
On a side note: I consider the origin of this dangerous trend to be from the specific situation when junior software developers skip computer, software architecture, and software design classes irresponsibly delivering bloated software to the marketplace.
Ignorance is the root and stem of all evils, Plato once said.
Source of inspiration: A Plea for Lean Software (Niklaus Wirth, 1995)
blog comments powered by Disqus