Technical Aspects

Platform agnosticism and abstraction layers can be combined with micro data centers

data serverThe growth of both frontend and backend compilation or script execution was one of the biggest changes in how a server was expected to operate. For the first time, it really was an interactive process which went far beyond the initial model of a server in a restaurant. Now the user and the website work working together to create something rather than just requesting or serving it. To be sure, a basic request for data and the later delivery of it was still the backbone of what a server did. But instead of the totality, it was now just one part among many. For a long time, this necessitated more powerful hardware. Eventually, the ability to work with code on the frontend evened out with the powerful backend within data centers. This is really the moment when movement changed from a steady march forward and into an amazingly fast and wonderful exponential growth.

Even the most underpowered device running a browser was now capable of some pretty impressive feats. This took a lot of the pressure off of the backend. And for the first time in quite a while, developers found themselves with far more power than they had any use for. Obviously, prior development hinged on having some extra power. But the changes during this period allowed them a wealth of free storage, processing potential, and memory. One of the first changes that happened, as a result, was a rise in virtualization. One big server would use software to simulate aspects of multiple smaller servers. This would usually involve sharing some aspects of the machine. For example, the main kernel of an operating system might be represented within every virtual machine running on the server. Though it was also becoming quite common for a machine to fully emulate a new server, even to the point of running multiple different operating systems.

data serverThis process began to get some people thinking about how they were using their resources. The idea of having a large server was just so embedded within people’s perceptions of a data center that it didn’t even occur to them that things could be different. This was the birth of a search for true micro data centers. Interestingly enough, it was in many ways possible for a similar reason to why the extra resources were now available. The commercial push for more powerful smartphone models is what put JIT compilation of frontend content into everyone’s hand. This freed up a huge amount of resources for the average server. A side effect of that search for more powerful smartphones was the creation of very small but powerful processors. In particular, ARM processors went from an afterthought in most people’s minds and into a primary development target.

While there’s the occasional exception, for the most part, the very first smartphones and the majority of current models all focus on ARM based processors. Meanwhile, servers tended to use x86-64 based processors. There was just too much of a difference in what each could do at that point. If this had happened even a few years earlier than it did people would have been in a very different position. But thankfully at that point abstraction was becoming as popular as virtualization. There was a lot of ready-made software solutions already out in the world to help people migrate code from one system to another.