This is a static website running on a server I got for $3 a month. It has 1000 Megabits per second of bandwidth up and down and these are the specs:
_,met$$$$$gg. ,g$$$$$$$$$$$$$$$P. ,g$$P"" """Y$$.". ,$$P' `$$$. ,$$P ,ggs. `$$b: d$$' ,$P"' . $$$ $$P d$' , $$P $$: $$. - ,d$$' $$\; Y$b._ _,d$P' Y$$. `.`"Y$$$$P"' `$$b "-.__ `Y$$ `Y$$. `$$b. `Y$$b. `"Y$b._ `""""
root@cool-website.xyz OS: Debian 11 bullseye Kernel: x86_64 Linux 5.10.0-11-amd64 Uptime: 6h 13m Packages: 657 (thanks BloatJS) Shell: bash 5.1.4 Disk: 21G CPU: AMD Ryzen 9 3900X 1-Core @ 3.8GHz GPU: Cirrus Logic GD 5446 RAM: 976MiB
It uses traditional technologies (Debian Gnu/Linux, Nginx, No javascript, No real backend), and it can handle about 20,000 people at the same time. It also does my email.
I did it this way because it's easy and because I'm not really a web developer, but once I was finished, I thought, "Hmm... I wonder how slow it would be if I made this the modern way. "
So that's what I did. I remade this website using what I assume are modern, fashionable technologies, and then I ran an experiment. The technologies in question are NodeJS, ReactJS, and MariaDB. (MongoDB wasn't in Debian's package manager and it was too confusing to download).
For the experiment, I used Loadster to have 1000 bots in a loop doing the following:
- Visit my website
- Wait a second
- Click on the link to my list of blog articles
- Wait another second
- Click on my blog post about Returning to the Gold Standard
- Stay there for three seconds.
This was meant to simulate my best blog post becoming popular. The great thing about Loadster is that it's bots use actual browsers to more accurately simulate a real person. I did the experiment twice, once on the static website, and once on the modern website. Other than the technologies being used to present the website, I tried to keep everything about the website the same between the two experiments.
How much does a Modern JS stack slow you down?
Resource | Slowdown |
CPU | 4.7x CPU per page load |
Load Time | 3.8x page load time |
Bandwidth | 4.4x as much bandwidth per page load |
RAM | 2.7x as much RAM |
Max concurrent users on cheap server | 16,000 less before the server overloads |
The chart but as an image: click here
Results in detail:
Average Load time per page:
Normal:
120 milliseconds
Modern:
450 milliseconds
3.75 times as long of a load time.
The load times didn't increase as more bots joined the site, but neither of them were close to maxing out the CPU. I would attribute this to looking things up in a database adding that little amount of lag. Also, since the client has to request the content of the page after loading, it would take 2 round trips for the actual page to load.
Note: Since the load time was more for the react site, the bots were going through the loop of visiting the website slower. I'm adjusting the value for x times as slow by the page views per second, because the server was doing different amounts of work. I was getting 390 page views per second on the static site, and 51 page views per second on the react site. However, since the react site was a single page "application", I'm multiplying it's value by 3 to get 153 page views per second.
CPU usage:
Normal:
Base: 0.3%
Peak CPU usage: 36.6%
Modern:
Base: 0.7%
Peak CPU usage: 68.0%
4.7 times as much CPU per page view
RAM usage (whole operating system was measured):
Normal:
Base: 90 MB
Peak: 132 MB
Modern:
Base: 292 MB
Peak: 351 MB
2.7x as RAM
Bandwidth Usage:
Normal:
Max Instantaneous: 9.45 Megabits/second
Total: 122.7 MegaBytes
Bandwidth per bot loop: 8,028 bytes
Modern:
Max Instantaneous: 16.99 Megabits/second
Total: 245.5 MegaBytes
Bandwidth per bot loop: 35,191 bytes
4.4 times as much bandwidth
Estimated Maximum Concurrent Users:
Normal Website:
Around 20,000
Modern Website:
Around 4,000
Here is the code for the modern website.
So we are looking at around a 400% to 500% slow down . That's actually not as bad as I was expecting, but this represents the best case scenario for Modern Websites. Aside from page loading time, I didn't take user side performance into consideration, but we all know who would win that one. I've never had my computer's fans turn on on a simple website. And this was an unusually lean modern website , at around a few kilobytes per page. If I had included even one multi-megabyte image as is common these days, all these metrics go out the window. Either way, the fact that I can have many thousands of simultaneous visitors in each scenario on a three dollar a month server really shows how powerful modern computers have gotten.
I never understood why people think they need to build complicated frontends with tons of javascript. All people really care about is how it looks, which is CSS. You don't need a loading screen if your website is that fast. And look how far I can go with CSS alone.
Here is the code if you want to use/modify this for a menu bar on your website.
If I was making a "web app", I would spend a week making the CSS look really cool and just have all interaction be submitting HTML forms to a basic API, maybe using JS sparingly for when it's really needed. Then in a few years when some new design trend is in, I would just switch out the CSS and I'm done. But that's just me.
P.S. The way I calculated 20,000 people is that when there was 36.6% CPU usage, I was getting around 8 thousand bot loops per minute. Assuming that someone would spend a minute on my website, 20,000 people seemed reasonable for my website under full load.
P.P.S. My article on the gold standard is really good though. Click here to read it.