Other Things...

Other Things... Like Coding, And Random Stuff...

Production Sites & Micro-Services on $50 Per Month


First off, I'm not about to sell you anything. I'm just sharing how I run multiple quasi-production websites and microserves on a budget. $50 per month can buy you decent server "rented" at a data center, which can be cheaper than colocating your own server.

For this example, I'm using a dedicated server from reliablesite.com with i7, 32GB memory and SSD. Not all that powerful compared to the Windows servers I run for true production sites, but sufficient for some websites and microservices. And, did I mention, cheap?

Why am I doing this? For one, there are a number of microservices in use by my production sites and the Windows/Docker combination just didn't work out.

I have used Linux since 1994 to host numerous site over the years. Then mid 2000s jumped ship and switched to Windows Servers. Switching back to Linux for Docker, Docker Compose, Docker Swarm, etc. was simply a no-brainer. Using cloud services proved too expensive (small company, no million $ budget for infrastructure :-) so I set out to find alternatives.

Linux is now used for quasi-production sites like client sites used by clients before finalizing projects, demo sites, and of course micro services. No, this won't get you high availability/scalability like Kubernetes or cloud services would, but you do get some those benefits, (better than my Windows servers) without breaking the bank.


But getting there is a bit harder with Linux because (you'll hate me), you're not using "real" products. You're using tools, pieced together like Legos, or an Erector set. You build it. You may say "Docker is a product". Well no. It solves/implements a solution, but giving me a command line to manage the whole thing? That's not a product. Portainer on the other hand is what I consider a product. It gives me point and click access to pretty much everything in Docker. Being a command line jockey is not my idea of knowing a product, understanding what it does or seeing the big picture. Portainer on the other hand is what Docker should have been. Portainer + Docker is a real product, worth paying for. And with this combination, the learning curve is substantially reduced.

Piecing Things Together

So my $50 server runs Docker and Docker Compose. Nginx, Portainer and Certbot (for SSL certs) run in containers. This setup is sufficient to manage and maintain the websites and micro services. The sites use SQL, Redis and some other services (all running in containers).


Nginx is essential to running multiple sites on the same server as all sites (normally) are accessed through ports 80 and 443. Multiple sites cannot "share" the same port so Nginx has to sit there and divvy up the traffic and send it to the correct site, which all run in Docker containers.

But I quickly realized, configuring Nginx is a pain in the butt. And Blue/Green deployment which I have used for years on Windows was not really an option out of the box. Yes, I could build a bunch of bash scripts to switch, but come on, this is not the 1990s.

Nginx Config

Nginx.conf can quickly become unwieldy with many sites and SSL certs. With Blue/Green deployment each site essentially has 3 URLs/domains (production, blue and green). And the Blue and Green sites should only be accessible from certain IP addresses, as I did not want them publicly accessible. And each URL has its own SSL cert. So many opportunities to make mistakes in the configuration...

So I set out to develop a web based tool to configure Nginx. Of course this is based on YetaWF and is simply a web site running in a container that lets me configure a site, using SSL with Blue/Green deployment and URLs for access to both Blue/Green sites for testing purposes before releasing to become the active site.

Below is a screenshot of a server showing a few domains:

After changing a site's properties, it generates a new nginx.conf file. It actually uses a model file and merges its generated code into it, so predefined nginx settings can be merged with the generated settings.
And clicking on "Reload Nginx Configuration" will activate the new settings in Nginx. No command line needed.


Sites, particularly the Blue/Green URLs are only accessible by defined IP addresses. All this is generated automatically.

The entire server only exposes ports 80 and 443. No Sql, Redis, Portainer, etc. ports can be accessed as they're all running in containers and none of the ports is exposed.

Iptables are also used to lock down all ports except 80/443. This of course was a manual process. It's a shame there is no Iptables UI. Oh well.


With this extremely cheap and easy to use setup I get better availablility (containers restart automatically), scalability for micro services and of course super simple website deployments. With Blue/Green deployment you'll never publish a site that doesn't work, because you can test the site before it becomes the active site. All for around $50 per month. Of course it does nothing if the entire server dies. That's a problem for another day to find an equally cheap solution.

2 Comments - 1 Comment Pending Approval

Gravatar image - billp@squaretree.com
Powershell and going backwards
I'm with you on the whole movement to command line, or in linux case lack of movement to UI.  After years of Unix back in the early early 80's where everything was vi and bash scripts and command line I was so happy to be abstracted away from all that.  Why waste precious synapses remembering the exact sequence of switches needed for everything you want to do.  Don't get me wrong, there are times when I definitely go to the command line for something I can't do in the UI and that's fine.  But it's the exception not the rule except in the Linux world.  And if someone tells me it's because you have more control I just ask them to load any package and watch hundreds of lines of stuff it's doing to your computer and tell me they know all the ramifications of all those additions or subtractions or settings or changes.  Seriously.  I'm not a docker user yet but as I get more back into Linux, because yes it is cheap, I will just so I can do try loading different functionality and not worry that I'm going to have to rebuild my whole host machine like almost happened installing nodeJS.  
Back to the title, we've had it great in Windows for years with excellent interfaces.  Now the THRILL is to be conversant with Powershell and people do things with it that they could do just as easily with the interface.  I guess it makes them feel like real computer geeks.  I got news, a real computer geek does as many things as easily as possible and uses the powerful tools for what they remain for, complex things you can't do otherwise.  
Can you upgrade your captcha so I don't have to guess what they think a boat is
Bill Pennock

Add New Comment

Complete this simple form to add a comment to this blog entry.