My goal:
-serve multiple python flask applications under the same development server
-no server side caching
-run gunicorn applications as background tasks
-nginx as reverse proxy that will resolve application instances by hostname, running all on port 80
I've realised I can run my Flask/Gunicorn application in a background using systemd - the problem is, it serves cached version of my application and I'd like to serve a new version of my build, everytime I commit new work into it while running
systemd/system/website.com.service
[Unit]
Description=Gunicorn instance to serve website.com
After=network.target
[Service]
User=melcma
Group=www-data
PIDFile=/var/tmp/gunicorn.pid
WorkingDirectory=/var/www/website.com
Environment="PATH=/var/www/website.com/env/bin"
ExecStart=/var/www/website.com/env/bin/gunicorn --workers 1 --bind unix:website.com.sock -m 007 wsgi:app
ExecReload=/bin/kill -s HUP $MAINPID
ExecStop=/bin/kill -s TERM $MAINPID
[Install]
WantedBy=multi-user.target
From what I learned, this line: "ExecReload=/bin/kill -s HUP $MAINPID" is supposed to reload my server, but how do I trigger it under files change? I've tried to add "--reload" to ExecStart command but no effect.
And bonus question:
Is using systemd for this purpose a good way or it would rather butcher the server? (running task in background that launches gunicorn instance and refreshes itself under files changes)