— Feature Card
Description
In the process of looking through the documentation, I’ve realized that there is no “simple” way to do a 1-click deploy of a fresh frontity instance and wordpress. The documentation is actually quite detailed and walks you through many different choices but falls short there.I know that many people use frontity with existing systems but we could really benefit from being able to go straight to production for a sample application like mars-theme.
User Stories
As a frontity developer
I want to be able to deploy both frontity and WordPress to a single hosting with one click
so that I have as few concepts to learn as possible (hosting, etc.)
I think we could create 1-click installs for Heroku (with buildpacks) and Google Cloud Run (with docker). It looks like DigitalOcean doesn’t allow this, all their 1-click installs are created and maintained by them.
Apart from that, maybe we can add tutorials to configure WP + Frontity in the same server using Apache/Nginx with forever/pm2.
yes, that’s exactly the kind of thing that I’m talking about. I think both are very valid options.
Well, more than Heroku and Google Cloud Run, I’d say we need to create a buildpack and a docker. I don’t know if buildpacks support more than one language (php + node).
Is there any proof-of-concept of this?
No that I know of.
But for the conversations on Proof of concept on Pantheon + Deploy on Google Cloud Platform and the experience of some of the big publishers using Frontity, it seems like the Theme Bridge is going to be a better solution.
I’ve built this several times from scratch recently and have been planning a full guide. Will update you once I have it.
Nginx with pm2 specifically
That’s really nice, thanks @403page.
Did you have any problems? From what we have heard from other developers it seemed easy at first but it turned out to be tricky for the amount of exceptions you need to take into account in your Nginx config.
This is my current frontity configuration with pm2 and Nginx let me know if you need help regarding Nginx and pm2 or forever
upstream frontity {
server 127.0.0.1:3000 max_fails=0 fail_timeout=10s weight=1;
# Send visitors back to the same server each time.
ip_hash;
# Enable number of keep-alive connections.
keepalive 512;
}
server {
root /oscod/webapps/frontitydemo/build;
index index.html index.htm;
server_name example.com;
location / {
proxy_pass http://frontity/;
# Headers to pass to proxy server.
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $host;
proxy_set_header X-NginX-Proxy true;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_cache_bypass $http_upgrade;
proxy_http_version 1.1;
proxy_redirect off;
# Go to next upstream after if server down.
proxy_next_upstream error timeout http_500 http_502 http_503 http_504;
proxy_connect_timeout 5s;
# Gateway timeout.
proxy_read_timeout 20s;
proxy_send_timeout 20s;
# define buffers, necessary for proper communication to prevent 502s
proxy_buffer_size 128k;
proxy_buffers 4 256k;
proxy_busy_buffers_size 256k;
}
location ~* \.(css|js|ico|gif|svg)$ {
expires 168h;
add_header Pragma public;
add_header Cache-Control "public, must-revalidate, proxy-revalidate";
}
location ~* \.html$ {
expires -1;
}
keepalive_timeout 10;
client_max_body_size 200M;
listen 443 ssl;
ssl_certificate /etc/letsencrypt/live/example.com/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/example.com/privkey.pem;
include /etc/letsencrypt/options-ssl-nginx.conf;
ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem;
}
server {
if ($host = example.com) {
return 301 https://$host$request_uri;
}
server_name example.com;
listen 80;
return 404;
}
2 Likes