Hosting a Ghost blog on OVH with Docker

Since years I was looking for some low budget, Europe hosted solution to host Web applications in technologies other than php/html. Here came OVH VSP product, 2.99€ / month for a 10Go VPS with a large opperating system choice, that seems promising to learn server management !

Docker is magic

At the time of writing, I don't have many knowledge in linux and server management, but one thing in OVH offer immediatly catch my attention, they provide server build with the trendy container engine of the moment: Docker.

Quick summary: Docker describe itself as "An open platform for distributed applications for developers and sysadmins". In other words, Docker allow developper to publish application images to the hub (containing everything from the core OS to the application), and to the sysadmin to deploy images in one simple command line.

That saved my life ! No Os configuration, no instalation of node, git etc... Just the deployment of an app image of my choice and it will be running.

Running Ghost with Docker

Knowing that, it took me 2 minutes to have my first blog application running, by typing in a console :

> docker run -d -p 80:2368 ptimof/ghost npm run --production
  • -d: Run the image as a Daemon (Do not stop after closing console)
  • -p 80:2368: Map the VPS port 80 with the docker image port 2368
  • ptimof/ghost: The name of the applicatoin image
  • npm run --production: The command to run inside the container

Fine, after a little DNS configuration, a visit to will show up a nice and fresh Ghost blog.

I choose the ptimof/ghost instead of the official ghost image because this one provide an easier configuration.

Data persistance

A question quickly fall : Where is my data, how to ensure data persistence if I restart my image ?

Docker provide folder sharing between the host (the VPS) and the docker image. When run an image, it will override the image files with the folder files, and create the files on the disk if not present on the folder.

Our command line become now :

> docker run -d -p 80:2368 -v /root/ghost:/var/lib/ghost ptimof/ghost npm run --production

Where -/root/ghost is my directory where I want to keep my data.

After creating this folder and runnning my container, I can see that the folder have bee populated with some files (theme, databases etc...), perfect !

Docker compose

Ok, all that things is realy nice, but I realy don't wan't to redo this every time I will lauch my image on the server or on my local computer.

Here come to my rescue docker-compose, a nice CLI to run images set based on a docker-compose.yml file, wish will contain all image configuration.

  image: ptimof/ghost
  command: npm start --production
    - .:/var/lib/ghost
    - ./variables.env
  restart: always
    - "2368"

Has you see docker compose allows us to easily store our docker run config in a file to a easier use.

This file should be placed in our share forler /root/ghost.

I add some usefull options like :

  • env_file: allow the creation of an environment variable definition file
  • restart: Automaticaly restart the container if it crash

That allows us to simplify the deployment script to :

> docker-compose run -d -p 80:2368 blog

In order to use that don't forget to install docker-compose on your VPS virtual machine. That can be easily done by downloading the library from the docker github repository :

> curl -L`uname -s`-`uname -m` > /usr/local/bin/docker-compose
> chmod +x /usr/local/bin/docker-compose


Docker made the deployment of my blog awesomly easy, in a few command line I have a working and very strong environment, runnable on any machine (OVH VPS, local machine), without struggle with any dependencies (node js etc...).

And bonus: It's very fun to play with !