Python is the oldie-but-goodie of programming languages. Since the ’80s, it has provided developers with reliability and ease-of-use.

No matter how reliable your coding language is, however, you need to implement continuous integration and delivery (CI/CD) to detect and remedy errors quickly. When you have confidence in the accuracy of your code, you can ship updates faster and with fewer mistakes.

By the end of this hands-on guide, you’ll understand how to build, test and deploy a Python website. We’ll then show you how to use a continuous integration and delivery platform, Semaphore, to automate the whole process. The final CI/CD pipeline will look like this:

What we’re building: a multi-stage continuous integration and deployment pipeline for Python web apps

Demo application

In this section we will play with a demo application. It’s a simple task manager. We can create, edit and delete tasks. We’ll also have a separate admin site to manage users and permissions. The website is built with Python and Django. The data will be stored on MySQL.

Django is web application framework based on the MVC (Model-View-Controller) pattern. As such, it keeps a strict separation between the data model, the rendering of views, and the application logic, which is managed by the controller. An approach that encourages modularity and make development easier.

Prerequisites

Before getting started you’ll need the following:

Get the code

  1. Create an account on GitHub.
  2. Go to Semaphore Django demo and hit the Fork button on the top right.
  3. Click on the Clone or download button and copy the provided URL.
  4. Open a terminal on your computer and paste the URL:
$ git clone https://github.com/your_repository_url

What do we have here?

Exploring our new project we find:

  • README.md: instructions for installing and running the app.
  • requirements.txt: list of python packages required for the project.
  • tasks: contains the main code for our app.
  • pydjango_ci_integration:
    • settings.py: main django config, includes db connection parameters.
    • urls.py: url route config.
    • wsgi.py: web server config.
  • .semaphore: contains the continuous integration config.

Examining the contents of requirements.txt reveals some interesting information:

  • nose and coverage: unit tests

Developers use unit tests to find errors. They validate code behavior by running small pieces of it and comparing the results. The nose package runs the test cases. And coverage measures their effectiveness, it can figure out which parts are tested and which are not.

  • pylint: static code analysis.

pylint scans the code for anomalies: bad coding practices, missing documentation, unused variables, among other dubious things. It adheres to the official PEP8 style guide. By following a standard, we get better readability and easier collaboration in a development team.

  • selenium: browser testing

Selenium is a browser automation tool primarily used to test websites. Tests done on the browser can cover parts that otherwise can’t be tested, such as javascript running on the client.

Run the demo on your computer

To the see application in action, we still have some work ahead of us.

Create a database

Tasks are stored on a database called pydjango:

$ mysql -u root -ANe"CREATE DATABASE pydjango;"

If you have a password on your MySQL: add -p or --password= to the last command.

Create a virtualenv and install dependencies

A virtualenv is a special directory for Python that will contain the requirements. Create a virtualenv and activate it:

$ python -m venv virtualenv
$ source ./virtualenv/bin/activate

Install the packages as usual:

$ pip install -r requirements.txt

Django should now be installed on your computer.

Django setup

Our pydjango database is empty. Django will take care of that:

$ python manage.py migrate

manage.py is Django’s main administration script. migrate creates and updates all db entities automatically. Thus, each time we modify our data model, we need to repeat the migration.

We should also create an administration user. It will allow us to manage users and permissions:

$ python manage.py createsuperuser

Fire it up

We’re all set. Start the application. With Python and Django we don’t need a web server such as Apache or nginx.

$ python manage.py runserver

Open a browser and contemplate your shiny new website in all its glory. The main site is found at http://127.0.0.1:8000. The admin backoffice should be located at http://127.0.0.1:8000/admin.

Screenshot of Python Django Initial Admin Dashboard
What you should see when you launch the Django application.

Testing the app

Now that the application is up and running, we can take a few minutes to do a little bit of testing. We should start with the code analysis:

pylint --load-plugins=pylint_django tasks/*.py
************* Module tasks.views
tasks/views.py:11:0: R0901: Too many ancestors (8/7) (too-many-ancestors)
tasks/views.py:18:4: W0221: Parameters differ from overridden 'get_context_data' method (arguments-differ)
tasks/views.py:24:0: R0901: Too many ancestors (11/7) (too-many-ancestors)
tasks/views.py:38:0: R0901: Too many ancestors (8/7) (too-many-ancestors)
tasks/views.py:46:0: R0901: Too many ancestors (11/7) (too-many-ancestors)
tasks/views.py:60:0: R0901: Too many ancestors (10/7) (too-many-ancestors) ------------------------------------------------------------------
Your code has been rated at 8.97/10 (previous run: 8.38/10, +0.59)

pylint gives us some warnings and an overall code rating. We got some “you should refactor” (R) and style warnings (W) messages. Not too bad, although we may want to look into that at some point in the future.

The testing code is located in tasks/tests:

  • test_browser.py: checks that the site is up and its title contains “Semaphore”.
  • test_models.py: creates a single sample task and verifies its values.
  • test_views.py: creates 20 sample tasks and checks the templates and views.

All tests run on a separate, test-only database, so it doesn’t conflict with any real user’s data.

If you have google chrome or chromium installed, you can run the browser test suite. During the test, the Chrome window may briefly flash on your screen:

$ python manage.py test tasks.tests.test_browser
nosetests tasks.tests.test_browser --with-coverage --cover-package=tasks --verbosity=1
Creating test database for alias 'default'...
[07/May/2019 13:48:01] "GET / HTTP/1.1" 200 2641
[07/May/2019 13:48:03] "GET /favicon.ico HTTP/1.1" 200 2763
.
Name Stmts Miss Cover
-----------------------------------------------------------------
tasks/__init__.py 0 0 100%
tasks/apps.py 3 3 0%
tasks/migrations/0001_initial.py 5 0 100%
tasks/migrations/0002_auto_20190214_0647.py 4 0 100%
tasks/migrations/0003_auto_20190217_1140.py 4 0 100%
tasks/migrations/__init__.py 0 0 100%
tasks/models.py 14 14 0%
-----------------------------------------------------------------
TOTAL 30 17 43%
----------------------------------------------------------------------
Ran 1 test in 4.338s OK
Destroying test database for alias 'default'...

We can also run the unit test suites, one at a time:

$ python manage.py test test tasks.tests.test_models
$ python manage.py test test tasks.tests.test_views

Finally, we have the Django checklist to look for security issues:

$ python manage.py check --deploy System check identified some issues: WARNINGS:
?: (security.W004) You have not set a value for the SECURE_HSTS_SECONDS setting. If your entire site is served only over SSL, you may want to consider setting a value and enabling HTTP Strict Transport Security. Be sure to read the documentation first; enabling HSTS carelessly can cause serious, irreversible problems.
?: (security.W006) Your SECURE_CONTENT_TYPE_NOSNIFF setting is not set to True, so your pages will not be served with an 'x-content-type-options: nosniff' header. You should consider enabling this header to prevent the browser from identifying content types incorrectly.
?: (security.W007) Your SECURE_BROWSER_XSS_FILTER setting is not set to True, so your pages will not be served with an 'x-xss-protection: 1; mode=block' header. You should consider enabling this header to activate the browser's XSS filtering and help prevent XSS attacks.
?: (security.W008) Your SECURE_SSL_REDIRECT setting is not set to True. Unless your site should be available over both SSL and non-SSL connections, you may want to either set this setting True or configure a load balancer or reverse-proxy server to redirect all connections to HTTPS.
?: (security.W012) SESSION_COOKIE_SECURE is not set to True. Using a secure-only session cookie makes it more difficult for network traffic sniffers to hijack user sessions.
?: (security.W016) You have 'django.middleware.csrf.CsrfViewMiddleware' in your MIDDLEWARE, but you have not set CSRF_COOKIE_SECURE to True. Using a secure-only CSRF cookie makes it more difficult for network traffic sniffers to steal the CSRF token.
?: (security.W019) You have 'django.middleware.clickjacking.XFrameOptionsMiddleware' in your MIDDLEWARE, but X_FRAME_OPTIONS is not set to 'DENY'. The default is 'SAMEORIGIN', but unless there is a good reason for your site to serve other parts of itself in a frame, you should change it to 'DENY'. System check identified 7 issues (0 silenced).

We got some warnings, but no showstoppers, we’re good to go.

Deploy to PythonAnywhere

Websites are meant to run on the internet. In this section we’ll see how we can publish our app for the world to enjoy. PythonAnywhere is a hosting provider that, as the name suggests, specializes in Python. In this section, we’ll learn how to use it.

Sign up with PythonAnywhere

Head to PythonAnywhere and create an account. The free tier allows one web application and MySQL databases, plenty for our immediate needs.

Create database

Go to Databases

Screenshot of Databases tab in PythonAnywhere application
Navigate to the Databases tab in PythonAnywhere.

Set up a database password. Avoid using the same password as the login:

Screenshot of setting a database password in PythonAnywhere
Screenshot of setting a database password in PythonAnywhere.

Take note of the database host address.

Create a database called pydjango_production

Screenshot of database creation in PythonAnywhere
Give your new database a name and click the Create button.

You’ll notice your username has been automatically prefixed to the database, that’s just how PythonAnywhere does things.

Screenshot of username prefix to database name in PythonAnywhere
You should see your username prefixed to your database name.

Create an API Token

An API Token is required for the next automation step. To request one:

  1. Go to Account
  2. Click API Token tab.
  3. Hit the Create button.
  4. Take note of the API Token shown.

Create the website

There are a couple of alternatives for editing files in PythonAnywhere. From the Dashboard you can:

  • Under Files: use Browse files to edit and Open another file to create.
  • Or click on Bash button under New console. There we can use vim or emacs.
Screenshot of new console creation in Python Anywhere
In the Dashboard, click the Bash button to create a new console.

Create a file called .env-production:

# ~/.env-production # This value is found on PythonAnywhere Accounts->API Token.
export API_TOKEN=<PYTHON_ANYWHERE_API_TOKEN> # Django Secret Key - Use a long random string for security.
export SECRET_KEY=<DJANGO_SECRET_KEY> # These values can be located on PythonAnywhere Databases tab.
export DB_HOST=<DATABASE_HOST_ADDRESS>
export DB_USER=<USERNAME>
export DB_PASSWORD=<DATABASE_PASSWORD>
# The name of the DB is prefixed with USERNAME$
export DB_NAME='<USERNAME>$pydjango_production'
export DB_PORT=3306

Source the environment variables to make them available in your session:

$ source ~/.env-production

Now we’re ready to create the website. Luckily for us, there is an official helper script. If you own a domain and wish to use it for your site, use the following command:

$ pa_autoconfigure_django.py --python=3.7 --domain=<YOUR_WEBSITE_ADDRESS> <GITHUB_REPOSITORY_URL>

If you don’t have a domain, just skip the --domain option to use the default: USERNAME.pythonanywhere.com.

$ pa_autoconfigure_django.py --python=3.7 <GITHUB_REPOSITORY_URL>

The script should take a few minutes to complete. Take a cup of coffee and don’t forget to stretch.

Create a CNAME

This step is only required if you’re using your own domain. Go to Web, copy the value under DNS Setup.

Screenshot of CNAME set up in Python anywhere.
Use the value under DNS setup to create a CNAME record for your domain.

Now, head to your domain’s DNS Provider to create a CNAME record pointing that address.

Edit WSGI

WSGI is the interface Python uses to talk to the webserver. We need to modify it to make the environment variables available inside the application.

Go to Web and open the WSGI configuration file link.

Screenshot of the WSGI configuration file in PythonAnywhere
The WSGI configuration file link you’ll modify in the next step.

We need three lines added near the end of the file:

# This file contains the WSGI configuration required to serve up your
# Django app
import os
import sys # Add your project directory to the sys.path
settings_path = '/home/tomfern/staging.tomfern.com'
sys.path.insert(0, settings_path) # Set environment variable to tell django where your settings.py is
os.environ['DJANGO_SETTINGS_MODULE'] = 'pydjango_ci_integration.settings' # -------> ADD THESE NEXT THREE LINES <-------
from dotenv import load_dotenv
env_file = os.path.expanduser('~/.env-production')
load_dotenv(env_file)
# -------------------------------------------- # Set the 'application' variable to the Django wsgi app
from django.core.wsgi import get_wsgi_application
application = get_wsgi_application()

Go Live!

Time for all the hard work to pay off. Go back to Web and click on the Reload button. Welcome to your new website.

The importance of Continuous Integration

Testing is the bread and butter of developing, that’s just how it is. When done badly, it is tedious, ineffective and counter-productive. But proper testing brings a ton of benefits: stability, quality, fewer conflicts, less errors and confidence on the correctness of the code.

Continuous integration (CI) is a programming discipline in which the application is built and tested each time code is modified. By making multiple small changes instead of a big one, problems are detected earlier and corrected faster. Such a paradigm, clearly, calls for an automated system to carry out all the steps. In such systems, code travels over a path, a pipeline, and it must pass an ever growing number of tests before it can reach the users.

In the past, developers had to buy servers and manage infrastructure in order to do CI, which obviously increased costs beyond the reach of small teams. Fortunately, in this cloud enabled world, everyone can enjoy the benefits of CI.

Running continuous integration on Semaphore

Semaphore adds value to our project sans the hassle of managing a CI infrastructure.

The demo project already includes a Semaphore config. So we can get started in a couple of minutes:

Sign up with Semaphore

Go to SemaphoreCI.com and click on the Sign up with GitHub button.

Connect your repository

Under Projects, click on New. You’ll see a list of your repositories:

Screenshot of repository list for configuring continuous integration in Semaphore
Click on the Add repository button.

Push to GitHub

To start the pipeline, edit or create any file and push to GitHub:

$ touch test_pipeline.md
$ git add test_pipeline.md
$ git commit -m "added semaphore"
$ git push origin master

That’s it! Go back to your Semaphore dashboard and there’s the pipeline:

Screenshot of Python Django ci/cd pipeline in Semaphore
Python continuous integration pipeline on Semaphore

The Continuous Integration Pipeline

This is a good chance to review the config. Take a look at .semaphore/semaphore.yml:

# .semaphore/semaphore.yml # Use the latest stable version of Semaphore 2.0 YML syntax:
version: v1.0 # Name your pipeline. In case you connect multiple pipelines with promotions,
# the name will help you differentiate between, for example, a CI build phase
# and delivery phases.
name: Semaphore Python / Django Example Pipeline # An agent defines the environment in which your code runs.
# It is a combination of one of available machine types and operating
# system images.
# See https://docs.semaphoreci.com/article/20-machine-types
# and https://docs.semaphoreci.com/article/32-ubuntu-1804-image
agent: machine: type: e1-standard-2 os_image: ubuntu1804 # Blocks are the heart of a pipeline and are executed sequentially.
# Each block has a task that defines one or more jobs. Jobs define the
# commands to execute.
# See https://docs.semaphoreci.com/article/62-concepts
blocks: - name: "install dependencies" task: prologue: commands: - sem-version python 3.7 - sudo apt-get update && sudo apt-get install -y python3-dev && sudo apt-get install default-libmysqlclient-dev jobs: - name: pip commands: - checkout - cache restore requirements-$semaphore_git_branch-$(checksum requirements.txt),requirements-$semaphore_git_branch-,requirements-master- - pip download --cache-dir .pip_cache -r requirements.txt - cache store requirements-$semaphore_git_branch-$(checksum requirements.txt) .pip_cache - name: "Run Code Analysis" task: prologue: commands: - sem-version python 3.7 - checkout - cache restore requirements-$SEMAPHORE_GIT_BRANCH-$(checksum requirements.txt) - pip install -r requirements.txt --cache-dir .pip_cache jobs: - name: Pylint commands: # list out files that are in directory and working tree # grep -v will exclude the files being considered for pylint # grep -E will matches files having .py extension # This command will help to pass required python files to pylint along with pylint_djanog plugin # Pylint with -E option will display only if there is any error - git ls-files | grep -v 'migrations' | grep -v 'settings.py' | grep -v 'manage.py' | grep -E '.py$' | xargs pylint -E --load-plugins=pylint_django - name: "Run Unit Tests" task: prologue: commands: - sem-version python 3.7 - sem-service start mysql - checkout - cache restore requirements-$SEMAPHORE_GIT_BRANCH-$(checksum requirements.txt) - pip install -r requirements.txt --cache-dir .pip_cache jobs: - name: Model Test commands: - python manage.py test tasks.tests.test_models - name: View Test commands: - python manage.py test tasks.tests.test_views - name: "Run Browser Tests" task: env_vars: - name: DB_NAME value: 'pydjango' prologue: commands: - sem-version python 3.7 - sem-service start mysql - sudo apt-get install -y -qq mysql-client - mysql --host=0.0.0.0 -uroot -e "create database $DB_NAME" - checkout - cache restore requirements-$SEMAPHORE_GIT_BRANCH-$(checksum requirements.txt) - pip install -r requirements.txt --cache-dir .pip_cache - nohup python manage.py runserver & jobs: - name: Browser Test commands: - python manage.py test tasks.tests.test_browser - name: "Run Security Tests" task: jobs: - name: Deployment Checklist commands: - checkout - sem-version python 3.7 - cache restore requirements-$SEMAPHORE_GIT_BRANCH-$(checksum requirements.txt) - pip install -r requirements.txt --cache-dir .pip_cache - python manage.py check --deploy --fail-level ERROR

A lot to unpack here. I’ll go step by step. It starts with the config version and a name.

version: v1.0
name: Semaphore Python / Django Example Pipeline

The pipeline runs on a agent, which basically is a virtual machine paired with an operating system. The machine is automatically managed by Semaphore. We’re using e1-standard-2 machine (2 vCPUs, 4GB, 25GB disk) with an Ubuntu 18.04 LTS image.

agent: machine: type: e1-standard-2 os_image: ubuntu1804

Blocks define the pipeline actions. Each block has a task, and each a task contains one or more jobs. All jobs within a block run concurrently. Blocks, on the other hand, run sequentially. Once all jobs on a block are completed, the next block starts.

The first block installs Linux and Python packages.

blocks: - name: "Install Dependencies" task: prologue: commands: - sem-version python 3.7 - sudo apt-get update && sudo apt-get install -y python3-dev && sudo apt-get install default-libmysqlclient-dev jobs: - name: pip commands: - checkout - cache restore requirements-$SEMAPHORE_GIT_BRANCH-$(checksum requirements.txt),requirements-$SEMAPHORE_GIT_BRANCH-,requirements-master- - pip download --cache-dir .pip_cache -r requirements.txt - cache store requirements-$SEMAPHORE_GIT_BRANCH-$(checksum requirements.txt) .pip_cache

Commands used here are:

  • The prologue is executed before each job.
  • sem-version is used to set the active python version.
  • checkout clones the code from GitHub.
  • cache is used to store and retrieve files between jobs, here it’s used for the python packages.

The “Run Code Analysis” block uses pylint to review the code. Each job runs on a clean environment, so we need to retrieve the code and packages each time.

- name: "Run Code Analysis" task: prologue: commands: - sem-version python 3.7 - checkout - cache restore requirements-$SEMAPHORE_GIT_BRANCH-$(checksum requirements.txt) - pip install -r requirements.txt --cache-dir .pip_cache jobs: - name: Pylint commands: - git ls-files | grep -v 'migrations' | grep -v 'settings.py' | grep -v 'manage.py' | grep -E '.py$' | xargs pylint -E --load-plugins=pylint_django

The next block runs the Django models and views unit tests. The tests run in parallel, each with its own separate MySQL database, started with sem-service.

- name: "Run Unit Tests" task: prologue: commands: - sem-version python 3.7 - sem-service start mysql - checkout - cache restore requirements-$SEMAPHORE_GIT_BRANCH-$(checksum requirements.txt) - pip install -r requirements.txt --cache-dir .pip_cache jobs: - name: Model Test commands: - python manage.py test tasks.tests.test_models - name: View Test commands: - python manage.py test tasks.tests.test_views

In order to run browser tests, the application and a database need to be started. The prologue takes care of this. Once started, a selenium test is executed on a Google Chrome instance.

- name: "Run Browser Tests" task: env_vars: - name: DB_NAME value: 'pydjango' prologue: commands: - sem-version python 3.7 - sem-service start mysql - sudo apt-get install -y -qq mysql-client - mysql --host=0.0.0.0 -uroot -e "create database $DB_NAME" - checkout - cache restore requirements-$SEMAPHORE_GIT_BRANCH-$(checksum requirements.txt) - pip install -r requirements.txt --cache-dir .pip_cache - nohup python manage.py runserver & jobs: - name: Browser Test commands: - python manage.py test tasks.tests.test_browser

The last block does the security checklist. It will tell us if the app is ready for deployment.

- name: "Run Security Tests" task: jobs: - name: Deployment Checklist commands: - checkout - sem-version python 3.7 - cache restore requirements-$SEMAPHORE_GIT_BRANCH-$(checksum requirements.txt) - pip install -r requirements.txt --cache-dir .pip_cache - python manage.py check --deploy --fail-level ERROR

Continuous deployment for Python apps

Deployment is a complex process with a lot of moving parts. It would be a shame if, after painstainkinly writing tests for everything, the application crashes due to a faulty deployment.

Continuous Deployment (CD) is an extension of the CI concept, in fact, most integration tools don’t make a great distinction between CI and CD. A CD pipeline performs all the deployment steps as a repeatable, battle-hardened process.

Even the best test in the world can’t catch all errors. Moreover, there are some problems that may only be found when the app is live. Think, for example, a website that perfectly passes all tests but crashes on production because the hosting provider has the wrong database version.

To avoid this kind of problems, it is a good strategy to have at least two copies of the app: production for our users and staging as guinea pig for developers.

Staging and production ought to be identical, this includes all the infrastructure, operating system, database and packages versions.

Automating deployment with Semaphore promotions

We’re going to write two new pipelines:

  • Production: deploys manually at our convenience.
  • Staging: deploys to the staging site every time all the test pass.

Pipelines are connected with promotions. Promotions allow us to start other pipelines, either manually or automatically on user-defined conditions. Both deployments will branch out of the CI pipeline.

SSH Access

From here on, we need a paid account on PythonAnywhere, no way around it. We need direct ssh access. If you are subscribing, consider buying two websites, the second is going to be staging. You can easily upgrade your plan from your account page: switch to the “hacker” plan and bump the number of websites from 1 to 2.

If you don’t have a ssh key already on your machine, generating a new one is just a matter of seconds. Just leave blank the passphrase when asked:

$ ssh-keygen
Generating public/private rsa key pair.
Enter file in which to save the key (/home/tom/.ssh/id_rsa):
Created directory '/home/tom/.ssh'.
Enter passphrase (empty for no passphrase):
Enter same passphrase again:
Your identification has been saved in /home/tom/.ssh/id_rsa.
Your public key has been saved in /home/tom/.ssh/id_rsa.pub.
The key fingerprint is:
SHA256:c1zTZkOtF79WD+2Vrs5RiU4oWNImt96JkQWGiHAnA38 tom@ix
The key's randomart image is:
+---[RSA 2048]----+
| oo+... .o .. |
| o.+. .o . o ..|
| . E o = .o =o+|
| . B.+..++oB|
| .S=o. o.*=|
| .o= + .+o|
| o o oo |
| ... |
| .o |
+----[SHA256]-----+

Now we just need to let the server know about our key. Use your PythonAnywhere username and password:

$ ssh-copy-id <USERNAME>@ssh.pythonanywhere.com

Try logging in now, no password should be required:

$ ssh <USERNAME>@ssh.pythonanywhere.com
<<<<<<:>~ PythonAnywhere SSH. Help @ https://help.pythonanywhere.com/pages/SSHAccess

Storing credentials with secrets

The deployment process needs some secret data, for example the SSH key to connect to PythonAnywhere. The environment file also has sensitive information, so we need to protect it.

Semaphore provides a secure mechanism to store sensitive information. We can easily create secrets from Semaphore’s dashboard. Go to Secrets under Configuration and use the Create New Secret button.

Screenshot of secret creation in Semaphore
Click Create New Secret button in Semaphore to create a secret for your PythonAnywhere SSH key.

Add the SSH key, and upload your .ssh/id_rsa key to Semaphore.

Screenshot showing how to upload the SSH key to Semaphore
Input your Secret name, and upload the SSH key file.

Now we need a copy of the environment file. It’s the same file created when we were publishing the website:

# ~/.env-production # This value is found on PythonAnywhere Accounts->API Token.
export API_TOKEN=<PYTHON_ANYWHERE_API_TOKEN> # Django Secret Key - Use a long random string for security.
export SECRET_KEY=<DJANGO_SECRET_KEY> # These values can be located on PythonAnywhere Databases tab.
export DB_HOST=<DATABASE_HOST_ADDRESS>
export DB_USER=<USERNAME>
export DB_PASSWORD=<DATABASE_PASSWORD>
# The name of the DB is prefixed with USERNAME$
export DB_NAME='<USERNAME>$pydjango_production'
export DB_PORT=3306

Upload the production environment file:

Screenshot of adding a Secret in Semaphore for the production environment file.
Input the Secret name, and upload the production environment file.

Production deployment pipeline

To deploy the website, create a new pipeline file called .semaphore/deploy-production.yml.

# .semaphore/deploy-production.yml version: v1.0
name: Deploy Django to PythonAnywhere
agent: machine: type: e1-standard-2 os_image: ubuntu1804 blocks: - name: "Deploy production" task: # make secrets accessible for this job secrets: - name: env-production - name: ssh-key env_vars: # Your PythonAnywhere username - name: SSH_USER value: <USERNAME> # host for ssh connection - name: SSH_HOST value: ssh.pythonanywhere.com # Your website URL without HTTP or HTTPS - name: APP_URL value: <PRODUCTION_APP_URL> # Environment on hosting provider - name: ENV_FILE value: ~/.env-production jobs: - name: Push code to production commands: - checkout # prepare deploy script, subtitute environment variables - envsubst < deploy.sh > ~/deploy-production.sh # ensure correct permissions for ssh key - chmod 0600 ~/.ssh/id_rsa_pa # add remote host to know_hosts - ssh-keyscan -H $SSH_HOST >> ~/.ssh/known_hosts # ensure ssh will use the key - ssh-add ~/.ssh/id_rsa_pa # copy environment file to remote machine - scp ~/.env-production $SSH_USER@$SSH_HOST:~/.env-production # copy deploy script to remote machine - scp ~/deploy-production.sh $SSH_USER@$SSH_HOST:~/deploy-production.sh # execute deploy script remotely - ssh $SSH_USER@$SSH_HOST bash deploy-production.sh

The pipeline has only one block: “Deploy production”. It begins by invoking the secrets we just created: the environment and the ssh key. We only need to call the secrets by name to summon the files into the job:

secrets: - name: env-production - name: ssh-ke

Environment variables are created as list of name – value pairs. Just replace the values with your account details:

env_vars: - name: SSH_USER value: <USERNAME> - name: SSH_HOST value: ssh.pythonanywhere.com - name: APP_URL value: <PRODUCTION_APP_URL> - name: ENV_FILE value: ~/.env-production

Finally, we’re going to write a helper script: deploy.sh to run commands on PythonAnywhere server. envsubst will help us by replacing all the environment variables with its values. Thus, the script itself is kept secret-free and we can check it in source control:

- checkout
- envsubst < deploy.sh > ~/deploy-production.sh

Here we correct the ssh key permissions and prevent confirmation message from the ssh client:

- chmod 0600 ~/.ssh/id_rsa_pa
- ssh-keyscan -H $SSH_HOST >> ~/.ssh/known_hosts
- ssh-add ~/.ssh/id_rsa_pa

The next step is to copy files to the remote server with scp.

- scp ~/.env-production $SSH_USER@$SSH_HOST:~/.env-production
- scp ~/deploy-production.sh $SSH_USER@$SSH_HOST:~/deploy-production.sh

The last command executes the deploy script on the remote machine:

- ssh $SSH_USER@$SSH_HOST bash deploy-production.sh

I mentioned deploy.sh but haven’t showed it yet. Here it is:

# deploy.sh # pull updated version of branch from repo
cd $APP_URL
git fetch --all
git reset --hard origin/$SEMAPHORE_GIT_BRANCH # perform django migration task
source $ENV_FILE
source ~/.virtualenvs/$APP_URL/bin/activate
python manage.py migrate # restart web application
touch /var/www/"$(echo $APP_URL | sed 's/\./_/g')"_wsgi.py

In short, the script does 3 things:

  • Updates the app code from the repository.
  • Executes manage.py migrate, in case there new code has additional tables.
  • Restarts the web application.

Now all that remains is to link the pipelines. This is achieved adding a promotion to the end of .semaphore/semaphore.yml:

promotions: - name: Production deploy pipeline_file: deploy-production.yml

Push all the updated files to your repository:

$ git add .semaphore
$ git add deploy.sh
$ git commit -m "added manual deployment to production"
$ git push origin master
Screenshot of updated Python / Django ci/cd pipeline in Semaphore
Python continuous deployment pipeline ready to be triggered on Semaphore

Hit the Promote button to start the deployment.

Staging Website and Pipeline

I’ll leave to you the creation the of the staging website. It won’t be hard, I promise, just repeat the steps we’ve already done:

  1. Create a pydjango_staging database.
  2. Create an .env-staging environment file that connects to pydjango_staging db
  3. Upload env-staging as a secret to Semaphore.
  4. On PythonAnywhere: source the staging environment and create new website with pa_autoconfigure_django, you’ll need a different address than production.
  5. Modify the WSGI.py file for the new site, load the staging environment file.
  6. If using a custom domain, add a CNAME for the new site on your DNS provider.
  7. Reload the application.

The only tricky thing is that you can’t use the same address than in production.

Once the staging site is up, create a new staging pipeline .semaphore/deploy-staging.yml:

# .semaphore/deploy-staging.yml version: v1.0
name: Deploy Django to PythonAnywhere
agent: machine: type: e1-standard-2 os_image: ubuntu1804 blocks: - name: "Deploy staging" task: # make secrets accessible for this job secrets: - name: env-staging - name: ssh-key env_vars: # Your PythonAnywhere username - name: SSH_USER value: <USERNAME> # host for ssh connection - name: SSH_HOST value: ssh.pythonanywhere.com # Your website URL without HTTP or HTTPS - name: APP_URL value: <STAGING_APP_URL> # Environment on hosting provider - name: ENV_FILE value: ~/.env-staging jobs: - name: Push code to staging commands: - checkout # prepare deploy script, subtitute environment variables - envsubst < deploy.sh > ~/deploy-staging.sh # ensure correct permissions for ssh key - chmod 0600 ~/.ssh/id_rsa_pa # add remote host to know_hosts - ssh-keyscan -H $SSH_HOST >> ~/.ssh/known_hosts # ensure ssh will use the key - ssh-add ~/.ssh/id_rsa_pa # copy environment file to remote machine - scp ~/.env-staging $SSH_USER@$SSH_HOST:~/.env-staging # copy deploy script to remote machine - scp ~/deploy-staging.sh $SSH_USER@$SSH_HOST:~/deploy-staging.sh # execute deploy script remotely - ssh $SSH_USER@$SSH_HOST bash deploy-staging.sh

And modify the promotions at the end of .semaphore/semaphore.yml:

promotions: - name: Staging deploy pipeline_file: deploy-staging.yml auto_promote_on: - result: passed - name: Production deploy pipeline_file: deploy-production.yml

Finally, push the updated files to your repository.

$ git add .semaphore/semaphore.yaml .semaphore/deploy-staging.yaml
$ git commit -m "Add staging pipeline"
$ git push origin master

We use auto_promote_on to start Staging as soon as previous tests pass:

Screenshot of a Python ci/cd pipeline to PythonAnywhere
The full Python continuous integration and deployment pipeline

Excellent! No errors, we can deploy to production safely.

Conclusion

We’ve discovered the incredible potential of a CI/CD platform. I hope that the tools and practices discussed here can add value to your projects, improve your team effectiveness and make your life easier.

For the next steps, I suggest learning more from Semaphore’s docs and, of course, setting up continuous integration and deployment for your own Python apps. Good luck!