Using SQLAlchemy with Flask to Connect to PostgreSQL

Modern webapps are meant to be stateless. They don’t have something which needs to be remembered - a state. This is the reason why horizontal scaling, spinning up multiple instances of your app, is possible without breaking everything. The data is saved in another service, which is made to be really good at handling it and making it accessible - a database.

Let’s connect our Flask app from the previous article to the PostgreSQL database which we are able to launch in a Docker container, and start doing useful stuff instead of just emitting “Hello World”.

Our starting point is the Flask app here. Boring, right? That’s about to change.

Installing Modules

We will need a few additional Python modules in our project to talk to the PostgreSQL database. Namely the following:


If you’re on Ubuntu, you will need a few more libraries to install those with pip. They are called psycopg2, libpq-dev and python-dev.

$ sudo apt-get install psycopg2 libpq-dev python-dev

Once those are in place, we can install the Python modules with pip in the following way:

pip install flask-sqlalchemy psycopg2

Using a virtualenv is a good idea, so your development machine is not littered with all kinds of development dependencies for different projects. Also, you will not break one project by upgrading another.

Usually, you save the dependencies of a project in a requirements.txt file, so it’s easy to get it to run on a new machine eventually. The currently installed packages can be written to a file without much effort:

$ pip freeze --local > requirements.txt

That one’s for future-you to save time and be less angry :)

If you look into the file, you’ll see that not only the package names but the precise versions are saved. And not only of the packages you installed implicitly, but also dependencies. Neat! Completely reproducible from now on.

Environment Variables

Secrets are everywhere. Or at least you need to tell your app what database to connect to, and what the credentials are. Those can be hardcoded, which is considered to be in poor taste. Imagine you want to develop your app locally, and use your trusty development database with user test and password test. What happens if you want to deploy the app to a production server? You’ll want to have a proper username and a password which is an actual password. You’d need to have two different versions of the code, and that’d be messy as hell.

Luckily, you can always access “environment variables” from within programs. You just look up a value you need will be available, and use it. This way, your app does not really change nor care where it’s running. The environment tells it what it needs to know.

In Ubuntu, you can set those variables using bash. For the development environment, I usually just add them to the bin/activate of the virtualenv, which is a bash script. Just append lines like the following one to it:

export VARIABLE_NAME=value

The export means that the variable is avilable to programs being started afterwards and isn’t just available in the current bash session only.

Here are the values those variables need to have, in case you are using a Docker-based setup as described in the last post:

export POSTGRES_URL=""
export POSTGRES_USER="postgres"
export POSTGRES_PW="dbpw"
export POSTGRES_DB="test"

In the simplest case, we want our app to check if they’re there. If not, we could rely on default values or complain. I like to choose the crash and burn approach, as that makes it very easy to notice if the environment is not configured correctly and fix it.

The function which is used to get the value of an environment var, and STOP EVERYTHING otherwise looks like this:

def get_env_variable(name):
        return os.environ[name]
    except KeyError:
        message = "Expected environment variable '{}' not set.".format(name)
        raise Exception(message)

# the values of those depend on your setup
POSTGRES_URL = get_env_variable("POSTGRES_URL")
POSTGRES_USER = get_env_variable("POSTGRES_USER")
POSTGRES_PW = get_env_variable("POSTGRES_PW")
POSTGRES_DB = get_env_variable("POSTGRES_DB")

We use it to get all the vars we will need to access PostgreSQL.

Adding SQLAlchemy Superpowers

The module we’ll use to deal with data in databases is called SQLAlchemy. With it, we can define classes with data fields which we need, and get lists of those objects with queries. SQLAlchemy takes care of doing the database-busywork if you don’t feel like diving into it. It make it possible to create the right tables, get the data you need and also takes care of many more technical details.

To use it with Flask, there’s an extension to make them play together nicely. We need to import it at the top of the file.

from flask_sqlalchemy import SQLAlchemy

Now we need to tell it what to do. For this, we will use the variables from the environment and their values. We’re setting Flask configs which the extension will in turn use to configure SQLAlchemy correctly. The SQLALCHEMY_DATABASE_URI expects a very particularly formatted string, which you can see below.

We tell it what database and communication-library to use, the username, password, url and even database to connect to.

In the end we create a db variable, which points to the SQLAlchemy Extension object.

DB_URL = 'postgresql+psycopg2://{user}:{pw}@{url}/{db}'.format(user=POSTGRES_USER,pw=POSTGRES_PW,url=POSTGRES_URL,db=POSTGRES_DB)

app.config['SQLALCHEMY_TRACK_MODIFICATIONS'] = False # silence the deprecation warning

db = SQLAlchemy(app)

Defining the First Model

The app will need to make logging-in with Spotify possible. That means we need to save the data somewhere. For this, we’ll define a User model with all required fields.

class User(db.Model):
    id = db.Column(db.Integer, primary_key=True)
    spotify_id = db.Column(db.String(200), unique=False, nullable=True)
    spotify_token = db.Column(db.String(200), unique=False, nullable=True)

It contains columns with types, specifying flags which make sense to make it possible for the database to create the right corresponding columns. The model will be used to access a database table. SQLAlchemy takes care of the necessary boilerplate by itself.

Management Commands

We hooked the new app with a way to talk to a database, which is also an abstraction layer. Now we need a way to make it create tables (as they are not there without us telling it to). This will be useful for local development, but also to bring the app online in production later on.

None of the data right now is non-disposable. That’s why we don’t really care about it being deleted completely every once in a while. This means we don’t care about backing the database up for now, nor do we care about migrations. The the users would not really notice a db-wipe, they’d just need to login with Spotify again, which happens if the token expires anyways (this happens kinda often).

Here’s how you define a management command in Flask. We Just need one single command to completely reset the PostgreSQL database. Delete (drop) everything if it’s there and create tables according to our models.

def resetdb_command():
    """Destroys and creates the database + tables."""

    from sqlalchemy_utils import database_exists, create_database, drop_database
    if database_exists(DB_URL):
        print('Deleting database.')
    if not database_exists(DB_URL):
        print('Creating database.')

    print('Creating tables.')

When in the virtualenv, we can use our new db superpowers for the first time:

SK_APP=baa/main.py flask resetdb

Shiny! Make sure that the database is running and available :)


Here’s a commit state which contains all the database goodness we discussed in this post, and a bit more.

The database is set and usable, we can check this by executing the management command. The app structure is kept simple - no blueprints and similar for now. We might add those if we need them eventually. Given, that it makes sense for the app or our own sanity.

But the app is STILL no more interesting than before from a user perspective - we can’t access the Spotify Web API without having user tokens. It doesn’t do much really, if you don’t know about the database connection happening. We’ll need to make it possible for users to log in. That’s up in the next article. Stay tuned!