How to Downgrade Postgresql from 11 to 9 to fix "pg_dump: aborting because of server version mismatch" error

Gitlab Omnibus ships with Postgres 9.6.  Unfortunately for me, several online references suggest using an external Postgres database container when using the Docker-based release of Gitlab but fail to mention the extremely important requirement: Gitlab only supports the same version of Postgres externally as internally.  This fact is buried in the install requirements document, but even there it specifies "9.6 or newer," but in the case of Docker installs, it should say "9.6 exactly."

Either way, if you find yourself running postgres:11.1-alpine, 10 or similar, this is how to downgrade your data dump to be compatible with 9.6.

TL;DR:

Run a python script against the v11 dump to remove AS integer from CREATE SEQUENCE statements, then import the database normally.

Lay of the Land

Given a docker-compose.yml similar to this (note it's a bit obfuscated):

version: '3.5'

services:

  gitlab:
   image: 'gitlab/gitlab-ce:latest'
   container_name: "gitlab"
   restart: always
   hostname: 'git.sebastientaggart.com'
   environment:
     GITLAB_OMNIBUS_CONFIG: |
       external_url 'https://git.sebastientaggart.com'

       # set the SSH port
       gitlab_rails['gitlab_shell_ssh_port'] = 10022

       # Disable built-in NGINX and encryption, Caddy handles this
       gitlab_workhorse['listen_network'] = "tcp"
       gitlab_workhorse['listen_addr'] = "0.0.0.0:3000"
       nginx['enable'] = false
       letsencrypt['enable'] = false
       letsencrypt['auto_renew'] = false

       postgresql['enable'] = false
       gitlab_rails['db_username'] = "gitlab"
       gitlab_rails['db_password'] = "{PASSWORD}"
       gitlab_rails['db_host'] = "postgresql-gitlab"
       gitlab_rails['db_port'] = "5432"
       gitlab_rails['db_database'] = "gitlab"
       gitlab_rails['db_adapter'] = 'postgresql'
       gitlab_rails['db_encoding'] = 'utf8'

       gitlab_rails['time_zone'] = 'America/New_York'

       gitlab_rails['rack_attack_git_basic_auth'] = { 
         'enabled' => true, 
         'ip_whitelist' => ["127.0.0.1"], 
         'maxretry' => 10, # Limit the number of Git HTTP authentication attempts per IP 
         'findtime' => 60, # Reset the auth attempt counter per IP after 60 seconds 
         'bantime' => 3600 # Ban an IP for one hour (3600s) after too many auth attempts 
       }

       # Add any other gitlab.rb configuration here, each on its own line
   ports:
     - '3000:3000' # HTTP (workhorse, main UI)
     - '10022:22' # SSH port
   volumes:
     - './data/gitlab/config:/etc/gitlab'
     - './data/gitlab/logs:/var/log/gitlab'
     - './data/gitlab/data:/var/opt/gitlab'

  postgresql-gitlab:
    image: 'postgres:11.1-alpine'
    container_name: "gitlab_postgresql"
    restart: always
    environment:
      - POSTGRES_USER=gitlab
      - POSTGRES_PASSWORD={PASSWORD}
      - POSTGRES_DB=gitlab
    volumes:
      - ./data/postgresql-gitlab:/var/lib/postgresql/data

An attempt at backing up using the usual rake, e.g. docker exec -t gitlab gitlab-rake gitlab:backup:create will fail with:

Dumping PostgreSQL database gitlab ... pg_dump: server version: 11.1; pg_dump version: 9.6.11
pg_dump: aborting because of server version mismatch

Getting the DB Dump Out

In order to get the export to work at all, it's necessary to sh into the container and upgrade the postgresql client to 11.

echo "deb http://apt.postgresql.org/pub/repos/apt/ stretch-pgdg main" >>  /etc/apt/sources.list.d/pgdg.list
sh -c 'echo "deb http://apt.postgresql.org/pub/repos/apt/ xenial-pgdg main" > /etc/apt/sources.list.d/pgdg.list'
apt install wget ca-certificates
wget --quiet -O - https://www.postgresql.org/media/keys/ACCC4CF8.asc | apt-key add -
apt update
apt install postgresql-client
ln -s /usr/bin/psql /usr/bin/pg_dump /opt/gitlab/bin/

Now the export will work via:

docker exec -t gitlab_postgresql pg_dumpall -c -U gitlab > dump_11_`date +%d-%m-%Y"_"%H_%M_%S`.sql

Note this is a manually executed command against the postgresql container, NOT the usual rake command, since we are only attempting to downgrade Postgres back to 9.6.  As we'll see below, actually converting back to using the bundled Postgres doesn't work even after taking these steps.

Cleaning Up Between Import/Export Operations

In my case, I'm voluming the Postgresql data directory to preserve the database between container resets.  During this operation we need to erase this data before setting an older version of Postgres in the docker-compose.yml.

sudo rm -rf /opt/docker/gitlab/data/postgresql-gitlab/*

Downgrading from 10 to 9

Exporting from 11, then importing into 10 works fine, however, going from 10 to 9 fails with DB errors.  The fix to these is possible with a Python script from stack overflow:

#!/usr/bin/env python3
import sys

#
#  Downgrades pg_dump 10 script to 9.x
#  removing 'AS integer' from 'CREATE SEQUENCE' statement
#
#  Usage:
#       $ python3 pgdump_10_to_9.py < test10.sql > test9.sql
#  or:
#       $ cat test10.sql | ./pgdump_10_to_9.py > test9.sql
#
#  To obtain a compressed 9.x sql script from a compressed 10 sql script:
#
#       $ gunzip -c test10.sql.gz | ./pgdump_10_to_9.py | gzip > test9.sql.gz
#

inside_create_sequence = False
for row in sys.stdin.readlines():

    if inside_create_sequence and row.strip().lower() == 'as integer':
        pass
    else:
        print(row, end='', flush=True)

    inside_create_sequence = row.strip().startswith('CREATE SEQUENCE ')

Save this script as pgdump_10_to_9.py, chmod +x pgdump_10_to_9.py, and run it:

# Assuming we have an existing dump named dump_10_12-04-2019_15_09_16.sql

cat dump_10_12-04-2019_15_09_16.sql | ./pgdump_10_to_9.py > dump_9_compatible.sql

# This would export a 9-compatible file to dump_9_compatible.sql

Now update your compose to use a Postgres 9.6 image, and import:

cat dump_9_compatible.sql | docker exec -i gitlab_postgresql psql -U gitlab

And voila!  You've downgraded back to an older version of Postgres that's more compatible with Gitlab.

Add new comment

This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.