-2

I am trying to connect from a python script to a postgres database. Here's my docker-compose:

networks:
  backend:
    name: value_tracker_backend
    external: true
services:
  db:
    build:
      context: ./sql
      dockerfile: db.dockerfile
    environment:
      POSTGRES_PASSWORD: postgres
      POSTGRES_DB: EntitiesAndValues
    ports:
      - "5431:5432"
    healthcheck:
      test: ["CMD-SHELL", "pg_isready -U postgres -d EntitiesAndValues"]
      interval: 10s
      retries: 5
      start_period: 30s
      timeout: 10s
    networks:
      - backend
  dataseeder:
    build:
      context: .
      dockerfile: dataseeder/dataseeder.dockerfile
    depends_on:
      db:
        condition: service_healthy
        restart: true
    networks:
      - backend   

And here's a snippet of the python code that's attempting to connect via dockers internal network:

import psycopg2
from enum import Enum

class DatabaseConnector:
    def __init__(self, db_name, db_user):
        self.db_name = db_name
        self.db_user = db_user
    
    def __enter__(self):
        self.conn = psycopg2.connect(user = self.db_user,
                                     database = self.db_name,
                                     password = "data",
                                     host = "db",
                                     port = 5432)
        return self.conn

    def __exit__(self, type, value, tb):
        self.conn.close()

I keep getting the following error:

0.394 dsn = 'user=data_seeder password=data host=db port=5432 dbname=EntitiesAndValues'
0.394 connection_factory = None, cursor_factory = None
0.394 kwargs = {'database': 'EntitiesAndValues', 'host': 'db', 'password': 'data', 'port': 5432, ...}
0.394 kwasync = {}
0.394 
0.394     def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs):
0.394         """
0.394         Create a new database connection.
0.394     
0.394         The connection parameters can be specified as a string:
0.394     
0.394             conn = psycopg2.connect("dbname=test user=postgres password=secret")
0.394     
0.394         or using a set of keyword arguments:
0.394     
0.394             conn = psycopg2.connect(database="test", user="postgres", password="secret")
0.394     
0.394         Or as a mix of both. The basic connection parameters are:
0.394     
0.394         - *dbname*: the database name
0.394         - *database*: the database name (only as keyword argument)
0.394         - *user*: user name used to authenticate
0.394         - *password*: password used to authenticate
0.394         - *host*: database host address (defaults to UNIX socket if not provided)
0.394         - *port*: connection port number (defaults to 5432 if not provided)
0.394     
0.394         Using the *connection_factory* parameter a different class or connections
0.394         factory can be specified. It should be a callable object taking a dsn
0.394         argument.
0.394     
0.394         Using the *cursor_factory* parameter, a new default cursor factory will be
0.394         used by cursor().
0.394     
0.394         Using *async*=True an asynchronous connection will be created. *async_* is
0.394         a valid alias (for Python versions where ``async`` is a keyword).
0.394     
0.394         Any other keyword parameter will be passed to the underlying client
0.394         library: the list of supported parameters depends on the library version.
0.394     
0.394         """
0.394         kwasync = {}
0.394         if 'async' in kwargs:
0.394             kwasync['async'] = kwargs.pop('async')
0.394         if 'async_' in kwargs:
0.394             kwasync['async_'] = kwargs.pop('async_')
0.394     
0.394         dsn = _ext.make_dsn(dsn, **kwargs)
0.394 >       conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
0.394                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
0.394 E       psycopg2.OperationalError: could not translate host name "db" to address: Name or service not known
0.394 
0.394 usr/local/lib/python3.13/site-packages/psycopg2/__init__.py:122: OperationalError

I've tried a number of things to get this to work, leading up the the pre-created external network. This seems to be useful because i'm starting the database first with docker compose up -d db. Then once i know its running i start the dataseeder image as well. Its running some tests (RUN pytest ...) before building and starting the 'release' version.

The run pytest is started in the dataseeder.dockerfile:

FROM python:latest
# Create integration testing environment
RUN apt-get update
COPY dataseeder/ dataseeder/
COPY packages/ packages/
COPY tests/ tests/
COPY localtest.txt .
RUN pip install -r localtest.txt
RUN pytest -v tests/test_EntitiesValues.py

pinging from dataseeder to db:

# ping db
PING db (172.18.0.2) 56(84) bytes of data.
64 bytes from valuetrackersampleproject-db-1.value_tracker_backend (172.18.0.2): icmp_seq=1 ttl=64 time=0.166 ms
64 bytes from valuetrackersampleproject-db-1.value_tracker_backend (172.18.0.2): icmp_seq=2 ttl=64 time=0.093 ms
64 bytes from valuetrackersampleproject-db-1.value_tracker_backend (172.18.0.2): icmp_seq=3 ttl=64 time=0.095 ms
64 bytes from valuetrackersampleproject-db-1.value_tracker_backend (172.18.0.2): icmp_seq=4 ttl=64 time=0.095 ms
64 bytes from valuetrackersampleproject-db-1.value_tracker_backend (172.18.0.2): icmp_seq=5 ttl=64 time=0.079 ms
64 bytes from valuetrackersampleproject-db-1.value_tracker_backend (172.18.0.2): icmp_seq=6 ttl=64 time=0.094 ms

What am I missing here?

9
  • Where is that script running? on docker host? Commented Aug 28 at 21:36
  • The script is running from the dataseeder dockerfile, so on the docker host? Commented Aug 28 at 21:37
  • Error pretty straightforward: psycopg2.OperationalError: could not translate host name "db" to address: Name or service not known. You seem to be assuming the db in db: is a host name and the evidence says it is not. Commented Aug 28 at 21:37
  • Docker host is the box where both docker containers are running. Open a shell on dataseeder and run ping db, if that does not work then the networks setup is not correct. Commented Aug 28 at 21:39
  • 1
    You can't run integration tests before you've built the image that you're going to test. The server isn't running, none of its container dependencies are guaranteed to be running, and the image build isn't running on any particular Docker network. Remove the RUN pytest line (it doesn't affect the image contents in any case) and run integration tests after you've deployed the entire system. Your CI system might refrain from pushing the build image until after the end-to-end tests pass. Commented Aug 28 at 23:55

1 Answer 1

0

Ok i solved it. You need two things, the database container running before building the other one AND to set the build context to host. new docker-compose:

networks:
  backend:
    name: value_tracker_backend
    external: true
services:
  db:
    build:
      context: ./sql
      dockerfile: db.dockerfile
    environment:
      POSTGRES_PASSWORD: postgres
      POSTGRES_DB: EntitiesAndValues
    ports:
      - "5431:5432"
    healthcheck:
      test: ["CMD-SHELL", "pg_isready -U postgres -d EntitiesAndValues"]
      interval: 10s
      retries: 5
      start_period: 30s
      timeout: 10s
    networks:
      - backend
  dataseeder:
    build:
      context: .
      dockerfile: dataseeder.dockerfile
      network: host
    depends_on:
      db:
        condition: service_healthy
        restart: true
    networks:
      - backend

Final dockerfile:

FROM python:latest AS base
# Create integration testing environment
RUN apt-get update
COPY packages/ packages/
COPY tests/ tests/
COPY localtest.txt .
COPY DataSeeder.py .
RUN pip install -r localtest.txt
RUN pytest -v tests/

FROM base AS release
RUN apt-get update
COPY packages/ packages/
COPY release.txt .
COPY DataSeeder.py .
RUN pip install -r release.txt
CMD ["python","DataSeeder.py"]

Then use a different connection string dependent on context. As in the tests connect like such:

@pytest.fixture(scope='session')
def setup():
    with DatabaseConnector('EntitiesAndValues', 'data_seeder', "localhost", 5431) as conn:
        _entitiesValues = EntitiesValuesFunctions(conn)
        with _entitiesValues.conn.cursor() as cur:
            cur.execute('TRUNCATE TABLE public.entities, public.entity_values;')
            conn.commit()
        yield _entitiesValues
        with _entitiesValues.conn.cursor() as cur:
            cur.execute('TRUNCATE TABLE public.entities, public.entity_values;')
            conn.commit()

Since the build context is using the docker hosts network, while the actual application is using the docker internal network.

if __name__ == "__main__":
    with DatabaseConnector('EntitiesAndValues', 'data_seeder', 'db', 5432) as conn:
        _entitiesValues = EntitiesValuesFunctions(conn)
        _dataSeeder = DataSeeder(_entitiesValues)
        _dataSeeder.run()

The external network lets you start one container, then the other to solve that dependency without both using their own instance of a 'backend' network.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.