I am trying to connect from a python script to a postgres database. Here's my docker-compose:
networks:
backend:
name: value_tracker_backend
external: true
services:
db:
build:
context: ./sql
dockerfile: db.dockerfile
environment:
POSTGRES_PASSWORD: postgres
POSTGRES_DB: EntitiesAndValues
ports:
- "5431:5432"
healthcheck:
test: ["CMD-SHELL", "pg_isready -U postgres -d EntitiesAndValues"]
interval: 10s
retries: 5
start_period: 30s
timeout: 10s
networks:
- backend
dataseeder:
build:
context: .
dockerfile: dataseeder/dataseeder.dockerfile
depends_on:
db:
condition: service_healthy
restart: true
networks:
- backend
And here's a snippet of the python code that's attempting to connect via dockers internal network:
import psycopg2
from enum import Enum
class DatabaseConnector:
def __init__(self, db_name, db_user):
self.db_name = db_name
self.db_user = db_user
def __enter__(self):
self.conn = psycopg2.connect(user = self.db_user,
database = self.db_name,
password = "data",
host = "db",
port = 5432)
return self.conn
def __exit__(self, type, value, tb):
self.conn.close()
I keep getting the following error:
0.394 dsn = 'user=data_seeder password=data host=db port=5432 dbname=EntitiesAndValues'
0.394 connection_factory = None, cursor_factory = None
0.394 kwargs = {'database': 'EntitiesAndValues', 'host': 'db', 'password': 'data', 'port': 5432, ...}
0.394 kwasync = {}
0.394
0.394 def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs):
0.394 """
0.394 Create a new database connection.
0.394
0.394 The connection parameters can be specified as a string:
0.394
0.394 conn = psycopg2.connect("dbname=test user=postgres password=secret")
0.394
0.394 or using a set of keyword arguments:
0.394
0.394 conn = psycopg2.connect(database="test", user="postgres", password="secret")
0.394
0.394 Or as a mix of both. The basic connection parameters are:
0.394
0.394 - *dbname*: the database name
0.394 - *database*: the database name (only as keyword argument)
0.394 - *user*: user name used to authenticate
0.394 - *password*: password used to authenticate
0.394 - *host*: database host address (defaults to UNIX socket if not provided)
0.394 - *port*: connection port number (defaults to 5432 if not provided)
0.394
0.394 Using the *connection_factory* parameter a different class or connections
0.394 factory can be specified. It should be a callable object taking a dsn
0.394 argument.
0.394
0.394 Using the *cursor_factory* parameter, a new default cursor factory will be
0.394 used by cursor().
0.394
0.394 Using *async*=True an asynchronous connection will be created. *async_* is
0.394 a valid alias (for Python versions where ``async`` is a keyword).
0.394
0.394 Any other keyword parameter will be passed to the underlying client
0.394 library: the list of supported parameters depends on the library version.
0.394
0.394 """
0.394 kwasync = {}
0.394 if 'async' in kwargs:
0.394 kwasync['async'] = kwargs.pop('async')
0.394 if 'async_' in kwargs:
0.394 kwasync['async_'] = kwargs.pop('async_')
0.394
0.394 dsn = _ext.make_dsn(dsn, **kwargs)
0.394 > conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
0.394 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
0.394 E psycopg2.OperationalError: could not translate host name "db" to address: Name or service not known
0.394
0.394 usr/local/lib/python3.13/site-packages/psycopg2/__init__.py:122: OperationalError
I've tried a number of things to get this to work, leading up the the pre-created external network. This seems to be useful because i'm starting the database first with docker compose up -d db. Then once i know its running i start the dataseeder image as well. Its running some tests (RUN pytest ...) before building and starting the 'release' version.
The run pytest is started in the dataseeder.dockerfile:
FROM python:latest
# Create integration testing environment
RUN apt-get update
COPY dataseeder/ dataseeder/
COPY packages/ packages/
COPY tests/ tests/
COPY localtest.txt .
RUN pip install -r localtest.txt
RUN pytest -v tests/test_EntitiesValues.py
pinging from dataseeder to db:
# ping db
PING db (172.18.0.2) 56(84) bytes of data.
64 bytes from valuetrackersampleproject-db-1.value_tracker_backend (172.18.0.2): icmp_seq=1 ttl=64 time=0.166 ms
64 bytes from valuetrackersampleproject-db-1.value_tracker_backend (172.18.0.2): icmp_seq=2 ttl=64 time=0.093 ms
64 bytes from valuetrackersampleproject-db-1.value_tracker_backend (172.18.0.2): icmp_seq=3 ttl=64 time=0.095 ms
64 bytes from valuetrackersampleproject-db-1.value_tracker_backend (172.18.0.2): icmp_seq=4 ttl=64 time=0.095 ms
64 bytes from valuetrackersampleproject-db-1.value_tracker_backend (172.18.0.2): icmp_seq=5 ttl=64 time=0.079 ms
64 bytes from valuetrackersampleproject-db-1.value_tracker_backend (172.18.0.2): icmp_seq=6 ttl=64 time=0.094 ms
What am I missing here?
psycopg2.OperationalError: could not translate host name "db" to address: Name or service not known. You seem to be assuming thedbindb:is a host name and the evidence says it is not.ping db, if that does not work then the networks setup is not correct.RUN pytestline (it doesn't affect the image contents in any case) and run integration tests after you've deployed the entire system. Your CI system might refrain from pushing the build image until after the end-to-end tests pass.