0

I inherited a working Docker Compose tech stack that used Bitnami/Kafka and a recompiled VerneMQ broker. Since Bitnami began deprecating some of its catalogue, we made a decision to switch to Apache's official Kafka image. Now I'm unable to figure out why the VerneMQ client fails to authenticate using user accounts confirmed to exist in the Kafka container.

The "before" state:

bitnami/kafka version: 3.6.1

vernemq version: 1.13.0 (recompiled to include correct version of brod and vmq_kafka required for the kafka version)

Current state:

apache/kafka version: 4.1.0

vernemq version: 2.0.0 (recompiled to support latest (and supposedly compatible?) version of brod and vmq_kafka)

The "before" (working) docker-compose.yaml:

  kafka-broker:
    image: bitnami/kafka:3.6.1
    ports:
      - 9092:9092
    environment:
      - KAFKA_CFG_NODE_ID=0
      - KAFKA_CFG_PROCESS_ROLES=controller,broker
      - KAFKA_CFG_CONTROLLER_QUORUM_VOTERS=0@kafka-broker:9093
      - KAFKA_CFG_LISTENERS=CLIENT://:9092,CONTROLLER://:9093,INTERNAL://:9094
      - KAFKA_CFG_ADVERTISED_LISTENERS=CLIENT://localhost:9092,INTERNAL://kafka-broker:9094
      - KAFKA_CFG_LISTENER_SECURITY_PROTOCOL_MAP=CLIENT:SASL_PLAINTEXT,CONTROLLER:SASL_PLAINTEXT,INTERNAL:SASL_PLAINTEXT
      - KAFKA_CFG_SASL_ENABLED_MECHANISMS=PLAIN,SCRAM-SHA-256,SCRAM-SHA-512
      - KAFKA_CLIENT_USERS=admin,vernemq-user
      - KAFKA_CLIENT_PASSWORDS=password,vernemq-user-password
      - KAFKA_CFG_CONTROLLER_LISTENER_NAMES=CONTROLLER
      - KAFKA_CFG_SASL_MECHANISM_CONTROLLER_PROTOCOL=SCRAM-SHA-512
      - KAFKA_CONTROLLER_USER=controller_user
      - KAFKA_CONTROLLER_PASSWORD=controller_password
      - KAFKA_CFG_INTER_BROKER_LISTENER_NAME=INTERNAL
      - KAFKA_INTER_BROKER_USER=controller_user
      - KAFKA_INTER_BROKER_PASSWORD=controller_password
      - KAFKA_CFG_SASL_MECHANISM_INTER_BROKER_PROTOCOL=SCRAM-SHA-512
      - KAFKA_KRAFT_CLUSTER_ID=[valid_cluster_id]

  vernemq-broker:
    image: 123456789012.dkr.ecr.us-west-2.amazonaws.com/mycompany/cloud/vernemq:1.13.0-build_hash
    ports:
      - 1883:1883
      - 8888:8888
    environment:
      DOCKER_VERNEMQ_ALLOW_ANONYMOUS: off
      DOCKER_VERNEMQ_PLUGINS__VMQ_PASSWD: off
      DOCKER_VERNEMQ_PLUGINS__VMQ_ACL: off
      DOCKER_VERNEMQ_PLUGINS__VMQ_DIVERSITY: on
      DOCKER_VERNEMQ_VMQ_DIVERSITY__AUTH_REDIS__ENABLED: on
      DOCKER_VERNEMQ_VMQ_DIVERSITY__REDIS__HOST: redis-cache
      DOCKER_VERNEMQ_VMQ_DIVERSITY__REDIS__PORT: 6379
      DOCKER_VERNEMQ_VMQ_DIVERSITY__REDIS__DATABASE: 1
      DOCKER_VERNEMQ_VMQ_DIVERSITY__REDIS__USER: redis-user
      DOCKER_VERNEMQ_VMQ_DIVERSITY__REDIS__PASSWORD: redis-password
      DOCKER_VERNEMQ_PLUGINS__VMQ_KAFKA: on
      DOCKER_VERNEMQ_VMQ_KAFKA__TOPIC_MAPPINGS_FILE: /etc/vernemq-broker/kafka-topics.json
      DOCKER_VERNEMQ_VMQ_KAFKA__CLIENT__BOOTSTRAP_SERVERS: kafka-broker:9094
      DOCKER_VERNEMQ_VMQ_KAFKA__CLIENT__AUTH__SASL_MECHANISM: scram_sha_512
      DOCKER_VERNEMQ_VMQ_KAFKA__CLIENT__AUTH__USER: vernemq-user
      DOCKER_VERNEMQ_VMQ_KAFKA__CLIENT__AUTH__PASSWORD: vernemq-user-password
    volumes:
      - ./compose/vernemq-broker/kafka-topics.json:/etc/vernemq-broker/kafka-topics.json:ro

The "current" (failing) docker-compose.yaml and supporting files:

  kafka-broker:
    image: apache/kafka:4.1.0
    hostname: kafka-broker
    restart: unless-stopped
    ports:
      - "9092:9092"   # CLIENT listener (host apps)
      - "9094:9094"   # INTERNAL listener (other containers)
    environment:
      - KAFKA_NODE_ID=0
      - KAFKA_PROCESS_ROLES=controller,broker
      - KAFKA_CONTROLLER_QUORUM_VOTERS=0@kafka-broker:9093
      - KAFKA_LISTENERS=CLIENT://:9092,CONTROLLER://:9093,INTERNAL://:9094,ADMIN://:9095
      - KAFKA_ADVERTISED_LISTENERS=CLIENT://kafka-broker:9092,INTERNAL://kafka-broker:9094,ADMIN://kafka-broker:9095
      - KAFKA_LISTENER_SECURITY_PROTOCOL_MAP=CLIENT:SASL_PLAINTEXT,CONTROLLER:PLAINTEXT,INTERNAL:PLAINTEXT,ADMIN:PLAINTEXT
      - KAFKA_SASL_ENABLED_MECHANISMS=PLAIN,SCRAM-SHA-256,SCRAM-SHA-512
      #- KAFKA_CLIENT_USERS=[now defined in kafka-broker-setup sidecar script (see its 'KAFKA_CLIENT_*' env vars)]
      #- KAFKA_CLIENT_PASSWORDS=[now defined in kafka-broker-setup sidecar script]
      - KAFKA_CONTROLLER_LISTENER_NAMES=CONTROLLER
      #- KAFKA_CFG_SASL_MECHANISM_CONTROLLER_PROTOCOL=SCRAM-SHA-512
      #- KAFKA_CONTROLLER_USER=controller_user
      #- KAFKA_CONTROLLER_PASSWORD=controller_password
      - KAFKA_INTER_BROKER_LISTENER_NAME=INTERNAL
      #- KAFKA_INTER_BROKER_USER=controller_user
      #- KAFKA_INTER_BROKER_PASSWORD=controller_password
      #- KAFKA_CFG_SASL_MECHANISM_INTER_BROKER_PROTOCOL=SCRAM-SHA-512
      - KAFKA_LISTENER_NAME_CLIENT_SASL_ENABLED_MECHANISMS=PLAIN,SCRAM-SHA-256,SCRAM-SHA-512
      - KAFKA_LISTENER_NAME_INTERNAL_SASL_ENABLED_MECHANISMS=PLAIN,SCRAM-SHA-256,SCRAM-SHA-512
      - KAFKA_LOG_DIRS=/var/lib/kafka/data
      - KAFKA_NUM_PARTITIONS=3
      - KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR=1
      - KAFKA_TRANSACTION_STATE_LOG_REPLICATION_FACTOR=1
      - KAFKA_TRANSACTION_STATE_LOG_MIN_ISR=1
      - KAFKA_AUTO_CREATE_TOPICS_ENABLE=false
      - KAFKA_OPTS=-Djava.security.auth.login.config=/opt/kafka/config/jaas.conf
      - KAFKA_LOG4J_LOGGERS="org.apache.kafka.common.security.scram=DEBUG"
      - KAFKA_LISTENER_NAME_CLIENT_SCRAM_SASL_JAAS_CONFIG=org.apache.kafka.common.security.scram.ScramLoginModule required;
      - CLUSTER_ID=[valid_cluster_id]

  kafka-broker-setup:
    image: apache/kafka:4.1.0
    entrypoint: ["/bin/bash","/opt/kafka/scripts/setup.sh"]
    depends_on:
      - kafka-broker
    environment:
      BROKER_BOOTSTRAP: "localhost:9095"
      CLIENT_BOOTSTRAP: "localhost:9092"
      KAFKA_CLIENT_USERS: admin,vernemq-user
      KAFKA_CLIENT_PASSWORDS: password,vernemq-user-password
    network_mode: "service:kafka-broker"
    volumes:
      - ./compose/kafka-broker/setup.sh:/opt/kafka/scripts/setup.sh:ro
      - ./compose/kafka-broker/client.properties:/opt/kafka/config/client.properties:ro

  vernemq-broker:
    image: 123456789012.dkr.ecr.us-west-2.amazonaws.com/mycompany/cloud/vernemq:2.1.1-build_hash
    ports:
      - 1883:1883
      - 8888:8888
    environment:
      DOCKER_VERNEMQ_ALLOW_ANONYMOUS: off
      DOCKER_VERNEMQ_PLUGINS__VMQ_PASSWD: off
      DOCKER_VERNEMQ_PLUGINS__VMQ_ACL: off
      DOCKER_VERNEMQ_PLUGINS__VMQ_DIVERSITY: on
      DOCKER_VERNEMQ_VMQ_DIVERSITY__AUTH_REDIS__ENABLED: on
      DOCKER_VERNEMQ_VMQ_DIVERSITY__REDIS__HOST: redis-cache
      DOCKER_VERNEMQ_VMQ_DIVERSITY__REDIS__PORT: 6379
      DOCKER_VERNEMQ_VMQ_DIVERSITY__REDIS__DATABASE: 1
      DOCKER_VERNEMQ_VMQ_DIVERSITY__REDIS__USER: redis-user
      DOCKER_VERNEMQ_VMQ_DIVERSITY__REDIS__PASSWORD: redis-password
      DOCKER_VERNEMQ_PLUGINS__VMQ_KAFKA: on
      DOCKER_VERNEMQ_VMQ_KAFKA__TOPIC_MAPPINGS_FILE: /etc/vernemq-broker/kafka-topics.json
      DOCKER_VERNEMQ_VMQ_KAFKA__CLIENT__BOOTSTRAP_SERVERS: kafka-broker:9092
      DOCKER_VERNEMQ_VMQ_KAFKA__CLIENT__AUTH__SASL_MECHANISM: scram_sha_512
      DOCKER_VERNEMQ_VMQ_KAFKA__CLIENT__AUTH__USER: vernemq-user
      DOCKER_VERNEMQ_VMQ_KAFKA__CLIENT__AUTH__PASSWORD: vernemq-user-password
      DOCKER_VERNEMQ_LOG__CONSOLE__LEVEL: debug

setup.sh setup script:

BROKER_BOOTSTRAP="${BROKER_BOOTSTRAP:-localhost:9095}"

# create creds
IFS=',' read -r -a users <<< "${KAFKA_CLIENT_USERS}"
IFS=',' read -r -a passwords <<< "${KAFKA_CLIENT_PASSWORDS}"

for idx in "${!users[@]}"; do
  u="${users[$idx]}"
  p="${passwords[$idx]:-}"

  log "Creating: user '${u}' -- password '${p}' (SCRAM-SHA-512)"
  /opt/kafka/bin/kafka-configs.sh \
    --bootstrap-server "$BROKER_BOOTSTRAP" \
    --alter --add-config "SCRAM-SHA-512=[password=${p}]" \
    --entity-type users --entity-name "$u"
done

# verify creds exist
/opt/kafka/bin/kafka-configs.sh \
  --bootstrap-server kafka-broker:9095 \
  --describe --entity-type users \
  --entity-name admin
# output: "SCRAM credential configs for user-principal 'admin' are SCRAM-SHA-512=iterations=4096"

/opt/kafka/bin/kafka-configs.sh \
  --bootstrap-server kafka-broker:9095 \
  --describe --entity-type users \
  --entity-name vernemq-user
# output: "SCRAM credential configs for user-principal 'vernemq-user' are SCRAM-SHA-512=iterations=4096"

# ... create topics...

client.properties:

security.protocol=SASL_PLAINTEXT
sasl.mechanism=SCRAM-SHA-512
sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required username="admin" password="password";

jaas.conf:

KafkaServer {
  org.apache.kafka.common.security.scram.ScramLoginModule required;
};

The kafka-broker logs show authentication failures soon after container startup:

[2025-10-06 19:27:07,224] INFO [SocketServer listenerType=BROKER, nodeId=0] Failed authentication with /172.18.0.18 (channelId=172.18.0.11:9092-172.18.0.18:41134-2-479) (Authentication failed during authentication due to invalid credentials with SASL mechanism SCRAM-SHA-512) (org.apache.kafka.common.network.Selector)

I've tried working with ChatGPT pro and Claude Sonnet 4.5 to troubleshoot this. Both end up failing to identify any misconfigurations and resorting to suggesting the credentials being sent aren't the same as the credentials stored, but with no helpful suggestions other than recreating the user creds and trying again.

3
  • Your setup script is pointed at itself for bootstrap in env vars, and there's no need to separate "client bootstrap", and port 9095 isn't setup anywhere Commented Oct 6 at 22:29
  • Are you referring to this weirdness: BROKER_BOOTSTRAP="${BROKER_BOOTSTRAP:-localhost:9095}". Its a remnant of various attempts at different things. I've simplified it now to BROKER_BOOTSTRAP=kafka-broker:9095. Can you elaborate on "no need to separate client bootstrap"? Port 9095 is set up as a "listener" and "advertised listener" in the compose file. Commented Oct 7 at 17:18
  • I see, but INTERNAL, CLIENT, and ADMIN are all the same. It's not necessary Commented Oct 8 at 2:06

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.