0

I am following this tutorial in order to configure my kafka broker security and i have get stuck after implementing the sasl_ssl authentication. Here are the things which i have done.

  1. Downloaded This github config project.
  2. Moved the keystore and truststore folder into my Apache Kafka config folder.
  3. Added kafka_server_jaas.conf file in config folder with these settings:

    KafkaServer { org.apache.kafka.common.security.scram.ScramLoginModule required username="admin" password="admin-secret"; };

  4. Updated the server.properties with this

    ##### SECURITY using SCRAM-SHA-512 and SSL

    listeners=PLAINTEXT://localhost:9092,SASL_PLAINTEXT://localhost:9093,SASL_SSL://localhost:9094 advertised.listeners=PLAINTEXT://localhost:9092,SASL_PLAINTEXT://localhost:9093,SASL_SSL://localhost:9094 security.inter.broker.protocol=SASL_SSL ssl.endpoint.identification.algorithm= ssl.client.auth=required sasl.mechanism.inter.broker.protocol=SCRAM-SHA-512 sasl.enabled.mechanisms=SCRAM-SHA-512

    Broker security settings

    ssl.truststore.location= truststore/kafka.truststore.jks ssl.truststore.password=password ssl.keystore.location= keystore/kafka.keystore.jks ssl.keystore.password=password ssl.key.password=password

    ACLs

    authorizer.class.name=kafka.security.auth.SimpleAclAuthorizer super.users=User:admin

    zookeeper SASL

    zookeeper.set.acl=false

    ##### SECURITY using SCRAM-SHA-512 and SSL
  5. added ssl-user-config.properties file in config

    security.protocol=SASL_SSL sasl.mechanism=SCRAM-SHA-512 sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required username="demouser" password="secret"; ssl.truststore.location=truststore/kafka.truststore.jks ssl.truststore.password=password

  6. run the Zookeeper and then created the superuser with this command.

    ./bin/kafka-configs.sh --zookeeper localhost:2181 --alter --add-config 'SCRAM-SHA-512=[password='admin-secret']' --entity-type users --entity-name admin Completed Updating config for entity: user-principal 'admin'.

  7. Now i am trying to running the Kafka server. with this .sh file, as described here

    export KAFKA_OPTS=-Djava.security.auth.login.config=kafka_2.13-2.4.1/config/kafka_server_jaas.conf bin/windows/kafka-server-start.bat config/server.properties

and getting the output

$ sh KafkSSLserver.sh
[2020-03-30 14:28:27,863] INFO Registered kafka:type=kafka.Log4jController MBean (kafka.utils.Log4jControllerRegistration$)
[2020-03-30 14:28:29,920] INFO starting (kafka.server.KafkaServer)
[2020-03-30 14:28:29,925] INFO Connecting to zookeeper on localhost:2181 (kafka.server.KafkaServer)
[2020-03-30 14:28:29,975] ERROR Fatal error during KafkaServer startup. Prepare to shutdown (kafka.server.KafkaServer)
org.apache.kafka.common.KafkaException: Exception while loading Zookeeper JAAS login context [java.security.auth.login.config=kafka_2.13-2.4.1/config/kafka_server_jaas.conf, zookeeper.sasl.client=default:true, zookeeper.sasl.clientconfig=default:Client]
        at org.apache.kafka.common.security.JaasUtils.isZkSecurityEnabled(JaasUtils.java:64)
        at kafka.server.KafkaServer.initZkClient(KafkaServer.scala:384)
        at kafka.server.KafkaServer.startup(KafkaServer.scala:207)
        at kafka.server.KafkaServerStartable.startup(KafkaServerStartable.scala:44)
        at kafka.Kafka$.main(Kafka.scala:84)
        at kafka.Kafka.main(Kafka.scala)
Caused by: java.lang.SecurityException: java.io.IOException: kafka_2.13-2.4.1/config/kafka_server_jaas.conf (No such file or directory)
        at sun.security.provider.ConfigFile$Spi.<init>(Unknown Source)
        at sun.security.provider.ConfigFile.<init>(Unknown Source)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
        at java.lang.reflect.Constructor.newInstance(Unknown Source)
        at java.lang.Class.newInstance(Unknown Source)
        at javax.security.auth.login.Configuration$2.run(Unknown Source)
        at javax.security.auth.login.Configuration$2.run(Unknown Source)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.login.Configuration.getConfiguration(Unknown Source)
        at org.apache.kafka.common.security.JaasUtils.isZkSecurityEnabled(JaasUtils.java:60)
        ... 5 more
Caused by: java.io.IOException: kafka_2.13-2.4.1/config/kafka_server_jaas.conf (No such file or directory)
        at sun.security.provider.ConfigFile$Spi.ioException(Unknown Source)
        at sun.security.provider.ConfigFile$Spi.init(Unknown Source)
        ... 17 more
[2020-03-30 14:28:29,989] INFO shutting down (kafka.server.KafkaServer)
[2020-03-30 14:28:30,024] INFO shut down completed (kafka.server.KafkaServer)
[2020-03-30 14:28:30,028] ERROR Exiting Kafka. (kafka.server.KafkaServerStartable)
[2020-03-30 14:28:30,038] INFO shutting down (kafka.server.KafkaServer)

Meanwhile i tried to run through a .bat file

export KAFKA_OPTS="-Djava.security.auth.login.config=config/kafka_server_jaas.conf"
start bin\windows\kafka-server-start.bat config\server.properties

I got this

       advertised.listeners = PLAINTEXT://localhost:9092,SASL_PLAINTEXT://local
host:9093,SASL_SSL://localhost:9094
        advertised.port = null
        alter.config.policy.class.name = null
        alter.log.dirs.replication.quota.window.num = 11
        alter.log.dirs.replication.quota.window.size.seconds = 1
        authorizer.class.name = kafka.security.auth.SimpleAclAuthorizer
        auto.create.topics.enable = true
        auto.leader.rebalance.enable = true
        background.threads = 10
        broker.id = 0
        broker.id.generation.enable = true
        broker.rack = null
        client.quota.callback.class = null
        compression.type = producer
        connection.failed.authentication.delay.ms = 100
        connections.max.idle.ms = 600000
        connections.max.reauth.ms = 0
        control.plane.listener.name = null
        controlled.shutdown.enable = true
        controlled.shutdown.max.retries = 3
        controlled.shutdown.retry.backoff.ms = 5000
        controller.socket.timeout.ms = 30000
        create.topic.policy.class.name = null
        default.replication.factor = 1
        delegation.token.expiry.check.interval.ms = 3600000
        delegation.token.expiry.time.ms = 86400000
        delegation.token.master.key = null
        delegation.token.max.lifetime.ms = 604800000
        delete.records.purgatory.purge.interval.requests = 1
        delete.topic.enable = true
        fetch.purgatory.purge.interval.requests = 1000
        group.initial.rebalance.delay.ms = 0
        group.max.session.timeout.ms = 1800000
        group.max.size = 2147483647
        group.min.session.timeout.ms = 6000
        host.name =
        inter.broker.listener.name = null
        inter.broker.protocol.version = 2.4-IV1
        kafka.metrics.polling.interval.secs = 10
        kafka.metrics.reporters = []
        leader.imbalance.check.interval.seconds = 300
        leader.imbalance.per.broker.percentage = 10
        listener.security.protocol.map = PLAINTEXT:PLAINTEXT,SSL:SSL,SASL_PLAINT
EXT:SASL_PLAINTEXT,SASL_SSL:SASL_SSL
        listeners = PLAINTEXT://localhost:9092,SASL_PLAINTEXT://localhost:9093,S
ASL_SSL://localhost:9094
        log.cleaner.backoff.ms = 15000
        log.cleaner.dedupe.buffer.size = 134217728
        log.cleaner.delete.retention.ms = 86400000
        log.cleaner.enable = true
        log.cleaner.io.buffer.load.factor = 0.9
        log.cleaner.io.buffer.size = 524288
        log.cleaner.io.max.bytes.per.second = 1.7976931348623157E308
        log.cleaner.max.compaction.lag.ms = 9223372036854775807
        log.cleaner.min.cleanable.ratio = 0.5
        log.cleaner.min.compaction.lag.ms = 0
        log.cleaner.threads = 1
        log.cleanup.policy = [delete]
        log.dir = /tmp/kafka-logs
        log.dirs = /tmp/kafka-logs
        log.flush.interval.messages = 9223372036854775807
        log.flush.interval.ms = null
        log.flush.offset.checkpoint.interval.ms = 60000
        log.flush.scheduler.interval.ms = 9223372036854775807
        log.flush.start.offset.checkpoint.interval.ms = 60000
        log.index.interval.bytes = 4096
        log.index.size.max.bytes = 10485760
        log.message.downconversion.enable = true
        log.message.format.version = 2.4-IV1
        log.message.timestamp.difference.max.ms = 9223372036854775807
        log.message.timestamp.type = CreateTime
        log.preallocate = false
        log.retention.bytes = -1
        log.retention.check.interval.ms = 300000
        log.retention.hours = 168
        log.retention.minutes = null
        log.retention.ms = null
        log.roll.hours = 168
        log.roll.jitter.hours = 0
        log.roll.jitter.ms = null
        log.roll.ms = null
        log.segment.bytes = 1073741824
        log.segment.delete.delay.ms = 60000
        max.connections = 2147483647
        max.connections.per.ip = 2147483647
        max.connections.per.ip.overrides =
        max.incremental.fetch.session.cache.slots = 1000
        message.max.bytes = 1000012
        metric.reporters = []
        metrics.num.samples = 2
        metrics.recording.level = INFO
        metrics.sample.window.ms = 30000
        min.insync.replicas = 1
        num.io.threads = 8
        num.network.threads = 3
        num.partitions = 1
        num.recovery.threads.per.data.dir = 1
        num.replica.alter.log.dirs.threads = null
        num.replica.fetchers = 1
        offset.metadata.max.bytes = 4096
        offsets.commit.required.acks = -1
        offsets.commit.timeout.ms = 5000
        offsets.load.buffer.size = 5242880
        offsets.retention.check.interval.ms = 600000
        offsets.retention.minutes = 10080
        offsets.topic.compression.codec = 0
        offsets.topic.num.partitions = 50
        offsets.topic.replication.factor = 1
        offsets.topic.segment.bytes = 104857600
        password.encoder.cipher.algorithm = AES/CBC/PKCS5Padding
        password.encoder.iterations = 4096
        password.encoder.key.length = 128
        password.encoder.keyfactory.algorithm = null
        password.encoder.old.secret = null
        password.encoder.secret = null
        port = 9092
        principal.builder.class = null
        producer.purgatory.purge.interval.requests = 1000
        queued.max.request.bytes = -1
        queued.max.requests = 500
        quota.consumer.default = 9223372036854775807
        quota.producer.default = 9223372036854775807
        quota.window.num = 11
        quota.window.size.seconds = 1
        replica.fetch.backoff.ms = 1000
        replica.fetch.max.bytes = 1048576
        replica.fetch.min.bytes = 1
        replica.fetch.response.max.bytes = 10485760
        replica.fetch.wait.max.ms = 500
        replica.high.watermark.checkpoint.interval.ms = 5000
        replica.lag.time.max.ms = 10000
        replica.selector.class = null
        replica.socket.receive.buffer.bytes = 65536
        replica.socket.timeout.ms = 30000
        replication.quota.window.num = 11
        replication.quota.window.size.seconds = 1
        request.timeout.ms = 30000
        reserved.broker.max.id = 1000
        sasl.client.callback.handler.class = null
        sasl.enabled.mechanisms = [SCRAM-SHA-512]
        sasl.jaas.config = null
        sasl.kerberos.kinit.cmd = /usr/bin/kinit
        sasl.kerberos.min.time.before.relogin = 60000
        sasl.kerberos.principal.to.local.rules = [DEFAULT]
        sasl.kerberos.service.name = null
        sasl.kerberos.ticket.renew.jitter = 0.05
        sasl.kerberos.ticket.renew.window.factor = 0.8
        sasl.login.callback.handler.class = null
        sasl.login.class = null
        sasl.login.refresh.buffer.seconds = 300
        sasl.login.refresh.min.period.seconds = 60
        sasl.login.refresh.window.factor = 0.8
        sasl.login.refresh.window.jitter = 0.05
        sasl.mechanism.inter.broker.protocol = SCRAM-SHA-512
        sasl.server.callback.handler.class = null
        security.inter.broker.protocol = SASL_SSL
        security.providers = null
        socket.receive.buffer.bytes = 102400
        socket.request.max.bytes = 104857600
        socket.send.buffer.bytes = 102400
        ssl.cipher.suites = []
        ssl.client.auth = required
        ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
        ssl.endpoint.identification.algorithm =
        ssl.key.password = [hidden]
        ssl.keymanager.algorithm = SunX509
        ssl.keystore.location =
        ssl.keystore.password = [hidden]
        ssl.keystore.type = JKS
        ssl.principal.mapping.rules = DEFAULT
        ssl.protocol = TLS
        ssl.provider = null
        ssl.secure.random.implementation = null
        ssl.trustmanager.algorithm = PKIX
        ssl.truststore.location =
        ssl.truststore.password = [hidden]
        ssl.truststore.type = JKS
        transaction.abort.timed.out.transaction.cleanup.interval.ms = 60000
        transaction.max.timeout.ms = 900000
        transaction.remove.expired.transaction.cleanup.interval.ms = 3600000
        transaction.state.log.load.buffer.size = 5242880
        transaction.state.log.min.isr = 1
        transaction.state.log.num.partitions = 50
        transaction.state.log.replication.factor = 1
        transaction.state.log.segment.bytes = 104857600
        transactional.id.expiration.ms = 604800000
        unclean.leader.election.enable = false
        zookeeper.connect = localhost:2181
        zookeeper.connection.timeout.ms = 6000
        zookeeper.max.in.flight.requests = 10
        zookeeper.session.timeout.ms = 6000
        zookeeper.set.acl = false
        zookeeper.sync.time.ms = 2000
 (kafka.server.KafkaConfig)
[2020-03-30 14:30:39,969] INFO [ThrottledChannelReaper-Fetch]: Starting (kafka.s
erver.ClientQuotaManager$ThrottledChannelReaper)
[2020-03-30 14:30:39,969] INFO [ThrottledChannelReaper-Produce]: Starting (kafka
.server.ClientQuotaManager$ThrottledChannelReaper)
[2020-03-30 14:30:39,975] INFO [ThrottledChannelReaper-Request]: Starting (kafka
.server.ClientQuotaManager$ThrottledChannelReaper)
[2020-03-30 14:30:40,129] INFO Loading logs. (kafka.log.LogManager)
[2020-03-30 14:30:40,174] INFO Logs loading complete in 44 ms. (kafka.log.LogMan
ager)
[2020-03-30 14:30:40,243] INFO Starting log cleanup with a period of 300000 ms.
(kafka.log.LogManager)
[2020-03-30 14:30:40,260] INFO Starting log flusher with a default period of 922
3372036854775807 ms. (kafka.log.LogManager)
log4j:ERROR Failed to rename [C:\ApacheKafka\kafka_2.13-2.4.1/logs/log-cleaner.l
og] to [C:\ApacheKafka\kafka_2.13-2.4.1/logs/log-cleaner.log.2020-03-30-13].
[2020-03-30 14:30:42,398] INFO Awaiting socket connections on localhost:9092. (k
afka.network.Acceptor)
[2020-03-30 14:30:42,669] INFO [SocketServer brokerId=0] Created data-plane acce
ptor and processors for endpoint : EndPoint(localhost,9092,ListenerName(PLAINTEX
T),PLAINTEXT) (kafka.network.SocketServer)
[2020-03-30 14:30:42,672] INFO Awaiting socket connections on localhost:9093. (k
afka.network.Acceptor)
[2020-03-30 14:30:42,694] ERROR [KafkaServer id=0] Fatal error during KafkaServe
r startup. Prepare to shutdown (kafka.server.KafkaServer)
java.lang.IllegalArgumentException: Could not find a 'KafkaServer' or 'sasl_plai
ntext.KafkaServer' entry in the JAAS configuration. System property 'java.securi
ty.auth.login.config' is not set
        at org.apache.kafka.common.security.JaasContext.defaultContext(JaasConte
xt.java:133)
        at org.apache.kafka.common.security.JaasContext.load(JaasContext.java:98
)
        at org.apache.kafka.common.security.JaasContext.loadServerContext(JaasCo
ntext.java:70)
        at org.apache.kafka.common.network.ChannelBuilders.create(ChannelBuilder
s.java:121)
        at org.apache.kafka.common.network.ChannelBuilders.serverChannelBuilder(
ChannelBuilders.java:85)
        at kafka.network.Processor.<init>(SocketServer.scala:753)
        at kafka.network.SocketServer.newProcessor(SocketServer.scala:394)
        at kafka.network.SocketServer.$anonfun$addDataPlaneProcessors$1(SocketSe
rver.scala:279)
        at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:190)
        at kafka.network.SocketServer.addDataPlaneProcessors(SocketServer.scala:
278)
        at kafka.network.SocketServer.$anonfun$createDataPlaneAcceptorsAndProces
sors$1(SocketServer.scala:241)
        at kafka.network.SocketServer.$anonfun$createDataPlaneAcceptorsAndProces
sors$1$adapted(SocketServer.scala:238)
        at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:553)
        at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:551)
        at scala.collection.AbstractIterable.foreach(Iterable.scala:921)
        at kafka.network.SocketServer.createDataPlaneAcceptorsAndProcessors(Sock
etServer.scala:238)
        at kafka.network.SocketServer.startup(SocketServer.scala:121)
        at kafka.server.KafkaServer.startup(KafkaServer.scala:263)
        at kafka.server.KafkaServerStartable.startup(KafkaServerStartable.scala:
44)
        at kafka.Kafka$.main(Kafka.scala:84)
        at kafka.Kafka.main(Kafka.scala)
[2020-03-30 14:30:42,713] INFO [KafkaServer id=0] shutting down (kafka.server.Ka
fkaServer)
[2020-03-30 14:30:42,722] INFO [SocketServer brokerId=0] Stopping socket server
request processors (kafka.network.SocketServer)
[2020-03-30 14:30:42,753] INFO [SocketServer brokerId=0] Stopped socket server r
equest processors (kafka.network.SocketServer)
[2020-03-30 14:30:42,770] INFO Shutting down. (kafka.log.LogManager)
[2020-03-30 14:30:42,846] INFO Shutdown complete. (kafka.log.LogManager)
[2020-03-30 14:30:42,850] INFO [ZooKeeperClient Kafka server] Closing. (kafka.zo
okeeper.ZooKeeperClient)
[2020-03-30 14:30:42,967] INFO Session: 0x100006f57170000 closed (org.apache.zoo
keeper.ZooKeeper)
[2020-03-30 14:30:42,967] INFO EventThread shut down for session: 0x100006f57170
000 (org.apache.zookeeper.ClientCnxn)
[2020-03-30 14:30:42,975] INFO [ZooKeeperClient Kafka server] Closed. (kafka.zoo
keeper.ZooKeeperClient)
[2020-03-30 14:30:42,977] INFO [ThrottledChannelReaper-Fetch]: Shutting down (ka
fka.server.ClientQuotaManager$ThrottledChannelReaper)
[2020-03-30 14:30:43,975] INFO [ThrottledChannelReaper-Fetch]: Shutdown complete
d (kafka.server.ClientQuotaManager$ThrottledChannelReaper)
[2020-03-30 14:30:43,975] INFO [ThrottledChannelReaper-Fetch]: Stopped (kafka.se
rver.ClientQuotaManager$ThrottledChannelReaper)
[2020-03-30 14:30:43,976] INFO [ThrottledChannelReaper-Produce]: Shutting down (
kafka.server.ClientQuotaManager$ThrottledChannelReaper)
[2020-03-30 14:30:44,976] INFO [ThrottledChannelReaper-Produce]: Stopped (kafka.
server.ClientQuotaManager$ThrottledChannelReaper)
[2020-03-30 14:30:44,976] INFO [ThrottledChannelReaper-Produce]: Shutdown comple
ted (kafka.server.ClientQuotaManager$ThrottledChannelReaper)
[2020-03-30 14:30:44,978] INFO [ThrottledChannelReaper-Request]: Shutting down (
kafka.server.ClientQuotaManager$ThrottledChannelReaper)
[2020-03-30 14:30:44,983] INFO [ThrottledChannelReaper-Request]: Stopped (kafka.
server.ClientQuotaManager$ThrottledChannelReaper)
[2020-03-30 14:30:44,983] INFO [ThrottledChannelReaper-Request]: Shutdown comple
ted (kafka.server.ClientQuotaManager$ThrottledChannelReaper)
[2020-03-30 14:30:44,988] INFO [SocketServer brokerId=0] Shutting down socket se
rver (kafka.network.SocketServer)
[2020-03-30 14:30:45,198] INFO [SocketServer brokerId=0] Shutdown completed (kaf
ka.network.SocketServer)
[2020-03-30 14:30:45,230] INFO [KafkaServer id=0] shut down completed (kafka.ser
ver.KafkaServer)
[2020-03-30 14:30:45,236] ERROR Exiting Kafka. (kafka.server.KafkaServerStartabl
e)
[2020-03-30 14:30:45,249] INFO [KafkaServer id=0] shutting down (kafka.server.Ka
fkaServer)

C:\ApacheKafka\kafka_2.13-2.4.1>

Edit 1: I found that export is used inside unix there i replaced export with set. Here is my new kafka server start command in batch file, double click and run:

set KAFKA_OPTS="-Djava.security.auth.login.config=config/kafka_server_jaas.conf"
start bin/kafka-server-start.sh config/server.properties

but it open git bash cmd, and after some time it shows "C:\Apachekafka\Kafka_2.13-2.41/bind/kafka-run-class.sh: line 309: C:\Program: no such file or directory

1 Answer 1

0

I had the same error kafka_server_jaas.conf (No such file or directory) and i just created a file using classic notepad app -> all working now =)

P.S. notepad++ and touch(git bash) -> not working - i think some kind of encoding error

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.