Confluent Certified Developer for Apache Kafka Certification Examination Questions and Answers
You are composing a REST request to create a new connector in a running Connect cluster. You invoke POST /connectors with a configuration and receive a 409 (Conflict) response.
What are two reasons for this response? (Select two.)
(A consumer application runs once every two weeks and reads from a Kafka topic.
The last time the application ran, the last offset processed was 217.
The application is configured with auto.offset.reset=latest.
The current offsets in the topic start at 318 and end at 588.
Which offset will the application start reading from when it starts up for its next run?)
(You create a topic with five partitions.
What can you assume about messages read from that topic by a single consumer group?)
You need to configure a sink connector to write records that fail into a dead letter queue topic. Requirements:
Topic name: DLQ-Topic
Headers containing error context must be added to the messagesWhich three configuration parameters are necessary?(Select three.)
(You are designing a stream pipeline to monitor the real-time location of GPS trackers, where historical location data is not required.
Each event has:
• Key: trackerId
• Value: latitude, longitude
You need to ensure that the latest location for each tracker is always retained in the Kafka topic.
Which topic configuration parameter should you set?)
A producer is configured with the default partitioner. It is sending records to a topic that is configured with five partitions. The record does not contain any key.
What is the result of this?
You have a Kafka client application that has real-time processing requirements.
Which Kafka metric should you monitor?
(You need to send a JSON message on the wire. The message key is a string.
How would you do this?)
You are experiencing low throughput from a Java producer.
Metrics show low I/O thread ratio and low I/O thread wait ratio.
What is the most likely cause of the slow producer performance?
(You are developing a Kafka Streams application with a complex topology that has multiple sources, processors, sinks, and sub-topologies.
You are working in a development environment and do not have access to a real Kafka cluster or topics.
You need to perform unit testing on your Kafka Streams application.
Which should you use?)
Which two statements are correct when assigning partitions to the consumers in a consumer group using the assign() API?
(Select two.)
You create a topic named stream-logs with:
A replication factor of 3
Four partitions
Messages that are plain logs without a keyHow will messages be distributed across partitions?
This schema excerpt is an example of which schema format?
package com.mycorp.mynamespace;
message SampleRecord {
int32 Stock = 1;
double Price = 2;
string Product_Name = 3;
}
What is the default maximum size of a message the Apache Kafka broker can accept?
(You are building real-time streaming applications using Kafka Streams.
Your application has a custom transformation.
You need to define custom processors in Kafka Streams.
Which tool should you use?)
(You are writing to a Kafka topic with producer configuration acks=all.
The producer receives acknowledgements from the broker but still creates duplicate messages due to network timeouts and retries.
You need to ensure that duplicate messages are not created.
Which producer configuration should you set?)
An application is consuming messages from Kafka.
The application logs show that partitions are frequently being reassigned within the consumer group.
Which two factors may be contributing to this?
(Select two.)
Clients that connect to a Kafka cluster are required to specify one or more brokers in the bootstrap.servers parameter.
What is the primary advantage of specifying more than one broker?
You need to explain the best reason to implement the consumer callback interface ConsumerRebalanceListener prior to a Consumer Group Rebalance.
Which statement is correct?
You want to connect with username and password to a secured Kafka cluster that has SSL encryption.
Which properties must your client include?
(Your application consumes from a topic configured with a deserializer.
You want the application to be resilient to badly formatted records (poison pills).
You surround the poll() call with a try/catch block for RecordDeserializationException.
You need to log the bad record, skip it, and continue processing other records.
Which action should you take in the catch block?)
(A consumer application needs to use an at-most-once delivery semantic.
What is the best consumer configuration and code skeleton to avoid duplicate messages being read?)
Kafka producers can batch messages going to the same partition.
Which statement is correct about producer batching?
(You are implementing a Kafka Streams application to process financial transactions.
Each transaction must be processed exactly once to ensure accuracy.
The application reads from an input topic, performs computations, and writes results to an output topic.
During testing, you notice duplicate entries in the output topic, which violates the exactly-once processing requirement.
You need to ensure exactly-once semantics (EOS) for this Kafka Streams application.
Which step should you take?)
Which statement describes the storage location for a sink connector’s offsets?
(You are experiencing low throughput from a Java producer.
Kafka producer metrics show a low I/O thread ratio and low I/O thread wait ratio.
What is the most likely cause of the slow producer performance?)
You are building a system for a retail store selling products to customers.
Which three datasets should you model as a GlobalKTable?
(Select three.)