Splunk® Connect for Kafka

Install and Administer Splunk Connect for Kafka

This documentation does not apply to the most recent version of Splunk® Connect for Kafka. For documentation on the most recent version, go to the latest release.

Install Splunk Connect for Kafka

The below steps have been tested on both Apache Kafka and Confluent platform deployments.

To install Splunk Connect for Kafka, perform the following steps:

  1. Navigate to the Splunk Connect for Kafka repository on github and download the latest splunk-kafka-connect-[VERSION].jar release.
  2. Start your Kafka Cluster and confirm it is running.
    curl http://<KAFKA_CONNECT_HOSTNAME>:<KAFKA_CONNECT_PORT>.
    For example, curl http://localhost:8083
  3. (Optional) Create a directory to store your Kafka Connect connectors. This will be used for your plugin.path setting.
  4. Navigate to your /$KAFKA_HOME/config/ directory.
  5. Modify the connect-distributed.properties file to include the below information:
    
    #Required configurations for Splunk Connect for Kafka
    bootstrap.servers=<BOOTSTRAP_SERVER1,BOOTSTRAP_SERVER2,BOOTSTRAP_SERVER3 >
    plugin.path=<PLUGIN_PATH>
    
    key.converter=<org.apache.kafka.connect.storage.StringConverter|org.apache.kafka.connect.json.JsonConverter|io.confluent.connect.avro.AvroConverter>
    value.converter=<org.apache.kafka.connect.storage.StringConverter|org.apache.kafka.connect.json.JsonConverter|io.confluent.connect.avro.AvroConverter>
    
  6. Place the Splunk Connect for Kafka jar file in the plugin.path directory for all Kafka Connect hosts.
  7. Restart your deployment's Kafka Connect services.
  8. Run ./bin/connect-distributed.sh config/connect-distributed.properties to start Kafka Connect.
  9. If this is a new install, create a test topic (for example, perf), and inject events into the topic using the Kafka data-gen-app or the kafka-console-producer.
  10. Run the following command to confirm that Splunk Connect of Kafka has been installed and configured correctly.
    curl http://localhost:8083/connector-plugins The command will return a list of the available connectors. Splunk Connect for Kafka is available if the returned list of available connectors contains com.splunk.kafka.connect.SplunkSinkConnector.

    $KAFKA_HOME is the home directory where your Kafka Connect deployment is located. This could be the same as your Kafka home directory if you are running on a shared system.

Splunk Connect for Kafka commands

Use the following commands to check the status of Splunk Connect for Kafka, to manage connectors, and to manage tasks:

Description Command
List active connectors
curl http://localhost:8083/connectors
Get kafka-connect-splunk connector information
curl http://localhost:8083/connectors/kafka-connect-splunk
Get kafka-connect-splunk connector configuration information
curl http://localhost:8083/connectors/kafka-connect-splunk/config
Delete kafka-connect-splunk connector
curl http://localhost:8083/connectors/kafka-connect-splunk -X DELETE
Get kafka-connect-splunk connector task information
curl http://localhost:8083/connectors/kafka-connect-splunk/tasks
Pause kafka-connect-splunk workers
curl -X PUT "http://localhost:8083/connectors/kafka-connect-splunk/pause

See the Confluent documentation for additional REST examples.

Last modified on 20 December, 2021
Data ingestion parameters for Splunk Connect for Kafka   Upgrade Splunk Kafka Connect

This documentation applies to the following versions of Splunk® Connect for Kafka: 2.0.5, 2.0.6, 2.0.7


Was this topic useful?







You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters