Splunk® Connect for Kafka

Install and Administer Splunk Connect for Kafka

This documentation does not apply to the most recent version of Splunk® Connect for Kafka. For documentation on the most recent version, go to the latest release.

Configuration examples for Splunk Connect for Kafka

Depending on your deployment, use the following configuration examples to configure your Splunk Connect for Kafka deployment.

Enable HEC token acknowledgements to avoid data loss. Without HEC token acknowledgement, data loss may occur, especially in case of a system restart or crash.

If raw events need to go through Splunk's index time extraction, use the HEC /raw event endpoint. When using the /raw HEC endpoint and when your raw data does not contain a timestamp or contains multiple timestamps or carriage returns, you must configure the splunk.hec.raw.line.breaker and setup a corresponding props.conf inside your Splunk platform to honor this line breaker setting. This will assist Splunk to do event breaking. For example, in Connection configuration, set "splunk.hec.raw.line.breaker":"####" for sourcetype "s1".

In props.conf, you can set up the line breaker as follows:

[s1] # sourcetype name
LINE_BREAKER = (####)
SHOULD_LINEMERGE = false

The auto-assigned timestamp will work for all deployments that use the /event HEC endpoint.


Splunk indexing with acknowledgment

Using HEC /raw endpoint

curl <hostname>:8083/connectors -X POST -H "Content-Type: application/json" -d'{
    "name": "splunk-prod-financial",
      "config": {
        "connector.class": "com.splunk.kafka.connect.SplunkSinkConnector",
        "tasks.max": "10",
        "topics": "t1,t2,t3,t4,t5,t6,t7,t8,t9,t10",
        "splunk.hec.uri": "https://idx1:8088,https://idx2:8088,https://idx3:8088",
        "splunk.hec.token": "1B901D2B-576D-40CD-AF1E-98141B499534",
        "splunk.hec.ack.enabled" : "true",
        "splunk.hec.ack.poll.interval" : "20",
        "splunk.hec.ack.poll.threads" : "2",
        "splunk.hec.event.timeout" : "300",
        "splunk.hec.raw" : "true",
        "splunk.hec.raw.line.breaker" : "####"
	    }
	}'

Using HEC /event endpoint

curl <hostname>:8083/connectors -X POST -H "Content-Type: application/json" -d'{
     "name": "splunk-prod-financial",
       "config": {
         "connector.class": "com.splunk.kafka.connect.SplunkSinkConnector",
         "tasks.max": "10",
         "topics": "t1,t2,t3,t4,t5,t6,t7,t8,t9,t10",
         "splunk.hec.uri": "https://idx1:8088,https://idx2:8088,https://idx3:8088",
         "splunk.hec.token": "1B901D2B-576D-40CD-AF1E-98141B499534",
         "splunk.hec.ack.enabled" : "true",
         "splunk.hec.ack.poll.interval" : "20",
         "splunk.hec.ack.poll.threads" : "2",
         "splunk.hec.event.timeout" : "300",
         "splunk.hec.raw" : "false",
         "splunk.hec.json.event.enrichment" : "org=fin,bu=south-east-us",
         "splunk.hec.track.data" : "true"
       }
   }'

Splunk indexing without acknowledgment

Using HEC /raw endpoint

curl <hostname>:8083/connectors -X POST -H "Content-Type: application/json" -d'{
     "name": "splunk-prod-financial",
       "config": {
         "connector.class": "com.splunk.kafka.connect.SplunkSinkConnector",
         "tasks.max": "10",
         "topics": "t1,t2,t3,t4,t5,t6,t7,t8,t9,t10",
         "splunk.hec.uri": "https://idx1:8088,https://idx2:8088,https://idx3:8088",
         "splunk.hec.token": "1B901D2B-576D-40CD-AF1E-98141B499534"
         "splunk.hec.ack.enabled" : "false",
         "splunk.hec.raw" : "true",
         "splunk.hec.raw.line.breaker" : "####"
        }
    }'

Using HEC /event endpoint

curl <hostname>:8083/connectors -X POST -H "Content-Type: application/json" -d'{
     "name": "splunk-prod-financial",
       "config": {
         "connector.class": "com.splunk.kafka.connect.SplunkSinkConnector",
         "tasks.max": "10",
         "topics": "t1,t2,t3,t4,t5,t6,t7,t8,t9,t10",
         "splunk.hec.uri": "https://idx1:8088,https://idx2:8088,https://idx3:8088",
         "splunk.hec.token": "1B901D2B-576D-40CD-AF1E-98141B499534",
         "splunk.hec.ack.enabled" : "false",
         "splunk.hec.raw" : "false",
         "splunk.hec.json.event.enrichment" : "org=fin,bu=south-east-us",
         "splunk.hec.track.data" : "true"
        }
    }'

Example of a connector with header support enabled

curl <hostname>:8083/connectors -X POST -H "Content-Type: application/json" -d' {
    "name": "splunk-prod-financial",
      "config": {
        "connector.class": "com.splunk.kafka.connect.SplunkSinkConnector",
        "tasks.max": "10",
        "topics": "t1,t2,t3,t4,t5,t6,t7,t8,t9,t10",
        "splunk.sourcetypes": "collectd_http",
        "splunk.hec.uri": "https://idx1:8088,https://idx2:8088,https://idx3:8088",
        "splunk.hec.token": "1B901D2B-576D-40CD-AF1E-98141B499534",
        "splunk.hec.ack.enabled": "true",
        "splunk.hec.ack.poll.interval": "20",
        "splunk.hec.ack.poll.threads": "2",
        "splunk.hec.event.timeout": "120",
        "splunk.hec.raw": "false",
        "splunk.header.support": "true",
        "splunk.header.index": "destination_storage",
        "splunk.header.source": "Financial_Application",
        "splunk.header.sourcetype": "ledger_format",
        "splunk.header.host": "finance.company.host"
        }
    }'

Example of a connector for custom Java keystore location

curl <hostname>:8083/connectors -X POST -H "Content-Type: application/json" -d' {
    "name": "splunk-prod-financial",
      "config": {
        "connector.class": "com.splunk.kafka.connect.SplunkSinkConnector",
        "tasks.max": "20",
        "topics": "t1",
        "splunk.hec.uri": "https://idx1:8088,https://idx2:8088,https://idx3:8088",
        "splunk.hec.token": "1B901D2B-576D-40CD-AF1E-98141B499534",
        "splunk.hec.ssl.trust.store.path": "/keystore.jks",
        "splunk.hec.ssl.trust.store.password": "password"
  }
 }'

Example of a connector for events already in HEC format

curl <hostname>:8083/connectors -X POST -H "Content-Type: application/json" -d' {
    "name": "splunk-prod-financial",
      "config": {
        "connector.class": "com.splunk.kafka.connect.SplunkSinkConnector",
        "tasks.max": "20",
        "topics": "t1",
        "splunk.hec.uri": "https://idx1:8088,https://idx2:8088,https://idx3:8088",
        "splunk.hec.token": "1B901D2B-576D-40CD-AF1E-98141B499534",
        "splunk.hec.json.event.formatted": "true"
 }
 }'

Example of a connector to send collectd metrics to a Splunk metrics index

The Splunk metrics index is optimized for ingesting and retrieving metrics. For more information, see the Metrics manual.

curl <hostname>:8083/connectors -X POST -H "Content-Type: application/json" -d'{
    "name": "splunk-prod-financial",
      "config": {
        "connector.class": "com.splunk.kafka.connect.SplunkSinkConnector",
        "tasks.max": "10",
        "topics": "t1,t2,t3,t4,t5,t6,t7,t8,t9,t10",
        "splunk.sourcetypes": "collectd_http",
        "splunk.hec.uri": "https://idx1:8088,https://idx2:8088,https://idx3:8088",
        "splunk.hec.token": "1B901D2B-576D-40CD-AF1E-98141B499534",
        "splunk.hec.ack.enabled" : "true",
        "splunk.hec.ack.poll.interval" : "20",
        "splunk.hec.ack.poll.threads" : "2",
        "splunk.hec.event.timeout" : "120",
        "splunk.hec.raw" : "true",
        "splunk.hec.raw.line.breaker" : "####"
    }
}'

Example of a connector with 10 topics and 10 parallelized tasks

Use the following command to create a connector called splunk-prod-financial for 10 topics and 10 parallelized tasks. The connector will use the HEC /event endpoint with acknowledgments enabled. The data is injected into a three-server Splunk platform indexer cluster.

curl <hostname>:8083/connectors -X POST -H "Content-Type: application/json" -d'{
    "name": "splunk-prod-financial",
      "config": {
         "connector.class": "com.splunk.kafka.connect.SplunkSinkConnector",
         "tasks.max": "10",
         "topics": "t1,t2,t3,t4,t5,t6,t7,t8,t9,t10",
         "splunk.hec.uri": "https://idx1:8088,https://idx2:8088,https://idx3:8088",
         "splunk.hec.token": "1B901D2B-576D-40CD-AF1E-98141B499534"
      }
    }'

Example of a connector with 20 parallelized tasks

Use the following command to update the connector to use 20 parallelized tasks.

curl <hostname>:8083/connectors -X POST -H "Content-Type: application/json" -d' {
    "name": "splunk-prod-financial",
       "config": {
         "connector.class": "com.splunk.kafka.connect.SplunkSinkConnector",
         "tasks.max": "20",
         "topics": "t1,t2,t3,t4,t5,t6,t7,t8,t9,t10",
         "splunk.hec.uri": "https://idx1:8088,https://idx2:8088,https://idx3:8088",
         "splunk.hec.token": "1B901D2B-576D-40CD-AF1E-98141B499534"
	  }
	}'

Example of load balancing with list of HEC enabled endpoints

 
curl <KAFKA_CONNECT_HOST>:8083/connectors -X POST -H "Content-Type: application/json" -d'{
  "name": "splunk-prod-financial",
    "config": {
      "connector.class": "com.splunk.kafka.connect.SplunkSinkConnector",
      "tasks.max": "1",
      "topics": "t1",
      "splunk.hec.uri": "https://idx1:8088,https://idx2:8088,https://idx3:8088",
      "splunk.hec.token": "1B901D2B-576D-40CD-AF1E-98141B499534",
      "splunk.hec.ack.enabled : "true",
      "splunk.hec.raw" : "true",
      "splunk.hec.raw.line.breaker" : "####"
    }
}'

Example of load balancing with a preconfigured load balancer

 
curl <KAFKA_CONNECT_HOST>:8083/connectors -X POST -H "Content-Type: application/json" -d'{
  "name": "splunk-prod-financial",
    "config": {
      "connector.class": "com.splunk.kafka.connect.SplunkSinkConnector",
      "tasks.max": "1",
      "topics": "t1",
      "splunk.hec.uri": "https://elb-kafka:8088",
      "splunk.hec.token": "1B901D2B-576D-40CD-AF1E-98141B499534",
      "splunk.hec.ack.enabled : "true",
      "splunk.hec.raw" : "true",
      "splunk.hec.raw.line.breaker" : "####"
    }
}'
Last modified on 20 December, 2021
Index routing configurations for Splunk Connect for Kafka   Troubleshoot issues with Splunk Connect for Kafka

This documentation applies to the following versions of Splunk® Connect for Kafka: 2.0.5, 2.0.6, 2.0.7


Was this topic useful?







You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters