Splunk® Data Stream Processor

Getting Data In

Acrobat logo Download manual as PDF


On April 3, 2023, Splunk Data Stream Processor will reach its end of sale, and will reach its end of life on February 28, 2025. If you are an existing DSP customer, please reach out to your account team for more information.
This documentation does not apply to the most recent version of Splunk® Data Stream Processor. For documentation on the most recent version, go to the latest release.
Acrobat logo Download topic as PDF

Send events to a DSP data pipeline using a Splunk forwarder

You can forward data to your data pipeline using a Splunk heavy forwarder or a universal forwarder. A forwarder is a Splunk Enterprise instance that forwards data to another Splunk Enterprise instance, such as an indexer or another forwarder, or to a third-party system. In this case, the forwarder is forwarding data to the Splunk Forwarder Service, which your data pipeline then ingests data from.

The Splunk Forwarder Service identifies incoming connections from forwarders by the client certificate on the forwarder. Your forwarder must be configured to use a self-signed client certificate. Before using the Splunk Forwarder Service, you must:

  • Create and upload a self-signed certificate to Splunk Forwarder Service.
  • Configure your forwarder to use the certificate.

When using a heavy forwarder, the event's timestamp is set to the extracted event time. When using a universal forwarder, the timestamp is set to the event's ingestion time, not the time of the event itself.

Download and install a universal forwarder

  1. Confirm that your system meets the system requirements.
  2. Download the correct forwarder for your operating system and application.

Using a universal forwarder

A universal forwarder doesn't separate ingested data into individual events by itself. To create a stream of complete individual events, you must use the Group by and Merge Events functions. After following the steps listed on this topic to set up your universal forwarder, see Create a Splunk Universal Forwarder pipeline for instructions on how to build a DSP pipeline that successfully ingests universal forwarder data.

Create a self-signed certificate

You must secure the connection between the Splunk Data Stream Processor and your Splunk instance.

Prerequisites:

  1. Choose a name to identify your Splunk instance. Use this name for the generated key, certificate signing request (CSR), and PEM files.
  2. Choose an email address to associate to the certificate. The email address is included in the name of the certificate.

Steps

  1. From the command line, generate a private key to sign your certificates.
    openssl genrsa -out my_forwarder.key 2048
  2. Use your private key to generate a CSR file.
    openssl req -new -key "my_forwarder.key" -out "my_forwarder.csr" -subj "/C=US/ST=CA/O=my_organization/CN=my_forwarder/emailAddress=email@example.com"
  3. Sign your CSR file with your newly created private key and create a client certificate that expires in 2 years.
    openssl x509 -req -days 730 -in "my_forwarder.csr" -signkey "my_forwarder.key" -out "my_forwarder.pem" -sha256
  4. Use the SCloud tool to format your certificate and upload your certificate to your tenant. For more information about SCloud, see SCloud.
    scloud forwarders create-certificate my_forwarder.pem 

After the key is added to the tenant, the server response shows details about the certificate including the hash, subject, issuer, and validity dates.

By default, you can add up to 100 certificates to your tenant.

Configure your forwarder to use the client certificate

  1. On the host that forwards the data that you want to collect, open a shell or command prompt or PowerShell window.
  2. Concatenate your private and public keys into one file.
    cat my_forwarder.pem my_forwarder.key > my_forwarder-keys.pem
    
  3. Navigate to the configuration directory for the forwarder.
    cd $SPLUNK_HOME/etc/system/local
  4. Open outputs.conf in etc/system/local for editing and add the following stanzas. If outputs.conf does not exist, create it.
    [tcpout:<group_name>]
    server=<ip-address-of-node>:30001, <ip-address-of-node>:30001, <ip-address-of-node>:30001,<ip-address-of-node>:30001,<ip-address-of-node>:30001
    * Spreading initial traffic across all nodes in your DSP cluster helps if a node becomes temporarily unavailable or goes down. The DSP cluster load balances internal traffic automatically. 
    
    clientCert=/path/to/my_forwarder-keys.pem
    sslVerifyServerCert=false 
    
    sslCommonNameToCheck = (Optional) <commonName1>, <commonName2>, ... 
    * When populated, Splunk software checks the common name of the client's certificate against 
    this list of names. If there is no match the Splunk instance is not authenticated. 
    The requireClientCert attribute must be set to true to use this attribute.
    
    useACK=true
  5. Configure your forwarder to trust the DSP Forwarder service's certificate, which is signed by DigiCert.
    1. From the "DigiCert Trusted Root Authority Certificates" page on the DigiCert website, download the DigiCert Global Root CA certificate as a PEM file.
    2. Open server.conf in etc/system/local for editing and add the following to the sslConfig stanza:
      [sslConfig]
      sslRootCAPath = /path/to/DigiCertGlobalRootCA.crt.pem
    3. Restart the forwarder to complete your changes.
  6. Verify that data is being forwarded to your pipeline.

If you are sending data from a large number of Forwarders and you are experiencing throughput issues, you should increase the number of replicas for the ingest-s2s deployment in your Kubernetes cluster. You can increase the number of replicas by adding more nodes to your cluster. See Install the Data Stream Processor for instructions on how to join a new node to your cluster.

Add permissions to users to send data with the Forwarders Service

By default, tenant administrators have permissions to use the Forwarders service. Perform the following steps to grant permissions to another user in your tenant to send data using the Forwarders service.

  1. Ask the user you want to grant permissions to for their token UUID.
    • Ask the user to login to SCloud:
      scloud login
    • Ask the user to run the following SCloud command and send you their UUID:
      scloud identity validate-token
  2. Create a new group to give permissions to.
    scloud identity create-group <groupName>
  3. Create a new role to give permissions to.
    scloud identity create-role <roleName>
    
  4. Add write, read, and delete permissions to the role that you created or to the pre-existing role.
    • To add write permissions, type the following command:
      scloud identity add-role-permission <roleName> default:*:forwarders.certificate.write
    • To add read permissions, type the following command:
      scloud identity add-role-permission <roleName> default:*:forwarders.certificate.read
    • To add delete permissions, type the following command:
      scloud identity add-role-permission <roleName> default:*:forwarders.certificate.delete
  5. Add the role to the group that you created.
    scloud identity add-group-role <groupName> <roleName>
  6. Add the user to the group.
    scloud identity add-group-member <groupName> <user-UUID>
Last modified on 07 August, 2020
PREVIOUS
Format and send events to a DSP data pipeline using the Ingest REST API
  NEXT
Get data in with a DSP push-based connector

This documentation applies to the following versions of Splunk® Data Stream Processor: 1.0.0, 1.0.0


Was this documentation topic helpful?


You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters