Splunk® SOAR (On-premises)

Administer Splunk SOAR (On-premises)

The classic playbook editor will be deprecated in early 2025. Convert your classic playbooks to modern mode.
After the future removal of the classic playbook editor, your existing classic playbooks will continue to run, However, you will no longer be able to visualize or modify existing classic playbooks.
For details, see:

Create and download or upload a diagnostic file

release 6.2.0 and higher releases can create diagnostic files that contain selectable categories of data to help Splunk Support diagnose issues with your deployment.

Splunk SOAR (On-premises) supported configurations:

  • Single instance deployments
  • Deployments using warm standby

Splunk SOAR (On-premises) clustered deployments are not currently supported.

You need an active support case and credentials for the Support Portal to upload the diagnostic file to Splunk Support. For more information on opening a support case, see the heading Splunk Technical Support in the topic Administer .

Create a diagnostic file

Diagnostic files can be created using either the web-based user interface or from the command line.

Use the web-based user interface

From the Home menu, select Administration, then System Health, then Debugging.

  1. (Optional) Click the ► symbol next to Advanced.
  2. (Optional) Select the checkboxes for the categories you want to include in your diagnostic file; Instance, System, Database, Apps, Filesystem, and Cloud. The default setting includes all sections except filesystem.
  3. (Optional) Select the range of logs you want to include in your diagnostic file; All Logs or Recent Logs. The default is All Logs.
  4. To download the diagnostic file locally click Download Logs.
  5. To upload your diagnostic file and attach it to your support case, click "Upload to Support".
    1. Type your Support Portal username, password, and case number.
    2. Click Login and Upload.

Usernames must be submitted in all lowercase letters.

Use the command line

You can create a diagnostic file using the command line.

Use the command phenv python -m manage diag and the arguments you need to create and upload your diagnostic file.

Argument Description
-h, --help Show the help message then exit.
-p <OUTPUT_DIR>

--path <OUTPUT_DIR>

Type a path for the destination directory in which to write the diagnostics TAR file.
-s {instance,system,db,filesystem,apps,cloud} [{instance,system,db,filesystem,apps,cloud} ...],

--sections {instance,system,db,filesystem,apps,cloud} [{instance,system,db,filesystem,apps,cloud} ...]

Specify one or more diagnostic sections to gather. By default, all sections will be gathered.

Including filesystem can make generating the diagnostic file take longer.

-r, --recent-logs Use this argument to set whether to include only recent log files in the diagnostic TAR file. If you do not use this argument, all logs will be included.
--username <USERNAME> Your username for the Splunk Support Portal. Include this if you want to upload the diagnostic file to your support case.

Usernames must be submitted in all lowercase letters.

--password <PASSWORD> Your password for the Splunk Support Portal. Include this if you want to upload the diagnostic file to your support case. You can also use the SPLUNK_PASSWORD environment variable instead of passing a password through the command line.
-c <CASE_NUMBER>

--case-number <CASE_NUMBER>

Your case number for the support case to which you want to attach this diagnostic file. Include this if you want to upload the diagnostic file to your support case.
-d, --dry-run If specified, no actual files will be created
-v {0,1,2,3}, --verbosity {0,1,2,3} Set how verbose you want the command output to be.
  • 0 = minimal output
  • 1 = normal output
  • 2 = verbose output
  • 3 = very verbose output
--no-color Don't colorize the command output.
--skip-checks Skip system checks.

Examples To create a diagnostic file on the local filesystem, run the command:

phenv python -m manage diag

To create a diagnostic file and upload to your support case, run the command:

phenv python -m manage diag --username <USERNAME> --password <PASSWORD> --case-number <CASE_NUMBER>

Sample output

phantom@soar1-i-0294e5a91dd236352:~$ phenv python -m manage diag
Writing metadata JSON.

Done.
JSON is located at /opt/phantom/private/phantom_logs_2023-04-13-1943/metadata.json.
Writing ingestion status JSON.

Done.
JSON is located at /opt/phantom/private/phantom_logs_2023-04-13-1943/ingestion_status_2023-04-13-1943.json.

Done.
Ingestion Status info is located at /opt/phantom/private/phantom_logs_2023-04-13-1943/ingestion_status_2023-04-13-1943.json.
Collecting CPU Info...
Collecting Disk Space...
Collecting Hostname...
Collecting Network Configuration...
Collecting Operating System...
Collecting Currently Running Processes...
Collecting Python3 Packages...
...  (pruned for brevity)
/opt/phantom/splunkforwarder/var/run/splunk/confsnapshot/tmpEtc_local/apps/soar_hec
/opt/phantom/splunkforwarder/var/run/splunk/confsnapshot/tmpEtc_local/apps/soar_hec/local
/opt/phantom/splunkforwarder/var/run/splunk/csv
/opt/phantom/splunkforwarder/var/run/splunk/conf-mutator.pid
/opt/phantom/splunkforwarder/var/run/splunk/dispatch
/opt/phantom/splunkforwarder/var/run/splunk/search_telemetry
/opt/phantom/splunkforwarder/var/run/splunk/appserver
/opt/phantom/splunkforwarder/var/run/splunk/appserver/modules
/opt/phantom/splunkforwarder/var/run/splunk/appserver/modules/static
/opt/phantom/splunkforwarder/var/run/splunk/appserver/modules/static/css
/opt/phantom/splunkforwarder/var/run/splunk/appserver/i18n
/opt/phantom/splunkforwarder/var/run/splunk/composite.xml
/opt/phantom/splunkforwarder/var/run/splunk/upload
/opt/phantom/splunkforwarder/var/run/splunk/splunkd.pid
/opt/phantom/private/phantom_logs_2023-04-13-1943
/opt/phantom/private/phantom_logs_2023-04-13-1943/metadata.json
/opt/phantom/private/phantom_logs_2023-04-13-1943/ingestion_status_2023-04-13-1943.json
/opt/phantom/splunkforwarder/etc/system/local/user-seed.conf
/opt/phantom/splunkforwarder/ftr
/opt/phantom/.soar
/opt/phantom/etc/logrotate.d/phantom_logrotate.conf
/opt/phantom/www/phantom_ui/settings.py
/opt/phantom/splunkforwarder/etc/auth expected '0o755', but actual is '0o700'
/opt/phantom/bin/spawn3 expected '0o4750', but actual is '0o750'
/opt/phantom/bin/worker_kill expected '0o4770', but actual is '0o770'
Writing diagnostics JSON.

Done.
JSON is located at /opt/phantom/private/phantom_logs_2023-04-13-1943/diag.json.
Copying the requested logs to /opt/phantom/private/phantom_logs_2023-04-13-1943.
Executing command: rsync -a --no-compress /opt/phantom/var/log/phantom /opt/phantom/private/phantom_logs_2023-04-13-1943/phantom_home/var/log --include=*/spawn.log --include=*/actiond.log --include=*/actiond.json.log --include=*/broker_*_localsplunk.log --include=*/app_install.log --include=*/spawn.log.* --include=*/actiond.log.* --include=*/actiond.json.log.* --include=*/broker_*_localsplunk.log.* --include=*/app_install.log.* --exclude=*.* --exclude=*_log.
Compressing logs to /opt/phantom/private/phantom_logs_2023-04-13-1943/phantom_logs_2023-04-13-1943.tgz.
Setting proper file permission attributes on /opt/phantom/private/phantom_logs_2023-04-13-1943/phantom_logs_2023-04-13-1943.tgz.
Log archive is created successfully in /opt/phantom/tmp/shared/phantom_logs_2023-04-13-1943.tgz.
Removing the /opt/phantom/private/phantom_logs_2023-04-13-1943 directory.

Done.
You have mail in /var/mail/phantom
phantom@soar1-i-0294e5a91dd236352:~$
Last modified on 14 November, 2023
Configure the logging levels for daemons   Enable and download audit trail logs in

This documentation applies to the following versions of Splunk® SOAR (On-premises): 6.2.0, 6.2.1, 6.2.2, 6.3.0


Was this topic useful?







You must be logged into splunk.com in order to post comments. Log in now.

Please try to keep this discussion focused on the content covered in this documentation topic. If you have a more general question about Splunk functionality or are experiencing a difficulty with Splunk, consider posting a question to Splunkbase Answers.

0 out of 1000 Characters