Exporting splunk dashboard with custom tokens

In previous splunk post, we've went through logic that allows us to export dashboards and store them on remote storage server.

But there was a catch, that api endpoint, that we've used for generating pdf out of choosen dashboard, does not provide us with a way of inserting custom tokens (if dashboards uses any) - so we were pretty much limited to 'static' dashboards that get generated once reports are ran, which is still pretty powerful and useful as we can run reports on scheduled basis so we can get near-real time dashboards.

What if we have a dashboard that runs 'inline' queries and waits for users input - like time/date, usernames, source ips, etc...

Lets break it down

Say we have this dashboard:

<form version="1.1" theme="dark" refresh="60">
  <label>Example dashboard</label>
  <description>Desc</description>
  <fieldset submitButton="false">
    <input type="time" token="time-picker">
      <label></label>
      <default>
        <earliest>-24h@h</earliest>
        <latest>now</latest>
      </default>
    </input>
  </fieldset>
  <row>
    <panel>
      <table>
        <title>Example 1</title>
        <search>
          <query>
            <![CDATA[index=windows (EventCode="636" OR EventCode="4732") | fieldformat Time=strftime(Time, "%d.%m.%Y. %H:%M:%S") | table _time Subject_Account_Name signature Group_Name Member_Account_Name user Subject_Account_Domain | rename _time AS Time, Subject_Account_Name AS "Action by", Group_Name AS "Group Name", Subject_Account_Domain AS Domain, signature AS Description | sort -Time]]>
          </query>
          <earliest>$time-picker.earliest$</earliest>
          <latest>$time-picker.latest$</latest>
        </search>
        <option name="drilldown">cell</option>
        <drilldown>
          <set token="username_token">$click.value2$</set>
          <link target="_blank">search?q=<![CDATA[urlEncoded drilldown query...]]>
          </link>
        </drilldown>
      </table>
    </panel>
  </row>
</form>

So if we would open this kind of dashboard, by default it will show us events generated in last 24h, but we could set custom time frame using tokens $time-picker.earliest/latest$.

If we would to give this dashboard to our previous script, it would complain with "unknown earliest time" as it could not parse these tokens, so here is a modified version that takes any xml dashboard, runs search, waits for results and yells out pdf.

#!/usr/bin/python

# pdfgenerator v2 - using static xml and passing custom tokens for earlist/latest time-picker var

import requests
import datetime
import os

now = datetime.datetime.now().strftime("%Y-%d-%m_%H-%M-%S")
secret = os.environ.get('SplunkPass')

with open("/full/path/dashboard.xml", encoding="utf-8") as xml_file:
    xml_dashboard = xml_file.read()
    # Change tokens
    xml_dashboard = xml_dashboard.replace("$time-picker.earliest$","-24h@h")
    xml_dashboard = xml_dashboard.replace("$time-picker.latest$","now")

report_timestamp = ("/opt/splunkexports/daily_" + now + ".pdf")

#Load custom xml dashboard and change pdf footer img/logo
#search can be any app inside splunk and img is located at /opt/splunk/etc/apps/search/appserver/static/

params = {
    'input-dashboard-xml': xml_dashboard.encode(),
    'include-splunk-logo': 1,
    'pdf.logo_path': 'search:img.png'
}

response = requests.post(
    'https://your-splunk-instance.tld:8089/services/pdfgen/render',
    auth=('your-splunk-username', secret),
    verify=False,
    params=params,
)

# Maybe play with some error handling if you want to ensure continuity
if response.status_code == 200:
    with open(report_timestamp, 'wb') as pdffile:
        pdffile.write(response.content)

Here we did a few things:

  1. We open a locally saved xml dashboard (same as the one inside our splunk instance) and change tokens to our - at a time of export - wanted time frame,
  2. We changed parameter to 'input-dashboard-xml',
  3. Changed splunk logo from pdfs footer, to our own wanted image/logo.

What does this allow us to do?

Well, for example, we can programatically get exports for any wanted time frame, using only 1 dashboard or no splunk dashboard at all, as this runs at real time when issued, with no previous reports at all, making splunk use less resources overall.

We can also do a bunch of different exports for different users, organizational units, assets, etc...

Lets say you have to preserve a monthly export of events generated by your critical infrastructure in readable dashboard style for audit purposes, and you have a bunch of CII, we can do something like this:

#!/usr/bin/python

# pdfgenerator v2.1 - using static xml and passing custom tokens

import requests
import datetime
import os

now = datetime.datetime.now().strftime("%Y-%d-%m")
secret = os.environ.get('SplunkPass')

with open("/full/path/dashboard.xml", encoding="utf-8") as xml_file:
    xml_dashboard = xml_file.read()
    # Change tokens
    xml_dashboard = xml_dashboard.replace("$time-picker.earliest$","-30d@d")
    xml_dashboard = xml_dashboard.replace("$time-picker.latest$","now")

with open("ciiList.json", encoding="utf-8") as json_file:
    cii_list = json.loads(json_file.read())

for cii_hostname in cii_list:

  # Change token each run
  xml_dashboard = xml_dashboard.replace("$hostname$", cii_hostname["hostname"])

  report_timestamp = ("/opt/splunkexports/" + cii_hostname + now + ".pdf")
  
  params = {
      'input-dashboard-xml': xml_dashboard.encode(),
      'include-splunk-logo': 1,
      'pdf.logo_path': 'search:img.png'
  }
  
  response = requests.post(
      'https://your-splunk-instance.tld:8089/services/pdfgen/render',
      auth=('your-splunk-username', secret),
      verify=False,
      params=params,
  )
  
  # Maybe play with some error handling if you want to ensure continuity
  if response.status_code == 200:
      with open(report_timestamp, 'wb') as pdffile:
          pdffile.write(response.content)

This generates unique pdf for each CII asset from .json list

If we have 'generator1, generator2, hvac' in our CII list, this would generate 3 pdfs - each named uniquely after CII asset name + timestamp, containing events about themselves.

Moore's law is the observation that the number of transistors in an integrated circuit doubles about every two years.
Splunk's law is the observation that their need for overcomplicating things doubles about every release.

Hopefully this helps you in your future endeavors.

Stay healthy, stay safe,
Cheers,
bigfella