Splunk Training + Certification

Implementing Splunk Data Stream Processor (DSP)

Course Description

This 4-day course is designed for the experienced Splunk administrators who are new to Splunk DSP. This hands-on class provides the fundamentals of deploying a Splunk DSP cluster and designing pipelines for core use cases. It covers installation, source and sink configurations, pipeline design and backup, and monitoring a DSP environment.    

Instructor-led Training Schedule

Course Prerequisites

Required:
  • Splunk Enterprise System Administration
  • Splunk Enterprise Data Administration
Recommended:
  • Architecting Splunk Enterprise Deployments
  • Working knowledge of:
    • Distributed system architectures
    • Apache Kafka (user level)
    • Apache Flink (user level)
    • Kubernetes (admin level)

Course Topics

  • Introduction to Splunk Data Stream Processor
  • Deploying a DSP cluster
  • Prepping Sources and Sinks
  • Building Pipelines - Basics
  • Building Pipelines - Deep Dive
  • Working with 3rd party Data Feeds
  • Working with Metric Data
  • Monitoring DSP Environment

Course Objectives
 

Module 1 – Introduction to DSP
  • Review Splunk deployment options and challenges
  • Describe the purpose and value of Splunk DSP
  • Understand DSP concepts and terminologies
Module 2 – Deploying a DSP Cluster
  • List DSP core components and system requirements
  • List DSP core components and system requirements
  • Describe installation options and steps
  • Check DSP service status
  • Learn to navigate in DSP UI
  • Use scloud
Module 3 – Prepping Sources and Sinks
  • Ingest data with DSP REST API service
  • Configure DSP source connections for Splunk data
  • Configure DSP sink connections for Splunk indexers
  • Create Splunk-to Splunk pass-through pipelines
Module 4 – Building Pipelines - Basics
  • Describe the basic elements of a DSP pipeline
  • Create data pipelines with the DSP canvas and SPL2
  • List DSP pipeline commands
  • Use scalar functions to convert data types and schema
  • Filter and route data to multiple sinks
Module 5 – Building Pipelines - Deep Dive
  • Manipulate pipeline options:
    • Extract
    • Transform
    • Obfuscate
    • Aggregate and conditional trigger
Module 6 – Working with 3rd party Data Feeds
  • Read from and write data to pub-sub systems like Kafka
  • List sources supported with the collect service
  • Transform data from Kafka and normalize
  • Write to S3
Module 7 – Working with Metric Data
  • Onboard metric data into DSP
  • Transform metric data for Splunk indexers and SignalFx
  • Send metric data to Splunk indexers
  • Send metric data to Splunk SignalFx
Module 8 – Monitoring DSP Environment
  • Back up DSP pipelines
  • Monitor DSP environment
  • Describe steps to isolate DSP service issues
  • Scale DSP
  • Replace DSP master node
  • Upgrade DSP cluster