PUBLIC SECTOR

CISA Emergency Directive 19-01: Doing Things the Easy Way in Splunk

On January 22nd, the US Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) released Emergency Directive 19-01.

In essence, this directive summarized recent DNS tampering activity affecting US government (and international) infrastructure.

Key Components of the DNS Tampering Activity

Compromised credentials:
The attacker compromises or gains access to legitimate credentials necessary to administer DNS records relevant to their target.

Altered DNS records:
The attacker alters some or all of the DNS records, replacing the legitimate IP address(es) with one that the attacker controls. This enables the attacker to direct users to their own infrastructure for inspection or modification before redirecting on to the legitimate service.

Valid certificates for traffic decryption:
Since the attacker has authority to update DNS records, they can also obtain valid encryption certificates for the domains they have gained administrative control over. This enables the attackers to decrypt intercepted traffic destined for these domains.

Required Actions

Directive 19-01 also specifies four ‘required actions’ to the heads of Executive agencies and other entities that operate relevant systems on their behalf.

Action One: This action requires that within 10 business days agencies audit all public DNS records (on authoritative and secondary DNS servers) for .gov or other agency-managed domains to ensure that they resolve to the intended location. If any do not these should be reported to CISA

Action Two: This action requires that within 10 business days, the passwords for all accounts on systems that can make changes to each agency’s DNS records be updated.

Action Three: This action requires that within 10 business days, multi-factor authentication (MFA) for all accounts on systems that can make changes to each agency’s DNS records be implemented. [Note: CISA provides additional guidance for cases where MFA cannot be achieved within the allotted time range].

Action Four: This action requires that within 10 business days, agencies begin monitoring Certificate Transparency logs, provided by CISA via the Cyber Hygiene service, and assess whether any newly issued certificates were unauthorized.

Fundamental Approaches

From an implementation perspective, the recommendations in Action Two (password changes) and Action Three (MFA implementation) are relatively straight forward. For Action One and Action Four, auditing DNS records and Certificate Transparency logs respectively, a bit more consideration is warranted.

High-level perspective on auditing DNS records
At a very basic level, a reverse DNS query (of public DNS records) to resolve the IP addresses corresponding to domain names for a given agency is a first step in evaluating whether the DNS records point to somewhere “unexpected.”

Once this list of IP addresses is generated through DNS interrogation, agency administrators can reduce the list of findings by eliminating blocks of “known” IP addresses. The resulting set of IP addresses would provide a good basis for initial assessment of “legitimacy.” Armed with the results of this evaluation, agencies would be well positioned to respond to Action One in the event of any anomalous findings.

High-level perspective on Auditing Certificate Transparency logs
CISA has committed to begin providing regular updates on newly added certificates, flagged in Certificate Transparency (CT) logs, that correspond to agency domains. These will be delivered to agencies via the CISA Cyber Hygiene Service. Agencies will simply need to implement a recurring business process to review the CT logs, identify any entries for “newly added certificates” that correspond to their domains, and assess whether these new certificates were legitimately added.

Doing Things the Easy Way in Splunk

For auditing DNS records and identifying unusual or unexpected domain name-to-IP address correlation, agencies can leverage basic functionality built into Splunk like lookups, the dnslookup function, and some lightweight SPL.

I ran some basic tests over coffee this morning and proved out the following basic approach in Splunk Enterprise:

  1. I needed a recent list of second, third, and fourth-level domains under the .gov TLD

  2. A quick Google search revealed that GSA maintains a listing of .gov domains and subdomains under their GitHub account

    • For testing, I used the domain column of the current-full.csv file and appended the other-websites.csv file to add in 3rd and 4th level .gov domains

  3. A little text munging with sed in the shell before ingesting into Splunk left me a nice clean lookup file containing ~21,000 public-facing domains under .gov

  4. With this lookup file in Splunk, I was able to pipe the domain list into dnslookup to automate the reverse DNS lookups and generate a new lookup file containing domain-to-IP address pairs by piping the results to outputlookup

  5. With these pairs in-hand, the auditing is simply an exercise of removing addresses you know and investigating the small remainder of unexpected addresses
    • Since I didn’t have immediate visibility into which IP addresses were “normal” and which ones weren’t, I plotted them on a map and found some interesting results

Let’s make things even easier by turning this into a basic dashboard.

Further Simplification with a Basic Splunk Dashboard

My next step was to implement a prototype dashboard that would be a little more efficient for real-world usage.  

This prototype dashboard essentially pulls in the domain-to-ip lookup file that I created earlier with outputlookup and provides a few search fields to make sifting through the data even quicker and easier. These search fields included:

  1. A domain search match - This enables users to enter wildcarded queries to limit the results down to just domains of interest for their analysis (e.g., *.state.gov)

  2. An IP address exclusion field (e.g., testing with 52.200.115.76 exclusion)

  3. A CIDR range exclusion (e.g., testing with 23.0.0.0/8 block exclusion)

As I mentioned earlier in the "Doing Things the Easy Way in Splunk" section of this post, with the above search fields, auditing is simply an exercise of searching for the subdomain you’re interested in, removing “known” CIDR range(s) (or individual IPs) and investigating the remainder of unexpected addresses.

Some other key steps for real-world usage that can be easily implemented in Splunk include:

  1. Temporal change analysis for domain-to-IP pairs
    • This will provide insights into “what’s normal” versus “what’s new” from a DNS pointer perspective
  2. Auditing Administrative ‘change,’ ‘modify,’ or ‘delete’ activity related to DNS by internal administrative accounts—this will provide insights into “normal” versus “abnormal” domain administrator activity and provide an easy mechanism to identify potential account takeovers

For auditing Certificate Transparency logs, the process is relatively simple in Splunk Enterprise as well:

  1. Working with a couple of colleagues, Andy Price and Patrick Chu, on this one, we went to Splunkbase and grabbed the excellent Certificate Transparency Log add-on for Splunk created by Jorrit Folmer. This TA handles the heavy lifting for both collecting the CT logs, and converting them to a Splunk friendly format.

     

  2. The next step was to identify the actual sources of CT Logs. Google (and several others) maintain lists of CT log servers. Using this list, you can identify the API endpoints for each CT log source
  3. Once you identify which CT log sources you want to collect, go into the Certificate Transparency TA and click on the “Create New Input” button. Next, give the input a name, set a polling interval, choose what index you want to use, and finally put in the url for the CT log api endpoint.

  4. Reviewing the resulting JSON we ran a quick assessment of log entry type 0 entries since log entry type 1 indicates a “pre-certification” for a pending cert issuance.

  5. Using some basic SPL to reduce our set of events, the final step is to add another line of SPL to look for certificates issued to just the agency subdomains of interest to your org. The important field to filter on is “LeafCertificate.x509_extensions.subjectAltName” (This contains the valid domain names)

 

Putting this all together in SPL looks like this:

index=ca sourcetype=ct:log LogEntryType=0
LeafCertificate.x509_extensions.subjectAltName{}=*.gov
| stats list(source) as "CTL Source" by ssl_issuer_common_name
LeafCertificate.signature_algorithm
LeafCertificate.x509_extensions.subjectAltName{}
LeafCertificate.serial
| dedup LeafCertificate.serial

From here, all you would need to do is update the *.gov to match your agency’s subdomains in these fields. This search will provide a table of all of the certificates issued with your domain. Comparing these certificates to your own records will set you up for easy evaluation of any certificates that you did not request and make reporting results back to CISA as easy as hitting the Export button in Splunk.


Not bad for a half-day’s prototyping…

Hopefully that helps chart an achievable path forward for any organization out there directly subject to CISA Directive 19-01.

Even if you don’t have Splunk, the methods outlined here provide a template of steps that you can follow to gain visibility into anomalous DNS resolutions and certificate registrations.

If you are already a Splunk customer and have the technical bandwidth, we’ve intentionally shared our prototype SPL for you so that you can implement this same approach for your agency with relative ease.

If you are already a Splunk customer and need help, reach out to us at gov_answers@splunk.com

If you don’t have Splunk and want try it out you can download a free trial version from our website.

Thanks for reading, hope this helps, and happy Splunking!

Anthony Perez
Posted by

Anthony Perez

Anthony is Director of Field Technology for Splunk’s public sector headquarters in Mclean, Virginia.  Prior to joining Splunk, Anthony spent several years at a global consulting firm where he led the development and execution of novel approaches for aggregating, analyzing, and assessing cyber threats to US interests.

Mr. Perez is a graduate of the Whiting School of Engineering at Johns Hopkins University and holds an M.S. in Information Systems specializing in Security.

TAGS

CISA Emergency Directive 19-01: Doing Things the Easy Way in Splunk

Show All Tags
Show Less Tags

Join the Discussion