Operations | Monitoring | ITSM | DevOps | Cloud

Splunk

Creating a Custom Container for the Deep Learning Toolkit: Splunk + Rapids.ai

The Deep Learning Toolkit (DLTK) was launched at .conf19 with the intention of helping customers leverage additional Deep Learning frameworks as part of their machine learning workflows. The app ships with four separate containers: Tensorflow 2.0 - CPU, Tensorflow 2.0 GPU, Pytorch and SpaCy. All of the containers provide a base install of Jupyter Lab & Tensorboard to help customers develop and create neural nets or custom algorithms.

Best Practices for Using Splunk Workload Management

Workload management is a powerful Splunk Enterprise feature that allows you to assign system resources to Splunk workloads based on business priorities. In this blog, I will describe four best practices for using workload management. If you want to refresh your knowledge about this feature or use cases that it solves, please read through our recent series of workload management blogs — part 1, part 2, and part 3.

The Daily Telegraf: Getting Started with Telegraf and Splunk

In this blog post, we discuss using Telegraf as your core metrics collection platform with the Splunk App for Infrastructure (SAI) version 2.0, the latest version of Splunk’s infrastructure monitoring app that was recently announced at Splunk .conf19. This blog post assumes you already have some familiarity with Telegraf and Splunk. We provided steps and examples to make sense of everything along the way, and there are also links to resources for more advanced workflows and considerations.

CVE-2020-0601 - How to operationalize the handling of vulnerabilities in your SOC

Software vulnerabilities are part of our lives in a digitalized world. If anything is certain, it’s that we will continue to see vulnerabilities in software code! Recently the CVE-2020-0601 vulnerability, also known as CurveBall or “Windows CryptoAPI Spoofing Vulnerability”, was discovered, reported by the NSA and made headlines. The NSA even shared a Cybersecurity Advisory on the topic. Anthony previously talked about it from a public sector and Vulnerability Scanner angle.

Using Splunk Attack Range to Test and Detect Data Destruction (ATT&CK 1485)

Data destruction is an aggressive attack technique observed in several nation-state campaigns. This technique under MITRE ATT&CK 1485, describes actions of adversaries that may “..destroy data and files on specific systems or in large numbers on a network to interrupt availability to systems, services, and network resources. Data destruction is likely to render stored data irrecoverable by forensic techniques through overwriting files or data on local and remote drives”.

Too Many Security Alerts, Not Enough Time: Automation to the Rescue

It’s 2020, which means it’s time to look back at 2019 and reminisce about the good times – fun with family and friends, good food, travel, and memories to last a lifetime. Who am I kidding? Everyone remembers the bad stuff. The increasing impacts of climate change; relentless fires in the Amazon, California, and Australia; political and social unrest around the globe; and the last season of Game of Thrones. Jon Snow... you still know nothing.

Splunk named Orange Business Service 'Digital and Data Partner of the Year'

With 2020 now well underway and the end of our financial year just around the corner, it’s a great moment to review some of the successes we have had in the EMEA Partner team over the past year. One particular highlight for us came in December when Orange Business Services, the digital transformation arm of Orange, named Splunk as its Digital & Data Partner of the Year at its annual awards in Paris.

Q&A Follow-Up: How Datev uses MITRE ATT&CK & Splunk in its SOC

Hey Everyone, We recently did a webinar with Christian Heger, technical head of the DATEV SOC, as well as Sebastian Schmerl, head of cyber defense of Computacenter. They shared their 6-month path of modernizing their security operations with help of Splunk technology and the MITRE ATT&CK framework. As we weren’t able to address all of the questions during the webinar, we discussed these afterwards and share them in this blog post as a Q&A follow-up.

Self-Service Analytics for the Shop Floor [Part I] - Splunk Core Concepts

Despite the hype around predictive maintenance, basic data collection and analysis are still high priorities for manufacturing companies and key criteria for the success of Industrial Internet of Things (IIoT) projects. It is crucial that people who are most familiar with industrial assets, like process or control engineers, have direct access to industrial data. That way inadequate situations such as breakdowns can be resolved quickly.