Analytics

yellowfin

How to create a data culture through data storytelling

One of the core reasons that organizations invest in analytic solutions is because they want to get everyone in their organization on the same page. They want everyone to understand what's happening and why it's happening so that individuals know what they need to do to be successful and drive outcomes for the organization.

looker

5 Ways to Kickstart a Data-Driven Company Culture in 2020

If you’ve been looking for ways to get your colleagues more excited about becoming more data-driven this year, you’ve landed on the right blog post. Getting people on board with data has a lot to do with two terms that were thrust into the limelight in 2019: ‘data culture’ and ‘data literacy’. In talking to customers about these initiatives, I found that (1) many felt like they should have a data culture, and (2) not everyone knows what that means.

cloudera

Real-time log aggregation with Flink Part 1

Many of us have experienced the feeling of hopelessly digging through log files on multiple servers to fix a critical production issue. We can probably all agree that this is far from ideal. Locating and searching log files is even more challenging when dealing with real-time processing applications where the debugging process itself can be extremely time-sensitive.

graylog

Implementing Geolocation with Graylog Pipelines

Geolocation can be automatically built into the Graylog platform by using the "GeoIP Resolver" plugin with a MaxMind database. However, you can further improve your ability to extract meaningful and useful data by leveraging the functionality of pipelines and lookup tables. In fact, these powerful features allow you to do much more than the basic plugin.

splunk

Creating a Custom Container for the Deep Learning Toolkit: Splunk + Rapids.ai

The Deep Learning Toolkit (DLTK) was launched at .conf19 with the intention of helping customers leverage additional Deep Learning frameworks as part of their machine learning workflows. The app ships with four separate containers: Tensorflow 2.0 - CPU, Tensorflow 2.0 GPU, Pytorch and SpaCy. All of the containers provide a base install of Jupyter Lab & Tensorboard to help customers develop and create neural nets or custom algorithms.

unravel

Unravel Earns Prestigious SOC 2 Security Certification

RELATED BLOG POSTS Unraveling the Complex Streaming Data Pipelines of Cybersecurity Best Practices Blog 5 Min Read Security is top of mind for every enterprise these days. There are so many threats they can hardly be counted, but one commonality exists: data is always the target. Unravel’s mission is to help organizations better understand and improve the performance of their data-based applications. We’re a data business, so we appreciate the scope and implications of these threats.

elastic

Elastic on Elastic: Embracing our own technology

When making investments in our tech stack, we tend to have doubts about companies that don’t use their own products and services. At Elastic, we deploy the full suite of our technology across the enterprise. We do so because our technology not only works, but it makes us more efficient and flexible on so many levels. And it can do the same for you and your business, too.

xpolog

DevOps Metrics: 7 KPIs to Evaluate Your Team's Maturity

Measuring the maturity of your DevOps team might sound difficult, but it isn’t at all. Simple key performance indicators (KPIs), such as the deployment success rate or mean time between failure, give a good indication of the maturity of your DevOps team. By “mature,” I mean that your team consistently and smoothly operates at a high level and can deploy several times a day with very little risk.