Operations | Monitoring | ITSM | DevOps | Cloud

IoT Sensor Data into Graylog: A Lab Guide

Graylog has always been associated with log management, metrics, SIEM and security monitoring—but it’s also a great tool for creative, low-cost experiments in a home lab. I wanted to use it for real-world sensor data, so I built a DIY temperature and humidity monitor using an ESP-WROOM-32 development board and a DHT22 sensor.

How to Use MCP to Optimize Your Graylog Security Detections

Security teams face a critical question: “What logs should we collect, and what detections should we enable to protect against threats targeting our industry?” For a bank in the northeast, this isn’t academic. Threat groups like FIN7, Lazarus Group, and Carbanak specifically target financial institutions with sophisticated attacks ranging from SWIFT compromise to ransomware.

Understanding How a Log Correlation Engine Enables Real-Time Insights

Tax season is notoriously most people’s least favorite time of year. For people who complete their own tax returns, the process becomes an agonizing one of looking at small pieces of paper, matching numbers to the lines that ask for information, and comparing various inputs. In essence, doing your taxes makes you a correlation engine. Now, imagine taking this tedious process and applying it to the terabytes of data that your environment generates daily.

How to Speed Up Incident Response With Guided Remediation

Most teams picture incident response as a linear sprint from alert to resolution. A notification appears, an analyst pivots across screens, a decision gets made, and the workflow moves on. It works, but it is mechanical, tiring, and fragile. Graylog 7.0 aims for something more impactful. Guided remediation gives analysts clarity during the moments when pressure rises and context usually scatters. It takes raw detection data and turns it into a clear path forward. No theatrics.

What Is a Data Pipeline

In today’s tech world, IT and security technologies are the functional equivalent of Pokemon. To gain the insights you need, you “gotta catch ‘em all” by ingesting, correlating, and analyzing as much security data as possible. Data pipelines organize chaotic information flows into structured streams, ensuring that data is reliable, processed, and ready for use.

Gobbling Up Insights: Graylog 7.0 Serves Up a Feast

A feast of new features. A cornucopia of new capabilities. A banquet of breakthroughs (and the T-day puns are just getting started). Graylog 7.0 brings a full plate of advancements that help security teams cut through noise, control cloud costs, and respond with confidence. We’re serving practical improvements across dashboards, automation, and AI support so analysts can focus on action instead of manual effort.

Sliding Through Log-Time Space

This post kicks off a new series written by the Graylog Development Team. In these updates, we’ll highlight the features and fixes that make daily work in Graylog smoother. We want to show the work we care so much about and present the challenges we faced and overcame. Today, we’re starting with one of those minor but functional enhancements: Graylog time-range stepping.

Understanding Incident Response vs Incident Remediation

At a high level, incident remediation is a part of the incident response process. An Incident response plan manages the incident lifecycle across planning, detection, investigation, and recovery. Meanwhile, incident remediation focuses on identifying root causes and implementing measures to prevent future occurrences.

Caddy Webserver Data in Graylog

If you’re running Caddy Webserver on Ubuntu, Graylog now has a new way to make your access logs more actionable without tedious parsing or manual setup. The new Caddy Webserver Content Pack, available in Illuminate 6.4 and a Graylog Enterprise or Graylog Security license, delivers ready-to-use parsing rules, streams, and dashboards so you can quickly turn raw logs into structured, searchable insights.