Operations | Monitoring | ITSM | DevOps | Cloud

June 2018

The Importance of Historical Log Data

Centralized log management lets you decide who can access log data without actually having access to the servers. You can also correlate data from different sources, such as the operating system, your applications, and the firewall. Another benefit is that user do not need to log in to hundreds of devices to find out what is happening. You can also use data normalization and enhancement rules to create value for people who might not be familiar with a specific log type.

Fishing for Log Events with Graylog Sidecar

Getting the right information at the right time can be a difficult task in large corporate IT infrastructures. Whether you are dealing with a security issue or an operational outage, the right data is key to prevent further breakdowns. With central log management, security analysts or IT operators have a single place to access server log data. But what happens if the one log file that is urgently needed is not collected by the system?

Using Trend Analysis for Better Insights

Centralized log collection has become a necessity for many organizations. Much of the data we need to run our operations and secure our environments comes from the logs generated by our devices and applications. Centralizing these logs creates a large repository of data that we can query to enable various types of analysis. The most common types are conditional analysis and trend analysis. They both have their place, but trend analysis is perhaps the more often underutilized source of information.