Operations | Monitoring | ITSM | DevOps | Cloud

Logging Kubernetes on GKE with the ELK Stack and Logz.io

An important element of operating Kubernetes is monitoring. Hosted Kubernetes services simplify the deployment and management of clusters, but the task of setting up logging and monitoring is mostly up to us. Yes, Kubernetes offer built-in monitoring plumbing, making it easier to ship logs to either Stackdriver or the ELK Stack, but these two endpoints, as well as the data pipeline itself, still need to be set up and configured.

Rethink Analytics: Don't be fooled by cloud washing

The choices facing today's enterprise executives are far more complex than whether to adopt cloud or not - that is mostly decided. The question is how to do it well. In particular, are the tools they are using to monitor performance and security truly built to run and scale in their cloud environment. Many vendors are "cloud washing" customers by simply adding the word "cloud" to their service offerings without truly being able to deliver on their promises.

Open Distro for Elasticsearch Review

Over the years the adoption of Elasticsearch and its ecosystem of tools positioned them as the leaders in the time series data management and analysis market. With strong search capabilities, great analytical engine, Kibana as the flexible frontend and a number of data shippers enable building of end to end data processing pipeline using components designed to work with each other. Very simple setup and configuration resulted in high adoption rates and the whole stack gaining more and more users.

How to debug your Logstash configuration file

Logstash plays an extremely important role in any ELK-based data pipeline but is still considered as one of the main pain points in the stack. Like any piece of software, Logstash has a lot of nooks and crannies that need to be mastered to be able to log with confidence. One super-important nook and cranny is the Logstash configuration file (not the software’s configuration file (/etc/logstash/logstash.yml), but the .conf file responsible for your data pipeline).