Operations | Monitoring | ITSM | DevOps | Cloud

Latest Posts

OpenTelemetry: The Key To Unified Telemetry Data

OpenTelemetry (OTel) is an open-source framework designed to standardize and automate telemetry data collection, enabling you to collect, process, and distribute telemetry data from your system across vendors. Telemetry data is traditionally in disparate formats, and OTel serves as a universal standard to support data management and portability.

What's New With Mezmo: Real-Time Alerting

Here at Mezmo, we see the purpose of a telemetry pipeline is to help ingest, profile, transform, and route data to control costs and drive actionability. There are many ways to do that as we’ve previously discussed in our blogs, but today I’m going to talk about real-time alerting on data in motion, yes - on streaming data, before it reaches its destination.

Webinar Recap: Mastering Telemetry Pipelines - A DevOps Lifecycle Approach to Data Management

In our webinar, Mastering Telemetry Pipelines: A DevOps Lifecycle Approach to Data Management, hosted by Mezmo’s Bill Balnave, VP of Technical Services, and Bill Meyer, Principal Solutions Engineer, we showcased a unique data-engineering approach to telemetry data management that comprises three phases: Understand, Optimize, and Respond.

Open-source Telemetry Pipelines: An Overview

Imagine a well-designed plumbing system with pipes carrying water from a well, a reservoir, and an underground storage tank to various rooms in your house. It will have valves, pumps, and filters to ensure the water is of good quality and is supplied with adequate pressure. It will also have pressure gauges installed at some key points to monitor whether the system is functioning efficiently. From time to time, you will check pressure, water purity, and if there are any issues across the system.

SRECon Recap: Product Reliability, Burn Out, and more

I recently attended SRECon in San Francisco on March 18 - 20, a show dedicated to a gathering of engineers who care deeply about site reliability, systems engineering, and working with complex distributed systems at scale. While there were a lot of talks, I’ll focus on a few areas that gave me the most insight into how having the right data impacts an SREs and an organization’s success.

Webinar Recap: How to Manage Telemetry Data with Confidence

In our recent webinar hosted by Bill Balnave, VP of Technical Services, and Brandon Shelton, our Solution Architect, we discussed how data's continuous growth and dynamic nature cause DevOps and security teams to lose confidence in their data. The uncertainty about the content of telemetry data, concerns about its completeness, and worries about sending sensitive PII information in data streams reduce trust in the collected and distributed data.

Webinar Recap: Myths and Realities in Telemetry Data Handling

Telemetry data is growing exponentially, but the business value isn’t increasing at a similar pace. Getting the right telemetry data is hard, so I recently had a conversation with Matt Aslett, Director of Research at Ventana Research, now a part of ISG, about five myths and realities in telemetry data handling.

Managing Telemetry Data Overflow in Kubernetes with Resource Quotas and Limits

One of the inherent challenges you'll face when working with Kubernetes is that a typical cluster includes many resources that produce telemetry data. Because producing and moving telemetry data consumes resources, you can end up in situations where different workloads are competing for the resources necessary to manage telemetry data.