Operations | Monitoring | ITSM | DevOps | Cloud

Latest Posts

Why Your Telemetry(Observability) Pipelines Need to be Responsive

At Mezmo, we consider Understand, Optimize, and Respond, the three tenets that help control telemetry data and maximize the value derived from it. We have previously discussed data Understanding and Optimization in depth. This blog discusses the need for responsive pipelines and what it takes to design them.

How Data Profiling Can Reduce Burnout

One of the most common sentiments across the industry, let alone this world, is burnout. Burnout is prevalent, the World Health Organization (WHO) estimates it costs the global economy $1 trillion dollars a year. A Gallup poll equated that to $3,400 lost for every $10,000 of salary due to lack of productivity. This problem isn’t ending anytime soon either, with the global Cybersecurity industry alone having a talent shortage of 4 Million people.

Data Optimization Technique: Route Data to Specialized Processing Chains

In most situations, you will have several sources of telemetry data that you want to send to multiple destinations, such as storage locations and observability tools. In turn, the data that you are sending needs to be optimized for its specific destination. If your data contains Personally Identifying Information (PII) for example, this data will need to be redacted or encrypted before reaching its destination.

Data Privacy Takeaways from Gartner Security & Risk Summit

A couple of weeks back, I had the opportunity to participate in the Gartner Security and Risk Summit held in National Harbor, MD. While my colleague, April Yep, has already shared insights on the sessions she attended, this blog will delve into the emerging data privacy concerns and explore how telemetry pipelines can effectively tackle these challenges. Two key drivers behind current privacy concerns are the adoption of Gen AI and increasing government regulations.

Mastering Telemetry Pipelines: Driving Compliance and Data Optimization

I had the opportunity to present with Michael Fratto, Senior Research Analyst at S&P Global Market Intelligence, at a virtual event hosted by Redmond. We discussed how telemetry pipelines are critical in controlling telemetry data (logs, metrics, events, and traces). Mike shared excellent insights from his recent research survey that discussed the proliferation of observability tools in enterprises and the challenges organizations face in managing those tools. ‍

A Recap of Gartner Security and Risk Summit: GenAI, Augmented Cybersecurity, Burnout

Last week, on June 3 -5, I attended the Gartner Security and Risk Summit in National Harbor, MD to learn about the latest trends and happenings in security. One thing was clear, artifical intelligence (AI) is the hot topic along with the growing cybersecurity staff shortage due to burnout and lack of talent.

Why Telemetry Pipelines Should Be A Part Of Your Compliance Strategy

In 2023, the global regulatory fines exceeded a colossal $10.5bn. It is not an isolated story. For the past few years, data, privacy, and industry-specific regulations have been getting stricter, enforcement is becoming rigorous, and non-compliance fines are going through the roof. Just look at this list on CSO Online of the biggest data breaches and subsequent fines companies like Meta, Amazon, and Equifax experienced in recent history.

Telemetry Data Compliance Module

Telemetry data sent from applications often contains Personally Identifying Information (PII) like names, user IDs, phone numbers, and other information that must be obfuscated before the data is sent to storage or observability tools, in order to be in compliance with corporate or government policies such as HIPAA in the US or the GDPR in the EU.

Pipeline Module: Event to Metric

At the most abstract level, a data pipeline is a series of steps for processing data, where the type of data being processed determines the types and order of the steps. In other words, a data pipeline is an algorithm, and standard data types can be processed in a standard way, just as solving an algebra problem follows a standard order of operations.

OpenTelemetry: The Key To Unified Telemetry Data

OpenTelemetry (OTel) is an open-source framework designed to standardize and automate telemetry data collection, enabling you to collect, process, and distribute telemetry data from your system across vendors. Telemetry data is traditionally in disparate formats, and OTel serves as a universal standard to support data management and portability.