Operations | Monitoring | ITSM | DevOps | Cloud

Mezmo

Mezmo Edge Explainer Video

Ensuring access to the right telemetry data - like logs, metrics, events, and traces from all applications and infrastructure are challenging in our distributed world. Teams struggle with various data management issues, such as security concerns, data egress costs, and compliance regulations to keep specific data within the enterprise. Mezmo Edge is a distributed telemetry pipeline that processes data securely in your environment based on your observability needs.

Data Optimization Technique: Route Data to Specialized Processing Chains

In most situations, you will have several sources of telemetry data that you want to send to multiple destinations, such as storage locations and observability tools. In turn, the data that you are sending needs to be optimized for its specific destination. If your data contains Personally Identifying Information (PII) for example, this data will need to be redacted or encrypted before reaching its destination.

Data Privacy Takeaways from Gartner Security & Risk Summit

A couple of weeks back, I had the opportunity to participate in the Gartner Security and Risk Summit held in National Harbor, MD. While my colleague, April Yep, has already shared insights on the sessions she attended, this blog will delve into the emerging data privacy concerns and explore how telemetry pipelines can effectively tackle these challenges. Two key drivers behind current privacy concerns are the adoption of Gen AI and increasing government regulations.

Mastering Telemetry Pipelines - Driving Compliance and Data Optimization

Telemetry (Observability) pipelines play a critical role in controlling telemetry data (logs, metrics, events, and traces). However, the benefits of pipeline go well beyond log volume and cost reductions. In addition to using pipelines as pre-processors of data going to observability and SIEM systems, they can be used to support your compliance initiatives. This session will cover how enterprises can understand and optimize their data for log reduction while reducing compliance risk.

Mastering Telemetry Pipelines: Driving Compliance and Data Optimization

I had the opportunity to present with Michael Fratto, Senior Research Analyst at S&P Global Market Intelligence, at a virtual event hosted by Redmond. We discussed how telemetry pipelines are critical in controlling telemetry data (logs, metrics, events, and traces). Mike shared excellent insights from his recent research survey that discussed the proliferation of observability tools in enterprises and the challenges organizations face in managing those tools. ‍

A Recap of Gartner Security and Risk Summit: GenAI, Augmented Cybersecurity, Burnout

Last week, on June 3 -5, I attended the Gartner Security and Risk Summit in National Harbor, MD to learn about the latest trends and happenings in security. One thing was clear, artifical intelligence (AI) is the hot topic along with the growing cybersecurity staff shortage due to burnout and lack of talent.

Why Telemetry Pipelines Should Be A Part Of Your Compliance Strategy

In 2023, the global regulatory fines exceeded a colossal $10.5bn. It is not an isolated story. For the past few years, data, privacy, and industry-specific regulations have been getting stricter, enforcement is becoming rigorous, and non-compliance fines are going through the roof. Just look at this list on CSO Online of the biggest data breaches and subsequent fines companies like Meta, Amazon, and Equifax experienced in recent history.

Telemetry Data Compliance Module

Telemetry data sent from applications often contains Personally Identifying Information (PII) like names, user IDs, phone numbers, and other information that must be obfuscated before the data is sent to storage or observability tools, in order to be in compliance with corporate or government policies such as HIPAA in the US or the GDPR in the EU.

Pipeline Module: Event to Metric

At the most abstract level, a data pipeline is a series of steps for processing data, where the type of data being processed determines the types and order of the steps. In other words, a data pipeline is an algorithm, and standard data types can be processed in a standard way, just as solving an algebra problem follows a standard order of operations.

OpenTelemetry: The Key To Unified Telemetry Data

OpenTelemetry (OTel) is an open-source framework designed to standardize and automate telemetry data collection, enabling you to collect, process, and distribute telemetry data from your system across vendors. Telemetry data is traditionally in disparate formats, and OTel serves as a universal standard to support data management and portability.