Operations | Monitoring | ITSM | DevOps | Cloud

Mezmo

More Value From Your Logs: Next Generation Log Management from Mezmo

Once upon a time, we thought “Log everything” was the way to go to ensure we have all the data we needed to identify, troubleshoot, and debug issues. But we soon had new problems: cost, noisiness, and time spent sifting through all that log data. Enter log analysis tools to help refine volumes of log data and differentiate signal from the noise to reduce mental toil to process. Log beast tamed, for now….

A Day in the Life of a Mezmo SRE

What keeps an SRE at the top of his game? I had an insightful conversation with Jon Duarte, a Site Reliability Engineer (SRE) at Mezmo and he walked me through his role and the various tasks he manages on a typical day. Here’s Jon offering a brief glimpse into the challenges he faces, the thought processes behind his approach, and the innovative solutions SREs come up with.

Dogfooding at Mezmo: How we used telemetry pipeline to reduce data volume

Like many other organizations, we at Mezmo struggle with a lot of telemetry data, and for a while our team configured our logs to be sent to a global Mezmo Log Analysis account in our SaaS so we would have a single pane of glass to view all of our logs. Our SRE team wanted to make sure that we have experience utilizing our new pipeline product. We set out some goals before we started using telemetry pipeline.

Applying a Data Engineering Approach to Telemetry Data

The exponential growth of telemetry data presents a significant challenge for organizations, who often overspend on data management without fully capitalizing on its potential value. To unlock the true potential of their telemetry data, organizations must treat it as a valuable enterprise asset, applying rigorous data engineering principles to glean the critical insights and accelerated investigations this data is meant to enable. The telemetry data platform approach democratizes access across disciplines and personas and fosters widespread utilization across the organization.

Unlocking Business Insights with Telemetry Pipelines

Imagine running a large company where data-driven decisions give you a competitive edge. You use a lot of business intelligence tools that tap into vast amounts of data, such as sales figures, inventories, and expenses. This analysis tells you how your company is performing. However, it does not reveal how your "company infrastructure" is performing. This crucial information comes from your systems in the form of telemetry data, such as logs and events.

Why Your Telemetry(Observability) Pipelines Need to be Responsive

At Mezmo, we consider Understand, Optimize, and Respond, the three tenets that help control telemetry data and maximize the value derived from it. We have previously discussed data Understanding and Optimization in depth. This blog discusses the need for responsive pipelines and what it takes to design them.

How Data Profiling Can Reduce Burnout

One of the most common sentiments across the industry, let alone this world, is burnout. Burnout is prevalent, the World Health Organization (WHO) estimates it costs the global economy $1 trillion dollars a year. A Gallup poll equated that to $3,400 lost for every $10,000 of salary due to lack of productivity. This problem isn’t ending anytime soon either, with the global Cybersecurity industry alone having a talent shortage of 4 Million people.

Mezmo Edge Explainer Video

Ensuring access to the right telemetry data - like logs, metrics, events, and traces from all applications and infrastructure are challenging in our distributed world. Teams struggle with various data management issues, such as security concerns, data egress costs, and compliance regulations to keep specific data within the enterprise. Mezmo Edge is a distributed telemetry pipeline that processes data securely in your environment based on your observability needs.

Data Optimization Technique: Route Data to Specialized Processing Chains

In most situations, you will have several sources of telemetry data that you want to send to multiple destinations, such as storage locations and observability tools. In turn, the data that you are sending needs to be optimized for its specific destination. If your data contains Personally Identifying Information (PII) for example, this data will need to be redacted or encrypted before reaching its destination.

Data Privacy Takeaways from Gartner Security & Risk Summit

A couple of weeks back, I had the opportunity to participate in the Gartner Security and Risk Summit held in National Harbor, MD. While my colleague, April Yep, has already shared insights on the sessions she attended, this blog will delve into the emerging data privacy concerns and explore how telemetry pipelines can effectively tackle these challenges. Two key drivers behind current privacy concerns are the adoption of Gen AI and increasing government regulations.