Operations | Monitoring | ITSM | DevOps | Cloud

The latest News and Information on Log Management, Log Analytics and related technologies.

Setting Up a Data Loop using Cribl Search and Stream Part 2: Configuring Cribl Search

In the second video of our series, we delve into the nuts and bolts of configuring Cribl Search to access the data that we've stored in the S3 bucket. The video guides you step-by-step through the process of configuring the Search S3 dataset provider by using the Stream Data Lake destination as a model for the authentication information. From there, we proceed to walk through the process of creating a Dataset to access the Provider that we've just established. To wrap things up, we demonstrate how to search through the test data that we've previously stored in the S3 bucket.

Setting Up a Data Loop using Cribl Search and Stream Part 3: Send Data from Cribl Search to Stream

The third video of our series focuses on utilizing Cribl Stream to manage data. The presenter takes us through the process of configuring the Cribl Stream in_cribl_http source in tandem with the Cribl Search send operator to collect data. We are able to witness live data results being sent from Search to Stream. Afterward, we demonstrate creating a Route in Stream to direct the incoming data from Search (via the in_cribl_http) Source to the Data Lake by using the Amazon S3 Data Lake Destination. This step employs a passthrupipeline to ensure that the data is not altered in transit.

Modernize Your SIEM Architecture

Join Ed Bailey from Cribl and John Alves from CyberOne Security as they discuss the struggles faced by many SIEM teams in managing their systems to control costs and extract optimal value from the platform. The prevalence of bad data or an overwhelming amount of data leads to various issues with detections and drives costs higher and higher. It is extremely common to witness a year-over-year cost increase of up to 35%, which is clearly unsustainable.

A Step-by-Step Guide to Standardizing Telemetry with the BindPlane Observability Pipeline

Adding additional attributes to your telemetry not only provides valuable context to your observability pipeline but also enhances the flexibility and precision of your data operations. Consider, for example, the need to route data from specific geographical locations, like the EU, to a designated destination. With a ‘Location’ attribute added to your logs, you can seamlessly achieve this.

Head in the Clouds (ft. Jo Peterson): Experts Dish on Cloud Strategy

Cloud is still a buzzword - but is it getting the attention you want it to get? Yeah, we thought so. The secret is layering in revenue-generating words around cloud to grab the attention it so rightfully deserves. Hear more from Splunker Tom Stoner and Clarify360's Jo Peterson.

Rollouts in BindPlane OP

Learn how easy it is to edit and roll out changes to your configurations, deploying in batches, while also being able to look back at the entire version history. About ObservIQ: observIQ is developing the unified telemetry platform: a fast, powerful and intuitive next-generation platform built for the modern observability team. Rooted in OpenTelemetry, our platform is designed to help teams reduce, simplify, and standardize their observability data.

Unraveling the Log Data Explosion: New Market Research Shows Trends and Challenges

Log data is the most fundamental information unit in our XOps world. It provides a record of every important event. Modern log analysis tools help centralize these logs across all our systems. Log analytics helps engineers understand system behavior, enabling them to search for and pinpoint problems. These tools offer dashboarding capabilities and high-level metrics for system health. Additionally, they can alert us when problems arise.

10 AWS Data Lake Best Practices

A data lake is the perfect solution for storing and accessing your data, and enabling data analytics at scale - but do you know how to make the most of your AWS data lake? In this week’s blog post, we’re offering 10 data lake best practices that can help you optimize your AWS S3 data lake set-up and data management workflows, decrease time-to-insights, reduce costs, and get the most value from your AWS data lake deployment.

The OSI Model in 7 Layers: How It's Used Today

The Open System Interconnection model (OSI Model) is a foundational concept that shapes how we build digital environments. The OSI Model is a conceptual framework that describes how different computer systems communicate with each other inside network or cloud/internet environments. Today, let’s look at how the OSI Model affects our digital lives, applications and networks.