Coralogix

2014
San Francisco, CA, USA

Latest News

Sep 25, 2019   |  By Ariel Assaraf
CloudTrail logs track actions taken by a user, role, or an AWS service, whether taken through the AWS console or API operations. In contrast to on-premise-infrastructure where something as important as network flow monitoring (Netflow logs) could take weeks or months to get off the ground, AWS has the ability to track flow logs with a few clicks at relatively low cost.
Sep 4, 2019   |  By Yoni Farin
At Coralogix, we strive to ensure that our customers get a stable, real-time service at scale. As part of this commitment, we are constantly improving our data ingestion pipeline resiliency and performance. Coralogix ingests messages at extremely high rates — up to tens of billions of messages per day. Every one of these records needs to go through our entire pipeline at near real-time rates: validation, parsing, classification, and ingestion to Elasticsearch.
Aug 13, 2019   |  By Mary Mats
Throughout the past few months, I had the opportunity to work with and serve hundreds of Coralogix’s customers, the challenges in performing efficient Log Analytics are numerous, from collecting, searching, visualizing, and alerting. What I have come to learn is that at the heart of each and every one of these challenges laid the challenge of data parsing. JSON structured logs are easier to read, easier to search, alert, and visualize.
Jul 25, 2019   |  By David Bitton
The key challenge with modern cloud visibility is that data originates from various sources across every layer of the application stack. This data is varied in its format, frequency, and importance. Logs, events, and metrics need to be monitored in real-time, in batch, and ad-hoc. The data needs to be made available to every team member, though some data may be more important than others for a particular role.
Jun 2, 2019   |  By David Bitton
Kafka is an open source real-time streaming messaging system and protocol built around the publish-subscribe system. In this system, producers publish data to feeds for which consumers are subscribed to. With Kafka, clients within a system can exchange information with higher performance and lower risk of serious failure. Instead of establishing direct connections between subsystems, clients communicate via a server which brokers the information between producers and consumers.