My holiday challenge to explain serverless InfluxDB to my family produced a useful Flux script anyone can put to work today. Before we dive into the code, let me outline the high-level approach to gathering and visualizing how long it takes a website to respond to an HTTP request.
Today we announce InfluxDB 2.0 Open Source’s official move to Beta. This represents a huge step forward from where we started out nearly a year ago and is one step closer to general availability. You can download the latest version on our downloads page. Since we announced the first Alpha for InfluxDB 2.0 back in January ‘19, we have been working hard to build out and harden InfluxDB 2.0’s capabilities.
Gathering logs that contain IP addresses are quite common across your infrastructure. Your firewalls, web servers, wireless infrastructure and endpoints can contain IP addresses outside your organization. Having additional data on those logs that gives you the Geolocation of the IP address helps in your investigations and understanding of your traffic patterns. For Example, if you can see logs on a World Map, you know if you are communicating to a country you don’t normally talk to.
2020 is finally here — and with the dawn of this new decade comes cutting-edge advancements to our platform as well as more opportunities to meet up and receive feedback from customers and community members at events around the world.
An emerging field of data science uses time series metrics to develop an educated estimate of future developments in business such as revenue, sales, and demand for resources and product deliverables. A forecast is based on historical data of a given metric plus other relevant factors. Accurate forecasts are an important aspect of corporate planning.
If money makes the world go round, then technology is becoming its engine. Financial services, in particular, have widely adopted big data technologies and analytics to inform better investment decisions. The past decade has seen the rise of digital investment platforms. Financial institutions — facing increasing competition, regulatory constraints and customer needs — are seeking ways to leverage technology to gain efficiency and competitive advantage.
As a company known for our anomaly detection, we know a thing or two about spotting irregularities. So as we reached the end of 2019, we couldn’t help but think back on the 2010s and the anomalies that shook the world. Once we got to listing them, it really became tough to pick just 10. Ultimately, after much debate, we ranked them based on their impact, newsworthiness and how utterly unexpected they were.
Cloud computing has changed the way we think about software, and opened up many new possibilities in both business and software development. Log management tools have also been affected by this, which begs the question – what are the pros and cons of cloud log management when compared to on-premises solutions? There are several key things you should consider before opting for either one, so here is a brief overview of the most important aspects that will help you make an informed decision.
Today’s post covers yet another log-related concept: log forensics. What’s this, and why should your organization care about it? Well, this is a topic related to logs, which are ubiquitous in the technology field. An IT organization that doesn’t generate many MBs worth of logs each day would be a rare occurrence nowadays. Even though logs are omnipresent, specific terms might not be so well-known. Not long ago, we covered log analytics, and today it’s log forensics time.