Operations | Monitoring | ITSM | DevOps | Cloud

Say Goodbye to ZooKeeper

Automated, Zero-Downtime KRaft Migrations Now Available on Aiven The Apache Kafka ecosystem has been steadily moving toward a simpler, more scalable architecture with KRaft (Kafka Raft), leaving ZooKeeper behind. In March 2025, Kafka 4.0 dropped support for ZooKeeper entirely. Since June 2025, all new Aiven for Apache Kafka services have been deployed with KRaft by default, allowing our users to benefit from faster partition scaling and simplified cluster management.

Essential Features in Employee Recognition Platforms

Employee recognition has become a key part of building a positive workplace culture. Organizations are realizing that when employees feel valued for their work, they are more motivated, productive, and committed to the company's goals. This is why many businesses are investing in structured recognition systems instead of relying on occasional praise from managers.

From Reactive to Predictive: Preserving BESS Uptime at Scale

Battery Energy Storage Systems (BESS) operate as revenue-generating grid assets that capture surplus electricity, deploy power during demand spikes, and support frequency control. By shifting energy across time, they stabilize grid conditions, enable renewable integration, and execute market dispatch commitments. When systems respond as designed, stored capacity becomes a flexible, monetizable supply. But BESS performance depends on precision and availability.

Deterministic Simulation Testing in Diskless Apache Kafka

Aiven put Diskless Kafka through 2,200 logical hours of chaos testing using Antithesis. Here's how it held up. Testing is a necessary pillar in any software development lifecycle. At the same time, it's a fundamentally incomplete process - a cat-and-mouse game. You can't test what you can't imagine and distributed systems have a nasty habit of producing failures that no one on the team imagined.

However you Postgres, we've got you covered

From free hobby projects to $5/month developer setups and enterprise-scale clusters, Aiven for PostgreSQL scales with you. Find the perfect tier for your data. Our aim is to make your use of PostgreSQL in the cloud as easy as possible, however you’re using it. Whether you are just starting to learn the ropes or you're managing a global-scale enterprise database, the platform should adapt to you, not the other way around.

A Practical Guide to SCADA Security

Critical infrastructure is under siege. The systems that control our power grids, water treatment plants, and oil pipelines weren’t designed for a connected world. This post covers what security measures teams need to understand and how time series monitoring can help turn SCADA’s weaknesses into a security advantage.

Elephant in the Room, Episode 3: Building a CFP Review Platform with PostgreSQL & Django Live

In Episode 3 of Elephant in the Room, we move from discussion to delivery with a hands-on, live build of a real, community-focused application. Join Jay Miller, Abigail Mesrenyame Dogbe and Andres Pineda as they collaboratively design and build a CFP (Call for Proposals) review platform using PostgreSQL and Django. The aim: create a practical tool that helps speakers receive better feedback and helps organisers discover new and diverse voices.
Sponsored Post

What is a Real-Time Data Lake?

A data lake is a centralized data repository where structured, semi-structured, and unstructured data from a variety of sources can be stored in their raw format. Data lakes help eliminate data silos by acting as a single landing zone for data from multiple sources. But what's the difference between a traditional data lake and a real-time data lake? Some traditional data lakes use batch processing, which involves processing and analyzing a collection of data that has been stored over a specific timeframe. For example, payroll and billing systems that are handled on a weekly or monthly basis might use batch processing.