article thumbnail

Fault Tolerance in Distributed Systems: Tracing with Apache Kafka and Jaeger

Confluent

Using Jaeger tracing, I’ve been able to answer an important question that nearly every Apache Kafka ® project that I’ve worked on posed: how is data flowing through my distributed system? Distributed tracing with Apache Kafka and Jaeger. Example of a Kafka project with Jaeger tracing. What does this all mean?

Kafka 54
article thumbnail

Unknown Magic Byte! How to Address Magic Byte Errors in Apache Kafka

Confluent

If you've used Kafka Streams, Kafka clients, or Schema Registry, you’ve probably felt the frustration of unknown magic bytes. Here are a few ways to fix the issue.

Bytes 59
Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

Kafka Connect Deep Dive – JDBC Source Connector

Confluent

One of the most common integrations that people want to do with Apache Kafka ® is getting data in from a database. The existing data in a database, and any changes to that data, can be streamed into a Kafka topic. Here, I’m going to dig into one of the options available—the JDBC connector for Kafka Connect. Introduction.

Kafka 89
article thumbnail

Apache Kafka Deployments and Systems Reliability – Part 1

Cloudera

There are many ways that Apache Kafka has been deployed in the field. In our Kafka Summit 2021 presentation, we took a brief overview of many different configurations that have been observed to date. Kafka as software falls more cleanly into the Parallel Systems Reliability discussed below but some parts of it can end up Serial.

Kafka 116
article thumbnail

Kafka Listeners – Explained

Confluent

Put another way, courtesy of Spencer Ruport: LISTENERS are what interfaces Kafka binds to. Apache Kafka ® is a distributed system. You need to tell Kafka how the brokers can reach each other but also make sure that external clients (producers/consumers) can reach the broker they need to reach. Is anyone listening? on AWS, etc.)

Kafka 101
article thumbnail

Streaming Data from the Universe with Apache Kafka

Confluent

This data pipeline is a great example of a use case for Apache Kafka ®. The case for Apache Kafka. After researching formats—and reading about Confluent’s suggestion of using Avro with Kafka —we settled on using Avro, an open source, JSON-based binary format, for serializing the data in the alert messages.

Kafka 102
article thumbnail

Getting Started with Rust and Apache Kafka

Confluent

We’ll also take a look at some performance tests to see if Rust might be a viable alternative for Java applications using Apache Kafka ®. In this case, that means a command is created for a particular action, which will be assigned to a Kafka topic specific for that action. On May 15, 2015, the Core Kafka team released version 1.0

Kafka 18