Confluent : Occasion-Pushed Microservices with Python and Apache Kafka

Our journey with microservices, to this point…

Again within the days earlier than microservices, we used to construct functions that had been typically hosted in software servers comparable to WebLogic or WebSphere (apologies for stirring up any painful recollections). Shoppers related to those server functions utilizing an internet browser over HTTP. It labored properly, for some definition of “properly”. It gave us the flexibleness of not having to manage the shopper {hardware} or software program and allowed us to concentrate on our enterprise logic.

This strategy opened the door to doing increasingly more enterprise transactions on-line as an alternative of by mail or in particular person. As the quantity of labor that these functions dealt with grew, so did the complexity of the functions themselves. Quickly a number of groups had been engaged on the identical software, specializing in totally different characteristic units. However these groups needed to coordinate adjustments and deployments as a result of they had been all engaged on a single software, which quickly turned often known as a monolith.

Microservices: the important thing to trendy, distributed techniques

Alongside got here microservices. Particular person, smaller functions that might be modified, deployed, and scaled independently. After some preliminary skepticism, this architectural type took off. It actually did clear up a number of vital issues. Nevertheless, as is commonly the case, it introduced new ranges of complexity for us to take care of. We now had distributed techniques that wanted to speak and rely on one another to perform the duties at hand.

The commonest strategy to getting our functions speaking to one another was to make use of what we had been already utilizing between our shoppers and servers: HTTP-based request/response communications, maybe utilizing REST or gRPC. This works, however it will increase the coupling between our unbiased functions by requiring them to find out about APIs, endpoints, request parameters, and so on., making them much less unbiased.

A standard, although vastly simplified, design will be seen above. Right here a shopper software, maybe an internet browser or a cell software, connects to our net server over HTTP. That is smart, as that is what HTTP was supposed for. Then our net server, service A, makes a name to service B, which makes a name to service C, and so forth down the road. One other variation is to let one service act as an orchestrator making calls to different functions, as proven under.

I’ve labored on techniques like these a number of occasions over time, they usually work properly at first. We will use cautious API design and instruments like Swagger or OpenAPI to assist handle the coupling. However as new necessities emerge or present necessities change, issues are likely to get extra difficult shortly. And even with the perfect of intentions, we frequently find yourself with one thing extra like this:

A system of many, extremely coupled parts turns into nearly inconceivable to take care of or deploy piece by piece, and we lose most of the advantages we gained by transferring to microservices.

However there may be one other strategy to microservices-the event-driven structure. Utilizing occasions with a platform like Apache Kafka® can dramatically cut back the coupling between our functions and make it simpler to maintain our microservices from changing into a distributed monolith. Let us take a look at how we will use occasions to construct an analogous system.

Connecting functions with occasions

An occasion is a logical assemble that incorporates each notification and state. In different phrases, it tells us one thing occurred and it provides us details about what occurred. In Kafka, occasions are posted to matters (logs) by a producer and obtained by a shopper. These producers and customers need not know something about one another. Let’s examine how we will reap the benefits of this when connecting our microservices.

As an alternative of 1 software calling one other through HTTP, it will probably produce an occasion to a Kafka matter. One other software, or multiple different software, can eat that occasion and take some motion. These functions can then, in flip, produce new occasions to different matters, both to be acted upon downstream or as closing outcomes.

The sort of structure has many advantages, together with diminished design-time coupling. Not one of the functions concerned know something about or have any dependency on the others. Every software listens to the subject(s) which are of curiosity and produces to matter(s) based mostly on the motion they carry out. It additionally makes it simpler to increase our techniques by including new functions that may eat the identical occasions with out affecting the present move.

As a result of occasions in Kafka are sturdy, we will additionally replay them at a later time or use them to provide information merchandise that may be of worth to different elements of the group.

Let’s strive it out

In all probability the easiest way for example event-driven microservices is to construct some out collectively.

Our undertaking

For those who’ve ever ordered a number of pizzas for a staff gathering,a birthday celebration, or one thing related, you understand how tough it may be to determine what forms of pizza to order. We’re going to clear up this drawback with event-driven microservices by constructing a random pizza generator utilizing Python, Flask, and Kafka. This is our design diagram.

Shoppers will join with our preliminary software, the PizzaService, utilizing HTTP since that is what it is best at. They may request a sure variety of random pizzas, and our PizzaService will produce an occasion to the pizza matter for each. Our SauceService will add a random sauce choice and produce occasions to the pizza-with-sauce matter. The CheeseService will eat from that matter, add a random cheese choice and produce an occasion to the pizza-with-cheese matter. The MeatService and VeggiesService will function similarly and at last, the PizzaService will eat from the pizza-with-veggies matter, which is able to comprise a accomplished pizza. Lastly, in a separate name, shoppers will have the ability to retrieve their accomplished random pizza order. Let’s get to work.

The pizza service

The PizzaService shall be accessible to exterior shoppers. We’ll use Flask, a light-weight and easy-to-use Python framework for constructing net functions.

Our software could have one endpoint /order, with the POST technique used to put an order and the GET technique used to get it.

This is the POST technique handler from

@app.route('/order/', strategies=['POST'])

def order_pizzas(rely):

   order_id = pizza_service.order_pizzas(int(rely))

   return json.dumps({"order_id": order_id})

When a consumer posts an order for a sure variety of pizzas, this handler will name the pizza_service.order_pizzas() technique and go within the quantity desired. The operate will return the order id and we’ll return that to the shopper. This id can later be used to retrieve the order with a GET name.

Subsequent, let’s check out the pizza_service module. On this module, as in all of the others the place we’re utilizing Kafka, we are going to import the producer and shopper from confluent_kafka. This bundle is offered at and relies on librdkafka, the gold normal for non-Java Kafka shoppers.

We’ll additionally use configparser to learn our configuration information. For all the small print, see the whole undertaking within the GitHub repo.


producer_config = dict(config_parser['kafka_client'])

pizza_producer = Producer(producer_config)

The pizza_producer shall be used within the order_pizzas() operate, together with Pizza and PizzaOrder, that are easy worth objects.

def order_pizzas(rely):

   order = PizzaOrder(rely)

   pizza_warmer[] = order

   for i in vary(rely):

       new_pizza = Pizza()

       new_pizza.order_id =





On this operate we first create a PizzaOrder with the rely that the consumer handed in. The constructor for this class assigns a novel id, which shall be returned on the finish of the operate. We then stash the order within the pizza_warmer, a dictionary for holding PizzaOrders whereas they’re in course of. Subsequent, based mostly on the rely, we enter a loop the place we create a brand new Pizza occasion, assign the to its order_id property, after which name the Producer’s produce() operate. The produce operate takes the title of the subject, the important thing of the occasion, and the worth. In our case, the subject shall be pizzas, the important thing would be the, and the worth would be the Pizza occasion as JSON. After finishing the loop, we name Producer.flush() simply to make certain all the pizza occasions have been despatched, after which return the I nonetheless cannot recover from how straightforward it’s to provide occasions to Kafka with Python!

The sauce service

We’ll use the sauce-service undertaking as a mannequin for the remainder of the companies on this system, cheese-service, meats-service, and veggies-service, since all of them comply with the identical sample.

The core of this software is a Kafka shopper loop, however earlier than we have a look at that, let’s have a look at how easy it’s to arrange the patron.

pizza_consumer = Client(client_config)


Just like the producer, we assemble the patron occasion by passing within the configuration properties that we loaded with configparser. In our instance, these properties solely include Kafka dealer location, and credentials together with a shopper group id and auto.offset.reset worth. Once more, you may get all the small print within the GitHub repo.

There’s an additional step with a shopper that we did not have with the producer, and that’s to subscribe to a number of matters. The subscribe technique takes a listing of matters, and in our case we’re giving it a listing of 1: ‘pizza’. Now, every time we name ballot() on our shopper occasion, it would examine the ‘pizza’ matter for any new occasions. Let us take a look at that now.

def start_service():

   whereas True:

       occasion = pizza_consumer.ballot(1.0)

       if occasion is None:


       elif occasion.error():

           print(f'Bummer! {occasion.error()}')


           pizza = json.hundreds(occasion.worth())

           add_sauce(occasion.key(), pizza)

Inside the loop we’re calling pizza_consumer.ballot(1.0) which is able to retrieve occasions from the subscribed matter, one by one. As acknowledged within the docs, the patron is fetching occasions in a extra environment friendly means, behind the scenes, however is delivering them to us individually, which saves us from having to nest one other loop to course of collections of occasions. The quantity we’re passing in, 1.0, is a timeout worth in seconds. With this parameter, the patron will wait as much as 1 second earlier than returning None, if there aren’t any new occasions.

If there isn’t any occasion, we are going to do nothing. If the occasion incorporates an error, we’ll simply log it, in any other case, we now have an excellent occasion, so we’ll get to work. First, we’ll extract the worth of the occasion, which is our pizza. Subsequent, we’ll go that pizza, together with the occasion’s key, to the add_sauce() operate. Recall that the occasion’s key’s the

def add_sauce(order_id, pizza):

   pizza['sauce'] = calc_sauce()

   sauce_producer.produce('pizza-with-sauce', key=order_id, 


The add_sauce() operate units the sauce property of our pizza to the results of the calc_sauce() operate, which simply generates a random sauce choice (mild, additional, alfredo, bbq, none). After that’s set, it would use the up to date pizza as the worth in a produce() name to the pizza-with-sauce matter.

Quick ahead

The subsequent service, the cheese-service, will eat from the pizza-with-sauce matter, do its work, and produce to the pizza-with-cheese matter. The meat-service will eat from that matter and produce to the pizza-with-meats matter. The veggie-service will eat from the pizza-with-meats matter and produce to the pizza-with-veggies matter. Hungry but? Lastly, the unique pizza-service will eat from the pizza-with-veggies matter to gather the finished random pizza and add it to the pizza_order.

def load_orders():

   pizza_consumer = Client(consumer_config)


   whereas True:

       msg = pizza_consumer.ballot(1.0)

       if msg is None:


       elif msg.error():

           print(f'Bummer - {msg.error()}')


           pizza = json.hundreds(msg.worth())

           add_pizza(pizza['order_id'], pizza)

This works a lot the identical means the opposite shopper loops do, however as an alternative of including one thing to the pizza that’s retrieved from the occasion, we’re passing it to the add_pizza() operate. This operate will search for the pizza_order within the pizza_warmer dictionary and add the pizza to its inner listing. Additionally, discover that we’re utilizing a variable for the subject subscription as an alternative of the string ‘pizza-with-veggies’. That simply provides us the flexibleness to alter the ultimate matter in case we add extra steps to our pizza constructing course of down the street.

Including a shopper to a Flask software

One other essential word right here: as a result of the pizza-service is a Flask software, we won’t simply run our infinite shopper loop on startup. That may forestall our software from receiving requests from shoppers. So, we put our loop into the load_orders() operate which we run in a separate thread. To do that we’ll use Flask’s @app.before_first_request decorator in our module.


def launch_consumer():

   t = Thread(goal=pizza_service.load_orders)


This would possibly not begin till the primary request is made, however then if no pizzas have been requested, there will not be something to eat, so all of it works out.

Retrieving the finished order

After a buyer has requested some pizzas they will make a GET name to the identical endpoint, passing within the order id, to retrieve their order. This is the GET request handler in

@app.route('/order/', strategies=['GET'])

def get_order(order_id):

   return pizza_service.get_order(order_id)

The get_order() operate will discover the pizza_order within the pizza_warmer and render it as JSON to be returned to the shopper.

With all of our functions working, we will challenge a easy curl command to order some pizzas.

curl -X POST http://localhost:5000/order/2

This can name our Flask software and trigger it to ship two pizza occasions to the pizza matter. The remainder of the functions will do their work purely based mostly on the occasions which are consumed and produced, and the consumer will get again an order id like this:


One other curl command, this time piped to jq, will return our scrumptious ready-to-bake pizzas.

curl http://localhost:5000/order/1994673508239677466…792 | jq


  "id": "199467350823967746698504683131014209792",

  "rely": 2,

  "pizzas": [


      "order_id": "199467350823967746698504683131014209792",

      "sauce": "bbq",

      "cheese": "extra",

      "meats": "pepperoni & ham",

      "veggies": "tomato & pineapple & mushrooms"



      "order_id": "199467350823967746698504683131014209792",

      "sauce": "extra",

      "cheese": "goat cheese",

      "meats": "anchovies & bacon",

      "veggies": "peppers & mushrooms"




Okay, I suppose I can not vouch for the deliciousness of those specific pizzas, however I can say that we simply related 5 microservices with little or no design-time coupling utilizing occasions saved in Kafka matters.

Extra ideas on coupling

Let’s draw your consideration to the earlier paragraph with regard to coupling. First, we stated that we had little or no and never “none”. It is not doable to take away all coupling between techniques which are all concerned in a single final result, however we will and will attempt to cut back coupling once we can. In our instance, we nonetheless have coupling on the occasion stage. Extra particularly, the schemas of the occasions. For instance, the sauce-service will produce occasions, with a sure schema, to the pizza-with-sauce matter. The cheese-service will eat these occasions and it must know concerning the schema of these occasions. We will handle the impression of schema coupling by utilizing a instrument like Confluent Schema Registry.

The second factor we must always spotlight is the phrase “design-time coupling”. Our instance centered on lowering design-time coupling which makes it simpler to work on and deploy functions individually. There’s nonetheless some runtime coupling between our functions in {that a} failure in one among them will forestall the complete system from finishing. We do keep away from cascading failures, however we cannot get any accomplished pizzas if any of the companies are down.

Occasion-driven structure can even assist cut back runtime coupling as we will see from a brand new service that we added, referred to as the cheese-reporter. This software will eat from the pizza-with-cheese matter and every time it receives an occasion it would replace a report on the preferred cheese picks in our random pizzas. This service will all the time return a report when requested, whatever the standing of the cheese-service or any of the opposite companies. If the cheese-service is down for any size of time the report would possibly develop into stale (to say nothing of the cheese), however the cheese-reporter service will nonetheless fulfill its requests. We can’t go into the code for this service, however it’s included within the GitHub repo talked about earlier.


That is clearly a trivial instance however it does present how easy it may be to make use of Apache Kafka to construct event-driven techniques in Python, and the way that may assist us construct extra loosely coupled, simpler to take care of functions.

You may take a look at the complete undertaking together with some directions for getting it working with Confluent Cloud, from the demo-scene GitHub repository. Then you possibly can strive some variations on this design to get some hands-on expertise. One suggestion is to have the 4 downstream companies all eat from the preliminary pizza matter, do their work and produce to their very own matters, after which have one other service that consumes from these 4 matters and joins them to construct the finished pizzas. Or for an easier problem: add a service that consumes from one of many matters and performs some form of aggregation. This can present how straightforward it’s to increase our architectures with out affecting the present functions. For those who do strive one thing like this out, drop me a word. I might love to listen to the way it went.

Python and Kafka are each superb instruments to assist us get extra accomplished in much less time and have extra enjoyable doing it. There are tons of sources accessible for studying each, so dig in and benefit from the journey!

After 28 years as a developer, architect, undertaking supervisor (recovered), writer, coach, convention organizer, and homeschooling dad, Dave Klein landed his dream job as a developer advocate at Confluent. Dave is marveling in and keen to assist others discover the superb world of occasion streaming with Apache Kafka.

Supply hyperlink

Leave a Reply

Your email address will not be published.