Publish-Subscribe and Python Message Bus

I am trying to create a central logging system for some of my python modules. I want to be able to send messages from multiple modules with logs, then the central registrar takes them and processes them.

enter image description here

For simplicity's sake, I want my module A to look something like this:

  bus = connect_to_a_bus_that_is_always_there
  while True:
    #Publish a message to message bus, pseudo code
    bus.publish(topic="logs.a", message="example")
    time sleep(1)

      

and registrar (single subscriber)

def actOnNewMessage(msg):
  if msg.topic.subtopic == "a":
     doSomethingForA(msg.data)

bus = connect_to_a_bus_that_is_always_there
bus.subscribe("logs", handler=actOnNewMessage)

while True:
  #wait for messages

      

Right now the Logger module acts like a library, so it doesn't persist, so maybe I can inject something between the Logger and the Message Bus that will constantly monitor for new messages.

I've looked at PyPubSub , but it doesn't seem to represent a permanent connection between the various python modules running in the documentation. If anyone tried this, it works for me, if I can use this between different modules.

Another problem is that I could get modules not written in python, so I don't need direct communication between modules A, B and Logger. In the end, my architecture might look like this: enter image description here

Hope the above information is not confusing.

tl; dr : publish-subscribe with persistent message bus in python and subscriber who is constantly waiting for new messages. Any ready-to-use solution?

EDIT: I am considering starting a websocket server that knows about the Logger module and the other modules A, B know the websocket address. Are there any downsides to this design?

+3


source to share


3 answers


I met nanomsg . Great for my needs, with an MIT license and no additional servers. In addition, there are bindings for any language I would like to use.

from nanomsg import Socket, PUB

s = Socket(PUB)
s.connect('tcp://localhost:8080')
s.send('topicMessage')

      



from nanomsg import Socket, SUB

s = Socket(SUB)
s.connect('tcp://localhost:8080')
s.set_string_option(SUB, SUB_SUBSCRIBE, "topic")
while True:
    print(s.recv())

      

+1


source


Opensplice is a message bus that allows for constant buffering of data. Don't wind up your own message bus! These are complex animals.

Why not just use syslog? There are versions of syslog that also support logging from multiple nodes to a central collection point. Many programming languages ​​support it, including python.



I would strongly recommend that you use the standard python framework. This allows you to choose where the logs go using various standard loggers such as SyslogHandler, SocketHandler, and DatagramHandler.

It even lets you write your own handler if you need to ...

+1


source


You can redraw as a broker and run logger.py in a separate process.

logger.py

import redis

r = redis.Redis()

while True:
    next_log_item = r.blpop(['logs'], 0)
    write_to_db(next_log_item)

      

a.py

import redis
import time

r = redis.Redis()

while True:
    r.rpush('logs', message)
    time.sleep(1)

      

0


source







All Articles