Send logs to RabbitMQ from Python/Django.

Overview

python-logging-rabbitmq

Build Status

Logging handler to ships logs to RabbitMQ. Compatible with Django.

Installation

Install using pip.

pip install python_logging_rabbitmq

Versions

Version Dependency
>= 2.x Pika == 0.13
<= 1.1.1 Pika <= 0.10

Handlers

This package has two built-in handlers that you can import as follows:

from python_logging_rabbitmq import RabbitMQHandler

or (thanks to @wallezhang)

from python_logging_rabbitmq import RabbitMQHandlerOneWay
Handler Description
RabbitMQHandler Basic handler for sending logs to RabbitMQ. Every record will be delivered directly to RabbitMQ using the exchange configured.
RabbitMQHandlerOneWay High throughput handler. Initializes an internal queue where logs are stored temporarily. A thread is used to deliver the logs to RabbitMQ using the exchange configured. Your app doesn't need to wait until the log is delivered. Notice that if the main thread dies you might lose logs.

Standalone python

To use with python first create a logger for your app, then create an instance of the handler and add it to the logger created.

import logging
from python_logging_rabbitmq import RabbitMQHandler

logger = logging.getLogger('myapp')
logger.setLevel(logging.DEBUG)

rabbit = RabbitMQHandler(host='localhost', port=5672)
logger.addHandler(rabbit)

logger.debug('test debug')

As result, a similar message as follows will be sent to RabbitMQ:

{
	"relativeCreated":280.61580657958984,
	"process":13105,
	"args":[],
	"module":"test",
	"funcName":"<module>",
	"host":"albertomr86-laptop",
	"exc_text":null,
	"name":"myapp",
	"thread":140032818181888,
	"created":1482290387.454017,
	"threadName":"MainThread",
	"msecs":454.01692390441895,
	"filename":"test.py",
	"levelno":10,
	"processName":"MainProcess",
	"pathname":"test.py",
	"lineno":11,
	"msg":"test debug",
	"exc_info":null,
	"levelname":"DEBUG"
}

Sending logs

By default, logs will be sent to RabbitMQ using the exchange 'log', this should be of type topic. The routing key used is formed by concatenating the logger name and the log level. For example:

import logging
from python_logging_rabbitmq import RabbitMQHandler

logger = logging.getLogger('myapp')
logger.setLevel(logging.DEBUG)
logger.addHandler(RabbitMQHandler(host='localhost', port=5672))

logger.info('test info')
logger.debug('test debug')
logger.warning('test warning')

The messages will be sent using the following routing keys:

  • myapp.INFO
  • myapp.DEBUG
  • myapp.WARNING

For an explanation about topics and routing keys go to https://www.rabbitmq.com/tutorials/tutorial-five-python.html

When create the handler, you're able to specify different parameters in order to connect to RabbitMQ or configure the handler behavior.

Overriding routing-key creation

If you wish to override routing-key format entirely, you can pass routing_key_formatter function which takes LogRecord objects and returns routing-key. For example:

RabbitMQHandler(
	host='localhost',
	port=5672,
	routing_key_formatter=lambda r: (
		'some_exchange_prefix.{}'.format(r.levelname.lower())
	)
)

Configuration

These are the configuration allowed:

Parameter Description Default
host RabbitMQ Server hostname or ip address. localhost
port RabbitMQ Server port. 5672
username Username for authentication. None
password Provide a password for the username. None
exchange Name of the exchange to publish the logs. This exchange is considered of type topic. log
declare_exchange Whether or not to declare the exchange. False
routing_key_format Customize how messages are routed to the queues. {name}.{level}
routing_key_formatter Customize how routing-key is constructed. None
connection_params Allow extra params to connect with RabbitMQ. None
formatter Use custom formatter for the logs. python_logging_rabbitmq.JSONFormatter
close_after_emit Close the active connection after send a log. A new connection is open for the next log. False
fields Dict to add as a field in each logs send to RabbitMQ. This is useful when you want fields in each log but without pass them every time. None
fields_under_root When is True, each key in parameter 'fields' will be added as an entry in the log, otherwise they will be logged under the key 'fields'. True
message_headers A dictionary of headers to be published with the message. None
record_fields A set of attributes that should be preserved from the record object. None
exclude_record_fields A set of attributes that should be ignored from the record object. None
heartbeat Lower bound for heartbeat timeout 60

Examples

RabbitMQ Connection

rabbit = RabbitMQHandler(
	host='localhost',
	port=5672,
	username='guest',
	password='guest',
	connection_params={
		'virtual_host': '/',
		'connection_attempts': 3,
		'socket_timeout': 5000
	}
)

Custom fields

rabbit = RabbitMQHandler(
	host='localhost',
	port=5672,
	fields={
		'source': 'MyApp',
		'env': 'production'
	},
	fields_under_root=True
)

Custom formatter

By default, python_logging_rabbitmq implements a custom JSONFormatter; but if you prefer to format your own message you could do it as follow:

import logging
from python_logging_rabbitmq import RabbitMQHandler

FORMAT = '%(asctime)-15s %(message)s'
formatter = logging.Formatter(fmt=FORMAT)
rabbit = RabbitMQHandler(formatter=formatter)

For a custom JSON Formatter take a look at https://github.com/madzak/python-json-logger

Django

To use with Django add the handler in the logging config.

LOGGING = {
	'version': 1,
	'disable_existing_loggers': False,
	'handlers': {
		'rabbit': {
			'level': 'DEBUG',
			'class': 'python_logging_rabbitmq.RabbitMQHandler',
			'host': 'localhost'
		}
	},
	'loggers': {
		'myapp': {
			'handlers': ['rabbit'],
			'level': 'DEBUG',
			'propagate': False
		}
	}
}

Configuration

Same as when use it with standalone python, you could configure the handle directly when declaring it in the config:

LOGGING = {
	'version': 1,
	'disable_existing_loggers': False,
	'handlers': {
		'rabbit': {
			'level': 'DEBUG',
			'class': 'python_logging_rabbitmq.RabbitMQHandler',
			'host': 'localhost',
			'port': 5672,
			'username': 'guest',
			'password': 'guest',
			'exchange': 'log',
			'declare_exchange': False,
			'connection_params': {
				'virtual_host': '/',
				'connection_attempts': 3,
				'socket_timeout': 5000
			},
			'fields': {
				'source': 'MainAPI',
				'env': 'production'
			},
			'fields_under_root': True
		}
	},
	'loggers': {
		'myapp': {
			'handlers': ['rabbit'],
			'level': 'DEBUG',
			'propagate': False
		}
	}
}

Custom formatter

LOGGING = {
	'version': 1,
	'disable_existing_loggers': False,
	'formatters': {
		'standard': {
			'format': '%(levelname)-8s [%(asctime)s]: %(message)s'
		}
	},
	'handlers': {
		'rabbit': {
			'level': 'DEBUG',
			'class': 'python_logging_rabbitmq.RabbitMQHandler',
			'host': 'localhost',
			'formatter': 'standard'
		}
	},
	'loggers': {
		'myapp': {
			'handlers': ['rabbit'],
			'level': 'DEBUG',
			'propagate': False
		}
	}
}

JSON formatter

pip install python-json-logger
LOGGING = {
	'version': 1,
	'disable_existing_loggers': False,
	'formatters': {
		'json': {
			'()': 'pythonjsonlogger.jsonlogger.JsonFormatter',
			'fmt': '%(name)s %(levelname) %(asctime)s %(message)s'
		}
	},
	'handlers': {
		'rabbit': {
			'level': 'DEBUG',
			'class': 'python_logging_rabbitmq.RabbitMQHandler',
			'host': 'localhost',
			'formatter': 'json'
		}
	},
	'loggers': {
		'myapp': {
			'handlers': ['rabbit'],
			'level': 'DEBUG',
			'propagate': False
		}
	}
}

Releases

Date Version Notes
Mar 10, 2019 1.1.1 Removed direct dependency with Django. Integration with Travis CI. Configuration for tests. Using pipenv.
May 04, 2018 1.0.9 Fixed exchange_type parameter in channel.exchange_declare (Thanks to @cklos).
Mar 21, 2018 1.0.8 Allowing message headers (Thanks to @merretbuurman).
May 15, 2017 1.0.7 Adding support to customize the routing_key (Thanks to @hansyulian).
Mar 30, 2017 1.0.6 Fix compatibility with python3 in RabbitMQHandlerOneWay (by @sactre).
Mar 28, 2017 1.0.5 Explicit local imports.
Mar 16, 2017 1.0.4 Added new handler RabbitMQHandlerOneWay (by @wallezhang).
Mar 14, 2017 1.0.3 Added config parameter close_after_emit.
Dec 21, 2016 1.0.2 Minor fixes.
Dec 21, 2016 1.0.1 Minor fixes.
Dec 21, 2016 1.0.0 Initial release.

What's next?

  • Let's talk about tests.
  • Issues, pull requests, suggestions are welcome.
  • Fork and improve it. Free for all.

Similar efforts

Comments
  • TypeError: unexpected kwargs: {'heartbeat_interval': 0}

    TypeError: unexpected kwargs: {'heartbeat_interval': 0}

    I always get this error.

    The error was from line 101 in handler,py.
    But I think it is because of the line 62
    self.connection_params.update(dict(host=host, port=port, heartbeat_interval=0))

    Just go through this Pika Documentation

    connection_params does not have heartbeat_interval

    bug 
    opened by raj-kiran-p 7
  • fix: handle thread shutdown

    fix: handle thread shutdown

    Introduce two events (stopping, stopped) to interlock with the worker thread and cause a graceful shutdown.

    Add a timeout to the Queue get of 10s, this means that a graceful shutdown will not be instantaneous.

    Switch to del on the Pika blocking channels.

    opened by donbowman 4
  • RabbitMQ server closes the connection because not receiving heartbeat

    RabbitMQ server closes the connection because not receiving heartbeat

    Hi Albert, Similar to issue https://github.com/pika/pika/issues/1104. After digging into Pika and RabbitMQ, I find with BlockedConnection, pika will not automatically send out the heartbeat. The heartbeat event will only be handled/sent in "start_consuming" and "process_data_events". For consumer, we will use "start_consuming", there will not be such issue. But for producer, normally we won't call "process_data_events" specifically, it will only be called when we call "basic_publish". Let's say we set "heartbeat" to 20s, if we don't log any message within 3x10s, the server would close the connection. (Different version of RabbitMQ might have different behaviors, some might take 3x20s) I didn't see anyone report this issue or talk this on the internet, so I'm not sure if my understanding is correct. Look forward to your response. Thanks in advance.

    bug wip 
    opened by yuanli-cn 4
  • Standalone not working

    Standalone not working

    Hello everybody,

    I'm trying to implement your lib in my python app. We're not using Django and we have this error raised :

    Traceback (most recent call last): File "/home/vgaugry/darwin/sms_v2_tools/sms_v2_tools/custom_logger/test.py", line 1, in <module> import DarwinLogger File "/home/vgaugry/darwin/sms_v2_tools/sms_v2_tools/custom_logger/DarwinLogger.py", line 4, in <module> from python_logging_rabbitmq import RabbitMQHandlerOneWay File "/home/vgaugry/.virtualenvs/sms-v2_env/local/lib/python2.7/site-packages/python_logging_rabbitmq/__init__.py", line 2, in <module> from .formatters import JSONFormatter # noqa: F401 File "/home/vgaugry/.virtualenvs/sms-v2_env/local/lib/python2.7/site-packages/python_logging_rabbitmq/formatters.py", line 5, in <module> from django.core.serializers.json import DjangoJSONEncoder ImportError: No module named django.core.serializers.json

    I simply followed the "standalone" part of the readme. Is this normal ? Or Am I doing something wrong ?

    Thx !

    bug 
    opened by Travincebarker 2
  • wait for logs to be sent in RabbitMQHandlerOneWay before exiting python ?

    wait for logs to be sent in RabbitMQHandlerOneWay before exiting python ?

    Hi,

    Thank you for your great package.

    Is there any way to wait for logs to be sent in RabbitMQHandlerOneWay before exiting python ? Naive method could be to wait a few seconds (time.sleep(2)) but there is maybe a better method.

    Thanks a lot.

    enhancement planning 
    opened by BenjaminSchmitt 2
  • Unconfigurable Routing Key Format

    Unconfigurable Routing Key Format

    I need to able to change the routing key format in my system, so i prefered that this file, python_loggin_rabbitmq/handlers.py:

    line 115:

                routing_key ="{name}.{level}".format(name=record.name, level=record.levelname)
    

    to be changed to:

    line 14:

                ROUTING_KEY_FORMAT = "{name}.{level}"
    

    line 115:

                routing_key = self.ROUTING_KEY_FORMAT.format(name=record.name, level=record.levelname)
    

    so it will be configurable thank you

    enhancement 
    opened by hansyulian 2
  • ImportError: No module named 'compat'

    ImportError: No module named 'compat'

    When I use the library I see an Exception:

    File "/usr/local/lib/python3.4/dist-packages/python_logging_rabbitmq/init.py", line 2, in from .formatters import JSONFormatter # noqa: F401 File "/usr/local/lib/python3.4/dist-packages/python_logging_rabbitmq/formatters.py", line 4, in from compat import json ImportError: No module named 'compat'

    are some wrong in ini?

    Regards and thank you for your library.

    bug 
    opened by sactre 2
  • Add content_type in pika.BasicProperties parameters

    Add content_type in pika.BasicProperties parameters

    https://github.com/albertomr86/python-logging-rabbitmq/blob/5d3ce4cc0b86b7303a2097d6acb46972d334e213/python_logging_rabbitmq/handlers.py#L164 The safest way to work is to add content_type = 'STRING' but could be as parameter key in class method.

    wip 
    opened by TopperBG 1
  • Fix in publish(): the body is already formatted.

    Fix in publish(): the body is already formatted.

    In emit(), the record is formatted and than queued. The worker, is getting from the queue the record to be published In publish(), that record was formatted again (a second time)

    Try a simple app like this:

    import time import logging from python_logging_rabbitmq import RabbitMQHandlerOneWay

    logger = logging.getLogger('myapp') logger.setLevel(logging.DEBUG)

    rabbit = RabbitMQHandlerOneWay(host='localhost', port=5672) logger.addHandler(rabbit)

    logger.debug('test debug') time.sleep(3)

    -- Error: File "python-logging-rabbitmq/python_logging_rabbitmq/formatters.py", line 22, in format data = record.dict.copy() AttributeError: 'str' object has no attribute 'dict'

    opened by ghost 1
  • Returning batch of changes to upstream

    Returning batch of changes to upstream

    Hi, I'm pleased to say that we've been using your library in our project and it turned out very helpful. We've made some changes to fit our needs and thought to return them to upstream, you may find them useful. In summary, we've:

    • Updated .gitignore to include broader range of Python/Vim-related files
    • Made some stylistic tweaks; sorted imports, PEP8-ified some comments
    • Added routing_key_formatter option which allows to pass lambda overriding routing-key creation
    • Added support for serialization of Django's requests (this means that Rabbit handlers can handle errors logged to django.requests)
    • Added record_fields and exclude_record_fields options which allow to include/exclude specified LogRecord attributes (sometimes fields such as levelno are just not helpful)
    • Imported DjangoJSONEncoder to json formatter in order to handle breader range of objects (such as Decimal)
    • Updated README
    opened by IwoHerka 1
  • call of channel.exchange_declare modified

    call of channel.exchange_declare modified

    According to the Pika source at: https://github.com/pika/pika/blob/master/pika/channel.py#L658 the channel.exchange_declare method has no argument 'type', the corresponding argument is 'exchange_type'.

    opened by cklos 1
  • fix: only mark task done when a task was dequeued

    fix: only mark task done when a task was dequeued

    task_done will fail if we mark a task as having finished when no task was dequeued. Since this can only happen after a task was retrieved from the queue, move the finally into an inner try so that we know task_done will work.

    Fixes #29 for the most part -- it does not address the leak regarding messages still in the queue when is_stopping is set.

    opened by klarose 0
  • Call queue.task_done() only after a successful get()

    Call queue.task_done() only after a successful get()

    queue.task_done() should be called only when an item was actually returned by get(). If get() raises a Empty exception, task_done() should not be called.

    Also, close the Pika connection only if it was actually opened.

    wip 
    opened by kmorwath 1
  •  self.queue.task_done() can be called when no message was get due to continue executing finally block anyway leading to ValueError exception

    self.queue.task_done() can be called when no message was get due to continue executing finally block anyway leading to ValueError exception

    The changes in version 2.2 for fix #25 in python_logging_rabbitmq/handlers_oneway.py may have introduced an issue. Before the Queue.Empty exception was never raised because record, routing_key = self.queue.get() had no timeout. Now when the exception is raised if no messages arrives within 10s, the exception handler will call "continue" but still the "finally" block is executed anyway - and queue.task_done() could be called more times than put() and it will lead to a ValueError exception.

    queue.task_done() should be called in a inner "try..finally" block after a message has been dequeued actually, for example:

    record, routing_key = self.queue.get(block=True, timeout=10) try: #Actually got a message ... try to send the message ... finally: queue.task_done()

    Moreover when is_stopping is set the loop is exited before queue.task_done() is called, and messages still in the queue are not processed. If on the other side of the queue something attempts to call queue.join() it could never return.

    opened by kmorwath 0
  • `ujson` does not support `.dumps(cls=SomeEncoder)` `cls` parameter

    `ujson` does not support `.dumps(cls=SomeEncoder)` `cls` parameter

    As per https://github.com/esnme/ultrajson/issues/124

    If you have a package that requires ujson, it is automatically picked up by the compat.py and used in JSONFormatter thereafter. Unfortunately, ujson is not fully compatible with the built-in json.dump and it does not understand the cls parameter.

    opened by EivV 1
  • SSL configuration isn't working automatically

    SSL configuration isn't working automatically

    As a workaround I initilize to following:

    SSLOptions(ssl.SSLContext(protocol=ssl.PROTOCOL_TLSv1_2))

    and pass it as connection_params under ssl_options

    Without a workaround I get a connection reset error.

    bug wip 
    opened by Ghost93 2
Releases(2.0.0)
Owner
Alberto Menendez Romero
Technical Manager at Globant SA.
Alberto Menendez Romero
A simple E-commerce shop made with Django and Bulma

Interiorshop A Simple E-Commerce app made with Django Instructions Make sure you have python installed Step 1. Open a terminal Step 2. Paste the given

Aditya Priyadarshi 3 Sep 03, 2022
Django Course Project - TextCorrector

Django-TextUtils Django Course Project A tool for analyzing text data in Django backend. It is a project where you can do some of the things with you

1 Oct 29, 2021
Cached file system for online resources in Python

Minato Cache & file system for online resources in Python Features Minato enables you to: Download & cache online recsources minato supports the follo

Yasuhiro Yamaguchi 10 Jan 04, 2023
PWA is a simple Django app to develope and deploy a Progressive Web Application.

PWA PWA is a simple Django app to develope and deploy a Progressive Web Application. Detailed documentation is in the "docs" directory. Quick start Ad

Nima 6 Dec 09, 2022
Source code for Django for Beginners 3.2

The official source code for https://djangoforbeginners.com/. Available as an ebook or in Paperback. If you have the 3.1 version, please refer to this

William Vincent 10 Jan 03, 2023
Transparently use webpack with django

Looking for maintainers This repository is unmaintained as I don't have any free time to dedicate to this effort. If you or your organisation are heav

Owais Lone 2.4k Jan 06, 2023
A UUIDField for Django

django-uuidfield Provides a UUIDField for your Django models. Installation Install it with pip (or easy_install): pip install django-uuidfield Usage

David Cramer 265 Nov 30, 2022
DCM is a set of tools that helps you to keep your data in your Django Models consistent.

Django Consistency Model DCM is a set of tools that helps you to keep your data in your Django Models consistent. Motivation You have a lot of legacy

Occipital 59 Dec 21, 2022
Create a netflix-like service using Django, React.js, & More.

Create a netflix-like service using Django. Learn advanced Django techniques to achieve amazing results like never before.

Coding For Entrepreneurs 67 Dec 08, 2022
A Django app to initialize Sentry client for your Django applications

Dj_sentry This Django application intialize Sentry SDK to your Django application. How to install You can install this packaging by using: pip install

Gandi 1 Dec 09, 2021
Highlight the keywords of a page if a visitor is coming from a search engine.

Django-SEKH Django Search Engine Keywords Highlighter, is a middleware for Django providing the capacities to highlight the user's search keywords if

Julien Fache 24 Oct 08, 2021
A Blog Management System Built with django

Blog Management System Backend use: Django Features Enhanced Ui

Vishal Goswami 1 Dec 06, 2021
PostgreSQL with Docker + Portainer + pgAdmin + Django local

django-postgresql-docker Running PostgreSQL with Docker + Portainer + pgAdmin + Django local for development. This project was done with: Python 3.9.8

Regis Santos 4 Jun 12, 2022
Bringing together django, django rest framework, and htmx

This is Just an Idea There is no code, this README just represents an idea for a minimal library that, as of now, does not exist. django-htmx-rest A l

Jack DeVries 5 Nov 24, 2022
wagtail_tenants is a Django/Wagtail app to provide multitenancy to your wagtail project.

wagtail-tenants wagtail_tenants is a Django/Wagtail app to provide multitenancy to your wagtail project. You are able to run a main Wagtail Site and f

<bbr> 11 Nov 20, 2022
Django-environ allows you to utilize 12factor inspired environment variables to configure your Django application.

Django-environ django-environ allows you to use Twelve-factor methodology to configure your Django application with environment variables. import envi

Daniele Faraglia 2.7k Jan 07, 2023
Getdp-project - A Django-built web app that generates a personalized banner of events to come

getdp-project https://get-my-dp.herokuapp.com/ A Django-built web app that gener

CODE 4 Aug 01, 2022
This is a personal django website for forum posts

Django Web Forum This is a personal django website for forum posts It includes login, registration and forum posts with date time. Tech / Framework us

5 May 12, 2022
A pluggable Django application for integrating PayPal Payments Standard or Payments Pro

Django PayPal Django PayPal is a pluggable application that integrates with PayPal Payments Standard and Payments Pro. See https://django-paypal.readt

Luke Plant 672 Dec 22, 2022
Automatically deletes old file for FileField and ImageField. It also deletes files on models instance deletion.

Django Cleanup Features The django-cleanup app automatically deletes files for FileField, ImageField and subclasses. When a FileField's value is chang

Ilya Shalyapin 838 Dec 30, 2022