OpenTracing API for Python

Overview

OpenTracing API for Python

GitterChat BuildStatus PyPI Documentation Status

This library is a Python platform API for OpenTracing.

Required Reading

In order to understand the Python platform API, one must first be familiar with the OpenTracing project and terminology more specifically.

Status

In the current version, opentracing-python provides only the API and a basic no-op implementation that can be used by instrumentation libraries to collect and propagate distributed tracing context.

Future versions will include a reference implementation utilizing an abstract Recorder interface, as well as a Zipkin-compatible Tracer.

Usage

The work of instrumentation libraries generally consists of three steps:

  1. When a service receives a new request (over HTTP or some other protocol), it uses OpenTracing's inject/extract API to continue an active trace, creating a Span object in the process. If the request does not contain an active trace, the service starts a new trace and a new root Span.
  2. The service needs to store the current Span in some request-local storage, (called Span activation) where it can be retrieved from when a child Span must be created, e.g. in case of the service making an RPC to another service.
  3. When making outbound calls to another service, the current Span must be retrieved from request-local storage, a child span must be created (e.g., by using the start_child_span() helper), and that child span must be embedded into the outbound request (e.g., using HTTP headers) via OpenTracing's inject/extract API.

Below are the code examples for the previously mentioned steps. Implementation of request-local storage needed for step 2 is specific to the service and/or frameworks / instrumentation libraries it is using, exposed as a ScopeManager child contained as Tracer.scope_manager. See details below.

Inbound request

Somewhere in your server's request handler code:

def handle_request(request):
    span = before_request(request, opentracing.global_tracer())
    # store span in some request-local storage using Tracer.scope_manager,
    # using the returned `Scope` as Context Manager to ensure
    # `Span` will be cleared and (in this case) `Span.finish()` be called.
    with tracer.scope_manager.activate(span, True) as scope:
        # actual business logic
        handle_request_for_real(request)


def before_request(request, tracer):
    span_context = tracer.extract(
        format=Format.HTTP_HEADERS,
        carrier=request.headers,
    )
    span = tracer.start_span(
        operation_name=request.operation,
        child_of=span_context)
    span.set_tag('http.url', request.full_url)

    remote_ip = request.remote_ip
    if remote_ip:
        span.set_tag(tags.PEER_HOST_IPV4, remote_ip)

    caller_name = request.caller_name
    if caller_name:
        span.set_tag(tags.PEER_SERVICE, caller_name)

    remote_port = request.remote_port
    if remote_port:
        span.set_tag(tags.PEER_PORT, remote_port)

    return span

Outbound request

Somewhere in your service that's about to make an outgoing call:

from opentracing import tags
from opentracing.propagation import Format
from opentracing_instrumentation import request_context

# create and serialize a child span and use it as context manager
with before_http_request(
    request=out_request,
    current_span_extractor=request_context.get_current_span):

    # actual call
    return urllib2.urlopen(request)


def before_http_request(request, current_span_extractor):
    op = request.operation
    parent_span = current_span_extractor()
    outbound_span = opentracing.global_tracer().start_span(
        operation_name=op,
        child_of=parent_span
    )

    outbound_span.set_tag('http.url', request.full_url)
    service_name = request.service_name
    host, port = request.host_port
    if service_name:
        outbound_span.set_tag(tags.PEER_SERVICE, service_name)
    if host:
        outbound_span.set_tag(tags.PEER_HOST_IPV4, host)
    if port:
        outbound_span.set_tag(tags.PEER_PORT, port)

    http_header_carrier = {}
    opentracing.global_tracer().inject(
        span_context=outbound_span.context,
        format=Format.HTTP_HEADERS,
        carrier=http_header_carrier)

    for key, value in http_header_carrier.iteritems():
        request.add_header(key, value)

    return outbound_span

Scope and within-process propagation

For getting/setting the current active Span in the used request-local storage, OpenTracing requires that every Tracer contains a ScopeManager that grants access to the active Span through a Scope. Any Span may be transferred to another task or thread, but not Scope.

# Access to the active span is straightforward.
scope = tracer.scope_manager.active
if scope is not None:
    scope.span.set_tag('...', '...')

The common case starts a Scope that's automatically registered for intra-process propagation via ScopeManager.

Note that start_active_span('...') automatically finishes the span on Scope.close() (start_active_span('...', finish_on_close=False) does not finish it, in contrast).

# Manual activation of the Span.
span = tracer.start_span(operation_name='someWork')
with tracer.scope_manager.activate(span, True) as scope:
    # Do things.

# Automatic activation of the Span.
# finish_on_close is a required parameter.
with tracer.start_active_span('someWork', finish_on_close=True) as scope:
    # Do things.

# Handling done through a try construct:
span = tracer.start_span(operation_name='someWork')
scope = tracer.scope_manager.activate(span, True)
try:
    # Do things.
except Exception as e:
    span.set_tag('error', '...')
finally:
    scope.close()

If there is a Scope, it will act as the parent to any newly started Span unless the programmer passes ignore_active_span=True at start_span()/start_active_span() time or specified parent context explicitly:

scope = tracer.start_active_span('someWork', ignore_active_span=True)

Each service/framework ought to provide a specific ScopeManager implementation that relies on their own request-local storage (thread-local storage, or coroutine-based storage for asynchronous frameworks, for example).

Scope managers

This project includes a set of ScopeManager implementations under the opentracing.scope_managers submodule, which can be imported on demand:

from opentracing.scope_managers import ThreadLocalScopeManager

There exist implementations for thread-local (the default instance of the submodule opentracing.scope_managers), gevent, Tornado, asyncio and contextvars:

from opentracing.scope_managers.gevent import GeventScopeManager # requires gevent
from opentracing.scope_managers.tornado import TornadoScopeManager # requires tornado<6
from opentracing.scope_managers.asyncio import AsyncioScopeManager # fits for old asyncio applications, requires Python 3.4 or newer.
from opentracing.scope_managers.contextvars import ContextVarsScopeManager # for asyncio applications, requires Python 3.7 or newer.

Note that for asyncio applications it's preferable to use ContextVarsScopeManager instead of AsyncioScopeManager because of automatic parent span propagation to children coroutines, tasks or scheduled callbacks.

Development

Tests

virtualenv env
. ./env/bin/activate
make bootstrap
make test

You can use tox to run tests as well.

tox

Testbed suite

A testbed suite designed to test API changes and experimental features is included under the testbed directory. For more information, see the Testbed README.

Instrumentation Tests

This project has a working design of interfaces for the OpenTracing API. There is a MockTracer to facilitate unit-testing of OpenTracing Python instrumentation.

from opentracing.mocktracer import MockTracer

tracer = MockTracer()
with tracer.start_span('someWork') as span:
    pass

spans = tracer.finished_spans()
someWorkSpan = spans[0]

Documentation

virtualenv env
. ./env/bin/activate
make bootstrap
make docs

The documentation is written to docs/_build/html.

LICENSE

Apache 2.0 License.

Releases

Before new release, add a summary of changes since last version to CHANGELOG.rst

pip install 'zest.releaser[recommended]'
prerelease
release
git push origin master --follow-tags
make docs
python setup.py sdist upload -r pypi upload_docs -r pypi
postrelease
git push
Owner
OpenTracing API
Consistent, expressive, vendor-neutral APIs for distributed tracing and context propagation
OpenTracing API
A collection of useful functions for writers to analyze text/stories.

AuthorTools AuthorTools provides a multitude of functions for easily analyzing (your?) writing. AuthorTools is made especially for creative writers wi

1 Jan 14, 2022
Find habits that genuinely increase your productivity

BiProductive Description This repository contains the application BiProductive, which analyzes the habits of the person, tests his productivity, and d

Rizvan Iskaliev 43 Jun 11, 2022
An app about keyboards, originating from the design of u/Sonnenschirm

keebapp-backend An app about keyboards, originating from the design of u/Sonnenschirm Setup Firstly, ensure that the environment for python is install

8 Sep 04, 2022
Objetivo: de forma colaborativa pasar de nodos de Dynamo a Python.

ITTI_Ed01_De-nodos-a-python ITTI. EXPERT TRAINING EN AUTOMATIZACIÓN DE PROCESOS BIM: OFFICIAL DE AUTODESK. Edición 1 Enlace al Master Enunciado: Traba

1 Jun 06, 2022
Transparently load variables from environment or JSON/YAML file.

A thin wrapper over Pydantic's settings management. Allows you to define configuration variables and load them from environment or JSON/YAML file. Also generates initial configuration files and docum

Lincoln Loop 90 Dec 14, 2022
An example of Connecting a MySQL Database with Python Code

An example of Connecting a MySQL Database with Python Code And How to install Table of contents General info Technologies Setup General info In this p

Mohammad Hosseinzadeh 1 Nov 23, 2021
A program to generate random numbers b/w 0 to 10 using time

random-num-using-time A program to generate random numbers b/w 0 to 10 using time it uses python's in-built module datetime and an equation which retu

Atul Kushwaha 1 Oct 01, 2022
Run python scripts and pass data between multiple python and node processes using this npm module

Run python scripts and pass data between multiple python and node processes using this npm module. process-communication has a event based architecture for interacting with python data and errors ins

Tyler Laceby 2 Aug 06, 2021
Rotating cube with hand

I am still working on this project :)) To-Do(Present): = It needs an algorithm to fine tune my hand's coordinates for rotation of our cube (initial o

Jay Desale 2 Dec 26, 2021
Class and mathematical functions for quaternion numbers.

Quaternions Class and mathematical functions for quaternion numbers. Installation Python This is a Python 3 module. If you don't have Python installed

3 Nov 08, 2022
📦 A Human's Ultimate Guide to setup.py.

📦 setup.py (for humans) This repo exists to provide an example setup.py file, that can be used to bootstrap your next Python project. It includes som

Navdeep Gill 5k Jan 04, 2023
Rofi script to minimize / unminimize multiple windows in qtile

Qminimize Rofi script to minimize / unminimize multiple windows in qtile Additional requirements : EWMH module fuzzywuzzy module How to use it : - Clo

9 Sep 18, 2022
The little-endian version of MessagePack

MessagePackEL This is the little-endian version of MessagePack, except the endianness is different, the rest is exactly the same as MessagePack. C lib

dukelec 9 May 13, 2022
Hopefully it'll become a very annoying desktop pet

AnnoyingPet Basic Tutorial: https://seebass22.github.io/python-desktop-pet-tutorial/ Handling Mouse Input: https://pythonhosted.org/pynput/mouse.html

1 Jun 08, 2022
Tindicators is a Python library to calculate the values of various technical indicators

Tindicators is a Python library to calculate the values of various technical indicators

omar 3 Mar 03, 2022
CBO uses its Capital Tax model (CBO-CapTax) to estimate the effects of federal taxes on capital income from new investment

CBO’s CapTax Model CBO uses its Capital Tax model (CBO-CapTax) to estimate the effects of federal taxes on capital income from new investment. Specifi

Congressional Budget Office 7 Dec 16, 2022
An evolutionary multi-agent platform based on mesa and NEAT

An evolutionary multi-agent platform based on mesa and NEAT

Valerio1988 6 Dec 04, 2022
This is a program for Carbon Emission calculator.

Summary This is a program for Carbon Emission calculator. Usage This will calculate the carbon emission by each person on various factors. Contributor

Ankit Rane 2 Feb 18, 2022
Modelling and Implementation of Cable Driven Parallel Manipulator System with Tension Control

Cable Driven Parallel Robots (CDPR) is also known as Cable-Suspended Robots are the emerging and flexible end effector manipulation system. Cable-driven parallel robots (CDPRs) are categorized as a t

Siddharth U 0 Jul 19, 2022
Custom component to calculate estimated power consumption of lights and other appliances

Custom component to calculate estimated power consumption of lights and other appliances. Provides easy configuration to get virtual power consumption sensors in Home Assistant for all your devices w

Bram Gerritsen 552 Dec 28, 2022