✨️🐍 SPARQL endpoint built with RDFLib to serve machine learning models, or any other logic implemented in Python

Overview

SPARQL endpoint for RDFLib

Version Python versions

Run tests Publish to PyPI CodeQL Coverage

rdflib-endpoint is a SPARQL endpoint based on a RDFLib Graph to easily serve machine learning models, or any other logic implemented in Python via custom SPARQL functions.

It aims to enable python developers to easily deploy functions that can be queried in a federated fashion using SPARQL. For example: using a python function to resolve labels for specific identifiers, or run a classifier given entities retrieved using a SERVICE query to another SPARQL endpoint.

🧑‍🏫 How it works

The user defines and registers custom SPARQL functions using Python, and/or populate the RDFLib Graph, then the endpoint is deployed based on the FastAPI framework.

The deployed SPARQL endpoint can be used as a SERVICE in a federated SPARQL query from regular triplestores SPARQL endpoints. Tested on OpenLink Virtuoso (Jena based) and Ontotext GraphDB (rdf4j based). The endpoint is CORS enabled.

Built with RDFLib and FastAPI. Tested for Python 3.7, 3.8 and 3.9

Please create an issue, or send a pull request if you are facing issues or would like to see a feature implemented

📥 Install the package

Install the package from PyPI:

pip install rdflib-endpoint

🐍 Define custom SPARQL functions

Checkout the example folder for a complete working app example to get started, with a docker deployment. The best way to create a new SPARQL endpoint is to copy this example folder and start from it.

Create a app/main.py file in your project folder with your functions and endpoint parameters:

from rdflib_endpoint import SparqlEndpoint
import rdflib
from rdflib.plugins.sparql.evalutils import _eval

def custom_concat(query_results, ctx, part, eval_part):
    """Concat 2 strings in the 2 senses and return the length as additional Length variable
    """
    argument1 = str(_eval(part.expr.expr[0], eval_part.forget(ctx, _except=part.expr._vars)))
    argument2 = str(_eval(part.expr.expr[1], eval_part.forget(ctx, _except=part.expr._vars)))
    evaluation = []
    scores = []
    evaluation.append(argument1 + argument2)
    evaluation.append(argument2 + argument1)
    scores.append(len(argument1 + argument2))
    scores.append(len(argument2 + argument1))
    # Append the results for our custom function
    for i, result in enumerate(evaluation):
        query_results.append(eval_part.merge({
            part.var: rdflib.Literal(result), 
            rdflib.term.Variable(part.var + 'Length'): rdflib.Literal(scores[i])
        }))
    return query_results, ctx, part, eval_part

# Start the SPARQL endpoint based on a RDFLib Graph and register your custom functions
g = rdflib.graph.ConjunctiveGraph()
app = SparqlEndpoint(
    graph=g,
    # Register the functions:
    functions={
        'https://w3id.org/um/sparql-functions/custom_concat': custom_concat
    },
    # CORS enabled by default
    cors_enabled=True,
    # Metadata used for the service description and Swagger UI:
    title="SPARQL endpoint for RDFLib graph", 
    description="A SPARQL endpoint to serve machine learning models, or any other logic implemented in Python. \n[Source code](https://github.com/vemonet/rdflib-endpoint)",
    version="0.1.0",
    public_url='https://your-endpoint-url/sparql',
    # Example queries displayed in the Swagger UI to help users try your function
    example_query="""Example query:\n
```
PREFIX myfunctions: <https://w3id.org/um/sparql-functions/>
SELECT ?concat ?concatLength WHERE {
    BIND("First" AS ?first)
    BIND(myfunctions:custom_concat(?first, "last") AS ?concat)
}
```"""
)

🦄 Run the SPARQL endpoint

To quickly get started you can run the FastAPI server from the example folder with uvicorn on http://localhost:8000

cd example
uvicorn main:app --reload --app-dir app

Checkout in the example/README.md for more details, such as deploying it with docker.

🧑‍💻 Development

📥 Install for development

Install from the latest GitHub commit to make sure you have the latest updates:

pip install rdf[email protected]+https://github.com/vemonet/[email protected]

Or clone and install locally for development:

git clone https://github.com/vemonet/rdflib-endpoint
cd rdflib-endpoint
pip install -e .

You can use a virtual environment to avoid conflicts:

# Create the virtual environment folder in your workspace
python3 -m venv .venv
# Activate it using a script in the created folder
source .venv/bin/activate

✅️ Run the tests

Install additional dependencies:

pip install pytest requests

Run the tests locally (from the root folder):

pytest -s

📂 Projects using rdflib-endpoint

Here are some projects using rdflib-endpoint to deploy custom SPARQL endpoints with python:

Comments
  • rdflib.plugins.sparql.CUSTOM_EVALS[

    rdflib.plugins.sparql.CUSTOM_EVALS["exampleEval"] = customEval

    I would like to know if this is possible to implement this kind of custom evaluation function, such as in this example: Source code for examples.custom_eval

    Also, do you please know if this would work in a federated query from Wikidata or Wikibase ? Thanks.

    enhancement 
    opened by rchateauneu 3
  • Fix error when dealing with AND operator (&&)

    Fix error when dealing with AND operator (&&)

    Hello, There is an error when dealing with AND operator (&&). Actually parse_qsl incorrectly parse the request body, misunderstanding && as new parameters. I suggest unquoting the data later on. Best regards, Ba Huy

    opened by tbhuy 2
  • Should be able to use prefixes bound in graph's NamespaceManager

    Should be able to use prefixes bound in graph's NamespaceManager

    Prefixes can be bound to a graph using the NamespaceManager. This makes them globally available, when loading and serializing as well as when querying.

    rdflib-endpoint, however, requires prefixes to be declared in the query input even for those bound to the graph, because prepareQuery is not aware of the graph.

    It would be convenient if the SPARQL endpoint could use the globally bound prefixes. My suggestion would be to drop the prepareQuery completely. I actually don't see why it is there in the first place, given that the prepared query is never used later on.

    opened by Natureshadow 0
  • SPARQL query containing 'coalesce' returns no result on rdflib-endpoint.SparqlEndpoint()

    SPARQL query containing 'coalesce' returns no result on rdflib-endpoint.SparqlEndpoint()

    Hello!

    My setup:

    I define a graph, g = Graph(), then I g.parse() a number of .ttl files and create the endpoint with SparqlEndpoint(graph=g). Then I use uvicorn.run(app, ...) to expose the endpoint on my local machine.

    I can successfully run this simple sparql statement to query for keywords:

    PREFIX dcat: <http://www.w3.org/ns/dcat#>
    SELECT 
    ?keyword
    WHERE { ?subj dcat:keyword ?keyword }
    

    However, as soon as I add a coalesce statement, the query does not return results any more:

    PREFIX dcat: <http://www.w3.org/ns/dcat#>
    SELECT 
    ?keyword
    (coalesce(?keyword, "xyz") as ?foo) 
    WHERE { ?subj dcat:keyword ?keyword }
    

    I tried something similar on wikidata which has no problems:

    SELECT ?item ?itemLabel 
    (coalesce(?itemLabel, 2) as ?foo)
    WHERE 
    {
      ?item wdt:P31 wd:Q146. # Must be of a cat
      SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],en". } 
    }
    

    The server debug output for the keywords query looks ok to me.

    INFO:     127.0.0.1:43942 - "GET /sparql?format=json&query=%0APREFIX+dcat%3A+%3Chttp%3A%2F%2Fwww.w3.org%2Fns%2Fdcat%23%3E%0ASELECT+%0A%3Fkeyword%0A%28coalesce%28%3Fkeyword%2C+%22xyz%22%29+as+%3Ffoo%29+%0AWHERE+%7B+%3Fsubj+dcat%3Akeyword+%3Fkeyword+%7D%0ALIMIT+10%0A HTTP/1.1" 200 OK
    

    Putting COALESCE in the WHERE clause does not solve the problem:

    PREFIX dcat: <http://www.w3.org/ns/dcat#>
    SELECT
    ?keyword
    ?foo
    WHERE {
    ?subj dcat:keyword ?keyword
    BIND(coalesce(?keyword, 2) as ?foo)
    }
    

    Am I missing something?

    Thanks in advance

    opened by byte-for-byte 4
  • Construct query

    Construct query

    Use rdflib-endpoint more frequently I ran into an issue with a sparql construct.

    sparql without prefixes

    CONSTRUCT {
      ?work dcterms:date ?jaar .
    }
    WHERE {
    {  select ?work (min(?year) as ?jaar)
        where {
              ?work dc:date ?year .
        }
        group by ?work }
    }
    

    The correct tuples are selected but no triples are created. Using rdflib directly does give the correct triples.

    opened by RichDijk 1
Releases(0.2.6)
  • 0.2.6(Dec 19, 2022)

    Changelog

    • YASGUI now use the provided example query as default query

    Full Changelog: https://github.com/vemonet/rdflib-endpoint/compare/0.2.5...0.2.6

    Source code(tar.gz)
    Source code(zip)
  • 0.2.5(Dec 19, 2022)

  • 0.2.4(Dec 19, 2022)

    Changelog

    • Add support for rdflib.Graph using store="Oxigraph" (oxrdflib package https://github.com/oxigraph/oxrdflib), related to https://github.com/vemonet/rdflib-endpoint/issues/4
    • CLI option for defining the store has been added

    Full Changelog: https://github.com/vemonet/rdflib-endpoint/compare/0.2.3...0.2.4

    Source code(tar.gz)
    Source code(zip)
  • 0.2.3(Dec 19, 2022)

  • 0.2.2(Dec 19, 2022)

  • 0.2.1(Dec 19, 2022)

  • 0.2.0(Dec 19, 2022)

    Changelog

    • Migrate from setup.py to pyproject.toml with a hatch build backend and src/ layout
    • Added types
    • Check for strict type compliance using mypy
    • Added tests for python 3.10 and 3.11
    • Added CITATION.cff file and pre-commit hooks
    • Merged pull request https://github.com/vemonet/rdflib-endpoint/pull/2 "Fix error when dealing with AND operator (&&)"

    Full Changelog: https://github.com/vemonet/rdflib-endpoint/compare/0.1.6...0.2.0

    Source code(tar.gz)
    Source code(zip)
  • 0.1.6(Feb 14, 2022)

  • 0.1.5(Jan 10, 2022)

  • 0.1.4(Dec 15, 2021)

  • 0.1.3(Dec 14, 2021)

  • 0.1.2(Dec 14, 2021)

    rdflib-endpoint, a library to quickly deploy SPARQL endpoint based on a RDFlib Graph optionally with custom functions defined in python. Support for Select, Construct, Describe, and Ask query. No support for Insert in RDFlib currently

    This SPARQL endpoint can perform federated SERVICE queries, and can be queried through a SERVICE query on another SPARQL endpoint (tested for endpoints based on RDF4J Sail and Jena-based triplestores, such as Virtuoso)

    Changelog:

    • Added a CLI feature to quickly start a SPARQL endpoint based on a local RDF file: rdflib-endpoint server your-file.nt

    Full Changelog: https://github.com/vemonet/rdflib-endpoint/compare/0.1.1...0.1.2

    Source code(tar.gz)
    Source code(zip)
  • 0.1.1(Dec 8, 2021)

    rdflib-endpoint, a library to quickly deploy SPARQL endpoint based on a RDFlib Graph optionally with custom functions defined in python. Support for Select, Construct, Describe, and Ask query. No support for Insert in RDFlib currently

    This SPARQL endpoint can perform federated SERVICE queries, and can be queried through a SERVICE query on another SPARQL endpoint (tested for endpoints based on RDF4J Sail and Jena-based triplestores, such as Virtuoso)

    Changelog:

    • Minor improvements to tests and workflows
    Source code(tar.gz)
    Source code(zip)
  • 0.1.0(Dec 7, 2021)

    First release of rdflib-endpoint, a library to quickly deploy SPARQL endpoint based on a RDFlib Graph optionally with custom functions defined in python

    Support for Select, Construct, Describe, and Ask query. No support for Insert in RDFlib currently

    This SPARQL can perform federated SERVICE queries, and can be queried through a SERVICE query on another SPARQL endpoint (tested for endpoints based on RDF4J Sail and Jena-based triplestores, such as Virtuoso)

    Source code(tar.gz)
    Source code(zip)
A utility that allows you to use DI in fastapi without Depends()

fastapi-better-di What is this ? fastapi-better-di is a utility that allows you to use DI in fastapi without Depends() Installation pip install fastap

Maxim 9 May 24, 2022
Opentracing support for Starlette and FastApi

Starlette-OpenTracing OpenTracing support for Starlette and FastApi. Inspired by: Flask-OpenTracing OpenTracing implementations exist for major distri

Rene Dohmen 63 Dec 30, 2022
Adds GraphQL support to your Flask application.

Flask-GraphQL Adds GraphQL support to your Flask application. Usage Just use the GraphQLView view from flask_graphql from flask import Flask from flas

GraphQL Python 1.3k Dec 31, 2022
A FastAPI Framework for things like Database, Redis, Logging, JWT Authentication and Rate Limits

A FastAPI Framework for things like Database, Redis, Logging, JWT Authentication and Rate Limits Install You can install this Library with: pip instal

Tert0 33 Nov 28, 2022
Cookiecutter API for creating Custom Skills for Azure Search using Python and Docker

cookiecutter-spacy-fastapi Python cookiecutter API for quick deployments of spaCy models with FastAPI Azure Search The API interface is compatible wit

Microsoft 379 Jan 03, 2023
Minecraft biome tile server writing on Python using FastAPI

Blocktile Minecraft biome tile server writing on Python using FastAPI Usage https://blocktile.herokuapp.com/overworld/{seed}/{zoom}/{col}/{row}.png s

Vladimir 2 Aug 31, 2022
A FastAPI WebSocket application that makes use of ncellapp package by @hemantapkh

ncellFastAPI author: @awebisam Used FastAPI to create WS application. Ncellapp module by @hemantapkh NOTE: Not following best practices and, needs ref

Aashish Bhandari 7 Oct 01, 2021
OpenAPI for Todolist RESTful API

swagger-client OpenAPI for Todolist RESTful API This Python package is automatically generated by the Swagger Codegen project: API version: 1 Package

Iko Afianando 1 Dec 19, 2021
Flask-vs-FastAPI - Understanding Flask vs FastAPI Web Framework. A comparison of two different RestAPI frameworks.

Flask-vs-FastAPI Understanding Flask vs FastAPI Web Framework. A comparison of two different RestAPI frameworks. IntroductionIn Flask is a popular mic

Mithlesh Navlakhe 1 Jan 01, 2022
Keycloack plugin for FastApi.

FastAPI Keycloack Keycloack plugin for FastApi. Your aplication receives the claims decoded from the access token. Usage Run keycloak on port 8080 and

Elber 4 Jun 24, 2022
Flask-Bcrypt is a Flask extension that provides bcrypt hashing utilities for your application.

Flask-Bcrypt Flask-Bcrypt is a Flask extension that provides bcrypt hashing utilities for your application. Due to the recent increased prevelance of

Max Countryman 310 Dec 14, 2022
Hyperlinks for pydantic models

Hyperlinks for pydantic models In a typical web application relationships between resources are modeled by primary and foreign keys in a database (int

Jaakko Moisio 10 Apr 18, 2022
Town / City geolocations with FastAPI & Mongo

geolocations-api United Kingdom Town / City geolocations with FastAPI & Mongo Build container To build a custom image or extend the api run the follow

Joe Gasewicz 3 Jan 26, 2022
Deploy/View images to database sqlite with fastapi

Deploy/View images to database sqlite with fastapi cd realistic Dependencies dat

Fredh Macau 1 Jan 04, 2022
FastAPI backend for Repost

Repost FastAPI This is the FastAPI implementation of the Repost API. Installation Python 3 must be installed and accessible through the use of a termi

PC 7 Jun 15, 2021
Complete Fundamental to Expert Codes of FastAPI for creating API's

FastAPI FastAPI is a modern, fast (high-performance), web framework for building APIs with Python 3 based on standard Python type hints. The key featu

Pranav Anand 1 Nov 28, 2021
Redis-based rate-limiting for FastAPI

Redis-based rate-limiting for FastAPI

Glib 6 Nov 14, 2022
A basic JSON-RPC implementation for your Flask-powered sites

Flask JSON-RPC A basic JSON-RPC implementation for your Flask-powered sites. Some reasons you might want to use: Simple, powerful, flexible and python

Cenobit Technologies 273 Dec 01, 2022
Prometheus exporter for Starlette and FastAPI

starlette_exporter Prometheus exporter for Starlette and FastAPI. The middleware collects basic metrics: Counter: starlette_requests_total Histogram:

Steve Hillier 225 Jan 05, 2023
Qwerkey is a social media platform for connecting and learning more about mechanical keyboards built on React and Redux in the frontend and Flask in the backend on top of a PostgreSQL database.

Flask React Project This is the backend for the Flask React project. Getting started Clone this repository (only this branch) git clone https://github

Peter Mai 22 Dec 20, 2022