"Inductive Entity Representations from Text via Link Prediction" @ The Web Conference 2021

Related tags

Deep Learningblp
Overview

Inductive entity representations from text via link prediction





This repository contains the code used for the experiments in the paper "Inductive entity representations from text via link prediction", presented at The Web Conference, 2021. To refer to our work, please use the following:

@inproceedings{daza2021inductive,
    title = {Inductive Entity Representations from Text via Link Prediction},
    author = {Daniel Daza and Michael Cochez and Paul Groth},
    booktitle = {Proceedings of The Web Conference 2021},
    year = {2021},
    doi = {10.1145/3442381.3450141},
}

In this work, we show how a BERT-based text encoder can be fine-tuned with a link prediction objective, in a graph where entities have an associated textual description. We call the resulting model BLP. There are three interesting properties of a trained BLP model:

  • It can predict a link between entities, even if one or both were not present during training.
  • It produces useful representations for a classifier, that don't require retraining the encoder.
  • It improves an information retrieval system, by better matching entities and questions about them.

Usage

Please follow the instructions next to reproduce our experiments, and to train a model with your own data.

1. Install the requirements

Creating a new environment (e.g. with conda) is recommended. Use requirements.txt to install the dependencies:

conda create -n blp python=3.7
conda activate blp
pip install -r requirements.txt

2. Download the data

Download the required compressed datasets into the data folder:

Download link Size (compressed)
UMLS (small graph for tests) 121 KB
WN18RR 6.6 MB
FB15k-237 21 MB
Wikidata5M 1.4 GB
GloVe embeddings 423 MB
DBpedia-Entity 1.3 GB

Then use tar to extract the files, e.g.

tar -xzvf WN18RR.tar.gz

Note that the KG-related files above contain both transductive and inductive splits. Transductive splits are commonly used to evaluate lookup-table methods like ComplEx, while inductive splits contain entities in the test set that are not present in the training set. Files with triples for the inductive case have the ind prefix, e.g. ind-train.txt.

2. Reproduce the experiments

Link prediction

To check that all dependencies are correctly installed, run a quick test on a small graph (this should take less than 1 minute on GPU):

./scripts/test-umls.sh

The following table is a adapted from our paper. The "Script" column contains the name of the script that reproduces the experiment for the corresponding model and dataset. For example, if you want to reproduce the results of BLP-TransE on FB15k-237, run

./scripts/blp-transe-fb15k237.sh
WN18RR FB15k-237 Wikidata5M
Model MRR Script MRR Script MRR Script
GlovE-BOW 0.170 glove-bow-wn18rr.sh 0.172 glove-bow-fb15k237.sh 0.343 glove-bow-wikidata5m.sh
BE-BOW 0.180 bert-bow-wn18rr.sh 0.173 bert-bow-fb15k237.sh 0.362 bert-bow-wikidata5m.sh
GloVe-DKRL 0.115 glove-dkrl-wn18rr.sh 0.112 glove-dkrl-fb15k237.sh 0.282 glove-dkrl-wikidata5m.sh
BE-DKRL 0.139 bert-dkrl-wn18rr.sh 0.144 bert-dkrl-fb15k237.sh 0.322 bert-dkrl-wikidata5m.sh
BLP-TransE 0.285 blp-transe-wn18rr.sh 0.195 blp-transe-fb15k237.sh 0.478 blp-transe-wikidata5m.sh
BLP-DistMult 0.248 blp-distmult-wn18rr.sh 0.146 blp-distmult-fb15k237.sh 0.472 blp-distmult-wikidata5m.sh
BLP-ComplEx 0.261 blp-complex-wn18rr.sh 0.148 blp-complex-fb15k237.sh 0.489 blp-complex-wikidata5m.sh
BLP-SimplE 0.239 blp-simple-wn18rr.sh 0.144 blp-simple-fb15k237.sh 0.493 blp-simple-wikidata5m.sh

Entity classification

After training for link prediction, a tensor of embeddings for all entities is computed and saved in a file with name ent_emb-[ID].pt where [ID] is the id of the experiment in the database (we use Sacred to manage experiments). Another file called ents-[ID].pt contains entity identifiers for every row in the tensor of embeddings.

To ease reproducibility, we provide these tensors, which are required in the entity classification task. Click on the ID, download the file into the output folder, and decompress it. An experiment can be reproduced using the following command:

python train.py node_classification with checkpoint=ID dataset=DATASET

where DATASET is either WN18RR or FB15k-237. For example:

python train.py node_classification with checkpoint=199 dataset=WN18RR
WN18RR FB15k-237
Model Acc. ID Acc. Bal. ID
GloVe-BOW 55.3 219 34.4 293
BE-BOW 60.7 218 28.3 296
GloVe-DKRL 55.5 206 26.6 295
BE-DKRL 48.8 207 30.9 294
BLP-TransE 81.5 199 42.5 297
BLP-DistMult 78.5 200 41.0 298
BLP-ComplEx 78.1 201 38.1 300
BLP-SimplE 83.0 202 45.7 299

Information retrieval

This task runs with a pre-trained model saved from the link prediction task. For example, if the model trained is blp with transe and it was saved as model.pt, then run the following command to run the information retrieval task:

python retrieval.py with model=blp rel_model=transe \
checkpoint='output/model.pt'

Using your own data

If you have a knowledge graph where entities have textual descriptions, you can train a BLP model for the tasks of inductive link prediction, and entity classification (if you also have labels for entities).

To do this, add a new folder inside the data folder (let's call it my-kg). Store in it a file containing the triples in your KG. This should be a text file with one tab-separated triple per line (let's call it all-triples.tsv).

To generate inductive splits, you can use data/utils.py. If you run

python utils.py drop_entities --file=my-kg/all-triples.tsv

this will generate ind-train.tsv, ind-dev.tsv, ind-test.tsv inside my-kg (see Appendix A in our paper for details on how these are generated). You can then train BLP-TransE with

python train.py with dataset='my-kg'

Alternative implementations

Owner
Daniel Daza
PhD student at VU Amsterdam and the University of Amsterdam, working on machine learning and knowledge graphs.
Daniel Daza
Code base for NeurIPS 2021 publication titled Kernel Functional Optimisation (KFO)

KernelFunctionalOptimisation Code base for NeurIPS 2021 publication titled Kernel Functional Optimisation (KFO) We have conducted all our experiments

2 Jun 29, 2022
(CVPR 2022 - oral) Multi-View Depth Estimation by Fusing Single-View Depth Probability with Multi-View Geometry

Multi-View Depth Estimation by Fusing Single-View Depth Probability with Multi-View Geometry Official implementation of the paper Multi-View Depth Est

Bae, Gwangbin 138 Dec 28, 2022
Zeyuan Chen, Yangchao Wang, Yang Yang and Dong Liu.

Principled S2R Dehazing This repository contains the official implementation for PSD Framework introduced in the following paper: PSD: Principled Synt

zychen 78 Dec 30, 2022
PyTorch implementation for the paper Visual Representation Learning with Self-Supervised Attention for Low-Label High-Data Regime

Visual Representation Learning with Self-Supervised Attention for Low-Label High-Data Regime Created by Prarthana Bhattacharyya. Disclaimer: This is n

Prarthana Bhattacharyya 5 Nov 08, 2022
Bayesian Optimization Library for Medical Image Segmentation.

bayesmedaug: Bayesian Optimization Library for Medical Image Segmentation. bayesmedaug optimizes your data augmentation hyperparameters for medical im

Şafak Bilici 7 Feb 10, 2022
Catbird is an open source paraphrase generation toolkit based on PyTorch.

Catbird is an open source paraphrase generation toolkit based on PyTorch. Quick Start Requirements and Installation The project is based on PyTorch 1.

Afonso Salgado de Sousa 5 Dec 15, 2022
Code implementing "Improving Deep Learning Interpretability by Saliency Guided Training"

Saliency Guided Training Code implementing "Improving Deep Learning Interpretability by Saliency Guided Training" by Aya Abdelsalam Ismail, Hector Cor

8 Sep 22, 2022
TensorFlow 2 AI/ML library wrapper for openFrameworks

ofxTensorFlow2 This is an openFrameworks addon for the TensorFlow 2 ML (Machine Learning) library

Center for Art and Media Karlsruhe 96 Dec 31, 2022
Invariant Causal Prediction for Block MDPs

MISA Abstract Generalization across environments is critical to the successful application of reinforcement learning algorithms to real-world challeng

Meta Research 41 Sep 17, 2022
This is the repository for the AAAI 21 paper [Contrastive and Generative Graph Convolutional Networks for Graph-based Semi-Supervised Learning].

CG3 This is the repository for the AAAI 21 paper [Contrastive and Generative Graph Convolutional Networks for Graph-based Semi-Supervised Learning]. R

12 Oct 28, 2022
[AAAI 2022] Separate Contrastive Learning for Organs-at-Risk and Gross-Tumor-Volume Segmentation with Limited Annotation

A paper Introduction This is an official release of the paper Separate Contrastive Learning for Organs-at-Risk and Gross-Tumor-Volume Segmentation wit

Jiacheng Wang 14 Dec 08, 2022
Code for "Long-tailed Distribution Adaptation"

Long-tailed Distribution Adaptation (Accepted in ACM MM2021) This project is built upon BBN. Installation pip install -r requirements.txt Usage Traini

Zhiliang Peng 10 May 18, 2022
Solving reinforcement learning tasks which require language and vision

Multimodal Reinforcement Learning JAX implementations of the following multimodal reinforcement learning approaches. Dual-coding Episodic Memory from

Henry Prior 31 Feb 26, 2022
VGGFace2-HQ - A high resolution face dataset for face editing purpose

The first open source high resolution dataset for face swapping!!! A high resolution version of VGGFace2 for academic face editing purpose

Naiyuan Liu 232 Dec 29, 2022
This is the repository for The Machine Learning Workshops, published by AI DOJO

This is the repository for The Machine Learning Workshops, published by AI DOJO. It contains all the workshop's code with supporting project files necessary to work through the code.

AI Dojo 12 May 06, 2022
This repository is all about spending some time the with the original problem posed by Minsky and Papert

This repository is all about spending some time the with the original problem posed by Minsky and Papert. Working through this problem is a great way to begin learning computer vision.

Jaissruti Nanthakumar 1 Jan 23, 2022
MoveNetを用いたPythonでの姿勢推定のデモ

MoveNet-Python-Example MoveNetのPythonでの動作サンプルです。 ONNXに変換したモデルも同梱しています。変換自体を試したい方はMoveNet_tf2onnx.ipynbを使用ください。 2021/08/24時点でTensorFlow Hubで提供されている以下モデ

KazuhitoTakahashi 38 Dec 17, 2022
Deep and online learning with spiking neural networks in Python

Introduction The brain is the perfect place to look for inspiration to develop more efficient neural networks. One of the main differences with modern

Jason Eshraghian 447 Jan 03, 2023
Prototypical Pseudo Label Denoising and Target Structure Learning for Domain Adaptive Semantic Segmentation (CVPR 2021)

Prototypical Pseudo Label Denoising and Target Structure Learning for Domain Adaptive Semantic Segmentation (CVPR 2021, official Pytorch implementatio

Microsoft 247 Dec 25, 2022
DeLighT: Very Deep and Light-Weight Transformers

DeLighT: Very Deep and Light-weight Transformers This repository contains the source code of our work on building efficient sequence models: DeFINE (I

Sachin Mehta 440 Dec 18, 2022