GraphNLI: A Graph-based Natural Language Inference Model for Polarity Prediction in Online Debates

Overview

GraphNLI: A Graph-based Natural Language Inference Model for Polarity Prediction in Online Debates

Vibhor Agarwal, Sagar Joglekar, Anthony P. Young and Nishanth Sastry, "GraphNLI: A Graph-based Natural Language Inference Model for Polarity Prediction in Online Debates", The ACM Web Conference (TheWebConf), 2022.

Abstract

An online forum that allows participatory engagement between users, very often, becomes a stage for heated debates. These debates sometimes escalate into full blown exchanges of hate and misinformation. As such, modeling these conversations through the lens of argumentation theory as graphs of supports and attacks has shown promise, especially in identifying which claims should be accepted. However, the argumentative relation of supports and attacks, also called the polarity, is difficult to infer from natural language exchanges, not least because support or attack relationship in natural language is intuitively contextual.

Various deep learning models have been used to classify the polarity, where the inputs to the model are typically just the texts of the replying argument and the argument being replied to. We propose GraphNLI, a novel graph-based deep learning architecture to infer argumentative relations, which not only considers the immediate pair of arguments involved in the response, but also the surrounding arguments, hence capturing the context of the discussion, through graph walks. We demonstrate the performance of this model on a curated debate dataset from Kialo, an online debating platform. Our model outperforms the relevant baselines with an overall accuracy of 83%, which demonstrates that incorporating nearby arguments in addition to the pair of relayed arguments helps in predicting argumentative relations in online debates.

The paper PDF will be available soon!

Directory Structure

  • GraphNLI folder contains the implementation of Graph Walks and GraphNLI model.
  • Baselines folder contains the implementation of all the four baselines in the paper.

Citation

If you find this paper useful in your research, please consider citing:

@inproceedings{agarwal2022graphnli,
  title={GraphNLI: A Graph-based Natural Language Inference Model for Polarity Prediction in Online Debates},
  author={Vibhor Agarwal and Sagar Joglekar and Anthony P. Young and Nishanth Sastry},
  booktitle={The ACM Web Conference (TheWebConf)},
  year={2022}
}
Owner
Vibhor Agarwal
PhD Researcher @University-of-Surrey | Ex-SRE @media.net | GSoC' 18 @oppia | NLP | Graph Machine Learning | Computational Social Science
Vibhor Agarwal
LightSeq: A High-Performance Inference Library for Sequence Processing and Generation

LightSeq is a high performance inference library for sequence processing and generation implemented in CUDA. It enables highly efficient computation of modern NLP models such as BERT, GPT2, Transform

Bytedance Inc. 2.5k Jan 03, 2023
The entmax mapping and its loss, a family of sparse softmax alternatives.

entmax This package provides a pytorch implementation of entmax and entmax losses: a sparse family of probability mappings and corresponding loss func

DeepSPIN 330 Dec 22, 2022
Easy to use, state-of-the-art Neural Machine Translation for 100+ languages

EasyNMT - Easy to use, state-of-the-art Neural Machine Translation This package provides easy to use, state-of-the-art machine translation for more th

Ubiquitous Knowledge Processing Lab 748 Jan 06, 2023
Creating an Audiobook (mp3 file) using a Ebook (epub) using BeautifulSoup and Google Text to Speech

epub2audiobook Creating an Audiobook (mp3 file) using a Ebook (epub) using BeautifulSoup and Google Text to Speech Input examples qual a pasta do seu

7 Aug 25, 2022
👑 spaCy building blocks and visualizers for Streamlit apps

spacy-streamlit: spaCy building blocks for Streamlit apps This package contains utilities for visualizing spaCy models and building interactive spaCy-

Explosion 620 Dec 29, 2022
Train 🤗-transformers model with Poutyne.

poutyne-transformers Train 🤗 -transformers models with Poutyne. Installation pip install poutyne-transformers Example import torch from transformers

Lennart Keller 2 Dec 18, 2022
HAN2HAN : Hangul Font Generation

HAN2HAN : Hangul Font Generation

Changwoo Lee 36 Dec 28, 2022
🎐 a python library for doing approximate and phonetic matching of strings.

jellyfish Jellyfish is a python library for doing approximate and phonetic matching of strings. Written by James Turk James Turk 1.8k Dec 21, 2022

Easy to start. Use deep nerual network to predict the sentiment of movie review.

Easy to start. Use deep nerual network to predict the sentiment of movie review. Various methods, word2vec, tf-idf and df to generate text vectors. Various models including lstm and cov1d. Achieve f1

1 Nov 19, 2021
Library for Russian imprecise rhymes generation

TOM RHYMER Library for Russian imprecise rhymes generation. Quick Start Generate rhymes by any given rhyme scheme (aabb, abab, aaccbb, etc ...): from

Alexey Karnachev 6 Oct 18, 2022
A relatively simple python program to generate one of those reddit text to speech videos dominating youtube.

Reddit text to speech generator A basic reddit tts video generator Current functionality Generate videos for subs based on comments,(askreddit) so rea

Aadvik 17 Dec 19, 2022
Python implementation of TextRank for phrase extraction and summarization of text documents

PyTextRank PyTextRank is a Python implementation of TextRank as a spaCy pipeline extension, used to: extract the top-ranked phrases from text document

derwen.ai 1.9k Jan 06, 2023
Simple multilingual lemmatizer for Python, especially useful for speed and efficiency

Simplemma: a simple multilingual lemmatizer for Python Purpose Lemmatization is the process of grouping together the inflected forms of a word so they

Adrien Barbaresi 70 Dec 29, 2022
Code for the paper TestRank: Bringing Order into Unlabeled Test Instances for Deep Learning Tasks

TestRank in Pytorch Code for the paper TestRank: Bringing Order into Unlabeled Test Instances for Deep Learning Tasks by Yu Li, Min Li, Qiuxia Lai, Ya

3 May 19, 2022
Tevatron is a simple and efficient toolkit for training and running dense retrievers with deep language models.

Tevatron Tevatron is a simple and efficient toolkit for training and running dense retrievers with deep language models. The toolkit has a modularized

texttron 193 Jan 04, 2023
Retraining OpenAI's GPT-2 on Discord Chats

Train OpenAI's GPT-2 on Discord Chats Retraining a Text Generation Model on Discord Chats using gpt-2-simple that wraps existing model fine-tuning and

Ayush Mishra 4 Oct 27, 2022
Code to reproduce the results of the paper 'Towards Realistic Few-Shot Relation Extraction' (EMNLP 2021)

Realistic Few-Shot Relation Extraction This repository contains code to reproduce the results in the paper "Towards Realistic Few-Shot Relation Extrac

Bloomberg 8 Nov 09, 2022
A demo for end-to-end English and Chinese text spotting using ABCNet.

ABCNet_Chinese A demo for end-to-end English and Chinese text spotting using ABCNet. This is an old model that was trained a long ago, which serves as

Yuliang Liu 45 Oct 04, 2022
Contact Extraction with Question Answering.

contactsQA Extraction of contact entities from address blocks and imprints with Extractive Question Answering. Goal Input: Dr. Max Mustermann Hauptstr

Jan 2 Apr 20, 2022
Text classification is one of the popular tasks in NLP that allows a program to classify free-text documents based on pre-defined classes.

Deep-Learning-for-Text-Document-Classification Text classification is one of the popular tasks in NLP that allows a program to classify free-text docu

Happy N. Monday 2 Mar 17, 2022