Meta Representation Transformation for Low-resource Cross-lingual Learning

Related tags

Deep LearningMetaXL
Overview

MetaXL: Meta Representation Transformation for Low-resource Cross-lingual Learning

This repo hosts the code for MetaXL, published at NAACL 2021.

[MetaXL: Meta Representation Transformation for Low- resource Cross-lingual Learning] (https://arxiv.org/pdf/2104.07908.pdf)

Mengzhou Xia, Guoqing Zheng, Subhabrata Mukherjee, Milad Shokouhi, Graham Neubig, Ahmed Hassan Awadallah

NAACL 2021

MetaXL is a meta-learning framework that learns a main model and a relatively small structure, called representation transformation network (RTN) through a bi-level optimization procedure with the goal to transform representations from auxiliary languages such that it benefits the target task the most.

Data

Please download [WikiAnn] (https://github.com/afshinrahimi/mmner), [MARC] (https://registry.opendata.aws/amazon-reviews-ml/), [SentiPers] (https://github.com/phosseini/sentipers) and [Sentiraama] (https://ltrc.iiit.ac.in/showfile.php?filename=downloads/sentiraama/) on its corresponding. Please refer to data/data_index.txt for data splits.

Scripts

The following script shows how to run metaxl on the named entity recognition task on Quechua.

python3 mtrain.py \
      --data_dir data_dir \
      --bert_model xlm-roberta-base \
      --tgt_lang qa \
      --task_name panx \
      --train_max_seq_length 200 \
      --max_seq_length 512 \
      --epochs 20 \
      --batch_size 10 \
      --method metaxl \
      --output_dir output_dir \
      --warmup_proportion 0.1 \
      --main_lr 3e-05 \
      --meta_lr 1e-06 \
      --train_size 1000\
      --target_train_size 100 \
      --source_languages en \
      --source_language_strategy specified \
      --layers 12 \
      --struct perceptron \
      --tied  \
      --transfer_component_add_weights \
      --tokenizer_dir None \
      --bert_model_type ori \
      --bottle_size 192 \
      --portion 2 \
      --data_seed 42  \
      --seed 11 \
      --do_train  \
      --do_eval 

The following script shows how to run metaxl on the sentiment analysis task on fa.

python3 mtrain.py  \
		--data_dir data_dir \
		--task_name sent \
		--bert_model xlm-roberta-base \
		--tgt_lang fa \
		--train_max_seq_length 256 \
		--max_seq_length 256 \
		--epochs 20 \
		--batch_size 10 \
		--method metaxl \
		--output_dir ${output_dir} \
		--warmup_proportion 0.1 \
		--main_lr 3e-05 \
		--meta_lr 1e-6 \
		--train_size 1000 \
		--target_train_size 100 \
		--source_language_strategy specified  \
		--source_languages en \
		--layers 12 \
		--struct perceptron \
		--tied  \
		--transfer_component_add_weights \
		--tokenizer_dir None  \
		--bert_model_type ori  \
		--bottle_size 192  \
		--portion 2 	\
		--data_seed 42 \
		--seed 11  \
		--do_train  \
		--do_eval

Citation

If you find MetaXL useful, please cite the following paper

@inproceedings{xia2021metaxl,
  title={MetaXL: Meta Representation Transformation for Low-resource Cross-lingual Learning},
  author={Mengzhou, Xia and Zheng, Guoqing and Mukherjee, Subhabrata and Shokouhi, Milad and Newbig, Graham and Awadallah, Ahmed Hassan},
  journal={NAACL},
  year={2021},
}

This repository is released under MIT License. (See LICENSE)

Owner
Microsoft
Open source projects and samples from Microsoft
Microsoft
Official repository for "Intriguing Properties of Vision Transformers" (2021)

Intriguing Properties of Vision Transformers Muzammal Naseer, Kanchana Ranasinghe, Salman Khan, Munawar Hayat, Fahad Shahbaz Khan, & Ming-Hsuan Yang P

Muzammal Naseer 155 Dec 27, 2022
Point-NeRF: Point-based Neural Radiance Fields

Point-NeRF: Point-based Neural Radiance Fields Project Sites | Paper | Primary c

Qiangeng Xu 662 Jan 01, 2023
Official implementation of "Accelerating Reinforcement Learning with Learned Skill Priors", Pertsch et al., CoRL 2020

Accelerating Reinforcement Learning with Learned Skill Priors [Project Website] [Paper] Karl Pertsch1, Youngwoon Lee1, Joseph Lim1 1CLVR Lab, Universi

Cognitive Learning for Vision and Robotics (CLVR) lab @ USC 134 Dec 06, 2022
Pytorch implementation of four neural network based domain adaptation techniques: DeepCORAL, DDC, CDAN and CDAN+E. Evaluated on benchmark dataset Office31.

Deep-Unsupervised-Domain-Adaptation Pytorch implementation of four neural network based domain adaptation techniques: DeepCORAL, DDC, CDAN and CDAN+E.

Alan Grijalva 49 Dec 20, 2022
PyTorch implementation of Deformable Convolution

PyTorch implementation of Deformable Convolution !!!Warning: There is some issues in this implementation and this repo is not maintained any more, ple

Wei Ouyang 893 Dec 18, 2022
NHL 94 AI contests

nhl94-ai The end goals of this project is to: Train Models that play NHL 94 Support AI vs AI contests in NHL 94 Provide an improved AI opponent for NH

Mathieu Poliquin 2 Dec 06, 2021
Locally Enhanced Self-Attention: Rethinking Self-Attention as Local and Context Terms

LESA Introduction This repository contains the official implementation of Locally Enhanced Self-Attention: Rethinking Self-Attention as Local and Cont

Chenglin Yang 20 Dec 31, 2021
Facial expression detector

A tensorflow convolutional neural network model to detect facial expressions.

Carlos Tardón Rubio 5 Apr 20, 2022
ML for NLP and Computer Vision.

Sparrow is our open-source ML product. It runs on Skipper MLOps infrastructure.

Katana ML 2 Nov 28, 2021
RAANet: Range-Aware Attention Network for LiDAR-based 3D Object Detection with Auxiliary Density Level Estimation

RAANet: Range-Aware Attention Network for LiDAR-based 3D Object Detection with Auxiliary Density Level Estimation Anonymous submission Abstract 3D obj

30 Sep 16, 2022
PyTorch implementation of "PatchGame: Learning to Signal Mid-level Patches in Referential Games" to appear in NeurIPS 2021

PatchGame: Learning to Signal Mid-level Patches in Referential Games This repository is the official implementation of the paper - "PatchGame: Learnin

Kamal Gupta 22 Mar 16, 2022
Implementation of our paper "DMT: Dynamic Mutual Training for Semi-Supervised Learning"

DMT: Dynamic Mutual Training for Semi-Supervised Learning This repository contains the code for our paper DMT: Dynamic Mutual Training for Semi-Superv

Zhengyang Feng 120 Dec 30, 2022
Code repository for EMNLP 2021 paper 'Adversarial Attacks on Knowledge Graph Embeddings via Instance Attribution Methods'

Adversarial Attacks on Knowledge Graph Embeddings via Instance Attribution Methods This is the code repository to accompany the EMNLP 2021 paper on ad

Peru Bhardwaj 7 Sep 25, 2022
The modify PyTorch version of Siam-trackers which are speed-up by TensorRT.

SiamTracker-with-TensorRT The modify PyTorch version of Siam-trackers which are speed-up by TensorRT or ONNX. [Updating...] Examples demonstrating how

9 Dec 13, 2022
Finding all things on-prem Microsoft for password spraying and enumeration.

msprobe About Installing Usage Examples Coming Soon Acknowledgements About Finding all things on-prem Microsoft for password spraying and enumeration.

205 Jan 09, 2023
PyGAD, a Python 3 library for building the genetic algorithm and training machine learning algorithms (Keras & PyTorch).

PyGAD: Genetic Algorithm in Python PyGAD is an open-source easy-to-use Python 3 library for building the genetic algorithm and optimizing machine lear

Ahmed Gad 1.1k Dec 26, 2022
Cobalt Strike teamserver detection.

Cobalt-Strike-det Cobalt Strike teamserver detection. usage: cobaltstrike_verify.py [-l TARGETS] [-t THREADS] optional arguments: -h, --help show this

TimWhite 17 Sep 27, 2022
Code repository for the paper Computer Vision User Entity Behavior Analytics

Computer Vision User Entity Behavior Analytics Code repository for "Computer Vision User Entity Behavior Analytics" Code Description dataset.csv As di

Sameer Khanna 2 Aug 20, 2022
Simultaneous NMT/MMT framework in PyTorch

This repository includes the codes, the experiment configurations and the scripts to prepare/download data for the Simultaneous Machine Translation wi

<a href=[email protected]"> 37 Sep 29, 2022
Implementation of temporal pooling methods studied in [ICIP'20] A Comparative Evaluation Of Temporal Pooling Methods For Blind Video Quality Assessment

Implementation of temporal pooling methods studied in [ICIP'20] A Comparative Evaluation Of Temporal Pooling Methods For Blind Video Quality Assessment

Zhengzhong Tu 5 Sep 16, 2022