Skip to content

A Document Relation Extraction Model With Context Pooling And Double Graph Attenion Network

License

Notifications You must be signed in to change notification settings

Polarisjame/Double-Graph-Gudied-Relation-Extraction-Enhanced-with-Contexual-Pool

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation


Double Graph Gudied Relation Extraction Enhanced with Contexual Pool

My model is improved through GAIN and ATLOP
Explore the docs »

Table of Contents
  1. About The Project
  2. Getting Started
  3. Training

About The Project

As illustrated in figure, My model Follow the structure of GAIN.

TrainPic

  1. Inspired by the paper Document-Level Relation Extraction With Adaptive Thresholding And Localized Context Pooling, I replaced the mention representation with Special Token (e.g., *Britain* ‘s Prince *Harry* is engaged to his US partner …, where '*' represents to a special token.) instead of Average Pooling.
  2. Secondly, I use Graph Attention Network to capture the Strcutural Feature of Document and Reduce noise at the same time.
  3. Besides, I merge the mention representations to entity representation through LogSumExp Pooling, which is proved to perform better than Average Pooling.
  4. Finally, I redesign the representation of Paths in Entity Graph. Following Zhou et al.(2021), I merge Contextual Information through Attention mechanism to Entity Representation to construct Path between entity pairs and do Inference on entity graph.

(back to top)

Getting Started

Requirements

My code works with the following environment.

  • python=3.7
  • pytorch=1.6.0+cu102
  • dgl-cu102(0.4.3)
  • numpy
  • pandas
  • sklearn
  • einops

Dataset

  • Download data from DocRED shared by DocRED authors
  • Put train_annotated.json, dev.json, test.json, word2id.json, ner2id.json, rel2id.json, vec.npy into the directory data/

PLM

  • download Pretrained Language Model throught Link. put possible required files (pytorch_model.bin, config.json, vocab.txt, etc.) into the directory PLM/bert-????-uncased such as PLM/bert-base-uncased.

(back to top)

Training

Train

>> cd code
>> ./runXXX.sh gpu_id   # like ./run_GAIN_BERT.sh 2

results are saved under code/logs

(back to top)

License

This project is licensed under the MIT License - see the LICENSE file for details.

About

A Document Relation Extraction Model With Context Pooling And Double Graph Attenion Network

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published