Skip to content
This repository has been archived by the owner on Oct 30, 2021. It is now read-only.

Latest commit

 

History

History
29 lines (20 loc) · 1.62 KB

README.md

File metadata and controls

29 lines (20 loc) · 1.62 KB

Skip-Thought Generative Adversarial Networks

arXiv preprint: Generating Text through Adversarial Training using Skip-Thought Vectors

Abstract

GANs have been shown to perform exceedingly well on tasks pertaining to image generation and style transfer. In the field of language modelling, word embeddings such as GLoVe and word2vec are state-of-the-art methods for applying neural network models on textual data. Attempts have been made for utilizing GANs with word embeddings for text generation. This work presents an approach to text generation using Skip-Thought sentence embeddings using GANs based on gradient penalty functions and f-measures. The proposed architecture aims to reproduce writing style in the generated text by modelling the way of expression at a sentence level across all works of an author. Extensive experiments in different embedding settings are carried out on variety of tasks including conditional text generation and language generation.

The model outperforms baseline text generation networks across several automated evaluation metrics. Human judgement scores also demonstrate wide applicability and effectiveness in real life tasks.

Citing

If you find this work helpful, please cite it as:

@article{DBLP:journals/corr/abs-1808-08703,
  author    = {Afroz Ahamad},
  title     = {Generating Text through Adversarial Training using Skip-Thought Vectors},
  journal   = {CoRR},
  volume    = {abs/1808.08703},
  year      = {2018}
}