Skip to content

AJDERS/pretraining

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Pretraining of language models

This repository implement pretraining of Huggingface language models using MLM. This repository was created as part of a project for Digital Revisor which wanted to rework their ML pipeline to accomodate other languages than english. This repository was used to create a ELECTRA model for Dutch.

All parameters for training, models and datasets are set in config/config.yaml.

Developers:

About

Repo for pretraining models

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages