Skip to content

payelsantra/FIRE2023tutorial

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

40 Commits
 
 
 
 
 
 

Repository files navigation

Unleashing the Power of Large Language Models: A Hands-On Tutorial

This repository contains all detailed information and resources for our tutorial at FIRE 2023, held at the University of Goa, India (Dec 2023).

Abstract

LLMs have opened up possibilities for advancing the state-of-the-art in natural language processing (NLP). In this tutorial, we present the audience with an introduction to LLMs and the associated challenges. The tutorial is structured in the following manner. First, we provide a brief preface that outlines the fundamental principles of NLP, following which, we explore the area of distributional representation learning for NLP. Then, we delve into the essential component of transformer-based pretrained language models. We then follow this up with the concept of prompt learning or in-context learning (ICL) and discuss how it is emerging as a popular methodology replacing the conventional supervised learning workflow comprised of pretraining and fine-tuning. We outline the research challenges in ICL, which usually involves finding the correct set of examples and contexts for the purpose of guiding the LLM decoder towards effective predictions. Afterwards, a hands-on coding and demonstration session will be carried out to impart practical knowledge about LLMs and ICL to the tutorial participants.

About the tutorial

Authors

Tutorial Outline

Part Topic Presenter Link to Slides
1 Introduction to NLP Dr. Sudip Kumar Naskar Slides
2 Overview of Distributional Representation Learning for NLP Dr. Partha Basuchowdhuri Slides
3 Overview of Transformer based Pretrained Language Model Madhusudan Ghosh Slides
4 Overview of Large Language Models Payel Santra Slides
5 Concept of in-context learning and its application Dr. Debasis Ganguly Slides
6 Future directions Dr. Debasis Ganguly
7 Hands-on Coding/Demo Session Dr. Debasis Ganguly, Shrimon Mukherjee, Madhusudan Ghosh, Payel Santra JuPyter Notebook

Useful Links

Citation Policy

If you make using of any of these slides, notebooks, please cite our tutorial abstract:

Feedback

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published