Skip to content

jacky1c/CS679_Project_LLM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Transformer-based Pre-trained Models for Natural Language Processing

In this project, we provided a comprehensive review and evaluation of some pre-trained models. Specifically, we firstly introduced the required knowledge for language representation learning and transfer learning and discussed the structural differences and advantages of these different pre-trained models. Then we performed experiments applying these pre-trained models to different downstream tasks (Q&A, summarization and language modelling) for comprehensive evaluation. This project aims to serve the future learners and the community as in-depth theoretical analysis and useful benchmark test results.

See report.pdf for the project report.

About

No description or website provided.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published