Skip to content

Commit

Permalink
Merge pull request #60 from FederatedAI/dev-2.0.0
Browse files Browse the repository at this point in the history
Update 2.0.0
  • Loading branch information
mgqa34 authored Mar 6, 2024
2 parents 4a5911b + fea0580 commit abee189
Show file tree
Hide file tree
Showing 57 changed files with 3,721 additions and 3,833 deletions.
12 changes: 6 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,14 +17,14 @@ FATE-LLM is a framework to support federated learning for large language models(

### Standalone deployment
Please refer to [FATE-Standalone deployment](https://github.com/FederatedAI/FATE#standalone-deployment).
Deploy FATE-Standalone version with 1.11.3 <= version < 2.0, then copy directory `python/fate_llm` to `{fate_install}/fate/python/fate_llm`
* To deploy FATE-LLM v2.0, deploy FATE-Standalone with version >= 2.1, then make a new directory `{fate_install}/fate_llm` and clone the code into it, install the python requirements, and add `{fate_install}/fate_llm/python` to `PYTHONPATH`
* To deploy FATE-LLM v1.x, deploy FATE-Standalone with 1.11.3 <= version < 2.0, then copy directory `python/fate_llm` to `{fate_install}/fate/python/fate_llm`

### Cluster deployment
Use [FATE-LLM deployment packages](https://github.com/FederatedAI/FATE/wiki/Download#llm%E9%83%A8%E7%BD%B2%E5%8C%85) to deploy, refer to [FATE-Cluster deployment](https://github.com/FederatedAI/FATE#cluster-deployment) for more deployment details.

## Quick Start
- [Offsite-tuning Tutorial: Model Definition and Job Submission](./doc/tutorial/offsite_tuning/Offsite_tuning_tutorial.ipynb)
- [FedIPR Tutorial: Add Watermarks to Your Model](./doc/tutorial/fed_ipr/FedIPR-tutorial.ipynb)
- [Federated ChatGLM-6B Training](./doc/tutorial/parameter_efficient_llm/ChatGLM-6B_ds.ipynb)
- [GPT-2 Training](./doc/tutorial/parameter_efficient_llm/GPT2-example.ipynb)
- [Builtin Models In PELLM](./doc/tutorial/builtin_models.md)
- [Federated ChatGLM3-6B Training](./doc/tutorial/parameter_efficient_llm/ChatGLM3-6B_ds.ipynb)
- [Builtin Models In PELLM](./doc/tutorial/builtin_pellm_models.md)
- [Offsite Tuning Tutorial](./doc/tutorial/offsite_tuning/Offsite_tuning_tutorial.ipynb)
- [FedKSeed](./doc/tutorial/fedkseed/fedkseed-example.ipynb)
8 changes: 8 additions & 0 deletions RELEASE.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,11 @@
## Release 2.0.0
### Major Features and Improvements
* Adapt to fate-v2.0 framework:
* Migrate parameter-efficient fine-tuning training methods and models.
* Migrate Standard Offsite-Tuning and Extended Offsite-Tuning(Federated Offsite-Tuning+)
* Newly trainer,dataset, data_processing function design
* New FedKSeed Federated Tuning Algorithm: train large language models in a federated learning setting with extremely low communication cost

## Release 1.3.0
### Major Features and Improvements
* FTL-LLM(Fedrated Learning + Transfer Learning + LLM)
Expand Down
21 changes: 0 additions & 21 deletions doc/tutorial/builtin_models.md

This file was deleted.

21 changes: 21 additions & 0 deletions doc/tutorial/builtin_pellm_models.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
## Builtin PELLM Models
FATE-LLM provide some builtin pellm models, users can use them simply to efficiently train their language models.
To use these models, please read the using tutorial of [ChatGLM-6B Training Guide](./ChatGLM-6B_ds.ipynb).
After reading the training tutorial above, it's easy to use other models listing in the following tabular by changing `module_name`, `class_name`, `dataset` list below.



| Model | ModuleName | ClassName | DataSetName |
| -------------- | ----------------- | --------------| --------------- |
| Qwen2 | pellm.qwen | Qwen | prompt_dataset |
| Bloom-7B1 | pellm.bloom | Bloom | prompt_dataset |
| LLaMA-2-7B | pellm.llama | LLaMa | prompt_dataset |
| LLaMA-7B | pellm.llama | LLaMa | prompt_dataset |
| ChatGLM3-6B | pellm.chatglm | ChatGLM | prompt_dataset |
| GPT-2 | pellm.gpt2 | GPT2 | seq_cls_dataset |
| ALBERT | pellm.albert | Albert | seq_cls_dataset |
| BART | pellm.bart | Bart | seq_cls_dataset |
| BERT | pellm.bert | Bert | seq_cls_dataset |
| DeBERTa | pellm.deberta | Deberta | seq_cls_dataset |
| DistilBERT | pellm.distilbert | DistilBert | seq_cls_dataset |
| RoBERTa | pellm.roberta | Roberta | seq_cls_dataset |
Loading

0 comments on commit abee189

Please sign in to comment.