Skip to content

Commit 25a2ac5

Browse files
quic-swatiaSwati Allabadi
authored andcommitted
Adding steps about how to fine tune on any custom dataset.
Signed-off-by: Swati Allabadi <[email protected]>
1 parent 6262009 commit 25a2ac5

File tree

2 files changed

+11
-7
lines changed

2 files changed

+11
-7
lines changed

QEfficient/finetune/utils/dataset_utils.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -51,7 +51,7 @@ def get_dataloader_kwargs(train_config, dataset, dataset_processer, split):
5151
)
5252
else:
5353
kwargs["sampler"] = torch.utils.data.DistributedSampler(
54-
dataset, num_replicas=dist.get_world_size(), rank=dist.get_rank(), shuffle=True
54+
dataset, num_replicas=dist.get_world_size(), rank=dist.get_rank(), shuffle=False
5555
)
5656
kwargs["batch_size"] = batch_size
5757
kwargs["drop_last"] = True

docs/source/finetune.md

Lines changed: 10 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -66,17 +66,21 @@ to visualise the data,
6666
tensorboard --logdir runs/<file> --bind_all
6767
```
6868

69+
## Some features/functionalities of fine-tuning stack:
70+
1) Gradient accumulation: By default, gradient accumulation happens for 4 steps. To update this value, command line argument gradient_accumulation_steps has to be passed. (Example: '--gradient_accumulation_steps 8')
71+
2) Gradient Checkpointing: By default, gradient checkpointing is disabled. To enable it, command line argument gradient_accumulation_steps has to be passed.
72+
6973
## Fine-Tuning on custom dataset
7074

7175
To run fine tuning for any user specific dataset, prepare the dataset using the following steps:
7276

73-
1) Create a directory named 'dataset' inside efficient-transformers (i.e. at the root of the repo).
74-
2) Inside this directory, create a file named 'custom_dataset.py'. This is different than the custom_dataset.py present at efficient-transformers/QEfficient/finetune/dataset.
77+
1) Create a directory named 'dataset' inside efficient-transformers.
78+
2) Inside this directory, create a file named 'custom_dataset.py'.
7579
3) Inside the newly created efficient-transformers/dataset/custom_dataset.py, define a function named 'get_custom_dataset'.
76-
4) get_custom_dataset() should have following 4 parameters: dataset_config, tokenizer, split, context_length. This function gets called twice through QEfficient/cloud/finetune.py with the name get_preprocessed_dataset.
77-
5) Inside get_custom_dataset(), dataset needs to prepared for fine tuning. So, the user needs to apply prompt and tokenize the dataset accordingly. Please refer the below template on how to define get_custom_dataset().
78-
6) For examples, please refer python files present in efficient-transformers/QEfficient/finetune/dataset. In case of Samsum dataset, get_preprocessed_samsum() of efficient-transformers/QEfficient/finetune/dataset/samsum_dataset.py is called.
79-
7) In efficient-transformers/QEfficient/finetune/configs/dataset_config.py, for custom_dataset class, pass the appropriate value for train_split and test_split according to the dataset keys corresponding to train and test data points. As an alternative, these values can be passed as command line arguemnets as well with the finetune command. For example "--train_split train".
80+
4) get_custom_dataset() should have following 4 parameters: dataset_config, tokenizer, split, context_length.
81+
5) Inside get_custom_dataset(), user needs to apply prompt and tokenize the dataset accordingly. Please refer the below template on how to define get_custom_dataset().
82+
6) For examples, please refer python files present in [dataset](https://github.com/quic/efficient-transformers/tree/main/QEfficient/finetune/dataset). In case of Samsum dataset, get_preprocessed_samsum() of efficient-transformers/QEfficient/finetune/dataset/samsum_dataset.py is called.
83+
7) In [dataset_config.py](https://github.com/quic/efficient-transformers/blob/main/QEfficient/finetune/configs/dataset_config.py), for custom_dataset class, pass the appropriate value for train_split and test_split. As an alternative, these values can be passed as command line arguments as well with the finetune command. For example "--train_split train".
8084
8) While running fine tuning, pass argument "-–dataset custom_dataset" to finetune on custom dataset.
8185

8286
Template for get_custom_dataset() to be defined inside efficient-transformers/dataset/custom_dataset.py is as follows:

0 commit comments

Comments
 (0)