Skip to content

Commit 62fdc34

Browse files
chore: update submodules (#186)
Co-authored-by: ydcjeff <[email protected]>
1 parent 2aea3cf commit 62fdc34

File tree

2 files changed

+6
-6
lines changed

2 files changed

+6
-6
lines changed

src/tutorials/beginner/02-transformers-text-classification.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,7 @@ manual_seed(42)
3737

3838
## Basic Setup
3939

40-
Next we will follow the tutorial and load up our dataset and tokenizer to prepocess the data.
40+
Next we will follow the tutorial and load up our dataset and tokenizer to preprocess the data.
4141

4242
### Data Preprocessing
4343

@@ -161,7 +161,7 @@ Therefore we will define a `process_function` (called `train_step` below) to do
161161
* Perform backward pass using loss to calculate gradients for the model parameters.
162162
* Optimize model parameters using gradients and optimizer.
163163

164-
Finally, we choose to return the `loss` so we can utilize it for futher processing.
164+
Finally, we choose to return the `loss` so we can utilize it for further processing.
165165

166166
You will also notice that we do not update the `lr_scheduler` and `progress_bar` in `train_step`. This is because Ignite automatically takes care of it as we will see later in this tutorial.
167167

@@ -190,9 +190,9 @@ from ignite.engine import Engine
190190
trainer = Engine(train_step)
191191
```
192192

193-
The `lr_scheduler` we defined perviously was a handler.
193+
The `lr_scheduler` we defined previously was a handler.
194194

195-
[Handlers](https://pytorch-ignite.ai/concepts/02-events-and-handlers/#handlers) can be any type of function (lambda functions, class methods, etc). On top of that, Ignite provides several built-in handlers to reduce redundant code. We attach these handlers to engine which is triggered at a specific [event](https://pytorch-ignite.ai/concepts/02-events-and-handlers/). These events can be anything like the start of an iteration or the end of an epoch. [Here](https://pytorch.org/ignite/generated/ignite.engine.events.Events.html#events) is a complete list of built-in events.
195+
[Handlers](https://pytorch-ignite.ai/concepts/02-events-and-handlers/#handlers) can be any type of function (lambda functions, class methods, etc.). On top of that, Ignite provides several built-in handlers to reduce redundant code. We attach these handlers to engine which is triggered at a specific [event](https://pytorch-ignite.ai/concepts/02-events-and-handlers/). These events can be anything like the start of an iteration or the end of an epoch. [Here](https://pytorch.org/ignite/generated/ignite.engine.events.Events.html#events) is a complete list of built-in events.
196196

197197
Therefore, we will attach the `lr_scheduler` (handler) to the `trainer` (`engine`) via [`add_event_handler()`](https://pytorch.org/ignite/generated/ignite.engine.engine.Engine.html#ignite.engine.engine.Engine.add_event_handler) so it can be triggered at `Events.ITERATION_STARTED` (start of an iteration) automatically.
198198

@@ -333,7 +333,7 @@ trainer.add_event_handler(Events.EPOCH_COMPLETED, log_validation_results)
333333

334334
## Early Stopping
335335

336-
Now we'll setup a [`EarlyStopping`](https://pytorch.org/ignite/generated/ignite.handlers.early_stopping.EarlyStopping.html#earlystopping) handler for the training process. `EarlyStopping` requires a score_function that allows the user to define whatever criteria to stop trainig. In this case, if the loss of the validation set does not decrease in 2 epochs (`patience`), the training process will stop early.
336+
Now we'll setup a [`EarlyStopping`](https://pytorch.org/ignite/generated/ignite.handlers.early_stopping.EarlyStopping.html#earlystopping) handler for the training process. `EarlyStopping` requires a score_function that allows the user to define whatever criteria to stop training. In this case, if the loss of the validation set does not decrease in 2 epochs (`patience`), the training process will stop early.
337337

338338

339339
```python

0 commit comments

Comments
 (0)