Skip to content

Commit ecc9eb3

Browse files
committed
tweaks
1 parent 918fc0b commit ecc9eb3

File tree

1 file changed

+11
-4
lines changed

1 file changed

+11
-4
lines changed

docs/src/models/quickstart.md

+11-4
Original file line numberDiff line numberDiff line change
@@ -74,14 +74,21 @@ Since then things have developed a little.
7474

7575
Some things to notice in this example are:
7676

77-
* The batch dimension of data is always the last one. Thus a `2×1000 Matrix` is a thousand observations, each a column of length 2.
78-
79-
* Flux defaults to `Float32`, but most of Julia to `Float64`.
77+
* The batch dimension of data is always the last one. Thus a `2×1000 Matrix` is a thousand observations, each a column of length 2. Flux defaults to `Float32`, but most of Julia to `Float64`.
8078

8179
* The `model` can be called like a function, `y = model(x)`. Each layer like [`Dense`](@ref Flux.Dense) is an ordinary `struct`, which encapsulates some arrays of parameters (and possibly other state, as for [`BatchNorm`](@ref Flux.BatchNorm)).
8280

8381
* But the model does not contain the loss function, nor the optimisation rule. The [`Adam`](@ref Flux.Adam) object stores between iterations the momenta it needs. And [`Flux.crossentropy`](@ref Flux.Losses.crossentropy) is an ordinary function.
8482

8583
* The `do` block creates an anonymous function, as the first argument of `gradient`. Anything executed within this is differentiated.
8684

87-
Instead of calling [`gradient`](@ref Zygote.gradient) and [`update!`](@ref Flux.update!) separately, there is a convenience function [`train!`](@ref Flux.train!) which could replace the `for (x, y) in loader` loop. However, to do anything extra (like logging the loss) an explicit loop is usually clearest.
85+
Instead of calling [`gradient`](@ref Zygote.gradient) and [`update!`](@ref Flux.update!) separately, there is a convenience function [`train!`](@ref Flux.train!). If we didn't want anything extra (like logging the loss), we could replace the training loop with the following:
86+
87+
````julia
88+
for epoch in 1:1_000
89+
train!(pars, loader, opt) do x, y
90+
y_hat = model(x)
91+
Flux.crossentropy(y_hat, y)
92+
end
93+
end
94+
```

0 commit comments

Comments
 (0)