You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Changes `Dropout.deterministic` and `BatchNorm.use_running_average` to be None by default, use now has to explicitely provide them by either:
1. Passing them to the constructor e.g:
self.bn = nnx.BatchNorm(..., use_running_average=False)
2. Passing them to __call__:
self.dropout(x, deterministic=False)
3. Using `nnx.view` to create a view of the model with specific values:
train_model = nnx.view(model, detereministic=False, use_running_average=False)
PiperOrigin-RevId: 877557940
@@ -90,7 +90,7 @@ Specifically, this will use the RngSteam `rngs.params` for weight initialization
90
90
The `nnx.Dropout` module also requires a random state, but it requires this state at *call* time rather than initialization. Once again, we can pass it random state using the `rngs` keyword argument.
91
91
92
92
```{code-cell} ipython3
93
-
dropout = nnx.Dropout(0.5)
93
+
dropout = nnx.Dropout(0.5, deterministic=False)
94
94
```
95
95
96
96
```{code-cell} ipython3
@@ -159,7 +159,7 @@ Say you want to train a model that uses dropout on a batch of data. You don't wa
0 commit comments