Skip to content

Commit 5662265

Browse files
committed
Fix module name in docstrings from modules -> deq
1 parent 8fae5ae commit 5662265

File tree

1 file changed

+4
-4
lines changed

1 file changed

+4
-4
lines changed

deep_implicit_attention/attention.py

+4-4
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@ class DEQMLPMixerAttention(_DEQModule):
2727
residual connection in the explicit MLP-Mixer architecture.
2828
2929
Note:
30-
To use this module, wrap it in `modules.DEQFixedPoint`.
30+
To use this module, wrap it in `deq.DEQFixedPoint`.
3131
3232
Paper:
3333
https://arxiv.org/abs/2105.02723
@@ -83,7 +83,7 @@ class DEQVanillaSoftmaxAttention(_DEQModule):
8383
into the feed-forward self-correction term.
8484
8585
Note:
86-
To use this module, wrap it in `modules.DEQFixedPoint`.
86+
To use this module, wrap it in `deq.DEQFixedPoint`.
8787
8888
Paper:
8989
https://arxiv.org/abs/1706.03762
@@ -179,7 +179,7 @@ class DEQMeanFieldAttention(_DEQModule):
179179
correction term. This all looks a lot like a transformer.
180180
181181
Note:
182-
To use this module, wrap it in `modules.DEQFixedPoint`.
182+
To use this module, wrap it in `deq.DEQFixedPoint`.
183183
184184
Args:
185185
num_spins (int):
@@ -317,7 +317,7 @@ class DEQAdaTAPMeanFieldAttention(_DEQModule):
317317
first and second moments assuming a Gaussian cavity distribution.
318318
319319
Note:
320-
To use this module, wrap it in `modules.DEQFixedPoint`.
320+
To use this module, wrap it in `deq.DEQFixedPoint`.
321321
322322
Args:
323323
num_spins (int):

0 commit comments

Comments
 (0)