-
Notifications
You must be signed in to change notification settings - Fork 206
Cheatsheet #1600
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cheatsheet #1600
Conversation
Co-authored-by: José Valim <[email protected]>
…ir-nx#1582) Co-authored-by: Paulo Valente <[email protected]>
Co-authored-by: Paulo Valente <[email protected]>
Co-authored-by: José Valim <[email protected]>
Cheatsheet remote -> origin
@@ -0,0 +1,153 @@ | |||
# Broadcasts |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this is a stray change.
You should probably try to reset your main branch as I think it diverged from Nx's
Nx.iota({5}, axis: 0) |> Nx.multiply(2) # [0 2 4 6 8] | ||
|
||
# Linearly Spaced Values | ||
Nx.iota({5}) |> Nx.divide(4) # [0.0, 0.25, 0.5, 0.75, 1.0] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think that instead of showing comments like this, we should use the iex> ... notation with results on the line below like it's done in docs/doctests
And do the equivalent for python
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looking great overall! Pretty exciting.
Adds a first cheatsheet focused on numpy to nx