Skip to main content
Modern PyTorch Guide home page
Search...
⌘K
Official Docs
GitHub
GitHub
Search...
Navigation
Autograd & Differentiation
Autograd Basics
Foundations
Building Models
Performance
Domains
Production
Advanced
API Reference
Community
Forums
Getting Started
Overview
Study Plan 1
Study Plan 2
Tensors & Operations
Introduction to Tensors
Tensor Creation
Indexing & Slicing
Tensor Operations
Broadcasting
Views strides
Named tensors
Tensor attributes
Notes
Autograd & Differentiation
Autograd Basics
Computational Graph
Custom Autograd Functions
Gradients Explained
Requires grad
Forward mode
Neural Network Basics
nn.Module Basics
Parameters & Buffers
Forward Pass
Hooks
State dict
Init strategies
Data Loading
Datasets
DataLoaders
Transforms
Samplers
Pin memory
Webdataset
ETL Process
Overview
On this page
Autograd Basics
Autograd & Differentiation
Autograd Basics
Automatic differentiation fundamentals in PyTorch
Autograd Basics
requires_grad, backward(), grad, and the basics of automatic differentiation.
Notes
Computational Graph
⌘I