-
Notifications
You must be signed in to change notification settings - Fork 1
Lesson - Intorduction to PyTorch #6
Copy link
Copy link
Open
Description
Intorduction to PyTorch
Goal
To provide participants with a foundational understanding of PyTorch, its capabilities, and how it can be used to implement neural networks and process data, especially in the context of Retrogressive Thaw Slumps.
Breakdown
- Overview of Deep Learning Frameworks
- Brief mention of popular frameworks: TensorFlow, Keras, etc.
- Why PyTorch? Advantages and use cases
- PyTorch Basics
- Tensors: Understanding the basic data structure in PyTorch
- Operations with tensors: Reshaping, slicing, mathematical operations
- GPU vs. CPU: How PyTorch utilizes hardware acceleration
- Data in PyTorch
- Dataset and DataLoader: Efficiently loading and batching data
- Transformations: Augmenting and preprocessing data
- Connecting the dots: How RTS data can be loaded and preprocessed in PyTorch
- Model Building in PyTorch (30 minutes)
- nn.Module: Creating custom neural network architectures
- Layers in PyTorch: Linear, Conv2D, RNN, etc.
- Activation functions: ReLU, Sigmoid, Tanh, etc.
- Optimizers, Loss Functions, and Schedulers
- Loss functions: MSE, CrossEntropy, etc.
- Optimizers: Adam, SGD, etc.
- Learning rate schedulers: StepLR, ReduceLROnPlateau, etc.
- Training, Validation, and Testing Pipeline
- Forward and backward propagation in PyTorch
- Model evaluation: Accuracy, loss, and other metrics
- Overfitting: Early stopping, dropout, and other regularization techniques
- A simple example: Training, validating, and testing a small neural network on sample data
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels