Skip to content

Some projects that were created while studying at the Deep Learning School (MIPT), fall 2024

Notifications You must be signed in to change notification settings

v-atrix/neural_networks_projects

Repository files navigation

Data Science Portfolio

This repository contains a collection of Jupyter notebooks showcasing various projects on data analytics, machine learning, and neural networks - in particular, projects that were created during the Deep Learning School (MIPT), fall 2024.

Projects

1. Game of Thrones Survival Prediction

View Notebook

  • Developed a classification model to predict character survival in Game of Thrones
  • Performed extensive exploratory data analysis with visualization
  • Created and analyzed new features through feature engineering
  • Implemented and compared multiple models:
    • Random Forest
    • Linear Regression
    • AdaBoost Classifier
    • KNeighborsClassifier
  • Conducted hyperparameter optimization to improve model performance
  • Tools used: Python, scikit-learn, pandas, seaborn

2. Kaggle Competition Project

View Notebook

  • Participated in Kaggle competition focusing on prediction tasks
  • Performed comprehensive data preprocessing:
    • Handled missing values
    • Data type optimization
    • Feature engineering
  • Implemented gradient boosting models using CatBoost and XGBoost
  • Applied data filtering techniques to improve model accuracy
  • Tools used: Python, CatBoost, XGBoost, pandas, numpy

3. MNIST Classification with CNN and LeNet

View Notebook

  • Implemented various neural network architectures from scratch
  • Explored different activation functions and their impacts
  • Built a fully-connected neural network
  • Developed convolution operations for image processing
  • Implemented LeNet architecture
  • Compared performance across different approaches
  • Tools used: Python, PyTorch, NumPy

4. Transfer Learning and CNN Optimization

View Notebook

  • Applied transfer learning techniques
  • Implemented data augmentation strategies
  • Developed and optimized convolutional neural networks
  • Experimented with layer freezing techniques
  • Addressed class imbalance problems
  • Tools used: Python, PyTorch, scikit-learn

5. Exploring Autoencoder Architectures

View Notebook

  • Implemented and compared different types of autoencoders:
    • Basic autoencoder for facial image reconstruction
    • Variational Autoencoder (VAE) with MNIST dataset
    • Conditional VAE (CVAE) with MNIST dataset
  • Analyzed and visualized latent space dimensions
  • Performed sampling from trained models
  • Compared VAE and CVAE performance and characteristics
  • Tools used: Python, PyTorch, torchvision

Skills Demonstrated

  • Machine Learning
    • Classification algorithms
    • Feature engineering
    • Model optimization
    • Gradient boosting
  • Deep Learning
    • Neural network architecture design
    • Convolutional Neural Networks
    • Transfer learning
    • Data augmentation
    • Autoencoders
      • Basic autoencoder architecture
      • Variational Autoencoders (VAE)
      • Conditional VAE
      • Latent space analysis
  • Data Analysis
    • Exploratory data analysis
    • Data visualization
    • Feature correlation analysis
    • Data preprocessing

Contact

Setup Instructions

To run these notebooks:

  1. Clone this repository
  2. Install required packages: pip install -r requirements.txt
  3. Open the notebooks in Jupyter or Google Colab

About

Some projects that were created while studying at the Deep Learning School (MIPT), fall 2024

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published