Skip to content

prepare release#8

Merged
CeadeS merged 1 commit intomainfrom
dev
Mar 9, 2025
Merged

prepare release#8
CeadeS merged 1 commit intomainfrom
dev

Conversation

@CeadeS
Copy link
Owner

@CeadeS CeadeS commented Mar 9, 2025

Bio Transformations v0.0.5 Release Notes

We're excited to announce the release of Bio Transformations v0.0.5, featuring substantial improvements to the core components of the library. This release introduces several new biologically inspired mechanisms and enhances existing ones to provide a more powerful and flexible toolkit for neural network modifications.

Major New Features

New Distribution Strategies for Learning Rates

  • Multiple Distribution Types: Added support for 10 different probability distributions for fuzzy learning rates:
    • BASELINE: No variability (all parameters = 1.0)
    • UNIFORM: Uniform distribution around 1.0
    • NORMAL: Normal distribution centered at 1.0
    • LOGNORMAL: Log-normal with mean 1.0 (skewed, all positive values)
    • GAMMA: Gamma distribution (positive, skewed)
    • BETA: Beta distribution scaled to [1-nu, 1+nu]
    • LAYER_ADAPTIVE: Layer-dependent variability (decreases with depth)
    • WEIGHT_ADAPTIVE: Weight-dependent scaling (smaller weights get more variability)
    • TEMPORAL: Evolves over time
    • ACTIVITY: Based on neuron activation patterns

Activity-Dependent Learning

  • Added support for activity-dependent learning rates that adjust based on neuron activation patterns
  • Neurons that are more active become more stable (less variable learning rates)
  • Implemented activation tracking for both Linear and Conv2d layers

Dynamic Learning Rate Evolution

  • Added update_fuzzy_learning_rates method to allow learning rates to evolve during training
  • Temporal distribution gradually adapts learning rates throughout training
  • Weight-adaptive distribution scales variability based on weight magnitudes

Improvements to Existing Features

Enhanced Weight Rejuvenation

  • Improved numerical stability and edge case handling in weight rejuvenation
  • Better handling of extreme values and NaN weights
  • Preserved original implementation as rejuvenate_weights_old for backward compatibility

Refined Weight Splitting

  • Better handling of output layers with automatic marking of the last layer
  • Renamed tokens for clarity (last_module_token instead of weight_splitting_skip)
  • Improved error messages for invalid configurations

Optimized Dale's Principle Implementation

  • Enhanced error handling for Dale's principle enforcement
  • Better compatibility with output layers

Code Quality and Documentation

  • Extensively documented code with detailed explanations of biological motivations
  • Added comprehensive docstrings for all methods and classes
  • Improved parameter validation and error messaging throughout
  • Enhanced code organization and naming conventions

Configuration Enhancements

  • Added bounds for fuzzy learning rates (fuzzy_lr_min and fuzzy_lr_max)
  • Added parameters for controlling temporal evolution (fuzzy_lr_update_freq, fuzzy_lr_decay)
  • New parameters for activity-dependent learning
  • Added layer indexing for layer-adaptive distributions

API Changes

  • Added new exposed functions:
    • update_fuzzy_learning_rates: Updates learning rates during training
    • rejuvenate_weights_old: Legacy implementation preserved for backward compatibility
  • Enhanced BioConverter.from_dict for easier configuration from dictionaries
  • Improved update_config method for updating configuration parameters

Documentation Improvements

  • Added advanced usage guide with comprehensive examples
  • Detailed tutorials for each distribution strategy
  • Performance optimization tips
  • Troubleshooting common issues

This release represents a significant advancement in bio-inspired neural network modifications, bringing more biological realism and flexibility to artificial neural networks. We encourage users to explore the new distribution strategies and dynamic learning rate capabilities.

For detailed usage instructions, please refer to our updated documentation including the tutorials and advanced usage guides.

@codecov-commenter
Copy link

⚠️ Please install the 'codecov app svg image' to ensure uploads and comments are reliably processed by Codecov.

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 100.00%. Comparing base (7f4dd51) to head (2e7e6cb).
Report is 2 commits behind head on main.

❗ Your organization needs to install the Codecov GitHub app to enable full functionality.

Additional details and impacted files
@@            Coverage Diff             @@
##              main        #8    +/-   ##
==========================================
  Coverage   100.00%   100.00%            
==========================================
  Files            6         6            
  Lines          590      1221   +631     
==========================================
+ Hits           590      1221   +631     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@CeadeS CeadeS merged commit 12415cf into main Mar 9, 2025
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants