Skip to content

Adding final version of entropy entropy acquisitions with test cases#14

Open
rishabh-mondal wants to merge 3 commits intosustainability-lab:mainfrom
rishabh-mondal:entropy_acquisition_al
Open

Adding final version of entropy entropy acquisitions with test cases#14
rishabh-mondal wants to merge 3 commits intosustainability-lab:mainfrom
rishabh-mondal:entropy_acquisition_al

Conversation

@rishabh-mondal
Copy link

@rishabh-mondal rishabh-mondal commented Nov 2, 2023

Implemented Entropy acquisition and test cases on commit 2944d64

Files added:

  1. astra/torch/al/acquisitions/entropy.py: contain implementation of entropy acquisition
  2. astra/torch/al/strategies/deterministic.py: modified this file as per the need for test case purpose
  3. astra/torch/al/strategies/ensemble.py: modified this file as per the need for test case purpose
  4. astra/torch/al/strategies/mc.py: modified this file as per need for test case purpose

Passes all test cases, including those already existing 2944d64
Explanations:

class EntropyAcquisition(MCAcquisition, EnsembleAcquisition, DeterministicAcquisition):
This line defines a new Python class called EntropyAcquisition. This class inherits from three parent classes: MCAcquisition, EnsembleAcquisition, and DeterministicAcquisition.

def acquire_scores(self, logits: torch.Tensor): if logits.dim() == 3:
If the logits tensor has three dimensions, this line computes the log probabilities (log-softmax) along the last dimension (dim=2) using PyTorch's F.log_softmax function and stores the result in the log_probs variable.

entropy = -torch.sum(torch.exp(log_probs) * log_probs, dim=2)
This line calculates the entropy of the logits.
entropy = torch.sum(entropy, dim=0)
This line sums up the calculated entropies across all samples in the sequence by summing along the first dimension (dim=0), resulting in a single entropy list.

else: log_probs = F.log_softmax(logits, dim=1) entropy = -torch.sum(torch.exp(log_probs) * log_probs, dim=1) return entropy
If it is deterministic, it is just returns a list of entropy.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant