Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -89,6 +89,9 @@ This index is used in the output to specify the selected position-momentum pairs
The following columns come in pairs for each excited state: the first column contains the excitation energy while the second column bears the magnitudes of the transition dipole moment. Note that the header is not mandatory in the input file.
We will use this input file in the following examples and refer to it as `input_file.dat`.

> [!NOTE]
> **Laser field polarization:** Both PDA and PDAW were derived to account for a linear laser field polarization. Although **PROMDENS** does not take the polarization vector $`\vec{E}_0`$ as a parameter, it can still account for the laser field polarization implicitly through the user input: the user just needs to provide the projection of the transition dipole moments to the laser field polarization |$`\vec{\mu}_{0i}`$ $\cdot$ $`\vec{E}_0`$| in the input file instead of the magnitude |$`\vec{\mu}_{0i}`$| as suggested above. In this manner, the code will account for a linearly polarized laser field. Note that if you provide |$`\vec{\mu}_{0i}`$ $\cdot$ $`\vec{E}_0`$|, the absorption spectrum calculated by **PROMDENS** corresponds to an effective absorption spectrum seen by the polarized laser pulse, not to the standard absorption spectrum measured in experiments.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Interesting. I guess in the future it would be nice to extend the code to handle this automatically?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I didn't do it now because it would require all the components of TDM and setting up the field polarization. This is a bit simpler for now. But I agree it should be added in the future.


## Usage
The code requires information about the method (PDA or PDAW, `--method`), the number of excited states to consider (`--nstates`),
the number of initial conditions to be generated (`--npsamples`), and the characteristics of the laser pulse, such as the envelope type
Expand Down
29 changes: 17 additions & 12 deletions src/promdens/promdens.py
Original file line number Diff line number Diff line change
Expand Up @@ -537,9 +537,8 @@ def windowing(self, output_fname='pdaw.dat'):
to an output file.
"""

print("* Generating weights and convolution for windowing.")
print("* Generating weights and convolution for PDAW.")

# determine and print convolution function
# determine and print convolution function
conv_eq = {
'gauss' : "I(t) = exp(-4*ln(2)*(t-t0)^2/fwhm^2)",
Expand All @@ -551,25 +550,31 @@ def windowing(self, output_fname='pdaw.dat'):
f" - Parameters: fwhm = {self.pulse.fwhm/self.fstoau:.3f} fs, "
f"t0 = {self.pulse.t0/self.fstoau:.3f} fs")

print(" - Calculating normalized weights:")
print(" - Calculating weights")
# creating a field for weights
self.weights = np.zeros((self.nstates, self.nsamples)) # sample index, weights in different states

# generating weights for all states and samples
for state in range(0, self.nstates):
self.weights[state] = self.tdm[state]**2*np.interp(self.de[state], self.field_ft_omega, self.field_ft)**2

# normalization of all the weights
print(" - Normalization of the weights (sum of all weights equals 1)")
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is perhaps not needed? Not sure

Suggested change
print(" - Normalization of the weights (sum of all weights equals 1)")

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would rather keep it. The normalization does not come from the equations. However, the weights can be tiny numbers, and you would anyway need to normalize them when processing the trajectories. Hence, I do it directly in the code, but I believe it is safer to mention that.

self.weights /= np.sum(self.weights)

# analysis of the weights
print(" - Analysis of the normalized weights for each state:")
for state in range(0, self.nstates):
# sorting from the largest weight to smallest
state_sum = np.sum(self.weights[state, :])
sorted = np.sort(self.weights[state, :]/state_sum)[::-1]
# analysis
sorted = np.sort(self.weights[state, :]/np.sum(self.weights[state, :]))[
::-1] # sorting from the largest weight to smallest
print(
f" > State {state + 1} - analysis of normalized weights (weights/sum of weights on state {state + 1}):\n"
f" > State {state + 1}:\n"
f" - State contribution: {state_sum * 100:.3f} %\n"
f" - Largest weight: {np.max(self.weights[state, :]):.3e}\n"
f" - Number of ICs making up 90% of S{state + 1} weights: {np.sum(np.cumsum(sorted) < 0.9) + 1:d}\n"
f" - Number of ICs with weights bigger than 0.001: {np.sum(self.weights[state, :] > 0.001):d}")

# normalization of weights at given state
self.weights /= np.sum(self.weights)
f" - Number of ICs with weights > 0.001: {np.sum(self.weights[state, :] > 0.001):d}\n"
f" - Number of ICs making up 90% of S{state + 1} weights: {np.sum(np.cumsum(sorted) < 0.9) + 1:d}")

# creating a variable for printing with first column being sample indexes
arr_print = np.zeros((self.nstates + 1, self.nsamples)) # index, weights in different states
Expand All @@ -584,7 +589,7 @@ def windowing(self, output_fname='pdaw.dat'):
f"index {weights_str}")
np.savetxt(output_fname, arr_print.T, fmt=['%8d'] + ['%16.5e']*self.nstates, header=header)

print(f" - Weights saved to file '{output_fname}'")
print(f" - Weights (normalized) saved to file '{output_fname}'")


def plot_spectrum(ics: InitialConditions) -> None:
Expand Down