-
Notifications
You must be signed in to change notification settings - Fork 16
Description
Reasoning: I am doing warp on tilt series collected on lamellae collected dose symmetrically with a pretilt of -11º (due to the milling angle, -11 is the "flat" path through the sample). Because of the relatively large pretilt, I do all my processing in imod with a +11º tilt angle offset in order to get better tilt series alignments - the effect is surprisingly significant for getting better quality output. However, when I import the aligned tilt series into warp, the tilt angle offset is lost and the resulting tomogram quality is relatively poor. Beyond the quality issue, I need a much thicker tomogram to be able to fit the lamella, and that empty space seems to make denoising perform much worse. I expect this to be a consistent issue for users doing tomography on FIB-processed samples, so I suspect its more than just a selfish feature request!
As far as I can tell, the math for this happens here:
| SortedAngle[i].TiltAngle += float.Parse(Parts[3], CultureInfo.InvariantCulture); |
Could that line instead read the tilt angle (instead of deltlt) and just write it in instead of adding on? The tilt angle appears to incorporate the tilt angle offset, but has precision only to the 10s digit. Alternatively, perhaps the Angle offset could be read from one of the numerous logs that it is written to (just not taSolutions!).
I'd be happy to put together a pull request if you think that one of these strategies is appropriate.