(WIP) add --skip-existing/WHITENOISE_SKIP_EXISTING option to run faster#295
Closed
PetrDlouhy wants to merge 1 commit intoevansd:masterfrom
Closed
(WIP) add --skip-existing/WHITENOISE_SKIP_EXISTING option to run faster#295PetrDlouhy wants to merge 1 commit intoevansd:masterfrom
PetrDlouhy wants to merge 1 commit intoevansd:masterfrom
Conversation
merwok
reviewed
Dec 10, 2021
|
|
||
| def compress(self, path): | ||
| skip_existing = getattr(settings, "WHITENOISE_SKIP_EXISTING", False) | ||
| if (self.skip_existing or skip_existing) and os.path.isfile(path + ".br") and os.path.isfile(path + ".gz"): |
There was a problem hiding this comment.
Shouldn’t the file checks be dependent on self.use_brotli / self.use_gzip?
Author
|
@merwok You are absolutely right, that would not work, if user has one of the files disabled. Thank you for your feedback. Although the more I think about it, the more I think, that it would be better to store the files hashes and compress only if the file hash changes. That way it could be default behavior and all users would benefit from it. |
Author
|
Closing this in favor of #296 |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
I have tried to solve the slow compression of large number of files (#279) by a bit different approach.
I use Docker to build my project. I can make use of Docker layer caching by first running
collectstaticwithout project files (with slightly modified settings file). Then I copy all project files and run the collectstatic once again.If the collectstatic don't compress already existing files in the second run, the run is quick and the first run is cached as long as I don't change settings file.
I think, that similar approach would be possible to use also in other cases - except there would need to be some mechanism to determine which files did change.
It could be enough to store the hash of the original files to separate file.
I would be glad for any thoughts on this approach. If it shows to be the right approach, I can make this to proper PR with documentation and other.