Conversation
|
I unfortunately can't provide further reliable evidence to confirm the <1% claim. I did manage to produce the overhead (~8s) the Python code should have on a ~1000s solve from the baars:20 paper, but that's on a different machine. I think that's too close to 1% to be deemed reliable, especially since I have no way of doing a full run. I'll change it to small and adjust the wording. |
|
OK, thanks. If you can adjust the wording and also add a reference to the baars:20 paper next to it that would be fine. |
8e0488e to
146c253
Compare
|
@jatkinson1000 I pushed the additional changes. I also added some explanation of how we are actually using TransiFlow, which is mostly in ocean and climate modelling, hence the reference in the introduction. If this is OK, I'll merge the PR. |
146c253 to
e6a6c88
Compare
|
Hi @Sbte I note that you have dropped and force-pushed to overwrite history without a discussion. I maintain that the paragraph on climate modelling is not relevant here as transiflow is not used in these papers. I am happy with the performance wording. |
This is a common thing to do in projects I work on, since the history is still kept in the PR, and it makes the commits that were added easier to review. If necessary, any removed commits can always be force pushed back before it's merged.
It is true that in this repository we provide only the mentioned backends, but because of the modular nature, one can still use the continuation with an external discretization and/or external computational backend. The important thing is that we have a clearly defined interface that can be used in parallel computations, and that is the novelty that TransiFlow provides. The user can then easily implement their own backend and/or discretization. As mentioned in the documentation, the PETSc and Epetra backends are provided for this exact purpose. To give the user a place to start. One of the climate use cases we have is using an external discretization with the continuation and HYMLS backend from TransiFlow, the other is using the continuation (and eigenvalue solver) from TransiFlow but has its own discretization and solver backend. The latter can be found here: https://github.com/nlesc-smcm/pop-iemic-coupling (it is imported as The discretizations implemented in TransiFlow itself are mostly there for educational purposes. For any real world simulations, I'd expect the user to implement their own, or at least amend the provided ones. |
|
Hi @Sbte I'm afraid I am going to stand on this. The JOSS paper review is for the software as openly published and publicly available. Looking at the online documentation I cannot see anywhere a description or guidance of how the code can be used with arbitrary backends such as a third-party Earth system model which is what you are implying here. Note the current focus of our discussion has diverged from my original objection which was that the reference to climate modelling was not in line with the rest of the paper, superfluous to the points already made, and that transiflow was not used in the referenced works. If you are happy to accept my original suggestion I see no further barriers to publication. |
All of these extensions are public. The backend for the project I mentioned earlier is implemented here https://github.com/omuse-geoscience/omuse/blob/master/src/omuse/community/iemic/interface.py and is instantiated here https://github.com/nlesc-smcm/pop-iemic-coupling/blob/main/iemic.py#L191. The EMIC itself is available here https://github.com/nlesc-smcm/i-emic/. I'd add a reference to that backend in the documentation, but that's not very high-quality code, so I don't want anyone to get confused by it.
That's a good point. I now added that in #63.
The reason for the reference to climate modelling is that this was the original inspiration for developing this Python package and this is where we've been using this package for the past 5 years. The package had predecessors in Fortran and C++, which have been in use for 30+ years, but we ran into serious issues with those, first after writing a hybrid Fortran/C++ package for thies:09 and then later with a more C++-oriented approach for mulder:21. This made us decide to finally completely rewrite everything. I can remove the two citations, if that's an issue, but I'd prefer to leave in the textual reference to climate modelling, since that's where all of this is coming from. I could also elaborate on that in the paper if that's preferred. |
|
@jatkinson1000 Anything I can do to make this move forward? |
|
My sincere apologies for the delay @Sbte I have been on extended leave but am due to return to work next week. Prior to this I was discussing how to proceed with the wider editorial team regarding the additional functionality claims added in #63 My fellow editors had a range of opinions as to the most appropriate approach. To confirm - my understanding is that coupling to an arbitrary codes to perform a stability analysis is a capability of TransiFlow as was reviewed, it was just not detailed? Given the functionality detailed in #63 is key to some of the details you want to leave in the paper (use of Transiflow and its predecessors with ESMs) I think the best compromise is to check with the reviewers if they feel that the functionality was clear from their initial review and from reading #63. Please merge this PR and I will proceed with the remainder of the editorial checks when I return to work next week. |
No problem!
Yes, it was actually detailed in the documentation of the classes that can be extended to allow for using arbitrary code with TransiFlow, but it just wasn't easy to find for new users, as was apparent from your review. For this reason, I added a new page to the documentation.
Sounds good to me!
Will do! |
Here are my editorial suggestions for the paper following JOSS review in openjournals/joss-reviews#8453.
To summarise:
I have a second commit that removes the section about climate modelling. The point about software maintainability is already made in the preceding paragraph and I do not see how the additional examples are relevant to this software or paper -- as far as I can tell Transiflow cannot be used with these simulations.
Not implemented here but also needing to be addressed is the statement about "Python overheads being less than 1%".
JOSS requires any performance statements to be backed up, and it is not clear to me exactly what this refers to (If I were to guess it is the overhead of running
transiflowcompared to running the numerical flow model inScipy,Trilinosetc.?)Please amend the text to be clear and either evidence the <1% or amend it to be 'small'.