-
Notifications
You must be signed in to change notification settings - Fork 3
Use the compile_commands implementation from the monolithic rule in the per_file rule #98
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
ff788bf to
c1b559c
Compare
c1b559c to
fea15a0
Compare
Szelethus
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you delete all unnecessary functions, just to see how much simpler this solution would be?
fea15a0 to
65034d3
Compare
|
This patch might deserve a bit more love. |
|
Using the compile_commands.bzl aspect for file collection, there are some things to keep in mind. Mainly, despite the name |
7827b46 to
cba2a86
Compare
Szelethus
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just like that, and everything runs? Damn.
| if not check_valid_file_type(src): | ||
| continue |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
When does this happen?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Mainly, despite the name transitive_source_files suggests only having source files, it actually can contain header files (this might be because it collects the files under the srcs=[] part of the cc_lib/bin rules)
3fb354f to
936ed44
Compare
936ed44 to
4ce7518
Compare
…he compile_commands solution transitively fixes the per_file rule
…e_commands.bzl fixes it transitively
4ce7518 to
e4b2503
Compare
Szelethus
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We need to understand this a bit better before merging...
| for elem in headers_mid: | ||
| if type(elem) == "depset": | ||
| headers.extend(elem.to_list()) | ||
| else: | ||
| headers.append(elem) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we have to do this for the monolithic version? If not, why not?
| trans_depsets = [compilation_context.headers] | ||
| for dset in target[SourceFilesInfo].headers.to_list(): | ||
| trans_depsets.append(dset) | ||
| headers = depset(direct = [src], | ||
| transitive = trans_depsets) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Lets turn this into a function.
| # Get compile_commands.json file and source files | ||
| compile_commands = None | ||
| source_files = None | ||
| for output in compile_commands_impl(ctx): | ||
| if type(output) == "DefaultInfo": | ||
| compile_commands = output.files.to_list()[0] | ||
| source_files = output.default_runfiles.files.to_list() | ||
| if not compile_commands: | ||
| fail("Failed to generate compile_commands.json file!") | ||
| if not source_files: | ||
| fail("Failed to collect source files!") | ||
| if compile_commands != ctx.outputs.compile_commands: | ||
| fail("Seems compile_commands.json file is incorrect!") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would prefer not touching this for compatibility sake
| trans_depsets = [compilation_context.headers] | ||
| for dset in target[SourceFilesInfo].headers.to_list(): | ||
| trans_depsets.append(dset) | ||
| headers = depset(direct = [src], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure I understand this line...
So, this is not just headers but src as well?
| # NOTE: we collect only headers, so CTU may not work! | ||
| headers = depset([src], transitive = [compilation_context.headers]) | ||
| trans_depsets = [compilation_context.headers] | ||
| for dset in target[SourceFilesInfo].headers.to_list(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess dest is actually a header :)
| ), | ||
| ] | ||
|
|
||
| def get_compile_commands_json_and_srcs(ctx): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe I'm mistaken, at least I do not see that yet, but I guess adding this function does not give us much benefits:
As I mentioned I would prefer original implementation in src/codechecker.bzl.
And in src/per_file.bzl we even do not need this function.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is literally copy-pasted from codechecker.bzl, so this is above and beyond NFC. Since the output of compile_commands_impl needs the same handling both in the per-file and monolithic implementations, I think this is very sensible refactoring step.
In terms of merging order, sure, we can land the creating of this convenience function before or after replacing the compile commands generation parts.
nettle
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
May I suggest changing only src/per_file.bzl implementation first, which seems reasonable to me and not touching src/codechecker.bzl and src/compile_commands.bzl at least yet.
It is a more than reasonable ask to land parts of this patch independently, but I'm strongly against not touching those two files to keep them stable, especially with NFC patches. |
Szelethus
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Meant to send this inline comment yesterday.
| ), | ||
| ] | ||
|
|
||
| def get_compile_commands_json_and_srcs(ctx): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is literally copy-pasted from codechecker.bzl, so this is above and beyond NFC. Since the output of compile_commands_impl needs the same handling both in the per-file and monolithic implementations, I think this is very sensible refactoring step.
In terms of merging order, sure, we can land the creating of this convenience function before or after replacing the compile commands generation parts.
| compile_commands_json, source_files = \ | ||
| get_compile_commands_json_and_srcs(ctx) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The following can be used to get compile_commands_json:
for output in compile_commands_impl(ctx):
if type(output) == "DefaultInfo":
compile_commands_json = output.files.to_list()[0]
Why:
Currently, we use a different, inferior implementation for generating compile_commands.json in the
pre_filerule.We should use the same as in the monolithic rule. This would prevent us from having to maintain both rules simultaneously.
What:
Replaces the
compile_commands.jsongenerating implementation in thepre_filerule.Fixes the unittest for compile_commands, the per_file
compile_commands.jsonlocation changedAddresses:
Fixes: #120
Fixes: #128