-
Notifications
You must be signed in to change notification settings - Fork 11
chore: fix clippy. #326
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
chore: fix clippy. #326
Conversation
Signed-off-by: Klaus Ma <klausm@nvidia.com>
Summary of ChangesHello @k82cn, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request primarily addresses code quality and consistency by resolving various Clippy linter warnings and standardizing error handling mechanisms across the codebase. It refines error type conversions, suppresses linter warnings in generated code, and improves Arrow IPC encoding for robustness, ensuring a cleaner and more maintainable codebase. Highlights
Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here. You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension. Footnotes
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
This pull request primarily focuses on addressing clippy warnings and improving error handling consistency across the codebase. The FlameError to Status conversion has been enhanced to map more specific error types to appropriate gRPC status codes, which is a good improvement for API clarity. Additionally, several #[allow(clippy::...)] attributes have been added to generated code to suppress linter warnings, which is a common and acceptable practice. The most significant change involves refactoring the Arrow Flight data serialization in object_cache/src/cache.rs to use a more direct IpcDataGenerator approach.
| let mut flight_data_vec = Vec::new(); | ||
|
|
||
| // If empty, manually encode the batch | ||
| if flight_data_vec.is_empty() { | ||
| tracing::warn!("flight_data_from_arrow_batch returned empty, using manual encoding"); | ||
|
|
||
| // First, encode and send schema | ||
| let mut data_gen = IpcDataGenerator::default(); | ||
| let encoded_schema = data_gen.schema_to_bytes(batch.schema().as_ref(), &options); | ||
|
|
||
| let schema_flight_data = FlightData { | ||
| flight_descriptor: None, | ||
| app_metadata: vec![].into(), | ||
| data_header: encoded_schema.ipc_message.into(), | ||
| data_body: vec![].into(), | ||
| }; | ||
| flight_data_vec.push(schema_flight_data); | ||
| // Encode using IpcDataGenerator directly with DictionaryTracker | ||
| let data_gen = IpcDataGenerator::default(); | ||
| let mut dict_tracker = DictionaryTracker::new(false); | ||
|
|
||
| // Then, send the batch data | ||
| let mut dictionary_tracker = arrow::ipc::writer::DictionaryTracker::new(false); | ||
| // First, encode and send schema | ||
| let encoded_schema = data_gen.schema_to_bytes_with_dictionary_tracker( | ||
| batch.schema().as_ref(), | ||
| &mut dict_tracker, | ||
| &options, | ||
| ); | ||
|
|
||
| let (encoded_dictionaries, encoded_batch) = data_gen | ||
| .encoded_batch(batch, &mut dictionary_tracker, &options) | ||
| .map_err(|e| Status::internal(format!("Failed to encode batch: {}", e)))?; | ||
| let schema_flight_data = FlightData { | ||
| flight_descriptor: None, | ||
| app_metadata: vec![].into(), | ||
| data_header: encoded_schema.ipc_message.into(), | ||
| data_body: vec![].into(), | ||
| }; | ||
| flight_data_vec.push(schema_flight_data); | ||
|
|
||
| // Add dictionary batches if any | ||
| for dict_batch in encoded_dictionaries { | ||
| flight_data_vec.push(dict_batch.into()); | ||
| } | ||
| // Then, send the batch data | ||
| let (encoded_dictionaries, encoded_batch) = data_gen | ||
| .encoded_batch(batch, &mut dict_tracker, &options) | ||
| .map_err(|e| FlameError::Internal(format!("Failed to encode batch: {}", e)))?; | ||
|
|
||
| // Add the data batch | ||
| flight_data_vec.push(encoded_batch.into()); | ||
| // Add dictionary batches if any | ||
| for dict_batch in encoded_dictionaries { | ||
| flight_data_vec.push(dict_batch.into()); | ||
| } | ||
|
|
||
| // Add the data batch | ||
| flight_data_vec.push(encoded_batch.into()); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The batch_to_flight_data_vec function has been significantly refactored to remove the use of arrow_flight::utils::flight_data_from_arrow_batch and implement the encoding manually. While this might resolve specific issues or provide more control, it's a substantial change to the data serialization logic. Could you please provide more context on why this change was necessary? Was flight_data_from_arrow_batch causing problems, or is this a performance/control optimization?
Signed-off-by: Klaus Ma <klausm@nvidia.com>
No description provided.