-
Notifications
You must be signed in to change notification settings - Fork 56
Scala 2.11 and Scala 2.12 cross-compilation #12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
| <properties> | ||
| <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> | ||
| <scala.maven.version>3.2.2</scala.maven.version> | ||
| <scala.binary.version>2.11</scala.binary.version> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Removed because cross-compilation plugin requires / recommends they are not defaulted and only set through Profiles.
I was able to select a Profile inside IntelliJ and the project compiled in my IDE.
| <modelVersion>4.0.0</modelVersion> | ||
| <groupId>com.linkedin.sparktfrecord</groupId> | ||
| <artifactId>spark-tfrecord_2.11</artifactId> | ||
| <artifactId>spark-tfrecord_${scala.binary.version}</artifactId> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is now possible due to the cross-compilation plugin.
| <recompileMode>incremental</recompileMode> | ||
| <useZincServer>true</useZincServer> | ||
| <scalaVersion>${scala.compiler.version}</scalaVersion> | ||
| <scalaVersion>${scala.version}</scalaVersion> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The plugin appears to want this property name.
|
|
||
|
|
||
| trait BaseSuite extends WordSpecLike with Matchers with BeforeAndAfterAll | ||
| trait BaseSuite extends AnyWordSpecLike with Matchers with BeforeAndAfterAll |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
New ScalaTest version.
| import matchers.should._ | ||
| import org.scalatest.wordspec.AnyWordSpecLike | ||
|
|
||
| class TFRecordDeserializerTest extends AnyWordSpecLike with Matchers { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
New ScalaTest version.
I think my IDE reformatted things. Let me know if there are specific formatting options I should use and I can revert.
|
Thanks @groodt ! |
|
TensorFlowInferSchema.scala is almost identical between the two. So it is likely due to the version difference. I will take a close look later. |
Might have just been my poor implementation of |
|
Great! |
Done. With I'm also beginning to wonder how much value there is in cross-building for Spark 2.4.x and Scala 2.11 and 2.12 given that it doesn't seem like we'll be able cross-build for Spark 3.x and Scala 2.11 and 2.12? Maybe for Scala 3.x there would be a need to cross-build for 2.12 and 2.13 someday though. Do you think for Spark 3.x there would probably need to be a different branch? Do you think there is still value in this PR to cross-build Spark 2.4.x and Scala 2.11 and 2.12 even though I'm not sure how many people might be using the Spark 2.4.x and Scala 2.12 combination. WDYT? |
I will keep at Spark 3.x will work with Scala 2.12 only. Spark 2.4.x will likely be built with Scala 2.11. If these are the two configurations we want to support, do we still need the cross-building plug-in? |
This relates to my point as well. Spark 2.4.x does support Scala 2.12 and is built and released on Maven for both 2.11 and 2.12. So we could still do that if there is demand. Not sure what the plans at LinkedIn might be. I'm not sure if AWS or EMR will plan on supporting Scala 2.12 for Spark 2.4.x. They did release EMR 6.0.0 with Spark 2.4.4 and Scala 2.12, but I think they're going to go straight to Spark 3.0.0 for EMR 6.1.0. It's really up to you. My main interest personally would be a Spark 3.0.0 and Scala 2.12 release on bintray and maven central. |
At Linkedin, we are not planning to do cross-building. Spark 2.3 is with Scala 2.11 and Spark 3.0 will have to be with Scala 2.12. But I think this PR is useful if there is demand for Spark 2.4 with Scala 2.12 in the future. It also saves us from creating another branch. You listed a task |
Ok, I'm happy to continue with this then. We might still want or need a different branch in future if we want to compile and release a version that targets Spark 3.0 I think.
I presume that in the past, you would run the following to upload your releases? The following should now work: Do you have a way to test those? |
Yes, let me clone your branch and test it out locally. With your latest change in README, your PR is complete, right? |
Yes, I think so. |
|
@junshi15 Any news on whether this approach works and would be acceptable? |
Sorry, I have not tested it yet due to some urgent tasks at work. The RB looks fine. I will do a local test tomorrow. If things check out fine, we should be able to merge it tomorrow. |
|
Ok, thanks! |
Adds cross-compilation for Scala 2.11 and Scala 2.12
TODO:
Should we upgradeStay on 3.0.8scalatestall the way to 3.2.2? We've upgraded most of the others.Fix formatting so the diff is smaller.Done.