Skip to content

Conversation

@HeartSaVioR
Copy link
Contributor

What changes were proposed in this pull request?

There're too many classes placed in a single package "org.apache.spark.sql.kafka010" which classes can be grouped by purpose.

As a part of change in SPARK-21869 (#26845), we moved out producer related classes to "org.apache.spark.sql.kafka010.producer" and only expose necessary classes/methods to the outside of package. This patch applies the same to consumer related classes.

Why are the changes needed?

Described above.

Does this PR introduce any user-facing change?

No.

How was this patch tested?

Existing UTs.


import java.{util => ju}

import scala.collection.mutable.ArrayBuffer
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've also cleaned up unused imports and remove unnecessary line (two empty lines between the last import statement and class definition)

@@ -218,4 +218,3 @@ private[kafka010] object InternalKafkaConsumerPool {
}
}
}

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There were two empty lines which can be just one empty line.

@SparkQA
Copy link

SparkQA commented Dec 24, 2019

Test build #115666 has finished for PR 26991 at commit 20c82b7.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Dec 24, 2019

Test build #115667 has finished for PR 26991 at commit 757c7da.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

Copy link
Member

@srowen srowen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks OK as a cleanup, pending tests

@HeartSaVioR
Copy link
Contributor Author

retest this, please

@SparkQA
Copy link

SparkQA commented Dec 24, 2019

Test build #115751 has finished for PR 26991 at commit 757c7da.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Dec 26, 2019

Test build #115793 has finished for PR 26991 at commit d8f5b5c.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@HeartSaVioR
Copy link
Contributor Author

cc. @srowen Gently ping.

Copy link
Member

@srowen srowen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If tests pass and it's just a refactoring, OK.
I guess you also made everything private to .consumer that you can?

@HeartSaVioR
Copy link
Contributor Author

I guess you also made everything private to .consumer that you can?

Yes, the KafkaDataConsumer itself, and acquire and release are open to other package (kafka010). Others are private to consumer.

@srowen
Copy link
Member

srowen commented Dec 31, 2019

Merged to master

@srowen srowen closed this in 319ccd5 Dec 31, 2019
@HeartSaVioR
Copy link
Contributor Author

Thamks for reviewing and merging! Happy new year!

@HeartSaVioR HeartSaVioR deleted the SPARK-30336 branch December 31, 2019 21:48
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants