Skip to content

Conversation

@gengliangwang
Copy link
Member

What changes were proposed in this pull request?

In the latest master branch, the document generation command:

./build/sbt -Phadoop-2.7 -Phive-2.3 -Pyarn -Phive -Pmesos -Pkinesis-asl -Pspark-ganglia-lgpl -Pkubernetes -Phadoop-cloud -Phive-thriftserver unidoc

fails with the folllowing message:

[error] /Users/gengliang.wang/Downloads/spark/sql/hive-thriftserver/v2.3/src/main/java/org/apache/hive/service/cli/thrift/ThriftCLIService.java:248:  error: incompatible types: org.apache.hive.service.rpc.thrift.TSessionHandle cannot be converted to org.apache.hive.service.cli.thrift.TSessionHandle
[error]       resp.setSessionHandle(sessionHandle.toTSessionHandle());
[error]                                                           ^
[error] /Users/gengliang.wang/Downloads/spark/sql/hive-thriftserver/v2.3/src/main/java/org/apache/hive/service/cli/thrift/ThriftCLIService.java:259:  error: incompatible types: org.apache.hive.service.rpc.thrift.TStatus cannot be converted to org.apache.hive.service.cli.thrift.TStatus
[error]       resp.setStatus(HiveSQLException.toTStatus(e));
[error]                                                ^
[error] /Users/gengliang.wang/Downloads/spark/sql/hive-thriftserver/v2.3/src/main/java/org/apache/hive/service/cli/thrift/ThriftCLIService.java:346:  error: method getMinVersion in class ThriftCLIService cannot be applied to given types;
[error]     TProtocolVersion protocol = getMinVersion(CLIService.SERVER_VERSION,
[error]  

To fix it, we should change "sbt unidoc" to "sbt clean unidoc"
This PR also updates related README.

Why are the changes needed?

Fix document generation in testing

Does this PR introduce any user-facing change?

No

How was this patch tested?

Manually test locally.
Also in #26788, the Jenkins tests pass with the fix.

@SparkQA
Copy link

SparkQA commented Dec 8, 2019

Test build #114985 has finished for PR 26796 at commit f8be487.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

# Enable all of the profiles for the build:
build_profiles = extra_profiles + modules.root.build_profile_flags
sbt_goals = ["unidoc"]
sbt_goals = ["clean", "unidoc"]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hm.. I avoided this because it will slow down doc gen since it re-uses complied ones from the regular build. How long does it increase?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, if I am not wrong, this wont verify the documentation in test codes anymore. That's fine as documentation in the test codes wont be the part of our official documentation though. Cc @JoshRosen I think I talked about this with you before somewhere.

docs/README.md Outdated
## API Docs (Scaladoc, Javadoc, Sphinx, roxygen2, MkDocs)

You can build just the Spark scaladoc and javadoc by running `./build/sbt unidoc` from the `$SPARK_HOME` directory.
You can build just the Spark scaladoc and javadoc by running `./build/sbt clean unidoc` from the `$SPARK_HOME` directory.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can imagine adding this to the docs, as users might have stale compiled code locally or something. How often would it occur in run-tests? In PR builders it starts with a clean build and I'd guess users recompile before tests. I'm also worried about slowdown for the PR builders.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@srowen Thanks for the suggestion. I will revert the changes here.

@gengliangwang
Copy link
Member Author

@HyukjinKwon @srowen Yes we should see if there is another approach that can avoid significantly increasing the PR builder execution time

With the code change: 233s (https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/114990)
Without the code change: 94s (https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/114992)

@dongjoon-hyun
Copy link
Member

Currently, this blocks #26788 .
Another option might be to teach a new tag like [unidoc-clean] to Jenkins.
How do you guys think about that?

@srowen
Copy link
Member

srowen commented Dec 9, 2019

Does this actually affect Jenkins? it starts from a clean build, I'd imagine. I thought this might simply affect end users.

@HyukjinKwon
Copy link
Member

HyukjinKwon commented Dec 9, 2019

Yea. Seems it affects the Jenkins job. I noticed the test failure too at #26788.

It eats (up to) 10 mins more in PR builder IIRC .. FWIW, I faced such case before in sbt checkstyle and had to reorder checkstyle (#21579).

@HyukjinKwon
Copy link
Member

I think it will re-compile again when we run the actual tests ...

========================================================================
Building Spark
========================================================================
[info] Building Spark using SBT with these arguments:  -Phadoop-2.7 -Phive-2.3 -Pyarn -Phive -Pkubernetes -Pmesos -Pspark-ganglia-lgpl -Phadoop-cloud -Phive-thriftserver -Pkinesis-asl test:package streaming-kinesis-asl-assembly/assembly
...
========================================================================
Building Unidoc API Documentation
========================================================================
[info] Building Spark unidoc using SBT with these arguments:  -Phadoop-2.7 -Phive-2.3 -Pyarn -Phive -Pkubernetes -Pmesos -Pspark-ganglia-lgpl -Phadoop-cloud -Phive-thriftserver -Pkinesis-asl clean unidoc
...
========================================================================
Running Spark unit tests
========================================================================
[info] Running Spark tests using SBT with these arguments:  -Phadoop-2.7 -Phive-2.3 -Pyarn -Phive -Pkubernetes -Pmesos -Pspark-ganglia-lgpl -Phadoop-cloud -Phive-thriftserver -Pkinesis-asl -Dtest.exclude.tags=org.apache.spark.tags.ExtendedYarnTest test

Another option might be to teach a new tag like [unidoc-clean] to Jenkins

Yeah, seems like it. We might have to reorder unidoc .. but I wonder if we have a better option. Let me take a cursory look today if this isn't urgent ..

@dongjoon-hyun
Copy link
Member

Thank you, @HyukjinKwon ~

@gengliangwang
Copy link
Member Author

gengliangwang commented Dec 9, 2019

Yeah, seems like it. We might have to reorder unidoc .. but I wonder if we have a better option. Let me take a cursory look today if this isn't urgent ..

I would like to try that, too. But actually I don't know the reason for build failure. The two actions "sbt assembly/package" and "sbt unicode" use the same profiles...

Anyway, @HyukjinKwon I have tried the reorder in the other PR: 970ec48

@HyukjinKwon
Copy link
Member

Thanks. Actually, I am trying a different way at #26800 too .. let me see if the tests pass. Hive domain isn't our official documentation anyway so we could simply just exclude. (see also https://github.com/apache/spark/blob/master/docs/_plugins/copy_api_dirs.rb#L30)

@SparkQA
Copy link

SparkQA commented Dec 9, 2019

Test build #114998 has finished for PR 26796 at commit 5acd9c0.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@gengliangwang
Copy link
Member Author

I am closing this one, in favor of #26800

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants