Skip to content

Selfcontained jar gets put at end of dependency tree causing errors in some jobs #2443

@drcrallen

Description

@drcrallen

PullDeps won't consider excluding dependencies outside the stock exclude list. So if, for example, you have extra jars in the aws credentials extension who have different versions which are pulled in by... say... spark. Then when the hadoop isolating classloader tries to load up the hadoop dependencies and then the druid and then the extension ones, you'll end up with not the classes you want because they are either the wrong version or loaded in the wrong classloaders. This will lead to a nasty stacktrace like this:

java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
    at com.google.common.base.Throwables.propagate(Throwables.java:160) ~[druid-selfcontained-0.9.0-rc1-mmx2.jar:0.9.0-rc1-mmx2]
    at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:170) ~[druid-selfcontained-0.9.0-rc1-mmx2.jar:0.9.0-rc1-mmx2]
    at io.druid.indexer.spark.SparkBatchIndexTask.run(SparkBatchIndexTask.scala:155) [druid-spark-batch_2.10-0.0.27.jar:0.0.27]
    at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:338) [druid-selfcontained-0.9.0-rc1-mmx2.jar:0.9.0-rc1-mmx2]
    at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:318) [druid-selfcontained-0.9.0-rc1-mmx2.jar:0.9.0-rc1-mmx2]
    at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_72]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_72]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_72]
    at java.lang.Thread.run(Thread.java:745) [?:1.8.0_72]
Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_72]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_72]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_72]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_72]
    at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:167) ~[druid-selfcontained-0.9.0-rc1-mmx2.jar:0.9.0-rc1-mmx2]
    ... 7 more
Caused by: java.util.ServiceConfigurationError: io.druid.initialization.DruidModule: Provider io.druid.storage.s3.S3StorageDruidModule could not be instantiated
    at java.util.ServiceLoader.fail(ServiceLoader.java:232) ~[?:1.8.0_72]
    at java.util.ServiceLoader.access$100(ServiceLoader.java:185) ~[?:1.8.0_72]
    at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384) ~[?:1.8.0_72]
    at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404) ~[?:1.8.0_72]
    at java.util.ServiceLoader$1.next(ServiceLoader.java:480) ~[?:1.8.0_72]
    at io.druid.initialization.Initialization.getFromExtensions(Initialization.java:132) ~[druid-selfcontained-0.9.0-rc1-mmx2.jar:0.9.0-rc1-mmx2]
    at io.druid.initialization.Initialization.makeInjectorWithModules(Initialization.java:318) ~[druid-selfcontained-0.9.0-rc1-mmx2.jar:0.9.0-rc1-mmx2]
    at io.druid.indexer.spark.SerializedJsonStatic$.liftedTree1$1(SparkDruidIndexer.scala:421) ~[?:?]
    at io.druid.indexer.spark.SerializedJsonStatic$.injector$lzycompute(SparkDruidIndexer.scala:420) ~[?:?]
    at io.druid.indexer.spark.SerializedJsonStatic$.injector(SparkDruidIndexer.scala:419) ~[?:?]
    at io.druid.indexer.spark.SerializedJsonStatic$.liftedTree2$1(SparkDruidIndexer.scala:449) ~[?:?]
    at io.druid.indexer.spark.SerializedJsonStatic$.mapper$lzycompute(SparkDruidIndexer.scala:448) ~[?:?]
    at io.druid.indexer.spark.SerializedJsonStatic$.mapper(SparkDruidIndexer.scala:447) ~[?:?]
    at io.druid.indexer.spark.SparkBatchIndexTask$.runTask(SparkBatchIndexTask.scala:277) ~[?:?]
    at io.druid.indexer.spark.SparkBatchIndexTask.runTask(SparkBatchIndexTask.scala) ~[?:?]
    at io.druid.indexer.spark.Runner.runTask(Runner.java:29) ~[?:?]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_72]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_72]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_72]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_72]
    at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:167) ~[druid-selfcontained-0.9.0-rc1-mmx2.jar:0.9.0-rc1-mmx2]
    ... 7 more
Caused by: java.lang.VerifyError: Bad type on operand stack
Exception Details:
  Location:
    io/druid/storage/s3/S3StorageDruidModule.getRestS3Service(Lcom/amazonaws/auth/AWSCredentialsProvider;)Lorg/jets3t/service/impl/rest/httpclient/RestS3Service; @24: invokespecial
  Reason:
    Type 'io/druid/storage/s3/AWSSessionCredentialsAdapter' (current frame, stack[2]) is not assignable to 'org/jets3t/service/security/ProviderCredentials'
  Current Frame:
    bci: @24
    flags: { }
    locals: { 'io/druid/storage/s3/S3StorageDruidModule', 'com/amazonaws/auth/AWSCredentialsProvider' }
    stack: { uninitialized 12, uninitialized 12, 'io/druid/storage/s3/AWSSessionCredentialsAdapter' }
  Bytecode:
    0x0000000: 2bb9 0027 0100 c100 2899 0013 bb00 2959
    0x0000010: bb00 2a59 2bb7 002b b700 2cb0 bb00 2959
    0x0000020: bb00 2d59 2bb9 0027 0100 b900 2e01 002b
    0x0000030: b900 2701 00b9 002f 0100 b700 30b7 002c
    0x0000040: b0                                     
  Stackmap Table:
    same_frame(@28)

    at java.lang.Class.getDeclaredConstructors0(Native Method) ~[?:1.8.0_72]
    at java.lang.Class.privateGetDeclaredConstructors(Class.java:2671) ~[?:1.8.0_72]
    at java.lang.Class.getConstructor0(Class.java:3075) ~[?:1.8.0_72]
    at java.lang.Class.newInstance(Class.java:412) ~[?:1.8.0_72]
    at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:380) ~[?:1.8.0_72]
    at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404) ~[?:1.8.0_72]
    at java.util.ServiceLoader$1.next(ServiceLoader.java:480) ~[?:1.8.0_72]
    at io.druid.initialization.Initialization.getFromExtensions(Initialization.java:132) ~[druid-selfcontained-0.9.0-rc1-mmx2.jar:0.9.0-rc1-mmx2]
    at io.druid.initialization.Initialization.makeInjectorWithModules(Initialization.java:318) ~[druid-selfcontained-0.9.0-rc1-mmx2.jar:0.9.0-rc1-mmx2]
    at io.druid.indexer.spark.SerializedJsonStatic$.liftedTree1$1(SparkDruidIndexer.scala:421) ~[?:?]
    at io.druid.indexer.spark.SerializedJsonStatic$.injector$lzycompute(SparkDruidIndexer.scala:420) ~[?:?]
    at io.druid.indexer.spark.SerializedJsonStatic$.injector(SparkDruidIndexer.scala:419) ~[?:?]
    at io.druid.indexer.spark.SerializedJsonStatic$.liftedTree2$1(SparkDruidIndexer.scala:449) ~[?:?]
    at io.druid.indexer.spark.SerializedJsonStatic$.mapper$lzycompute(SparkDruidIndexer.scala:448) ~[?:?]
    at io.druid.indexer.spark.SerializedJsonStatic$.mapper(SparkDruidIndexer.scala:447) ~[?:?]
    at io.druid.indexer.spark.SparkBatchIndexTask$.runTask(SparkBatchIndexTask.scala:277) ~[?:?]
    at io.druid.indexer.spark.SparkBatchIndexTask.runTask(SparkBatchIndexTask.scala) ~[?:?]
    at io.druid.indexer.spark.Runner.runTask(Runner.java:29) ~[?:?]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_72]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_72]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_72]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_72]
    at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:167) ~[druid-selfcontained-0.9.0-rc1-mmx2.jar:0.9.0-rc1-mmx2]
    ... 7 more

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions