From 060348decec5868d58a32ab60f41531f44ea1cb1 Mon Sep 17 00:00:00 2001 From: Erik Dubbelboer Date: Thu, 2 Jun 2016 18:15:33 +0200 Subject: [PATCH] Added info about Google Cloud Storage --- docs/content/development/extensions-core/hdfs.md | 16 ++++++++++++++++ 1 file changed, 16 insertions(+) diff --git a/docs/content/development/extensions-core/hdfs.md b/docs/content/development/extensions-core/hdfs.md index 177676554e54..a623a7f82156 100644 --- a/docs/content/development/extensions-core/hdfs.md +++ b/docs/content/development/extensions-core/hdfs.md @@ -16,3 +16,19 @@ Make sure to [include](../../operations/including-extensions.html) `druid-hdfs-s |`druid.storage.storageDirectory`||Directory for storing segments.|Must be set.| If you are using the Hadoop indexer, set your output directory to be a location on Hadoop and it will work + +## Google Cloud Storage + +The HDFS extension can also be used for GCS as deep storage. + +### Configuration + +|Property|Possible Values|Description|Default| +|--------|---------------|-----------|-------| +|`druid.storage.type`|hdfs||Must be set.| +|`druid.storage.storageDirectory`||gs://bucket/example/directory|Must be set.| + +All services that need to access GCS need to have the [GCS connector jar](https://cloud.google.com/hadoop/google-cloud-storage-connector#manualinstallation) in their class path. One option is to place this jar in /lib/ and /extensions/druid-hdfs-storage/ + +Tested with Druid 0.9.0, Hadoop 2.7.2 and gcs-connector jar 1.4.4-hadoop2. +