Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@

import com.fasterxml.jackson.annotation.JsonCreator;
import com.fasterxml.jackson.annotation.JsonProperty;
import com.google.common.annotations.VisibleForTesting;

import javax.annotation.Nullable;
import java.net.InetAddress;
Expand All @@ -44,8 +45,9 @@ protected QueryException(Throwable cause, String errorCode, String errorClass, S
this.host = host;
}

@VisibleForTesting
@JsonCreator
protected QueryException(
public QueryException(
@JsonProperty("error") @Nullable String errorCode,
@JsonProperty("errorMessage") String errorMessage,
@JsonProperty("errorClass") @Nullable String errorClass,
Expand Down
39 changes: 20 additions & 19 deletions docs/querying/querying.md
Original file line number Diff line number Diff line change
Expand Up @@ -106,6 +106,12 @@ curl -X DELETE "http://host:port/druid/v2/abc123"

## Query errors

### Authentication and authorization failures

For [secured](../design/auth.md) Druid clusters, query requests respond with an HTTP 401 response code in case of an authentication failure. For authorization failures, an HTTP 403 response code is returned.

### Query execution failures

If a query fails, Druid returns a response with an HTTP response code and a JSON object with the following structure:

```json
Expand All @@ -116,13 +122,6 @@ If a query fails, Druid returns a response with an HTTP response code and a JSON
"host" : "druid1.example.com:8083"
}
```
The HTTP response code returned depends on the type of query failure. For timed out queries, an HTTP 504 response code is returned.

For [secured](../design/auth.md) Druid clusters, query requests respond with an HTTP 401 response code in case of an authentication failure. For authorization failures, an HTTP 403 response code is returned.

If a query request fails due to being limited by the [query scheduler laning configuration](../configuration/index.md#broker), an HTTP 429 response with the same JSON object schema as an error response, but with `errorMessage` of the form: "Total query capacity exceeded" or "Query capacity exceeded for lane 'low'".

For every other type of query failures, an HTTP 500 response code is returned.

The fields in the response are:

Expand All @@ -133,15 +132,17 @@ The fields in the response are:
|errorClass|The class of the exception that caused this error. May be null.|
|host|The host on which this error occurred. May be null.|

Possible codes for the *error* field include:

|code|description|
|----|-----------|
|`Query timeout`|The query timed out.|
|`Query interrupted`|The query was interrupted, possibly due to JVM shutdown.|
|`Query cancelled`|The query was cancelled through the query cancellation API.|
|`Resource limit exceeded`|The query exceeded a configured resource limit (e.g. groupBy maxResults).|
|`Unauthorized request.`|The query was denied due to security policy. Either the user was not recognized, or the user was recognized but does not have access to the requested resource.|
|`Unsupported operation`|The query attempted to perform an unsupported operation. This may occur when using undocumented features or when using an incompletely implemented extension.|
|`Truncated response context`|An intermediate response context for the query exceeded the built-in limit of 7KB.<br/><br/>The response context is an internal data structure that Druid servers use to share out-of-band information when sending query results to each other. It is serialized in an HTTP header with a maximum length of 7KB. This error occurs when an intermediate response context sent from a data server (like a Historical) to the Broker exceeds this limit.<br/><br/>The response context is used for a variety of purposes, but the one most likely to generate a large context is sharing details about segments that move during a query. That means this error can potentially indicate that a very large number of segments moved in between the time a Broker issued a query and the time it was processed on Historicals. This should rarely, if ever, occur during normal operation.|
|`Unknown exception`|Some other exception occurred. Check errorMessage and errorClass for details, although keep in mind that the contents of those fields are free-form and may change from release to release.|
Possible Druid error codes for the `error` field include:

|Error code|HTTP response code|description|
|----|-----------|-----------|
|`SQL parse failed`|400|Only for SQL queries. The SQL query failed to parse.|
|`Plan validation failed`|400|Only for SQL queries. The SQL query failed to validate.|
|`Resource limit exceeded`|400|The query exceeded a configured resource limit (e.g. groupBy maxResults).|
|`Query capacity exceeded`|429|The query failed to execute because of the lack of resources available at the time when the query was submitted. The resources could be any runtime resources such as [query scheduler lane capacity](../configuration/index.md#query-prioritization-and-laning), merge buffers, and so on. The error message should have more details about the failure.|
|`Unsupported operation`|501|The query attempted to perform an unsupported operation. This may occur when using undocumented features or when using an incompletely implemented extension.|
|`Query timeout`|504|The query timed out.|
|`Query interrupted`|500|The query was interrupted, possibly due to JVM shutdown.|
|`Query cancelled`|500|The query was cancelled through the query cancellation API.|
|`Truncated response context`|500|An intermediate response context for the query exceeded the built-in limit of 7KB.<br/><br/>The response context is an internal data structure that Druid servers use to share out-of-band information when sending query results to each other. It is serialized in an HTTP header with a maximum length of 7KB. This error occurs when an intermediate response context sent from a data server (like a Historical) to the Broker exceeds this limit.<br/><br/>The response context is used for a variety of purposes, but the one most likely to generate a large context is sharing details about segments that move during a query. That means this error can potentially indicate that a very large number of segments moved in between the time a Broker issued a query and the time it was processed on Historicals. This should rarely, if ever, occur during normal operation.|
|`Unknown exception`|500|Some other exception occurred. Check errorMessage and errorClass for details, although keep in mind that the contents of those fields are free-form and may change from release to release.|
Original file line number Diff line number Diff line change
Expand Up @@ -35,9 +35,19 @@ public class ResourceLimitExceededException extends BadQueryException
{
public static final String ERROR_CODE = "Resource limit exceeded";

public ResourceLimitExceededException(String message, Object... arguments)
public static ResourceLimitExceededException withMessage(String message, Object... arguments)
{
this(ERROR_CODE, StringUtils.nonStrictFormat(message, arguments), ResourceLimitExceededException.class.getName());
return new ResourceLimitExceededException(StringUtils.nonStrictFormat(message, arguments));
}

public ResourceLimitExceededException(String errorCode, String message, String errorClass, String host)
{
super(errorCode, message, errorClass, host);
}

public ResourceLimitExceededException(String message)
{
this(ERROR_CODE, message, ResourceLimitExceededException.class.getName());
}

@JsonCreator
Expand All @@ -47,6 +57,6 @@ private ResourceLimitExceededException(
@JsonProperty("errorClass") String errorClass
)
{
super(errorCode, errorMessage, errorClass, resolveHostname());
this(errorCode, errorMessage, errorClass, resolveHostname());
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -189,7 +189,7 @@ public QueryRunner<ScanResultValue> mergeRunners(

return nWayMergeAndLimit(groupedRunners, queryPlus, responseContext);
}
throw new ResourceLimitExceededException(
throw ResourceLimitExceededException.withMessage(
"Time ordering is not supported for a Scan query with %,d segments per time chunk and a row limit of %,d. "
+ "Try reducing your query limit below maxRowsQueuedForOrdering (currently %,d), or using compaction to "
+ "reduce the number of segments per time chunk, or raising maxSegmentPartitionsOrderedInMemory "
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -169,7 +169,7 @@ private void init()
throw timeoutQuery();
} else {
// TODO: NettyHttpClient should check the actual cause of the failure and set it in the future properly.
throw new ResourceLimitExceededException(
throw ResourceLimitExceededException.withMessage(
"Possibly max scatter-gather bytes limit reached while reading from url[%s].",
url
);
Expand All @@ -187,7 +187,10 @@ private void init()
);
}
}
catch (IOException | InterruptedException | ExecutionException | CancellationException e) {
catch (ExecutionException | CancellationException e) {
throw convertException(e.getCause() == null ? e : e.getCause());
}
catch (IOException | InterruptedException e) {
throw convertException(e);
}
catch (TimeoutException e) {
Expand All @@ -210,7 +213,7 @@ private QueryTimeoutException timeoutQuery()
* based on {@link QueryException#getErrorCode()}. During conversion, {@link QueryException#host} is overridden
* by {@link #host}.
*/
private QueryException convertException(Exception cause)
private QueryException convertException(Throwable cause)
{
LOG.warn(cause, "Query [%s] to host [%s] interrupted", queryId, host);
if (cause instanceof QueryException) {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -455,7 +455,10 @@ private static <T, QueryType extends Query<T>> InlineDataSource toInlineDataSour
final int limitToUse = limit < 0 ? Integer.MAX_VALUE : limit;

if (limitAccumulator.get() >= limitToUse) {
throw new ResourceLimitExceededException("Cannot issue subquery, maximum[%d] reached", limitToUse);
throw ResourceLimitExceededException.withMessage(
"Cannot issue subquery, maximum[%d] reached",
limitToUse
);
}

final RowSignature signature = toolChest.resultArraySignature(query);
Expand All @@ -466,7 +469,7 @@ private static <T, QueryType extends Query<T>> InlineDataSource toInlineDataSour
resultList,
(acc, in) -> {
if (limitAccumulator.getAndIncrement() >= limitToUse) {
throw new ResourceLimitExceededException(
throw ResourceLimitExceededException.withMessage(
"Subquery generated results beyond maximum[%d]",
limitToUse
);
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,211 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/

package org.apache.druid.client;

import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.JavaType;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.google.common.collect.ImmutableList;
import com.google.common.util.concurrent.Futures;
import org.apache.druid.jackson.DefaultObjectMapper;
import org.apache.druid.query.QueryCapacityExceededException;
import org.apache.druid.query.QueryException;
import org.apache.druid.query.QueryInterruptedException;
import org.apache.druid.query.QueryTimeoutException;
import org.apache.druid.query.QueryUnsupportedException;
import org.apache.druid.query.ResourceLimitExceededException;
import org.junit.Rule;
import org.junit.Test;
import org.junit.experimental.runners.Enclosed;
import org.junit.rules.ExpectedException;
import org.junit.runner.RunWith;
import org.junit.runners.Parameterized;
import org.junit.runners.Parameterized.Parameters;
import org.mockito.ArgumentMatchers;
import org.mockito.Mockito;

import java.io.ByteArrayInputStream;
import java.io.IOException;
import java.io.InputStream;

@RunWith(Enclosed.class)
public class JsonParserIteratorTest
{
private static final JavaType JAVA_TYPE = Mockito.mock(JavaType.class);
private static final String URL = "url";
private static final String HOST = "host";
private static final ObjectMapper OBJECT_MAPPER = new DefaultObjectMapper();

@SuppressWarnings("ResultOfMethodCallIgnored")
public static class FutureExceptionTest
{
@Rule
public ExpectedException expectedException = ExpectedException.none();

@Test
public void testConvertFutureTimeoutToQueryTimeoutException()
{
JsonParserIterator<Object> iterator = new JsonParserIterator<>(
JAVA_TYPE,
Futures.immediateFailedFuture(
new QueryException(
QueryTimeoutException.ERROR_CODE,
"timeout exception conversion test",
null,
HOST
)
),
URL,
null,
HOST,
OBJECT_MAPPER
);
expectedException.expect(QueryTimeoutException.class);
expectedException.expectMessage("timeout exception conversion test");
iterator.hasNext();
}

@Test
public void testConvertFutureCancelationToQueryInterruptedException()
{
JsonParserIterator<Object> iterator = new JsonParserIterator<>(
JAVA_TYPE,
Futures.immediateCancelledFuture(),
URL,
null,
HOST,
OBJECT_MAPPER
);
expectedException.expect(QueryInterruptedException.class);
expectedException.expectMessage("Immediate cancelled future.");
iterator.hasNext();
}

@Test
public void testConvertFutureInterruptedToQueryInterruptedException()
{
JsonParserIterator<Object> iterator = new JsonParserIterator<>(
JAVA_TYPE,
Futures.immediateFailedFuture(new InterruptedException("interrupted future")),
URL,
null,
HOST,
OBJECT_MAPPER
);
expectedException.expect(QueryInterruptedException.class);
expectedException.expectMessage("interrupted future");
iterator.hasNext();
}

@Test
public void testConvertIOExceptionToQueryInterruptedException() throws IOException
{
InputStream exceptionThrowingStream = Mockito.mock(InputStream.class);
IOException ioException = new IOException("ioexception test");
Mockito.when(exceptionThrowingStream.read()).thenThrow(ioException);
Mockito.when(exceptionThrowingStream.read(ArgumentMatchers.any())).thenThrow(ioException);
Mockito.when(
exceptionThrowingStream.read(ArgumentMatchers.any(), ArgumentMatchers.anyInt(), ArgumentMatchers.anyInt())
).thenThrow(ioException);
JsonParserIterator<Object> iterator = new JsonParserIterator<>(
JAVA_TYPE,
Futures.immediateFuture(exceptionThrowingStream),
URL,
null,
HOST,
OBJECT_MAPPER
);
expectedException.expect(QueryInterruptedException.class);
expectedException.expectMessage("ioexception test");
iterator.hasNext();
}
}

@SuppressWarnings("ResultOfMethodCallIgnored")
@RunWith(Parameterized.class)
public static class NonQueryInterruptedExceptionRestoreTest
{
@Parameters(name = "{0}")
public static Iterable<Object[]> constructorFeeder()
{
return ImmutableList.of(
new Object[]{new QueryTimeoutException()},
new Object[]{
QueryCapacityExceededException.withErrorMessageAndResolvedHost("capacity exceeded exception test")
},
new Object[]{new QueryUnsupportedException("unsupported exception test")},
new Object[]{new ResourceLimitExceededException("resource limit exceeded exception test")}
);
}

@Rule
public ExpectedException expectedException = ExpectedException.none();

private final Exception exception;

public NonQueryInterruptedExceptionRestoreTest(Exception exception)
{
this.exception = exception;
}

@Test
public void testRestoreException() throws JsonProcessingException
{
JsonParserIterator<Object> iterator = new JsonParserIterator<>(
JAVA_TYPE,
Futures.immediateFuture(mockErrorResponse(exception)),
URL,
null,
HOST,
OBJECT_MAPPER
);
expectedException.expect(exception.getClass());
expectedException.expectMessage(exception.getMessage());
iterator.hasNext();
}
}

public static class QueryInterruptedExceptionConversionTest
{
@Rule
public ExpectedException expectedException = ExpectedException.none();

@Test
public void testConvertQueryExceptionToQueryInterruptedException() throws JsonProcessingException
{
JsonParserIterator<Object> iterator = new JsonParserIterator<>(
JAVA_TYPE,
Futures.immediateFuture(mockErrorResponse(new QueryException(null, "query exception test", null, null))),
URL,
null,
HOST,
OBJECT_MAPPER
);
expectedException.expect(QueryInterruptedException.class);
expectedException.expectMessage("query exception test");
iterator.hasNext();
}
}

private static InputStream mockErrorResponse(Exception e) throws JsonProcessingException
{
return new ByteArrayInputStream(OBJECT_MAPPER.writeValueAsBytes(e));
}
}