Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -123,7 +123,7 @@ void doReindexTest(
queryResponseTemplate = StringUtils.replace(
queryResponseTemplate,
"%%DATASOURCE%%",
fullBaseDatasourceName
fullReindexDatasourceName
);

queryHelper.testQueriesFromString(queryResponseTemplate, 2);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,7 @@ public class ITIndexerTest extends AbstractITBatchIndexTest
private static String INDEX_DATASOURCE = "wikipedia_index_test";

private static String REINDEX_TASK = "/indexer/wikipedia_reindex_task.json";
private static String REINDEX_QUERIES_RESOURCE = "/indexer/wikipedia_reindex_queries.json";
private static String REINDEX_DATASOURCE = "wikipedia_reindex_test";

@Test
Expand All @@ -52,7 +53,7 @@ public void testIndexData() throws Exception
INDEX_DATASOURCE,
REINDEX_DATASOURCE,
REINDEX_TASK,
INDEX_QUERIES_RESOURCE
REINDEX_QUERIES_RESOURCE
);
}
}
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@
[
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess we don't need this new query? It's same with wikipedia_index_queries.json except that rows is missing in this query.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The differences are:

  • maxTime in the first query is different (because the second ingest pulls a subset of time)
  • not asking for the now-gone count metric in the second query (is that what you mean by rows?)

Are you proposing just not doing queries at all after the reingest? It seems reasonable to validate that some data ended up in the new datasource.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh I see. I missed the 1st change. It looks good to me. Thanks.

{
"description": "timeseries, 1 agg, all",
"query":{
"queryType" : "timeBoundary",
"dataSource": "%%DATASOURCE%%"
},
"expectedResults":[
{
"timestamp" : "2013-08-31T01:02:33.000Z",
"result" : {
"minTime" : "2013-08-31T01:02:33.000Z",
"maxTime" : "2013-08-31T12:41:27.000Z"
}
}
]
},

{
"description":"having spec on post aggregation",
"query":{
"queryType":"groupBy",
"dataSource":"%%DATASOURCE%%",
"granularity":"day",
"dimensions":[
"page"
],
"filter":{
"type":"selector",
"dimension":"language",
"value":"zh"
},
"aggregations":[
{
"type":"longSum",
"fieldName":"added",
"name":"added_count"
}
],
"postAggregations": [
{
"type":"arithmetic",
"name":"added_count_times_ten",
"fn":"*",
"fields":[
{"type":"fieldAccess", "name":"added_count", "fieldName":"added_count"},
{"type":"constant", "name":"const", "value":10}
]
}
],
"having":{"type":"greaterThan", "aggregation":"added_count_times_ten", "value":9000},
"intervals":[
"2013-08-31T00:00/2013-09-01T00:00"
]
},
"expectedResults":[ {
"version" : "v1",
"timestamp" : "2013-08-31T00:00:00.000Z",
"event" : {
"added_count_times_ten" : 9050.0,
"page" : "Crimson Typhoon",
"added_count" : 905
}
} ]
}
]