Skip to content

Datastore error on great amount of data #1386

@mik115

Description

@mik115

Hi,
i'm facing a real strange issue.
I've a datastore save request that goes well when the entity is small,
but when we try to save a great amount of data we receive this error

{ handle: 2,
type: 'error',
 className: 'Error',
 constructorFunction: { ref: 8 },
 protoObject: { ref: 9 },
 prototypeObject: { ref: 1 },
 properties: 
  [ { name: 'stack', attributes: 2, propertyType: 3, ref: 1 },
    { name: 'message', attributes: 2, propertyType: 0, ref: 10 },
    { name: 'code', propertyType: 0, ref: 11 },
    { name: 'metadata', propertyType: 0, ref: 12 } ],
 text: 'Error: Bad Request' }

that breaks the node process with this stacktrace:

  < /Users/michelepini/crawler.analytics.igenius.net/node_modules/gcloud/lib/datastore/entity.js:300
  <   throw new Error('Unsupported field value, ' + value + ', was provided.');
  <   ^
  < Error: Unsupported field value, undefined, was provided.
  <     at Object.encodeValue (/Users/michelepini/crawler.analytics.igenius.net/node_modules/gcloud/lib/datastore/entity.js:300:9)
  <     at /Users/michelepini/crawler.analytics.igenius.net/node_modules/gcloud/lib/datastore/entity.js:652:24
  <     at Array.map (native)
  <     at Object.queryToQueryProto (/Users/michelepini/crawler.analytics.igenius.net/node_modules/gcloud/lib/datastore/entity.js:646:33)
  <     at makeRequest (/Users/michelepini/crawler.analytics.igenius.net/node_modules/gcloud/lib/datastore/request.js:495:21)
  <     at Object.makeRequest (/Users/michelepini/crawler.analytics.igenius.net/node_modules/gcloud/lib/common/util.js:579:21)
  <     at DestroyableTransform.<anonymous> (/Users/michelepini/crawler.analytics.igenius.net/node_modules/gcloud/lib/datastore/request.js:485:13)
  <     at DestroyableTransform.g (events.js:286:16)
  <     at emitOne (events.js:96:13)
  <     at DestroyableTransform.emit (events.js:188:7)

the error shows up when i try to save an entity with two string field, not indexed of 1MB of size
ie

{
  data: [0.9MB string not indexed]
  data2: [0.9MB string not indexex]
  [other strings indexed of no more than 40bit]
}

We chunck our data on string of less than 1MB to respect the maximum string length property on datastore.
I'm pretty sure that the issue is about the amount of data because saving only one chunk no error was raised.

There's an upper limit on the total entity size or there is some error?

Thanks in advance

Metadata

Metadata

Labels

api: datastoreIssues related to the Datastore API.type: questionRequest for information or clarification. Not an issue.

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions