Increase AQL memory limit to 1GiB from 20MB#88
Conversation
|
This would likely not fix the issue in production if you truly do need the whole 1GB, as currently our prod dyno only has 512MB of RAM. I think increasing it from 20MB is a good idea though, so if you think you problem is solved with less than 512MB of RAM, I'd say let's adjust this to meet that requirement and proceed. If 512MB still isn't enough, we might have to consider upgrading our dyno type. |
The memory limit isn't directly used to change the amount of memory the dyno uses, it controls how much memory can be use on the arango server to generate the query results. The use case here is that the query takes around 1GB to run there and returns only 100 rows, which is far smaller than the 512MB of RAM on the dyno. I'm not sure we should be using |
I see, my bad then. In that case, I think this is fine. If we need to constrict memory on the dyno itself, I think that can be addressed separately. |
|
I think I know why the tests are failing, so I can fix that up, and then approve this. |
e035c3e to
6cebc13
Compare
|
Rebased onto main to trigger testing again |
|
Since the tests are still failling because of |
Depends on #89
This fixes an issue I've been having on UPDB with applying filters to a query. When I would add the filters, I'd hit this memory limit.
Increasing the limit here fixes the problem. There are still issue on the main client with showing the number of nodes and the "node types". I think that we can modify the queries that run on
/nodes/?offset=0&limit=10to reduce the memory use, since it shouldn't need to load every table to get the count.