[No QA] fix(ci/test): set node --max-old-space-size=8192 and jest --maxWorker=2#75553
Conversation
|
@aimane-chnaif Please copy/paste the Reviewer Checklist from here into a new comment on this PR and complete it. If you have the K2 extension, you can simply click: [this button] |
|
|
|
@mountiny ready for merge |
|
no QA for this PR |
|
All tests are failing |
aimane-chnaif
left a comment
There was a problem hiding this comment.
Please add [No QA] prefix to the title.
390d5c5 to
a4577fd
Compare
rerunning the test if setting the - name: Jest tests
run: |
export NODE_OPTIONS="--experimental-vm-modules --max-old-space-size=16384"
npm test -- --silent --shard=${{ fromJSON(matrix.chunk) }}/${{ strategy.job-total }} --maxWorkers=4 --coverage --coverageDirectory=coverage/shard-${{ matrix.chunk }} |
Ok looks like it fails. I will just go with this approach |
a4577fd to
a3d7a16
Compare
|
weird the test still fails. i will try to set the memory limit to 8GB and see if it works |
a3d7a16 to
83dff1e
Compare
|
It’s still failing. I think the problem comes from using sharding together with multiple workers because running all of these in parallel consumes a lot of memory. I will try to decrease the number of workers to 2 and see if it works |
83dff1e to
7474a42
Compare
|
If using 2 workers per shard still fails, I’ll reduce it to 1 worker per shard. |
7474a42 to
5139c64
Compare
Codecov Report✅ All modified and coverable lines are covered by tests. |
|
@mountiny all test are passing I'll just amend my commit. |
5139c64 to
2b33489
Compare
|
8 GB × 2 workers = 16 GB per shard (fully utilizing the maximum ram of |
If heap allocation limit problem still arise in the future. We can just increase the memory limit to 16GB and decrease the shard worker to 1. (I'm willing to open a follow-up PR.) 16 GB x 1 workers = 16 GB per shard |
|
And the reason why I’m keeping 2 workers per shard instead of 1 because it makes the workflow finish faster. |
|
All tests are passing. Just waiting for the ESLint check to complete, then it’ll be ready to merge |
Reviewer Checklist
Screenshots/VideosAndroid: HybridAppAndroid: mWeb ChromeiOS: HybridAppiOS: mWeb SafariMacOS: Chrome / SafariMacOS: Desktop |
6760805 to
2335bfb
Compare
Done rebasing my branch onto Waiting for all the checks to finish |
roryabraham
left a comment
There was a problem hiding this comment.
I think these changes are problematic for a few reasons:
- Removing the node options from
package.jsonmeans that they won't be applied if someone runsnpm testlocally - Reducing the number of parallel workers will slow down tests. Why not shard further and add another runner instead?
- Overall it seems like we're not making strong efforts to address the root cause of very high memory usage tests.
so the new changes would be:
|
I reduced the number of Jest workers because we need to increase the memory allocation to 8GB, and we have to make sure we don’t exceed the RAM limits of the ubuntu-latest runner. Explanation:
|
I will try to implement this |
|
Does |
yes, and both. because node is the parent process that spawns jest process |
|
This seems like a reasonable plan then:
But still, I'd love to see some investigation into why tests are being so memory consumptive. That plan only really makes sense if tests across the board are similarly memory-hungry. If we have one or two "bad apple" tests that are consuming lots of memory, then we're just moving the goalpost from 4GB to 8GB rather than fixing the root problem. |
8287b5e to
3379941
Compare
Maybe because the codebase is so large. ESLint checks in this codebase also uses 8GB of memory because 4GB is not enough and it will cause heap allocation limit problem if u set the --max-old-space-size to 4GB. |
|
How about we shard further to 16 runners to make the test workflow even faster if the test run is not fast enough? |
|
I don't think "throw more resources at it" is the most elegant solution proposal, when we don't really understand the root cause of the problem. |
|
@roryabraham thanks for taking the time to review this PR, I really appreciate it. |
|
All checks are passing, and this PR will resolve the recent Jest workflow test failures. |
| const TEST_AUTH_TOKEN_2 = 'zxcvbnm'; | ||
|
|
||
| jest.setTimeout(60000); | ||
| jest.setTimeout(120000); |
There was a problem hiding this comment.
What's this change for?
There was a problem hiding this comment.
to fixed the exceeded timeout error in https://github.com/Expensify/App/actions/runs/19515137212/job/55864876727
i increased the timeout time from 1min to 2mins
1min = 60000ms
2min = 120000ms
|
Congrats, that's your 5th PR merged! 🎉 Do you know about the ContributorPlus role? It's an opportunity to earn more in the Expensify Open Source community. Keep up the great work - thanks! |
|
✋ This PR was not deployed to staging yet because QA is ongoing. It will be automatically deployed to staging after the next production release. |
|
🚀 Deployed to staging by https://github.com/roryabraham in version: 9.2.62-0 🚀
|
|
🚀 Deployed to production by https://github.com/marcaaron in version: 9.2.62-5 🚀
|
Explanation of Change
Fixed Issues
$ #75488
PROPOSAL: #75488 (comment)
Tests
Offline tests
QA Steps
// TODO: These must be filled out, or the issue title must include "[No QA]."
PR Author Checklist
### Fixed Issuessection aboveTestssectionOffline stepssectionQA stepssectioncanBeMissingparam foruseOnyxtoggleReportand notonIconClick)src/languages/*files and using the translation methodSTYLE.md) were followedAvatar, I verified the components usingAvatarare working as expected)StyleUtils.getBackgroundAndBorderStyle(theme.componentBG))npm run compress-svg)Avataris modified, I verified thatAvataris working as expected in all cases)Designlabel and/or tagged@Expensify/designso the design team can review the changes.ScrollViewcomponent to make it scrollable when more elements are added to the page.mainbranch was merged into this PR after a review, I tested again and verified the outcome was still expected according to theTeststeps.Screenshots/Videos
Android: Native
Android: mWeb Chrome
iOS: Native
iOS: mWeb Safari
MacOS: Chrome / Safari
MacOS: Desktop