Skip to content

Basic performance testing #22

@jacobsa

Description

@jacobsa

We should do some performance testing to identify completely obvious bottlenecks.

  • Run on GCE, on a mid-range machine.
  • Generate random data of various sizes (1 KiB, 1 MiB, 10 MiB, 100 MiB, 1 GiB, 10 GiB?)
  • Use time cp to measure wall time taken to copy from local disk to GCS.
  • Find some way to measure CPU time taken by the gcsfuse process during the operation, too.
  • Look at throughput in terms of bytes per wall second and bytes per CPU second.
  • As a baseline, compare both to gsutil cp. (Both can be measured here by just using time gsutil cp, I think.)

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions