Skip to content

Conversation

@vegaluisjose
Copy link
Member

The following PR adds support to test end-to-end(sw/hw) MobileNet using the Verilator backend. This is build upon this commit.

@tmoreau89

@tqchen
Copy link
Member

tqchen commented May 12, 2021

Thanks @vegaluisjose . just want to make sure that the mobile net simulation won't cost a long time to finish. otherwise it would be still good to have unit test models that can finish in a reasonable amount of time. We can move mobile net tests to nightly category later once we support nightlies

@tmoreau89
Copy link
Contributor

Thank you @vegaluisjose the changes LGTM. To alleviate @tqchen 's concerns can you comment on how long the simulations take on MobileNet? We're only simulating bias_add on a small RTL design so I presume the runtime won't be as long as some of the other TSIM tests on full VTA designs offloading entire conv2ds but let's make sure we keep tabs on runtime. Thanks!

@vegaluisjose
Copy link
Member Author

It takes about the same time as running the pure software version.

@tmoreau89
Copy link
Contributor

I tested the two tests on my local machine, test_mobilenet.py and test_verilator_ops.py; total runtime is 22.06s. @tqchen I'm going to approve the PR, but let us know if you think that is too long. IMO that is a reasonable runtime.

Copy link
Contributor

@tmoreau89 tmoreau89 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @vegaluisjose ; I left a nit that would be best to address if you have the bandwidth.

predictions = np.squeeze(res)
prediction = np.argmax(predictions)
# 387 is the elephant
tvm.testing.assert_allclose(prediction, 387, rtol=1e-5, atol=1e-5)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In this case can't we just assert that prediction is equal to 387?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed

@tmoreau89
Copy link
Contributor

Weird, it looks like we got a CI error, but probably unrelated to the changes. Could you re-trigger @vegaluisjose ?

@tmoreau89 tmoreau89 merged commit dbd076a into apache:main May 18, 2021
@tmoreau89
Copy link
Contributor

Thank you @vegaluisjose the PR has been merged.

trevor-m pushed a commit to trevor-m/tvm that referenced this pull request Jun 17, 2021
* update

* update vta submodule

* cpp fmt

* python fmt

* skip if tflite is not available

* fmt

* change assertion

* update comment
trevor-m pushed a commit to neo-ai/tvm that referenced this pull request Jun 17, 2021
* update

* update vta submodule

* cpp fmt

* python fmt

* skip if tflite is not available

* fmt

* change assertion

* update comment
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants