-
Notifications
You must be signed in to change notification settings - Fork 705
Closed
Labels
minor bugBugs that aren't too bad, only concern documentation, or have easy work-aroundsBugs that aren't too bad, only concern documentation, or have easy work-arounds
Description
Currently the attention example is not very efficient, particularly on GPUs. For example, this for loop could be changed so it only does a single matrix multiplication (which can be done only once per sentence):
https://github.com/clab/dynet/blob/master/examples/python/attention.py#L75
Also, here the attention model is randomly generating examples instead of selecting the best one, which is more in line with what we would expect:
https://github.com/clab/dynet/blob/master/examples/python/attention.py#L105
Metadata
Metadata
Assignees
Labels
minor bugBugs that aren't too bad, only concern documentation, or have easy work-aroundsBugs that aren't too bad, only concern documentation, or have easy work-arounds