-
Notifications
You must be signed in to change notification settings - Fork 38
Transformers v4.12.0 compatible #107
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
12b9985
e2b7a1d
2566867
434185c
2a8f779
67bf684
6f8e451
c3fc882
c6b98b6
f00c752
9c32b81
940a78d
39b33ba
f4ce9c2
2ac5317
08cc31a
bd58d06
d9466e4
cc4494c
c3c98cb
e4354c3
9fb69a7
298ce3a
b6e899e
6faf3fb
e549380
e1e76d5
8fb5575
b5172e7
ad12d1b
a8c224a
9fe9c04
22b94c6
424bc4e
b20d9c0
36b018b
f5946c6
b118184
4e41e75
f9e8c16
c464dee
addd2a9
c61bedb
645887b
cc34fc7
8490d64
1e43d3f
816cab9
124f768
7aab0d5
f28672d
7d83023
1633ef3
49beb3d
980efa6
0ad879e
943f198
5233d86
31a49ff
8ad78b3
d501760
61fb12a
c66ec06
3513f0f
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,10 +1,9 @@ | ||
| #!/bin/bash | ||
| source utils.sh | ||
| if [[ $SKIP_BASELINE -eq 0 ]]; then | ||
| export BASELINE_REPO=$CACHE_DIR/transformers_v3.0.2 | ||
| #https://github.com/huggingface/transformers.git \ | ||
| export BASELINE_REPO=$CACHE_DIR/transformers_v4.12.0 | ||
| git_clone_if_not_in_cache \ | ||
| https://github.com/JiushengChen/transformers.git \ | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Some context about this forked repo, it is to add param "no_repeat_ngram_size", see
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. To clarify, this is no longer needed since fastseq uses it's own run_eval_hf.py for the baseline? |
||
| https://github.com/huggingface/transformers.git \ | ||
| $BASELINE_REPO \ | ||
| v3.0.2-ngram | ||
| v4.12.0 | ||
| fi | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,41 @@ | ||
| #!/bin/bash | ||
| # Run it at its parent folder, and check result at ../perf. | ||
| # USAGE - ./benchmark.sh | ||
| # [fairseq|fairseq+fastseq|transformers|transformers+fastseq] | ||
| # <model> | ||
| # <task> | ||
| # <split> # train/val/test (text) or train/valid/test (binary) | ||
| # <batch-sizes> | ||
| source hf.sh | ||
|
|
||
| # MODEL - prophetnet from transformer | ||
| # TASK - cnn dm val full set | ||
| ./benchmark.sh \ | ||
| transformers \ | ||
| microsoft/prophetnet-large-uncased \ | ||
| cnn_dm_bert/raw \ | ||
| val \ | ||
| 128 \ | ||
| --task summarization \ | ||
| --no_repeat_ngram_size 3 | ||
| ./benchmark.sh \ | ||
| transformers+fastseq \ | ||
| microsoft/prophetnet-large-uncased \ | ||
| cnn_dm_bert/raw \ | ||
| val \ | ||
| 128 \ | ||
| --task summarization \ | ||
| --no_repeat_ngram_size 3 | ||
|
|
||
| # Accuracy | ||
| grep "microsoft/prophetnet-large-uncased cnn_dm_bert/raw val " perf \ | ||
| | awk '{print $9}' \ | ||
| | awk -F'|' '{if($1!="NA"){c+=1;s+=$1}}END{print s/c}' \ | ||
| | ./range.sh 0.230 0.232 | ||
| # Speed on V100 16GB 250W | ||
| grep -E "transformers_v4.12.0 microsoft/prophetnet-large-uncased cnn_dm_bert/raw val 128 " perf \ | ||
| | awk '{s+=$13}END{if(NR==0) print -1; else print s/NR}' \ | ||
| | ./range.sh 3 4 | ||
| grep -E "transformers_v4.12.0+fastseq_v.* microsoft/prophetnet-large-uncased cnn_dm_bert/raw val 128 " perf \ | ||
| | awk '{s+=$13}END{if(NR==0) print -1; else print s/NR}' \ | ||
| | ./range.sh 6 100 |
This file was deleted.
Uh oh!
There was an error while loading. Please reload this page.