Skip to content

Conversation

@ihsaan-ullah
Copy link
Collaborator

@ihsaan-ullah ihsaan-ullah commented Apr 7, 2023

@ mention of reviewers

@Didayolo @bbearce

A brief description of the purpose of the changes contained in this PR.

In leaderboard scores are rounded to precision decimal points. precision can be provided in the yaml file as below

leaderboards:
  - index: 0 
    title: Results
    key: main
    submission_rule: "Force_Last"
    columns:
      - index: 0
        title: average auc
        key: auc
        sorting: desc
        precision: 3
      - index: 1
        title: average bac
        key: bac
        sorting: desc
        precision: 4

Issues this PR resolves

Solving the long decimal number for scores in leaderboard

Screenshot of solved problem

Screenshot 2023-04-07 at 8 02 34 PM

Screenshot 2023-04-07 at 8 02 50 PM

A checklist for hand testing

  • run makemigrations, migrate and collect-static for testing

Checklist

  • Code review by me
  • Hand tested by me
  • I'm proud of my work
  • Code review by reviewer
  • Hand tested by reviewer
  • CircleCi tests are passing
  • Ready to merge

Ihsan Ullah added 2 commits April 7, 2023 19:58
….yaml file, for each column a new variable 'precision' (integer) can be added
@Didayolo
Copy link
Member

@ihsaan-ullah

Do we have also a new field in the editor for the precision?

@ihsaan-ullah
Copy link
Collaborator Author

New field in the editor added
Screenshot 2023-04-13 at 2 48 47 PM

@Didayolo
Copy link
Member

Didayolo commented Apr 21, 2023

@ihsaan-ullah

  • For some reason I can't see the "Precision"column in the editor.
    I have run the migration and collectstatic.

  • In CodaLab Competitions (v1), this field is called numeric_format in the YAML files. I think the v1-unpacker should be edited to convert numeric_format into precision.

  • When I tried to set the precision in a v1 bundle, it did not work. The precision was set to the default value. This should be fixed by taking into account the previous point.

@ihsaan-ullah
Copy link
Collaborator Author

ihsaan-ullah commented Apr 22, 2023

I followed these steps and I can see precision :

  • pull this branch
  • docker down and up
  • collect static docker-compose exec django ./manage.py collectstatic --noinput
  • hard refresh MAC: command + shift + R
  • uploaded a fresh competition which does not have this field in YAML

Screenshot 2023-04-22 at 5 54 39 PM

A bit confused about the problem in V1 bundle

@Didayolo
Copy link
Member

Last point: updating the documentation https://github.com/codalab/codabench/wiki/Yaml-Structure

@ihsaan-ullah
Copy link
Collaborator Author

ihsaan-ullah commented Apr 25, 2023

@Didayolo I think I have covered all the points

  • Documentation updated for precision
  • V1 unpacker updated to take into account numeric_format
  • V1 packer tested on V1 Iris bundle

YAML from Iris bundle:

numeric_format = 2and 4
Screenshot 2023-04-26 at 12 18 52 AM

Converted into precision as shown in the editor:
Screenshot 2023-04-26 at 12 15 01 AM

@ihsaan-ullah
Copy link
Collaborator Author

@bbearce

The rounding to 2 decimal places we added fails in this situation because one of the V1 bundle has numeric_format = 4 rather than 2.

What should be done in this case?

@Didayolo Didayolo self-assigned this Apr 27, 2023
@Didayolo
Copy link
Member

Didayolo commented May 3, 2023

@ihsaan-ullah Your last commit seems a bit strange.

@ihsaan-ullah
Copy link
Collaborator Author

ihsaan-ullah commented May 3, 2023

I changed this function _run_submission_and_add_to_leaderboard to add one more parameter precision which by default is 2. In all bundles it is 2 so default value will be used when rounding the score from leaderboard

Now in the specific case of bundle v15, I passed precision=4 to the function _run_submission_and_add_to_leaderboard because for this specific bundle precision/numeric_format is 4.

I just checked the same file here: #828 and it looks like the difference is just the addition of test_v15_iris_result_submission_end_to_end and test_v15_iris_code_submission_end_to_end.

My changes to the test will not affect the new additions as these will use the default parameter precision=2.

To clarify a bit more, the precision passed as a parameter was already there but was hardcoded like this:

assert Decimal(self.find('leaderboards table tbody tr:nth-of-type(1) td:nth-of-type(3)').text) == round(Decimal(prediction_score), 2)

And now this is changed to

assert Decimal(self.find('leaderboards table tbody tr:nth-of-type(1) td:nth-of-type(3)').text) == round(Decimal(prediction_score), precision)

@ihsaan-ullah ihsaan-ullah deleted the leaderboard_score_precision branch May 4, 2023 10:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants