Skip to content

UI to run fitting experiments#290

Merged
jonc125 merged 38 commits intomasterfrom
277-run-fitting-experiment
Oct 6, 2020
Merged

UI to run fitting experiments#290
jonc125 merged 38 commits intomasterfrom
277-run-fitting-experiment

Conversation

@helenst
Copy link
Copy Markdown
Contributor

@helenst helenst commented Sep 22, 2020

Addresses #277

@helenst helenst linked an issue Sep 22, 2020 that may be closed by this pull request
13 tasks
@helenst helenst marked this pull request as ready for review September 24, 2020 07:56
@helenst helenst requested a review from jonc125 September 24, 2020 07:56
@jonc125 jonc125 changed the base branch from master to 203-fitting-result September 24, 2020 11:02
Copy link
Copy Markdown
Contributor

@jonc125 jonc125 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is looking good! I've played around with it live and set 4 fitting experiments running. A few ideas from that and looking at the code so far. I still need to read through the tests of forms & views.

Comment thread weblab/fitting/views.py
Comment thread weblab/fitting/views.py Outdated
Comment thread weblab/fitting/views.py Outdated
Comment thread weblab/fitting/views.py
Comment thread weblab/fitting/views.py Outdated
Comment thread weblab/fitting/views.py
Comment thread weblab/templates/entities/entity_version.html Outdated
@helenst
Copy link
Copy Markdown
Contributor Author

helenst commented Sep 24, 2020

TODO:

  • add check_access_token test cases for fitting / dataset support
  • revisit the way logic works on the form (re. fields restricting one another)
  • make rerun work for fitting experiments

@jonc125
Copy link
Copy Markdown
Contributor

jonc125 commented Sep 24, 2020

I've just noticed a slight oddity with the drop-downs: once you have select one model (say) you then need to select the '----' entry before you can click again to see the full list of models!

@helenst
Copy link
Copy Markdown
Contributor Author

helenst commented Sep 25, 2020

I've just noticed a slight oddity with the drop-downs: once you have select one model (say) you then need to select the '----' entry before you can click again to see the full list of models!

Yeah, this arises because of the restrictions that various other fields place on one another (mostly protocols, fitting specs and datasets) - I may need to revisit the way this logic works as it's not very user friendly.

Comment thread weblab/fitting/views.py Outdated
@helenst helenst force-pushed the 277-run-fitting-experiment branch from 6a5cf4a to e4cc33d Compare September 25, 2020 14:20
@helenst helenst force-pushed the 277-run-fitting-experiment branch from e4cc33d to 2003eee Compare September 29, 2020 11:19
Comment thread weblab/templates/fitting/fittingresultversion_detail.html Outdated
@jonc125 jonc125 changed the base branch from 203-fitting-result to master October 2, 2020 10:57
@helenst
Copy link
Copy Markdown
Contributor Author

helenst commented Oct 5, 2020

OK, thinking some more about how the protocols, datasets and fitting specs restrict one another:

Say we select the protocol first, then that will restrict datasets and fitting specs to those connected to that protocol.
Currently it also restricts the protocol choice to just that protocol, and you have to deselect the protocol to get the full list back. We could have it keep all protocols available, but...

If you select the protocol as above, and then select the dataset, selecting the dataset will still restrict the protocol to just the selected protocol, because the two fields mutually control one another - both get locked in to one another until you fully deselect one of them. I guess that's a side effect of having the restrictions working in both directions. It's similar for any pairing of protocol, dataset, fitting spec being

I guess one answer might be to keep some sense of the order in which things should be selected - e.g. protocol -> datasets -> fitting specs - but that might be a bit restrictive for the user. Or maybe have a "clear selections" button to reset all the dropdowns (except those preselected on page load).

Or perhaps the dropdowns should keep the "invalid" options for those fields, indicating that they are so, but also allowing those options to be selected, which would then trigger a reset of values in the other dropdowns.

@jonc125
Copy link
Copy Markdown
Contributor

jonc125 commented Oct 5, 2020

I think possibly a combination of measures? I like the idea of a 'clear selections' button that resets everything as you suggest.

As for the restrictions, I think making a selection should restrict all the other drop-downs, but not the one you just selected.

So e.g. selecting a protocol would leave the other options in the menu, until a dataset was then selected, since that would restrict the protocol field. Similarly if they started with choosing a dataset, the protocol menu would be restricted but not the dataset menu (until a protocol was chosen).

Does that make sense?

In normal use you probably wouldn't see it happening. It only gets odd really in the case where you accidentally click on the wrong entry, so you click the menu to change it and discover your choices have disappeared!

@helenst helenst force-pushed the 277-run-fitting-experiment branch from 51805aa to 391a14b Compare October 5, 2020 13:55
@helenst
Copy link
Copy Markdown
Contributor Author

helenst commented Oct 5, 2020

@jonc125 I'm done adding to this for now so feel free to re-review :)

Copy link
Copy Markdown
Contributor

@jonc125 jonc125 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good!

Comment thread weblab/fitting/tests/test_views.py Outdated


@pytest.fixture
def fits_user(logged_in_user):
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is in conftest.py now?

@jonc125 jonc125 merged commit 5e90b6e into master Oct 6, 2020
@jonc125 jonc125 deleted the 277-run-fitting-experiment branch October 6, 2020 18:55
MauriceHendrix pushed a commit that referenced this pull request Jul 22, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

UI to run a new fitting experiment

2 participants