Skip to content

Conversation

@brandon-wada
Copy link
Collaborator

Adding ask_confident and ask_fast as two simplified versions of submit_image query.
In the process, I separate patience time from wait time and add confidence level to submit image query

assert False, "The detector performance has not improved after two minutes"


def test_ask_method_quality(gl: Groundlight):
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a bit of a funny test. ask_confident and ask_fast can only do as well as our backend allows them to. I make some weak guarantees here that reflect what we want from each method

Copy link
Member

@robotrapta robotrapta left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A couple things I think should work differently.

wait=wait,
)

def ask_fast(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think I still prefer ask_ml here.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The logic fundamentally looks for the first answer we get back, which is why I like ask_fast. 99.9% of the time it is de facto asking ML because the ML returns faster than anything else. I only worry that this could set up some confusion down the road if we have to explain to people why the first result is (almost) the same as the ML result.

I won't dig in too much on this though, it's updated to ask_ml in the latest version now.

self,
detector: Union[Detector, str],
image: Union[str, bytes, Image.Image, BytesIO, BufferedReader, np.ndarray],
wait: Optional[float] = None,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Semantically, what is "wait" supposed to do here? We're asking it for the fastest possible answer, so what is this doing?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Only for the edge case you mention in the other comment. Would changing "wait" to "max_wait" everywhere be clearer?

detector,
image,
wait=wait,
confidence_threshold=0.5,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think this is right. Because this updates the permanent threshold for this IQ, which will cause it to never get escalated. I think the way to do this is to set wait=0 which causes submit_image_query to never go into polling mode.

The exception would be if the initial response doesn't have a real prediction yet (0.5) because the ML took too long, then this method should poll until we get something back. I suppose we could have a wait timeout there.

@mjvogelsong mjvogelsong mentioned this pull request Oct 9, 2023
@brandon-wada brandon-wada marked this pull request as ready for review October 10, 2023 21:27
Copy link
Member

@robotrapta robotrapta left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks good! My biggest requests are to leave wait_for_confident as not hidden. (Changing it will break some internal apps, so we’d definitely need to bump to 0.12 for this.). And if it’s not too hard to merge wait_for_ml into it to reduce duplicated polling logic.

Any binary format must be JPEG-encoded already. Any pixel format will get
converted to JPEG at high quality before sending to service.
:param wait: How long to wait (in seconds) for a confident answer.
:param patience_time: How long Groundlight should work to generate a confident answer, even working beyond the specified wait time
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should mention that this is optional, and that the detector’s default will be used if this is not specified.

converted to JPEG at high quality before sending to service.
:param wait: How long to wait (in seconds) for a confident answer.
:param patience_time: How long Groundlight should work to generate a confident answer, even working beyond the specified wait time
:param confidence_threshold: If set, override the detector's confidence threshold for this query.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like this wording - let’s use this for “patience_time” too. “If set, overrides…”

return self._fixup_image_query(image_query)

def wait_for_confident_result(
def _wait_for_confident_result(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we hide this method? I think it’s quite reasonable that customers will want to use this directly. We have a bunch ourselves, but a lot of that was because the ask_ methods weren’t available. But I think, e.g. in the ask_async example with two threads - one thread just submits with ask_async, and the other one uses wait_for_confident to analyze.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's more of a gut instinct that made me want to hide it originally. I suspect that any use case someone has for it we'll want to build syntactic sugar around it like we're doing for the ask methods. The ask_async method may be the exception, so I'll make this unhidden for now.

image_query = self._fixup_image_query(image_query)
return image_query

def _wait_for_ml_result(self, image_query: Union[ImageQuery, str], timeout_sec: float = 30.0) -> ImageQuery:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we merge this with “wait_for_confident”? I imagine there’s a lot of duplicated logic in terms of retry and back off.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I merged them in an earlier draft but actually opted for duplicity in favor of better logging. I'm going back to the previous version with them merged but would love any tips on how to provide the same level of specificity while logging if the combined logic is indifferent to being called from wait_for_ml or from wait_for_confident

"""Returns True if the image query has a ML or human label.
Placeholder and special labels (out of domain) have confidences exactly 0.5
"""
if iq.result.confidence is None:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I really hate this. Next time you’re in the office, let’s grab a whiteboard with Sunil & Michael and figure out a way to fix this.

@brandon-wada brandon-wada changed the base branch from main to version_0.12.0 October 12, 2023 22:23
@robotrapta robotrapta changed the title Add ask_confident and ask_fast Add ask_confident and ask_ml Oct 13, 2023
@brandon-wada brandon-wada merged commit 9eecd10 into version_0.12.0 Oct 17, 2023
brandon-wada added a commit that referenced this pull request Oct 17, 2023
* bump version

* Update UNSURE to be UNCLEAR, catch __USURE received from the service (#105)

* ask_async (#102)

* initial commit

* added ask_async to submit_image_query

* Automatically reformatting code

* added ask async method

* Automatically reformatting code

* added integration tests (requires BE merge first)

* Automatically reformatting code

* satisfying mypy

* Automatically reformatting code

* fix comments

* change what type of exception test is catching

* Automatically reformatting code

* fix imports organization issue

* fix implementation - wait must be 0 not None

* Automatically reformatting code

* forgot to make wait=0 in relevant test

* feedback from PR review

* Automatically reformatting code

* ensure want_async is a serializable bool

* add description

* updated sphinx reqs to render some of the dependencies

* updated docstring for ask_async and fixed small sphinx bugs in other folk's docstrings

* Tests aren't passing because I didn't update the autogenerated code to expect a new param

* Revert "Tests aren't passing because I didn't update the autogenerated code to expect a new param"

This reverts commit 2477fd5.

* fix generated

* Automatically reformatting code

* fix lint

* Automatically reformatting code

* Revert "Automatically reformatting code"

This reverts commit cb9359e.

* Revert "fix generated"

This reverts commit 935c036.

* Revert "Revert "Tests aren't passing because I didn't update the autogenerated code to expect a new param""

This reverts commit 07670e3.

* Revert "Tests aren't passing because I didn't update the autogenerated code to expect a new param"

This reverts commit 2477fd5.

* Revert "updated docstring for ask_async and fixed small sphinx bugs in other folk's docstrings"

This reverts commit 67e3edd.

* third time at generated docs is the charm

---------

Co-authored-by: Auto-format Bot <autoformatbot@groundlight.ai>

* Cli polishing (#109)

* Add basic catch if api token isn't specified when cli is called

* Pushes Groundlight class instantiation up until the function is actually called with arguments.
This means that the entire help args are available even if we can't instantiate the class (no api key)

* Fixed misunderstanding with metaprogramming, added tests

* Addressing comments

---------

Co-authored-by: Auto-format Bot <autoformatbot@groundlight.ai>

* Add ask_confident and ask_ml (#99)

* Adding ask_confident and ask_fast

* Fixing ask_ml behavior

* Unhide wait functions, merging logic, fixed iq_is_answered logic

* Rewriting doc strings in Sphinx style

* fixed sphinx docstring return types

* Making iq submission with inspection work with newly optional patience time


---------

Co-authored-by: Auto-format Bot <autoformatbot@groundlight.ai>

* Linting

* fix ask_async docstring

---------

Co-authored-by: brandon <brandon@groundlight.ai>
Co-authored-by: Brandon <132288221+brandon-groundlight@users.noreply.github.com>
Co-authored-by: Auto-format Bot <autoformatbot@groundlight.ai>
sunildkumar added a commit that referenced this pull request Oct 18, 2023
* bump version

* Update UNSURE to be UNCLEAR, catch __USURE received from the service (#105)

* rough draft of demo

* ask_async (#102)

* initial commit

* added ask_async to submit_image_query

* Automatically reformatting code

* added ask async method

* Automatically reformatting code

* added integration tests (requires BE merge first)

* Automatically reformatting code

* satisfying mypy

* Automatically reformatting code

* fix comments

* change what type of exception test is catching

* Automatically reformatting code

* fix imports organization issue

* fix implementation - wait must be 0 not None

* Automatically reformatting code

* forgot to make wait=0 in relevant test

* feedback from PR review

* Automatically reformatting code

* ensure want_async is a serializable bool

* add description

* updated sphinx reqs to render some of the dependencies

* updated docstring for ask_async and fixed small sphinx bugs in other folk's docstrings

* Tests aren't passing because I didn't update the autogenerated code to expect a new param

* Revert "Tests aren't passing because I didn't update the autogenerated code to expect a new param"

This reverts commit 2477fd5.

* fix generated

* Automatically reformatting code

* fix lint

* Automatically reformatting code

* Revert "Automatically reformatting code"

This reverts commit cb9359e.

* Revert "fix generated"

This reverts commit 935c036.

* Revert "Revert "Tests aren't passing because I didn't update the autogenerated code to expect a new param""

This reverts commit 07670e3.

* Revert "Tests aren't passing because I didn't update the autogenerated code to expect a new param"

This reverts commit 2477fd5.

* Revert "updated docstring for ask_async and fixed small sphinx bugs in other folk's docstrings"

This reverts commit 67e3edd.

* third time at generated docs is the charm

* Automatically reformatting code

* finish making tests work

* Automatically reformatting code

---------

Co-authored-by: Auto-format Bot <autoformatbot@groundlight.ai>

* Cli polishing (#109)

* Add basic catch if api token isn't specified when cli is called

* Pushes Groundlight class instantiation up until the function is actually called with arguments.
This means that the entire help args are available even if we can't instantiate the class (no api key)

* Fixed misunderstanding with metaprogramming, added tests

* Addressing comments

---------

Co-authored-by: Auto-format Bot <autoformatbot@groundlight.ai>

* Add ask_confident and ask_ml (#99)

* Adding ask_confident and ask_fast

* Automatically reformatting code

* Fixing ask_ml behavior

* Adding to test

* Automatically reformatting code

* set default wait for ask_ml

* Unhide wait functions, merging logic, fixed iq_is_answered logic

* Automatically reformatting code

* Rewriting doc strings in Sphinx style

* ask_fast to ask_ml in the tests

* fixed sphinx docstring return types

* Cleaning the lint trap

* Last bits of lint

* Making iq submission with inspection work with newly optional patience time

* single char typo

* Reorder functions to trick Git's LCS alg to be correct

* Automatically reformatting code

---------

Co-authored-by: Auto-format Bot <autoformatbot@groundlight.ai>

* Linting

* addressed self comments

* fix ask_async docstring

* small fixes

* cleaning up a bit

* Update docs/docs/building-applications/5-async-queries.md

Co-authored-by: blaise-muhirwa <135643310+blaise-muhirwa@users.noreply.github.com>

* Update docs/docs/building-applications/5-async-queries.md

Co-authored-by: robotrapta <79607467+robotrapta@users.noreply.github.com>

* small fixes based on PR feedback from Leo and Blaise

---------

Co-authored-by: brandon <brandon@groundlight.ai>
Co-authored-by: Brandon <132288221+brandon-groundlight@users.noreply.github.com>
Co-authored-by: Auto-format Bot <autoformatbot@groundlight.ai>
Co-authored-by: blaise-muhirwa <135643310+blaise-muhirwa@users.noreply.github.com>
Co-authored-by: robotrapta <79607467+robotrapta@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants