-
Notifications
You must be signed in to change notification settings - Fork 27.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
avoid many failures for ImageGPT #34071
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, cc @gante and @zucchini-nlp
@unittest.skip( | ||
reason="After #33533, this still passes, but many subsequential tests fail with `device-side assert triggered`" | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
just a thought, but can it is be because we need to self.asserts(ValueError)
and not RuntimeError
? Similar test for video/image_inputs are not causing massive failures
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
But this test itself is still passing, so what the code gives (expected) RuntimeError
. Not sure if we can adjust the test to use ValueError
.
I can give it a try without any asserts
to see if other tests are affected.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think the desired error (here RuntimeError
) from such input (remove some images from inputs
) causing the cdua in a bad state (I tried without assert it). Not sure what would be best approach if we want such test case. But before we have an idea, just skip it 😅
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
oke, thanks! Will have it noted for such tests in the future
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@zucchini-nlp Running that test in a subprocess could avoid the issue
But the necessary change is not super trivial
* skip * [run-slow] imagegpt * skip * [run-slow] imagegpt * [run-slow] imagegpt,video_llava * skip * [run-slow] imagegpt,video_llava --------- Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
* skip * [run-slow] imagegpt * skip * [run-slow] imagegpt * [run-slow] imagegpt,video_llava * skip * [run-slow] imagegpt,video_llava --------- Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
* skip * [run-slow] imagegpt * skip * [run-slow] imagegpt * [run-slow] imagegpt,video_llava * skip * [run-slow] imagegpt,video_llava --------- Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
What does this PR do?
See comment in the change