-
Notifications
You must be signed in to change notification settings - Fork 60
Competing Evidence: Fix issues from client QA #91
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
30961f8 to
db141b8
Compare
Also includes a drive-by fix that makes integration tests for plot blocks more robust.
db141b8 to
e6276a4
Compare
A small light grey copyright notice is displayed below the buttons of mentoring and step blocks. It is hidden in the apros theme.
The scrollIntoView function can be called multiple times, which resulted in multiple (10+) scroll animations getting queued. After the first animation successfully scrolled to the top of the step, the others kept running one by one. The animations weren't visible because the element was already scrolled to its final position, but if you tried manually scrolling down while the animations were still running, the window would be scrolled right back up.
The videos have to be resized after each step is displayed, because they are sized to fit into their container at load time, but since the container isn't visible at load time, the videos end up with wrong dimensions. Trigger video resize after displaying each step.
d2bf95e to
836a3be
Compare
|
@mtyaka would be nice to have testing instructions and course export that showcases fixes and new features. |
|
@e-kolpakov I updated PR description with instructions on how to test. |
|
@e-kolpakov Hmmm... it should look like this: It should always show up as long as you don't have the apros theme installed. |
|
@mtyaka Looks like I just haven't noticed it - I'll do another run. UPD: Right, I had Apros theme configured |
d843781 to
2351a0a
Compare
2351a0a to
6157e30
Compare
|
@mtyaka 👍 conditional tests pass |
Competing Evidence: Fix issues from client QA
|
@mtyaka It looks like the last commit on this has a build failure? |
|
@antoviaque Yes, but the integration tests are quite flaky. Travis runs two builds per PR - a "merge" build and a "pr branch" build. In this case one of the builds was successful, while the other was not - even though they were both running the same code (the PR branch was rebased against master). The tests also pass locally for me. We might want to investigate why the tests are so flaky, I got random travis build failures about half of the time when trying to get to green (without changing any code). |
|
@mtyaka In this cases there are different things that can be done:
Happy to discuss what the approach/steps should be in these cases, but it would be good if we had a solution that didn't involve merging broken builds. Otherwise, it will progressively undermine our confidence in the build, and it will start to rot -- it won't be long before people start merging builds with genuine failures ("it always fails anyway...") |
|
@antoviaque You're right, the last failure was in the new test. I didn't notice that. It's not the only flaky test, though. I've had random failures in multiple tests. One that I've (randomly) seen multiple times was I think it would be nice if we add a task in the backlog to make tests more robust. It would also be great if we could figure out what all of the random 'broken pipe' errors mean and get rid of them. |
|
@mtyaka Yup, if you add a ticket in the backlog, I'll try to put it toward the top. But it will be hard to fix it once for all imho, so I'd also like to make sure, at the same time, that we know how to handle these cases in the future. Maybe something to bring up during the meeting today? |
|
@antoviaque @mtyaka I think we already have a ticket for this: OC-807. For the "Broken pipe" thing, this might help. |
|
Thanks for finding that ticket @itsjeyd @antoviaque Yeah, we can discuss it on the call. I agree we need to take this more seriously. I think all three suggestions make sense - depending on how much time the author and fireman have. Most often it isn't obvious what causes flakiness, so it's often hard to estimate time fixing it will take, especially considering the fact that integration tests take ~30 minutes to run. If at all possible, we should try to reduce that. I don't really have any ideas other than try to replace the majority of the integration tests with JS tests. |
|
@itsjeyd Thanks! My bad actually, I hadn't realized we had this ticket, forgotten in the backlog; pulling it back up. @mtyaka And yup, sounds good. And I agree that some of the integration tests could be JS tests - even without converting them, that could be something to consider in future tasks, to prefer implementing them as JS tests, and keep integration tests for when they are really useful. |

To test make sure you have
problem-builderandimagemodal(https://github.com/Stanford-Online/xblock-image-modal) xblocks installed. Import latest export (2015.6PFnhS.tar.gz) from OC-1020 into a fresh course.