Skip to content

Conversation

@itsjeyd
Copy link
Member

@itsjeyd itsjeyd commented Nov 11, 2015

To test make sure you have problem-builder and imagemodal (https://github.com/Stanford-Online/xblock-image-modal) xblocks installed. Import latest export (2015.6PFnhS.tar.gz) from OC-1020 into a fresh course.

@itsjeyd itsjeyd force-pushed the competing-evidence-qa branch 3 times, most recently from 30961f8 to db141b8 Compare November 11, 2015 15:40
Also includes a drive-by fix that makes integration tests for plot
blocks more robust.
@itsjeyd itsjeyd force-pushed the competing-evidence-qa branch from db141b8 to e6276a4 Compare November 11, 2015 17:16
A small light grey copyright notice is displayed below the buttons
of mentoring and step blocks.

It is hidden in the apros theme.
The scrollIntoView function can be called multiple times,
which resulted in multiple (10+) scroll animations getting queued.

After the first animation successfully scrolled to the top of the step,
the others kept running one by one. The animations weren't visible
because the element was already scrolled to its final position, but if
you tried manually scrolling down while the animations were still running,
the window would be scrolled right back up.
The videos have to be resized after each step is displayed,
because they are sized to fit into their container at load time,
but since the container isn't visible at load time, the videos
end up with wrong dimensions.

Trigger video resize after displaying each step.
@mtyaka mtyaka force-pushed the competing-evidence-qa branch from d2bf95e to 836a3be Compare November 12, 2015 10:10
@e-kolpakov
Copy link
Contributor

@mtyaka would be nice to have testing instructions and course export that showcases fixes and new features.

@mtyaka
Copy link
Member

mtyaka commented Nov 12, 2015

@e-kolpakov I updated PR description with instructions on how to test.

@e-kolpakov
Copy link
Contributor

@mtyaka @itsjeyd I'm not sure what (and if) I'm doing wrong, but I can't see that copyright notice anywhere (in LMS). I don't have lms theme applied though. So, under what conditions that copyright should be shown, and where?

@mtyaka
Copy link
Member

mtyaka commented Nov 12, 2015

@e-kolpakov Hmmm... it should look like this:

screen shot 2015-11-12 at 14 07 37

It should always show up as long as you don't have the apros theme installed.

@e-kolpakov
Copy link
Contributor

@mtyaka Looks like I just haven't noticed it - I'll do another run.

UPD: Right, I had Apros theme configured

@e-kolpakov
Copy link
Contributor

@mtyaka @itsjeyd verified, everything works as advertised. However, there are some possible improvements, so please address other notes (i.e. change implementation or provide argumentation for chosen approach).

@mtyaka mtyaka force-pushed the competing-evidence-qa branch 3 times, most recently from d843781 to 2351a0a Compare November 13, 2015 09:12
@mtyaka mtyaka force-pushed the competing-evidence-qa branch from 2351a0a to 6157e30 Compare November 13, 2015 09:55
@e-kolpakov
Copy link
Contributor

@mtyaka 👍 conditional tests pass

mtyaka added a commit that referenced this pull request Nov 13, 2015
Competing Evidence: Fix issues from client QA
@mtyaka mtyaka merged commit cc0c234 into master Nov 13, 2015
@antoviaque
Copy link
Member

@mtyaka It looks like the last commit on this has a build failure?

@mtyaka
Copy link
Member

mtyaka commented Nov 16, 2015

@antoviaque Yes, but the integration tests are quite flaky. Travis runs two builds per PR - a "merge" build and a "pr branch" build. In this case one of the builds was successful, while the other was not - even though they were both running the same code (the PR branch was rebased against master). The tests also pass locally for me. We might want to investigate why the tests are so flaky, I got random travis build failures about half of the time when trying to get to green (without changing any code).

@antoviaque
Copy link
Member

@mtyaka In this cases there are different things that can be done:

  • Fix the flaky test - here since the test I saw was failing was one I think was introduced by the current PR, it would have made sense imho (test_scroll_into_view).
  • Get the sprint fireman and/or the person who wrote the test to have a look at it ( @itsjeyd would you have time to help here? )
  • Disable the flaky test & add a bug about it the backlog - not ideal, but if all else fails, this is still better than a broken build

Happy to discuss what the approach/steps should be in these cases, but it would be good if we had a solution that didn't involve merging broken builds. Otherwise, it will progressively undermine our confidence in the build, and it will start to rot -- it won't be long before people start merging builds with genuine failures ("it always fails anyway...")

@mtyaka
Copy link
Member

mtyaka commented Nov 16, 2015

@antoviaque You're right, the last failure was in the new test. I didn't notice that. It's not the only flaky test, though. I've had random failures in multiple tests. One that I've (randomly) seen multiple times was MultipleObjectsReturned: get() returned more than one XBlockState - no idea what that's about. There was also one about some buttons not showing, completely unrelated to changes in this PR.

I think it would be nice if we add a task in the backlog to make tests more robust. It would also be great if we could figure out what all of the random 'broken pipe' errors mean and get rid of them.

@antoviaque
Copy link
Member

@mtyaka Yup, if you add a ticket in the backlog, I'll try to put it toward the top. But it will be hard to fix it once for all imho, so I'd also like to make sure, at the same time, that we know how to handle these cases in the future. Maybe something to bring up during the meeting today?

@itsjeyd
Copy link
Member Author

itsjeyd commented Nov 16, 2015

@antoviaque @mtyaka I think we already have a ticket for this: OC-807. For the "Broken pipe" thing, this might help.

@mtyaka
Copy link
Member

mtyaka commented Nov 16, 2015

Thanks for finding that ticket @itsjeyd

@antoviaque Yeah, we can discuss it on the call. I agree we need to take this more seriously. I think all three suggestions make sense - depending on how much time the author and fireman have. Most often it isn't obvious what causes flakiness, so it's often hard to estimate time fixing it will take, especially considering the fact that integration tests take ~30 minutes to run. If at all possible, we should try to reduce that. I don't really have any ideas other than try to replace the majority of the integration tests with JS tests.

@antoviaque
Copy link
Member

@itsjeyd Thanks! My bad actually, I hadn't realized we had this ticket, forgotten in the backlog; pulling it back up.

@mtyaka And yup, sounds good. And I agree that some of the integration tests could be JS tests - even without converting them, that could be something to consider in future tasks, to prefer implementing them as JS tests, and keep integration tests for when they are really useful.

@xitij2000 xitij2000 deleted the competing-evidence-qa branch December 13, 2019 06:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants