Skip to content

Resyncing with public#82

Merged
EliAndrewC merged 43 commits intomasterfrom
public-resync-2017-05-09
May 9, 2017
Merged

Resyncing with public#82
EliAndrewC merged 43 commits intomasterfrom
public-resync-2017-05-09

Conversation

@EliAndrewC
Copy link
Copy Markdown
Contributor

I'm pushing back what we've done on Sideboard at my day job over the past few months, which mostly has to do with websockets and our services API.

It looks like this won't merge cleanly, so I'll merge into this branch and resolve any conflicts and use this opportunity to pull back down the things Rob has been working on.

I'll go through and leave inline comments explaining all of the changes.

EliAndrewC and others added 30 commits January 29, 2017 20:51
Fixes #2373: Prevents inclusion of hybrid_property fields in to_dict_default_attrs
We are once again tracking code coverage! Yay!
Updates coveralls.io badge
added SIDEBOARD_CONFIG_ROOT environment variable
added SIDEBOARD_CONFIG_ROOT environment variable

See merge request !48
Store HTTP header fields in threadlocal

See merge request !49
Fixes #69: Allows port to be configurable via test.ini
Updating unit tests for the new header fields

See merge request !50
refactored notify delay to take a boolean instead of a time

See merge request !47
Pull down changes from the public Sideboard repo

See merge request !51
forwarded subscriptions can now have their methods updated

See merge request !53
EliAndrewC added 3 commits May 3, 2017 13:26
passthru subscriptions now get cleaned up properly

See merge request !57
Comment thread docs/source/index.rst


.. function:: notify(channels)
.. function:: notify(channels, delay=False)
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here's what's going on with the new delay stuff.

People already familiar with Sideboard's pub/sub utilities will probably recognize the @subscribes and @notifies decorators. This here is the notify() function which allows us to manually broadcast to a channel, which we sometimes use.

Consider the following problem: we're in the middle of a database transaction and want to trigger a notify after the transaction is complete. We don't want to just call notify(), because the background thread that handles broadcasts might run before we've committed our changes to the database.

When we originally considered this use case, we "solved" it by adding an optional delay argument to the notify method. For example, plugins could say notify('my_channel', delay=1) to cause the notification to be triggered after 1 second, with the assumption that they'd pick an amount of time long enough to let the transaction finish.

This is kind of a terrible solution for several reasons:

  • What if you pick an amount of time that's not long enough? Maybe your transaction takes longer than you thought to complete.
  • If you do pick a larger amount of time to solve that problem, it makes your notifications less responsive.
  • In order to support this feature, our sideboard.lib.Caller class spins up an extra thread for every Caller instance, even though most callers will never need this feature. (Admittedly, having a few additional threads sitting around doing nothing isn't much of a drain on system resources, but even still it adds a bunch of threads to the running system which makes watching top more annoying.)

I checked every Sideboard plugin we've ever written and we've literally never used this feature anyway! However, we have recently started wanting to be able to do this ("notify at the end of the current RPC request"), so I figured this was a good time to bite the bullet and make this minor refactor.

Now we can say notify('my_channel', delay=True) which will cause the notify to occur after the current RPC method has completed. This is technically a backwards-incompatible change, but since literally no one has ever used this feature that doesn't matter. Also, this has been implemented in such a way that even if some third party was setting delay to an integer like in the old approach, their code would still work as they expected!

Comment thread requirements.txt
pytest>=3.0.1
mock>=1.0.1,<1.1
Sphinx>=1.2.1
coverage>=3.6
Copy link
Copy Markdown
Contributor Author

@EliAndrewC EliAndrewC May 9, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This has been moved into a separate test_requirements.txt which is a better practice. In particular, the reason for this change is to avoid dropping unnecessary dependencies into production installs of Sideboard.

This might require a small tweak to the ubersystem-deploy repo in order to ensure that these dependencies are still installed. In other words, this is a backwards-incompatible change, in that running python setup.py develop will now install fewer dependencies, and if you want the test dependencies then you'll need to say something like pip install -r test_requirements.txt (which is what we've started doing in places where we need this).

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

At a glance, it looks like this has caused Travis to start failing, since we run our unit tests with the command

coverage run --source sideboard -m py.test sideboard

and we're no longer installing coverage by default. I'm sure this is just a matter of adding another line to the .travis.yml so I'll try to get that working.

Copy link
Copy Markdown
Contributor

@RobRuana RobRuana May 9, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For reference, we are doing this exact thing in ubersystem and all of the other plugins. The tests for each plugin ensure that necessary packages are installed, and do not rely on sideboard installing testing packages, so the plugin tests should be fine.

Also, I used the conventions requirements_*.txt everywhere else. So we've got:

  • requirements.txt
  • requirements_docs.txt
  • requirements_test.txt

And those files reference each other so we don't have to list the same core requirements three times. I think the way we have it set up is pretty nice, so feel free to use that as inspiration.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sweet, I'll take a look. I've always seen test_requirements.txt so I've kind of assumed that was the preferred idiom, but I personally like requirements_test.txt better.

Bizarrely, https://travis-ci.org/magfest/sideboard/jobs/230546161#L241 says it's installing CherryPy, but then we get a bunch of ImportError: No module named 'cherrypy' errors in the build. I'll take a look at what you did for the plugins and see if adapting that works any better than what I have now.

Comment thread setup.py
requires = ['CherryPy==3.2.2' if 'cherrypy' in r.lower() else r for r in requires]

if __name__ == '__main__':
setup_requires = {'setup_requires': ['distribute']} if sys.version_info[0] == 2 else {}
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fact: packaging is the worst thing about Python.

setuptools was obsoleted by distribute which got merged back into setuptools, which means that setuptools both obsoletes and is obsoleted by distribute.

Those of us using Python 3 such as MAGFest and new projects where I work can just ignore distribute, but older Python 2.7 projects still need distribute, so I changed this to only rely on distribute if you're on Python 2.7.

I don't think this should cause any problems for MAGFest, but it'll be something to keep an eye on.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh man, I thought the new setuptools was back ported to 2.7.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It was, but if you were hypothetically supporting versions of a Red Hat based OS well over a decade old, then you might not have easy access to it, which is why we still list distribute as a dependency :(


if any(ssl_params.values()):
# check if any of the non-version SSL options have been set
if any(val for val in ssl_params.values() if not isinstance(val, int)):
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Minor bugfix to the check_connections action, which MAGFest doesn't use.

checks = {}
for name, jservice in services._jsonrpc.items():
jproxy = jservice._send.im_self # ugly kludge to get the ServerProxy object
jproxy = jservice._send.im_self if six.PY2 else jservice._send.__self__ # ugly kludge to get the ServerProxy object
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shockingly, our "ugly kludge" didn't work on both Python 2 and 3.

Comment thread sideboard/jsonrpc.py
from cherrypy.lib.jsontools import json_decode

from sideboard.lib import log, config, serializer
from sideboard.websockets import trigger_delayed_notifications
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is the delay=True change on notify mentioned above. Here's the link:
https://github.com/magfest/sideboard/pull/82/files#r115605657

Comment thread sideboard/lib/_threads.py
self.q.put([args, kwargs])

def delayed(self, delay, *args, **kwargs):
self.q.put([args, kwargs], delay=delay)
Copy link
Copy Markdown
Contributor Author

@EliAndrewC EliAndrewC May 9, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As mentioned above, this is technically a backwards-incompatible change in that we've removed the delayed parameter, but as mentioned above this method was never actually used by anyone. If a future plugin decides they want this, they can subclass one of the classes in this module and easily add it themselves.

if not sub:
if sub:
sub.method = method
else:
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a minor bugfix to the "passthru subscriptions" feature of Sideboard's websocket RPC functionality. The bug was that previously we could update forwarded subscription parameters, but were unable to update the methods, so if you tried then only the parameters would change and not the method, which would result in an error! (As with other bugfixes in this changeset, the unit tests have been updated to check for the bugfix.)

def plugin_dirs(self):
module_path = '/fake/sideboard/plugins/test-plugin/test_plugin'
root_path = os.path.join(os.getcwd(), 'plugins', 'test-plugin')
root_path = os.path.join(config['plugins_dir'], 'test-plugin')
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is the more correct way to do this. I think that the new way will work on the MAGFest Vagrant - I know that the old way wasn't working on the Vagrant environments at my day job. However, this is something we'll want to confirm; we'll also want to confirm that it works in Travis.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Taking a second look, this may actually supposed to be os.getcwd(). The directory isn't actually supposed to exist, but it's the directory that will be returned by get_module_and_root_dirs() when the tests are running (I think).

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm, I'll need to go back and look at the git history of this change then, because my vague recollection is

  • the tests weren't passing the old way
  • config['plugins_dir'] is supposed to be the canonical way to get the location of the plugins directory
  • the tests passed when I changed it to that
  • I ran the tests after the merge and they passed both locally and on Travis

So I think this is correct? But I didn't take the time today to look at how that fixture is actually being used when making this MR, so it's possible that the tests are currently passing for the wrong reason and should be fixed so that they fail, and then further fixed so that they pass correctly.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's probably fine. root_path was not supposed to be a real directory. I think I was just trying to test that test-plugin and test_plugin both work as the plugin's directory name.

Comment thread sideboard/websockets.py
will always be 'rpc').

header_fields: We copy header fields from the request that initiated the
websocket connection.
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

MAGFest doesn't currently care about this, but in the future we might if we start doing more with exposed RPC services over websocket.

Comment thread sideboard/websockets.py
for client, callbacks in clients.copy().items():
if client != originating_client:
for callback in callbacks:
triggered.add((websocket, client, callback))
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This entire change is a minor performance tweak. When a websocket is closed, then there are two things which happen:

  • the client sends a client-terminated message to the server
  • the underlying socket connection is closed

In theory those two things should happen at about the same time. In practice we've noticed that sometimes there will be a considerable period of time between those two events, especially when you're having Apache reverse-proxy your websocket connections. Therefore, we now have Sideboard call unsubscribe_all() on any websocket connection which is marked as closed, since receiving client-terminated counts as closed even if the underlying socket connection is still open. This prevents us from triggering update checks on subscriptions which can't be sent out anyway.

MAGFest doesn't currently care about this, though we've been talking about doing more websocket pub/sub in the future so this kind of thing will be relevant as soon as we do.

Comment thread sideboard/websockets.py
"""
if self.is_closed:
log.debug('ignoring send on an already closed websocket: {}', message)
self.unsubscribe_all()
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See above; this is the same performance improvement I describe in my previous comment.

Comment thread sideboard/websockets.py
clients.pop(self, {})

for passthru_client in list(self.passthru_subscriptions.keys()):
self.teardown_passthru(passthru_client)
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This fixes a bug in our "passthru subscriptions" functionality; previously we would only clean up passthru subscriptions which were present in self.subscriptions instead of cleaning up all passthru subscriptions regardless.

Comment thread sideboard/websockets.py
the cached value, ensuring that an explicit RPC call to a service
exposed via websocket always receives a response.
"""
self.cached_fingerprints[client].pop(callback, None)
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

MAGFest probably doesn't care about this, so feel free to skip over this explanation, but just in case you want to know even more than what's described in the docstring about why we made this change, here's a fuller explanation:

Suppose a client sends the following websocket RPC call to the server:

{
    "method": "ecards.get",
    "params": {
        "before": "2017-01-01"
    },
    "client": "client-1"
}

let's further suppose that what we get back is [], i.e. an empty list.

Then on the same websocket connection, we send the following:

{
    "method": "ecards.get",
    "params": {
        "before": "2016-01-01"
    },
    "client": "client-1"
}

What would you intuitively expect to get back? Presumably you'd expect to get back [] as well. However, what Sideboard would actually return (before this change) in such a case is nothing. Not an empty list, but you'd literally not get any RPC response!

Why is this? Because Sideboard has response caching such that it doesn't send a subscription response if it's the same as the last time it sent a response on that subscription. This is useful for efficiency, since it means that if one method has an @notifies which pushes out a check on all subscriptions, then we only actually push out responses if there's new data.

However, on further reflection, we probably want to invalidate this cache anytime the user explicitly makes an API call, since if you make a method call, you probably always want a response, even if the response isn't any different than the last time you made an API call on the same subscription.

That's what this change is all about. MAGFest will start caring about this at the point where we start using Javascript to make WebSocket RPC requests.

@EliAndrewC
Copy link
Copy Markdown
Contributor Author

I've just finished leaving comments explaining what changed.

One more request: please do NOT use the "Squash and merge" action because that will mess with the commit history that we already have at my day job. In general that option is a great default for when everyone is pulling from the same repo, since it lets us collapse WIP branched changes into a single commit, but in this case it will make future merges a lot more work.

Copy link
Copy Markdown
Contributor

@RobRuana RobRuana left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think any of these changes affects MAGFest, and I see no obvious errors, so +1 from me!

from sideboard.config import get_config_files, get_config_overrides, \
get_config_root, get_module_and_root_dirs, parse_config, uniquify
from sideboard.lib import config
from sideboard.config import get_config_files, get_config_overrides, get_config_root, get_module_and_root_dirs, parse_config, uniquify
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you're ever curious about which lines of code I wrote in one of our codebases, just check which lines adhere to a maximum of 79 characters 😄

Comment thread tox.ini
deps= -rrequirements.txt
deps=
-rrequirements.txt
-rtest_requirements.txt
Copy link
Copy Markdown
Contributor Author

@EliAndrewC EliAndrewC May 9, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Adding this line seemed to do the trick to get Travis to pass. As of https://travis-ci.org/magfest/sideboard/builds/230557700 all builds are green!

I like how things are being handled in the ubersystem plugin (https://github.com/magfest/ubersystem/blob/master/requirements_test.txt#L1) where requirements_test.txt directly references requirements.txt but that would require some additional changes to how we write our setup.py file which would further affect our local Jenkins jobs and such, so I'd prefer to not take that on as part of this merge.

@EliAndrewC EliAndrewC merged commit e7aaa2c into master May 9, 2017
@EliAndrewC EliAndrewC deleted the public-resync-2017-05-09 branch May 9, 2017 23:09
Comment thread .travis.yml
- sudo apt-get install -y build-essential libcap-dev
install:
- pip install tox
- pip install -r test_requirements.txt
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I prefer doing this in tox.ini so you can run tox locally:

deps= -rrequirements_test.txt

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I added it there as well. I think I could probably just remove this line entirely come to think of it, since I'm also specifying it in tox.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agree. Also I would request either renaming test_requirements.txt to requirements_test.txt in sideboard, or renaming requirements_test.txt to test_requirements.txt in all of our other plugins.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Of the two options, I would prefer to rename test_requirements.txt to requirements_test.txt in Sideboard, because I like that naming convention better. That's out of scope for this particular PR, but I'll try to take care of that this week or at least file a ticket for it. At the same time I'll probably remove the test_requires line from setup.py since we never use python setup.py test anyway.

Comment thread setup.py
scripts=[],
setup_requires=['distribute'],
install_requires=requires,
tests_require=tests_require,
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is probably not needed unless you want to support running tests via:

python setup.py test

Which is totally not a thing people do anymore.

Also, tox.ini specifies skipsdist=True which speeds up builds, but also presumes that you've already installed all the necessary test requirements, which is why I also specify deps= -rrequirements_test.txt in tox.ini

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm, supporting python setup.py test actually was the reason we put it in, in spite of the fact that we ourselves don't actually run our tests that way! In retrospect I think we were cargo culting without realizing it.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess the other thing to consider is that you put the tests inside the sideboard module, sideboard.tests. Which means you are packaging and shipping code that depends on your test requirements. In this scenario, I would recommend one of the following:

  • Move your tests out of sideboard.tests into a top level tests package
  • Do not separate your test requirements out of requirements.txt

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I still like the idea we discussed last time this came up, which is:

  • sideboard.tests remains a module, but it only contains test utilities which are deliberately exposed to plugins as part of its API
  • all regular unit tests for Sideboard get moved into a tests/ directory

I'll write this up into an actual ticket this week and see if I can take a stab at actually doing it, since I'd prefer to have that done sooner rather than later, since otherwise it'll create a bunch of merge conflicts if one branch has that change and another doesn't.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have no strong opinions on this, so whatever fits your needs best. Just wanted to point out that by separating out your test requirements, but leaving your tests in place, you are now shipping code that will break if it's imported.

Comment thread sideboard/configspec.ini
# which store other data for logged in users can add those fields to this list.
ws.session_fields = string_list(default=list("username"))

# When a frontend server permforms authentication before proxying a request,
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"permforms"

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good catch, I'll fix that in the other open PR that follows up this one.

RobRuana added a commit to magfest/ubersystem that referenced this pull request May 11, 2017
RobRuana added a commit to magfest/ubersystem that referenced this pull request May 11, 2017
* Removes import from tests, which is broken as of magfest/sideboard/pull/82

* Adds missing test requirements
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants