diff --git a/.gitignore b/.gitignore index ac787139..f936ad2b 100644 --- a/.gitignore +++ b/.gitignore @@ -50,3 +50,6 @@ docs/source/pylint-badge.svg # PyBuilder target/ + +# Random play files +splat* diff --git a/README.rst b/README.rst index 23435fb6..b8493660 100644 --- a/README.rst +++ b/README.rst @@ -23,10 +23,7 @@ typical pyramid + sqlalchemy application. .. note:: - Most work has shifted to an upcoming new version. The ``2_2_master`` branch is - where the development is happening, although the resulting release is likely - to be named "3" rather than "2" due to the number of breaking changes. More - information soon. + The default branch of pyramid_jsonapi is now the 2.2 branch. Documentation ------------- diff --git a/docs/source/client.rst b/docs/source/client.rst index 7868804d..a6e0ede2 100644 --- a/docs/source/client.rst +++ b/docs/source/client.rst @@ -333,8 +333,11 @@ Filtering ~~~~~~~~~ The JSON API spec doesn't say much about filtering syntax, other than that it -should use the parameter key ``filter``. In this implementation, we use syntax -like the following: +should use the parameter key ``filter``. There are multiple filtering syntaxes +available in pyramid_jsonapi. + +The first is simple filtering and has been +available since the first release. It uses the following syntax: .. code:: @@ -353,9 +356,20 @@ This is simple and reasonably effective. It's a little awkward on readability th Search operators in sqlalchemy (called column comparators) must be registered before they are treated as valid for use in json-api filters. The procedure for registering them, and the list of those registered by default can be found in :ref:`search_filter_operators`. +To specify another search filter syntax use the syntax name with a ``*`` in +front in the square brackets after ``filter``, like +``filter[*rql]=some rql filter``. + +Filter languages available: + +* RQL defined here ``_ as implemented in + rqlalchemy ``_. + Filter Examples ^^^^^^^^^^^^^^^ +**Simple**: + Find all the people with name 'alice': .. code-block:: bash @@ -379,3 +393,17 @@ Find all the posts where the author has the name 'alice': .. code-block:: bash http GET http://localhost:6543/api/posts?filter[author.name:eq]=alice + +**RQL** + +Find all the people with name 'alice': + +.. code-block:: bash + + http :6543/api/people filter[*rql]='eq(name,alice)' + +Find all the posts where the author has the name 'alice': + +.. code-block:: bash + + http :6543/api/posts filter[*rql]='eq((author,name),alice)' diff --git a/docs/source/customisation.rst b/docs/source/customisation.rst index 631c4e3c..185371d5 100644 --- a/docs/source/customisation.rst +++ b/docs/source/customisation.rst @@ -63,7 +63,7 @@ Available column options: =============== ========== ================================================ Option Value Type Description =============== ========== ================================================ -visible Boolean Whether or not to display this colum in the API. +visible Boolean Whether or not to display this column in the API. =============== ========== ================================================ Model Relationship Options @@ -268,119 +268,260 @@ It's also possible to specify a value transformation function to change the para value_transform=lambda val: re.sub(r'\*', '%', val) ) -Callbacks ---------- - -At certain points during the processing of a request, ``pyramid_jsonapi`` will -invoke any callback functions which have been registered. Callback sequences are -currently implemented as ``collections.deque``: you add your callback functions -using ``.append()`` or ``.appendleft()``, remove them with ``.pop()`` or -``.popleft()`` and so on. The functions in each callback list will be called in -order at the appropriate point. - -Getting the Callback Deque --------------------------- +Stages and the Workflow +----------------------- -Every view class (subclass of CollectionViewBase) has its own dictionary of -callback deques (``view_class.callbacks``). That dictionary is keyed by callback -deque name. For example, if you have a view_class and you would like to append -your ``my_after_get`` function to the ``after_get`` deque: +``pyramid_jsonapi`` services requests in stages. These stages are sequences of +functions implemented as a :class:`collections.deque` for each stage on each +method of each view class. It is possible to add (or remove) functions to those +deques directly but it is recommended that you use the following utility +function instead: .. code-block:: python - view_class.callbacks['after_get'].append(my_after_get) + view_class.add_stage_handler( + ['get', 'collection_get'], ['alter_document'], hfunc, + add_after='end', # 'end' is the default + add_existing=False, # False is the default + ) + +will append ``hfunc`` to the deque for the ``alter_document`` stage of +``view_class``'s methods ``get`` and ``collection_get``. ``add_after`` can be +``'end'`` to append to the deque, ``'start'`` to appendleft, or an existing +handler in the deque to insert after it. ``add_existing`` is a boolean +determining whether the handler should be added to the deque even if it exists +there already. -If you don't currently have a view class, you can get one from a model class -(for example, ``models.Person``) with: +To register a handler for all of the view methods involved in servicing a +particular http method, use ``pj.endpoint_data.http_to_view_methods``: .. code-block:: python - person_view_class = pyramid_jsonapi.PyramidJSONAPI.view_classes[models.Person] + view_class.add_stage_handler( + api_instance.endpoint_data.http_to_view_methods['post'], + ['alter_request'], + hfunc, + add_after='end', # 'end' is the default + add_existing=False, # False is the default + ) -Available Callback Deques -------------------------- +The above would append ``hfunc`` to the stage ``alter_request`` for all of the +view methods associated with the http method ``post`` (``collection_post``, +``relationships_post``). -The following is a list of available callbacks. Note that each item in the list -has a name like ``pyramid_jsonapi.callbacks_doc.``. That's so -that sphinx will link to auto-built documentation from the module -``pyramid_jsonapi.callbacks_doc``. In practice you should use only the name -after the last '.' to get callback deques. +If you do want to get directly at a stage deque, you can get it with something +like: -* :func:`pyramid_jsonapi.callbacks_doc.after_serialise_object` +.. code-block:: python -* :func:`pyramid_jsonapi.callbacks_doc.after_serialise_identifier` + ar_stage = pj.view_classes[models.Person].collection_post.stages['alter_request'] -* :func:`pyramid_jsonapi.callbacks_doc.after_get` +The handler functions in each stage deque will be called in +order at the appropriate point and should have the following signature: -* :func:`pyramid_jsonapi.callbacks_doc.before_patch` +.. code-block:: python -* :func:`pyramid_jsonapi.callbacks_doc.before_delete` + def handler_function(argument, view_instance, stage, view_method): + # some function definition... + return same_type_of_thing_as_argument -* :func:`pyramid_jsonapi.callbacks_doc.after_collection_get` +``argument`` in the ``alter_request`` stage would be a request, for example, +while in ``alter_document`` it would be a document object. ``argument`` and +``view_instance`` are passed positionally while ``stage`` and ``view_method`` +are keyword arguments. -* :func:`pyramid_jsonapi.callbacks_doc.before_collection_post` +For example, let's say you would like to alter all posts to the people +collection so that a created_on_server attribute is populated automatically. -* :func:`pyramid_jsonapi.callbacks_doc.after_related_get` +.. code-block:: python -* :func:`pyramid_jsonapi.callbacks_doc.after_relationships_get` + import socket -* :func:`pyramid_jsonapi.callbacks_doc.before_relationships_post` + def sh_created_on_server(req, view, **kwargs): + obj_data = req.json_body['data'] + obj_data['attributes']['created_on_server'] = socket.gethostname() + req.body = json.dumps({'data': obj_data}).encode() + return req -* :func:`pyramid_jsonapi.callbacks_doc.before_relationships_patch` + pj.view_classes[models.Person].add_stage_handler( + ['collection_post'], ['alter_request'], + ) -* :func:`pyramid_jsonapi.callbacks_doc.before_relationships_delete` +The stages are run in the following order: + +* ``alter_request``. Functions in this stage alter the request. For example + possibly editing any POST or PATCH such that it contains a server defined + calculated attribute. +* ``validate_request``. Functions in this stage validate the request. For + example, ensuring that the correct headers are set and that any json validates + against the schema. +* Any stages defined by a ``workflow`` function from a loadable workflow module. +* ``alter_document``. Functions in this stage alter the ``document``, which is + to say the dictionary which will be JSON serialised and sent back in the + response. +* ``validate_response``. + +The Loop Workflow +----------------- + +The default workflow is the ``loop`` workflow. It defines the following stages: + +* ``alter_query``. Alter the :class:`sqlalchemy.orm.query.Query` which will be + executed (using ``.all()`` or ``.one()``) to fetch the primary result(s). +* ``alter_related_query``. Alter the :class:`sqlalchemy.orm.query.Query` which + will be executed to fetch related result(s). +* ``alter_result``. Alter a :class:`workflow.ResultObject` object containing + a database result (a sqlalchemy orm object) from a query of the requested + collection. This might also involve rejecting the whole object (for example, + for authorisation purposes). +* ``before_write_item``. Alter a sqlalchemy orm item before it is written + (flushed) to the database. + +Authorisation at the Object Level +--------------------------------- + +Authorisation of access a the object level can quite complicated in typical +JSONAPI apis. The complexity arises from the connecting nature of relationships. +Every operation on an object with relationships implies other operations on any +related objects. The simplest example is ``GET``: ``get`` permission is required +on any object directly fetched and *also* on any related object fetched. More +complicated is any write based operation. For example, to update the owner of a +blog, you need ``patch`` permission on ``blog_x.owner``, ``post`` permission on +``new_owner.blogs`` (to add ``blog_x`` to the reverse relationship) and +``delete`` permission on ``old_owner.blogs`` (to remove ``blog_x`` from the +reverse relationship). + +There are stage handlers available for stages which handle most of the logic of +authorisation. The remaining logic is provided by permission filters which you +provide. The job of a permission filter is to decide, for an individual object, +whether the specified operation is allowed on that object or not. The permission +handler built in to pyramid_jsonapi will call the permission filters at the +appropriate times (including for related objects) and then stitch the answers +back together into a coherent, authorised whole. + +The default permission filters allow everything, which is the same as not having +any permission handlers at all. Permission filters should be registered with +:func:`CollectionView.register_permission_filter`. + +Note that you supply the lists of permissions and stages handled by the +permission filter function so you can either write functions that are quite +specific or more general ones. They will have the permission sought and the +current stage passed as arguments to aid in decision making. + +Permission filters will be called from within the code like this: +.. code-block:: python + + your_filter( + object_rep, + view=view_instance, + stage=stage_name, + permission=permission_sought, + ) -Canned Callbacks ----------------- +Where ``object_rep`` is some representation of the object to be authorised, +``view_instance`` is the current view instance, ``stage_name`` is the name of +the current stage, and ``permission_sought`` is one of ``get``, ``post``, +``patch``, or ``delete``. Different stages imply different representations. For +example the ``alter_request`` stage will pass a dictionary representing an item +from ``data`` from the JSON contained in a pyramid request and the +``alter_document`` stage will pass a similar dictionary representation of an +object taken from the ``document`` to be serialised. The ``alter_result`` stage +from the loop workflow, on the other hand, will pass a +:class:`workflow.ResultObject`, which is a wrapper around a sqlAlchemy ORM +object (which you can get to as ``object_rep.object``). + +Note that you can get the current sqlAlchemy session from +``view.dbsession`` (which you might need to make the queries required +for authorisation) and the pyramid request from ``view.request`` which +should give you access to the usual things. + +The simplest thing that a permission filter can do is return ``True`` +(``permission_sought`` is granted for the whole object) or ``False`` +(``permission_sought`` is denied for the whole object). To control permissions +for attributes or relationships, you must use the fuller return representation: -Using the callbacks above, you could, in theory, do things like implement a -permissions system, generalised call-outs to other data sources, or many other -things. However, some of those would entail quite a lot of work as well as being -potentially generally useful. In the interests of reuse, pyramid_jsonapi -maintains sets of self consistent callbacks which cooperate towards one goal. +.. code-block:: python -So far there is only one such set: ``access_control_serialised_objects``. This -set of callbacks implements an access control system based on the inspection of -serialised (as dictionaries) objects before POST, PATCH and DELETE operations -and after serialisation and GET operations. + { + 'id': True|False, # Controls visibility of / action on the whole object. + 'attributes': {'att1', 'att2', ...}, # The set of allowed attribute names. + 'relationships': {'rel1', 'rel2', ...}, # The set of allowed rel names. + } -Registering Canned Callbacks ----------------------------- +Putting that together in some examples: -Given a callback set name, you can register callback sets on each view class: +Let's say you have banned the user 'baddy' and want to authorise GET requests so +that baddy can no longer fetch blogs. Both the ``alter_document`` and +``alter_result`` stages would make sense as places to influence what will +be returned by a GET. We will choose ``alter_result`` here so that we are +authorising results as soon as +they come from the database. You might have something like this in +``__init__.py``: .. code-block:: python - view_class.append_callback_set('access_control_serialised_objects') + pj = pyramid_jsonapi.PyramidJSONAPI(config, models) + pj.view_classes[models.Blogs].register_permission_filter( + ['get'], + ['alter_result'], + lambda obj, view, **kwargs: view.request.remote_user != 'baddy', + ) -or on all view classes: +Next, you want to do authorisation on PATCH requests and allow only the author +of a blog post to PATCH it. The ``alter_request`` stage is the most obvious +place to do this (you want to alter the request before it is turned into a +database update). You might do something like this in ``__init__.py``: .. code-block:: python - pyramid_jsonapi.PyramidJSONAPI.append_callback_set_to_all_views( - 'access_control_serialised_objects' + pj = pyramid_jsonapi.PyramidJSONAPI(config, models) + def patch_posts_filter(data, view, **kwargs): + post_obj = view.db_session.get(models.Posts, data['id']) # sqlalchemy 1.4+ + # post_obj = view.db_session.query(models.Posts).get(data['id']) # sqlalchemy < 1.4 + return view.request.remote_user == post_obj.author.name + pj.view_classes[models.Posts].register_permission_filter( + ['patch'], + ['alter_request'], + patch_posts_filter ) -Callback Sets -------------- - -``access_control_serialised_objects`` -~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +Imagine that ``Person`` objects have an ``age`` attribute. Access to ``age`` is +sensitive so only the person themselves and anyone in the (externally defined) +``age_viewers`` group should be able to see that attribute. Other viewers should +still be able to see the object so we can't just return ``False`` from the +permission filter - we must use the fuller return format. -These callbacks will allow, deny, or manipulate the results of actions -dependent upon the return values of two methods of the calling view class: -:func:`pyramid_jsonapi.CollectionViewBase.allowed_object` and -:func:`pyramid_jsonapi.CollectionViewBase.allowed_fields`. +.. code-block:: python -The default implementations allow everything. To do anything else, you need to -replace those methods with your own implementations. + pj = pyramid_jsonapi.PyramidJSONAPI(config, models) + + def get_person_filter(person, view, **kwargs): + # This could be done in one 'if' but we split it out here for clarity. + # + # A person should see the full object for themselves. + if view.request.remote_user == person.username: + return True + # + # Anyone in the age_viewers group should also see the full object. + # get_group_members() is an imagined function in this app which gets the + # members of a named group. + if view.request.remote_user in get_group_members('age_viewers'): + return True + + # Everyone else isn't allowed to see age. + return { + 'id': True, # False would reject the whole object. Missing out the 'id' + # key is the same as specifying True. + 'attributes': set(view.all_attributes) - 'age', + 'relationships': True # The same as allowing all attributes. + } -* :func:`pyramid_jsonapi.CollectionViewBase.allowed_object` will be given two - arguments: an instance of a view class, and the serialised object (so far). It - should return ``True`` if the operation (available from view.request) is - allowed on the object, or ``False`` if not. + pj.view_classes[models.Person].register_permission_filter( + ['get'], + ['alter_result'], + get_person_filter + ) -* :func:`pyramid_jsonapi.CollectionViewBase.allowed_fields` will be given one - argument: an instance of a view class. It should return the set of fields - (attributes and relationships) on which the current operation is allowed. +What Happens With Authorisation Failures +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ diff --git a/notes.md b/notes.md new file mode 100644 index 00000000..4867ea35 --- /dev/null +++ b/notes.md @@ -0,0 +1,305 @@ +# Notes + +## Permissions + +`{blogs/1}` has the following representation: + +```python +{ + "type": "blogs", + "id": "1", + "attributes": { + "title": "alice's blog", + "content": "Welcome to alice's blog.", + "secret_code": "secret" + }, + "relationships": { + "owner": { + "data": {"type": "people", "id": "1"} # alice + }, + "posts": { + "data": [ + {"type": "posts", "id": "1"}, + {"type": "posts", "id": "2"} + ] + } + } +} +``` + +### get + +#### example: `GET /blogs/1` +View methdod: `get` + +Permissions required in general: + + 1. `GET` for each attribute value (via return of `get_pfilter(obj, resource=True, ...)` being `True` or `{'attributes': {'the_attribute'}}`); `False` will result in no resource (`HTTPForbidden` or `HTTPNotFound`). + 1. `GET` for each relationship or it will not appear. + 1. `GET` for each related item (via return of `get_pfilter(identifier, resource=False, ...)`) or it will not be returned (`meta` _might_ have list of items removed from return). + 1. `GET` for each resource in `included`; These will, in turn, be subject to the above permission rules. + +Permissions required for `{blogs/1}`: + + 1. `GET` permission on `{blogs/1}` to see that `{blogs/1}` exists. + - `blogs1_GET_perms = blogs_view_class.pfilter[GET](blogs_view_instance, {blogs/1})` + - `blogs1_GET_perms == True or isinstance(blogs1_GET_perms, dict)` for exists permission. + 1. `GET` permission on `{blogs/1}.title, content, secret_code` to see existence and value of `title`, `content`, `secret_code`. + - `blogs1_GET_perms == True` or `'title' in blogs1_GET_perms['attributes']` etc. + 1. `GET` permission on `{blogs/1}.owner` to see that `{blogs/1}.owner` exists. + 1. `blogs1_GET_perms == True` or `'owner' in blogs1_GET_perms['attributes']`. + 1. `GET` permission on `{people/1}` or else resource identifier will not be added. + 1. `people1_GET_perms = people_view_class.pfilter[GET](blogs_view_instance, {people/1})` + + +Sketch of procedure: + + 1. ask for `GET` permission to `{blogs/1}`: + 1. `True` gives access to the content of all attributes (`title`, `content`, and `secret_code`) and shows the _existence_ of all rels (`owner` and `posts`). + 1. `False` denies access to the whole resource. A config setting changes behaviour between forbidden error and pretending `{blogs/1}` doesn't exist. + 1. A dictionary in the form `{'attributes': {'title', 'content'}, 'relationships': {'posts'}}` gives the same level of access to selected attributes and relationships (access to the contents of listed attributes and existence of relationships). + 1. Now loop through all related items in _allowed_ relationships: + 1. Ask for `GET` permission to each related resource. + 1. `True` _or_ a dictionary means a resource identifier for that resource will be present in that relationship. + 1. `False` means a resource identifier will not be present. + 1. If a related resource's identifier is present in the relationship data then it may be included (if the request asked for it to be). + 1. Each included resource will be shown depending on `GET` permissions as above (but requested for the included resource). + +#### example: `GET /blogs` +View method: `collection_get` + +Permissions required: + +Same as `GET /blogs/1` example above, but applied to each resource in `data`. + +Sketch of procedure: + + 1. `GET` permission checked for every resource in `data`. + 1. If `True`, add whole resource. + 1. If `False`, remove resource. + 1. If dictionary, modify resource and add. + 1. Check related resources for each primary resource in `data` as for `GET /blogs/1` above. + 1. Build `included` as for `GET /blogs/1` example above. + +#### example: `GET /blogs/1/owner` (to_one rel) +View method: `related_get` + +Permissions required: + + 1. `GET` for `{blogs/1}` relationship `owner`. + - Note: if no further permissions were enforced, this would result in returning the resource `{people/1}`. + 2. Same permissions required as outlined in the `GET /blogs/1` example above but for the `{people/1}` resource. + +#### example: `GET /blogs/1/posts` (to_many rel) + +Same as `GET/blogs/1/owner` above but loop over `people` results. + +#### example: `GET /blogs/1/relationships/owner` +View method: `relationships_get` + +Permissions required: + + 1. `GET` for `{blogs/1}` relationship `owner`. + - Note: if no further permissions were enforced, this would result in returning the resource identifier for `{people/1}`. + 2. `GET` for res id for `{people/1}` (`pfilter` result of `True` or any dictionary will do). + 3. `GET` as appropriate for any resources in `included`. + +#### example: `GET /blogs/1/relationships/posts` +View method: `relationships_get` + +Permissions required: + +Pretty much the same as `GET /blogs/1/relationships/owner` + +#### example: `POST /blogs [blogs_resource]` +View method: `collection_post` + +Permissions required: + + 1. `POST` for the supplied resource. `pfilter` return of `True` will allow the whole resource through; dictionary will strip any blocked attributes and rels before attempting to create the resource (which might result in failure); `False` will entirely block creation (HTTPForbidden). + 2. `POST` for the relationship for any supplied relationships. + 3. `POST` (if the the created resource will be added to a `to_many` relationship of the related resource) or `PATCH` (if the created resrouce will be the target of a `to_one` relationship of the related resource) for the reverse relationship of any resources in supplied relationships? + +Illustrating creating a resource with supplied relationsips: + +```json +POST /blogs +{ + "data": { + "type": "blogs", + "attributes": { + "title": "A new blog" + }, + "relationships": { + "owner": { + "data": {"type": "people", "id": "1"} + }, + "posts": { + "data": [ + {"type": "posts", "id": "1"}, + {"type": "posts", "id": "2"} + ] + } + } + } +} +``` + +(3) would imply that we need `POST` permission to add the new blog to `/people/1/relationships/blogs`. Similarly we would need `PATCH` permission to set this as the blog for each `posts/{id}/relationships/blog`. + +#### example: `POST /blogs/1/relationships/posts [resource identifiers]` +View method: `relationships_post` + +Assume `{posts/10}.owner` is currently `None` and `{posts/20}.owner` is currently `{blogs/2}`. + +```json +POST /blogs/1/relationships/posts +{ + "data": [ + {"type": "posts", "id": "10"}, + {"type": "posts", "id": "20"}, + ] +} +``` + +Permissions required: + + 1. `POST` permission on `{blogs/1}.posts` to add `{posts/10}`. + 1. `PATCH` permission on `{posts/10}.blog` to set value to `{blogs/1}`. + 1. `POST` permission on `{blogs/1}.posts` to add `{posts/20}`. + 1. `PATCH` permission on `{posts/20}.blog` to set value to `{blogs/1}`. + 2. `DELETE` permission on `{blogs/2}.posts` to remove `{posts/20}`. + + +#### example: `PATCH /blogs/1` + +Consider patching `{blogs/1}` as follows: + +```json +PATCH /blogs/1 +{ + "data": { + "type": "blogs", + "id": "1", + "attributes": { + "title": "A new title" + }, + "relationships": { + "owner": { + "data": {"type": "people", "id": "2"} + }, + "posts": { + "data": [ + {"type": "posts", "id": "2"}, + {"type": "posts", "id": "3"} + ] + } + } + } +} +``` + +We assume that the `title` is changing, the `owner` is changing from `{people/1}` to `{people/2}`, and that the list of `posts` is changing from `[1,2]` to `[2,3]` (removing `{posts/1}` and adding `{posts/2}`). + +Permissions required: + + 1. `PATCH` permission to the `title` attribute. + 1. `PATCH` permission on `{blogs/1}.owner` to set value to `{people/2}`. + 1. `POST` permission on `{people/2}.blogs` to add `{blogs/1}`. + 1. `DELETE` permission on `{people/1}.blogs` to remove `{blogs/1}` + 1. `POST` permission on `{blogs/1}.posts` to add `{posts/3}`. + 1. `PATCH` permission on `{posts/3}.blog` to set value to `{blogs/1}`. + 1. `DELETE` permission on `{blogs/ID}.posts` to remove `{posts/3}` if `{posts/3}.blog` is currently `{blogs/ID}`. + 1. `DELETE` permission on `{blogs/1}.posts` to remove `{posts/1}`. + 1. `PATCH` permission on `{posts/1}.blog` to set value to `None/null`. + +#### example: `PATCH /blogs/1/relationships/owner` + +```json +PATCH /blogs/1/relationships/owner +{ + "data": {"type": "people", "id": "2"} +} +``` + +Permissions required: + + 1. `PATCH` permission on `{blogs/1}.owner` to set value to `{people/2}`. + 1. `POST` permission on `{people/2}.blogs` to add `{blogs/1}`. + 1. `DELETE` permission on `{people/1}.blogs` to remove `{blogs/1}`. + +#### example: `PATCH /blogs/1/relationships/posts` + +Assume: + - `{blogs/1}.posts` is currently `[{posts/1}, {posts/2}]`. + - `{posts/3}.blog` is currently `None`. + - `{posts/4}.blog` is currently `{blogs/2}`. + +```json +PATCH /blogs/1/relationships/posts +{ + "data": [ + {"type": "posts", "id": "2"}, + {"type": "posts", "id": "3"}, + {"type": "posts", "id": "4"} + ] +} +``` + +So this constitutes removing `{posts/1}` and adding `{posts/3}` and `{posts/4}`. + +Permissions required: + + 1. `DELETE` permission on `{blogs/1}.posts` to remove `{posts/1}`. + 1. `PATCH` permission on `{posts/1}.blog` to set value to `None`. + 1. `POST` permission on `{blogs/1}.posts` to add `{posts/3}`. + 1. `PATCH` permission on `{posts/3}.blog` to set value to `{blogs/1}`. + 1. `POST` permission on `{blogs/1}.posts` to add `{posts/4}`. + 1. `PATCH` permission on `{posts/4}.blog` to set value to `{blogs/1}`. + 1. `DELETE` permission on `{blogs/2}.posts` to remove `{posts/4}`. + +#### example: `DELETE /blogs/1` + +Assume `{blogs/1}` is as represented at the beginning of the permissions section. + +Permissions required: + + 1. `DELETE` permission on `{blogs/1}`. + 1. `DELETE` permission on `{people/1}.blogs` to remove `{blogs/1}`. + 1. `PATCH` permission on `{posts/1}.blog` to set value to `None`. + 1. `PATCH` permission on `{posts/2}.blog` to set value to `None`. + +#### example: `DELETE /blogs/1/relationships/posts` + +Permissions required: + +1. `DELETE` permission on `{blogs/1}.posts` to remove `{posts/1}`. + 1. `PATCH` permission on `{posts/1}.blog` to set value to `None`. +1. `DELETE` permission on `{blogs/1}.posts` to remove `{posts/2}`. + 1. `PATCH` permission on `{posts/2}.blog` to set value to `None`. + +### Permission Filters and Asking for Permission + +Each model can have on permission filter per stage and possible permission. The possible permissions are the lower case versions of the HTTP verbs: `get`, `post`, `patch`, `delete`. They should have the signature: + +`pfilter(target, mask, permission_sought, stage_name, view_instance)` + +A workflow that is seeking permission for an action will call the registered `pfilter`. + +mask = { + 'id': True, + 'attributes': {'att1': True, 'att2': False, ...}, + 'relationships': {'rel1': False, 'rel2': True, ...} +} + +mask = view.nothing_mask +mask = view.only_id_mask +mask = view.all_attributes_mask +mask = view.all_relationships_mask +mask = view.everything_mask + +mask = view.attributes_mask(attributes) +mask = view.relationships_mask(relationships) + +mask = view.mask_or(mask1, mask2) +mask = view.mask_and(mask1, mask2) diff --git a/pyramid_jsonapi/__init__.py b/pyramid_jsonapi/__init__.py index 7ed6af49..6d3ce628 100644 --- a/pyramid_jsonapi/__init__.py +++ b/pyramid_jsonapi/__init__.py @@ -37,16 +37,27 @@ import sqlalchemy from sqlalchemy.exc import DBAPIError from sqlalchemy.ext.associationproxy import ASSOCIATION_PROXY -from sqlalchemy.ext.declarative.api import DeclarativeMeta +# DeclarativeMeta moved between sqlalchemy 1.3 and 1.4 +try: + # <= 1.3 + from sqlalchemy.ext.declarative.api import DeclarativeMeta +except ImportError: + # 1.4+ + from sqlalchemy.orm import DeclarativeMeta from sqlalchemy.ext.hybrid import hybrid_property +from sqlalchemy.orm.interfaces import ( + MANYTOMANY, + MANYTOONE, + ONETOMANY, +) from sqlalchemy.orm.relationships import RelationshipProperty import pyramid_jsonapi.collection_view import pyramid_jsonapi.endpoints import pyramid_jsonapi.filters -import pyramid_jsonapi.jsonapi import pyramid_jsonapi.metadata import pyramid_jsonapi.version +import pyramid_jsonapi.workflow as wf __version__ = pyramid_jsonapi.version.get_version() @@ -70,6 +81,7 @@ class PyramidJSONAPI(): 'allow_client_ids': {'val': False, 'desc': 'Allow client to specify resource ids.'}, 'api_version': {'val': '', 'desc': 'API version for prefixing endpoints and metadata generation.'}, 'expose_foreign_keys': {'val': False, 'desc': 'Expose foreign key fields in JSON.'}, + 'inform_of_get_authz_failures': {'val': True, 'desc': 'True = return information in meta about authz failures; False = pretend items don\'t exist'}, 'metadata_endpoints': {'val': True, 'desc': 'Should /metadata endpoint be enabled?'}, 'metadata_modules': {'val': 'JSONSchema OpenAPI', 'desc': 'Modules to load to provide metadata endpoints (defaults are modules provided in the metadata package).'}, 'openapi_file': {'val': '', 'desc': 'File containing OpenAPI data (YAML or JSON)'}, @@ -85,8 +97,18 @@ class PyramidJSONAPI(): 'schema_validation': {'val': True, 'desc': 'jsonschema schema validation enabled?'}, 'debug_endpoints': {'val': False, 'desc': 'Whether or not to add debugging endpoints.'}, 'debug_test_data_module': {'val': 'test_data', 'desc': 'Module responsible for populating test data.'}, - 'debug_meta': {'val': False, 'desc': 'Whether or not to add debug information to the meta key in returned JSON.'}, 'debug_traceback': {'val': False, 'desc': 'Whether or not to add a stack traceback to errors.'}, + 'debug_meta': {'val': False, 'desc': 'Whether or not to add debug information to the meta key in returned JSON.'}, + 'workflow_get': {'val': 'pyramid_jsonapi.workflow.loop.get', 'desc': 'Module implementing the get workflow.'}, + 'workflow_patch': {'val': 'pyramid_jsonapi.workflow.loop.patch', 'desc': 'Module implementing the patch workflow.'}, + 'workflow_delete': {'val': 'pyramid_jsonapi.workflow.loop.delete', 'desc': 'Module implementing the delete workflow.'}, + 'workflow_collection_get': {'val': 'pyramid_jsonapi.workflow.loop.collection_get', 'desc': 'Module implementing the collection_get workflow.'}, + 'workflow_collection_post': {'val': 'pyramid_jsonapi.workflow.loop.collection_post', 'desc': 'Module implementing the collection_post workflow.'}, + 'workflow_related_get': {'val': 'pyramid_jsonapi.workflow.loop.related_get', 'desc': 'Module implementing the related_get workflow.'}, + 'workflow_relationships_get': {'val': 'pyramid_jsonapi.workflow.loop.relationships_get', 'desc': 'Module implementing the relationships_get workflow.'}, + 'workflow_relationships_post': {'val': 'pyramid_jsonapi.workflow.loop.relationships_post', 'desc': 'Module implementing the relationships_post workflow.'}, + 'workflow_relationships_patch': {'val': 'pyramid_jsonapi.workflow.loop.relationships_patch', 'desc': 'Module implementing the relationships_patch workflow.'}, + 'workflow_relationships_delete': {'val': 'pyramid_jsonapi.workflow.loop.relationships_delete', 'desc': 'Module implementing the relationships_delete workflow.'}, } def __init__(self, config, models, get_dbsession=None): @@ -228,24 +250,24 @@ def create_resource(self, model, collection_name=None, expose_fields=None): ``collection_view_factory()`` """ - # Find the primary key column from the model and use as 'id_col_name' - try: - keycols = sqlalchemy.inspect(model).primary_key - except sqlalchemy.exc.NoInspectionAvailable: - # Trying to inspect the declarative_base() raises this exception. We - # don't want to add it to the API. - return - # Only deal with one primary key column. - if len(keycols) > 1: - raise Exception( - 'Model {} has more than one primary key.'.format( - model.__name__ - ) - ) - if not hasattr(model, '__pyramid_jsonapi__'): model.__pyramid_jsonapi__ = {} + if 'id_col_name' not in model.__pyramid_jsonapi__: + # Find the primary key column from the model and use as 'id_col_name' + try: + keycols = sqlalchemy.inspect(model).primary_key + except sqlalchemy.exc.NoInspectionAvailable: + # Trying to inspect the declarative_base() raises this exception. + # We don't want to add it to the API. + return + # Only deal with one primary key column. + if len(keycols) > 1: + raise Exception( + 'Model {} has more than one primary key.'.format( + model.__name__ + ) + ) model.__pyramid_jsonapi__['id_col_name'] = keycols[0].name # Create a view class for use in the various add_view() calls below. @@ -262,6 +284,17 @@ def create_resource(self, model, collection_name=None, expose_fields=None): view.default_limit = int(self.settings.paging_default_limit) view.max_limit = int(self.settings.paging_max_limit) + view.get = wf.make_method('get', self) + view.patch = wf.make_method('patch', self) + view.delete = wf.make_method('delete', self) + view.collection_get = wf.make_method('collection_get', self) + view.collection_post = wf.make_method('collection_post', self) + view.related_get = wf.make_method('related_get', self) + view.relationships_get = wf.make_method('relationships_get', self) + view.relationships_post = wf.make_method('relationships_post', self) + view.relationships_patch = wf.make_method('relationships_patch', self) + view.relationships_delete = wf.make_method('relationships_delete', self) + self.endpoint_data.add_routes_views(view) def collection_view_factory(self, model, collection_name=None, expose_fields=None): @@ -281,6 +314,17 @@ def collection_view_factory(self, model, collection_name=None, expose_fields=Non class_attrs['key_column'] = sqlalchemy.inspect(model).primary_key[0] class_attrs['collection_name'] = collection_name or model.__tablename__ class_attrs['exposed_fields'] = expose_fields + class_attrs['permission_filters'] = { + m.lower(): {} for m in + self.endpoint_data.endpoints['method_sets']['read'] + } + class_attrs['permission_filters'].update( + { + m.lower(): {} for m in + self.endpoint_data.endpoints['method_sets']['write'] + } + ) + # atts is ordinary attributes of the model. # hybrid_atts is any hybrid attributes defined. # fields is atts + hybrid_atts + relationships @@ -300,11 +344,16 @@ def collection_view_factory(self, model, collection_name=None, expose_fields=Non for key, item in sqlalchemy.inspect(model).all_orm_descriptors.items(): if isinstance(item, hybrid_property): if expose_fields is None or item.__name__ in expose_fields: - hybrid_atts[item.__name__] = item - fields[item.__name__] = item + if item.info.get('pyramid_jsonapi', {}).get('relationship', False): + rels[key] = item + else: + hybrid_atts[item.__name__] = item + fields[item.__name__] = item if item.extension_type is ASSOCIATION_PROXY: rels[key] = item class_attrs['hybrid_attributes'] = hybrid_atts + class_attrs['all_attributes'] = atts.copy() + class_attrs['all_attributes'].update(hybrid_atts) for key, rel in sqlalchemy.inspect(model).mapper.relationships.items(): if expose_fields is None or key in expose_fields: rels[key] = rel @@ -313,24 +362,6 @@ def collection_view_factory(self, model, collection_name=None, expose_fields=Non fields.update(rels) class_attrs['fields'] = fields - # All callbacks have the current view as the first argument. The comments - # below detail subsequent args. - class_attrs['callbacks'] = { - 'after_serialise_identifier': deque(), # args: identifier(dict) - 'after_serialise_object': deque(), # args: object(dict) - 'after_get': deque(), # args: document(dict) - 'before_patch': deque(), # args: partial_object(dict) - 'before_delete': deque(), # args: item(sqlalchemy) - 'after_collection_get': deque(), # args: document(dict) - 'before_collection_post': deque(), # args: object(dict) - 'after_related_get': deque(), # args: document(dict) - 'after_relationships_get': deque(), # args: document(dict) - 'before_relationships_post': deque(), # args: object(dict) - 'before_relationships_patch': deque(), # args: partial_object(dict) - 'before_relationships_delete': - deque(), # args: parent_item(sqlalchemy) - } - view_class = type( 'CollectionView<{}>'.format(collection_name), (pyramid_jsonapi.collection_view.CollectionViewBase, ), @@ -343,18 +374,44 @@ def collection_view_factory(self, model, collection_name=None, expose_fields=Non return view_class - def append_callback_set_to_all_views(self, set_name): # pylint:disable=invalid-name - """Append a named set of callbacks to all view classes. + def enable_permission_handlers(self, permissions, stage_names): + ''' + Add permission handlers to all views. - Args: - set_name (str): key in ``callback_sets``. - """ - for view_class in self.view_classes.values(): - view_class.append_callback_set(set_name) + Permission handlers are not added to views by default for performance + reasons. Call this function to add permission handlers to *all* views + for the view methods that permissions implies and for the stage names + specified. + + Arguments: + permissions: an iterable of permissions to be enabled. Each permission + should be one of ``get``, ``post``, ``patch``, ``delete`` or + ``write`` (which expands to the same as ``post``, ``patch`` and + ``delet``). + stage_names: an iterable of stage names to enable. + + ''' + # Build a set of all the end points from permissions. + ep_names = set() + for perm in permissions: + ep_names.update(self.endpoint_data.http_to_view_methods[perm.lower()]) + + # Add permission handlers for all view classes. + for model, view_class in self.view_classes.items(): + for ep_name in ep_names: + ep_func = getattr(view_class, ep_name) + ep_func.stages['alter_document'].append( + wf.sh_alter_document_add_denied + ) + for stage_name in stage_names: + view_class.add_stage_handler( + [ep_name], [stage_name], + view_class.permission_handler(ep_name, stage_name) + ) class StdRelationship: - """Standardise access to relationship informationself. + """Standardise access to relationship information. Attributes: obj: the actual object representing the relationship. @@ -368,21 +425,39 @@ def __init__(self, name, obj, view_class): if isinstance(obj, RelationshipProperty): self.direction = self.rel_direction self.tgt_class = self.rel_tgt_class + self.instrumented = getattr(self.src_class, self.name) + self.queryable = True + elif isinstance(obj, hybrid_property): + pj_info = obj.info['pyramid_jsonapi']['relationship'] + self.direction = pj_info.get('direction', ONETOMANY) + self.queryable = pj_info.get('queryable', False) + tgt_class = pj_info.get('tgt_class') + if isinstance(tgt_class, str): + for mapper in view_class.model.registry.mappers: + if mapper.class_.__name__ == tgt_class: + tgt_class = mapper.class_ + break + self.tgt_class = tgt_class elif obj.extension_type is ASSOCIATION_PROXY: self.direction = self.proxy_direction self.tgt_class = self.proxy_tgt_class + self.queryable = True @property def rel_direction(self): return self.obj.direction + @property + def to_many(self): + return self.direction in (ONETOMANY, MANYTOMANY) + @property def proxy_direction(self): ps = self.obj.for_class(self.src_class) if ps.scalar: - return sqlalchemy.orm.interfaces.MANYTOONE + return MANYTOONE else: - return sqlalchemy.orm.interfaces.MANYTOMANY + return MANYTOMANY @property def rel_tgt_class(self): @@ -393,6 +468,61 @@ def proxy_tgt_class(self): ps = self.obj.for_class(self.src_class) return getattr(ps.target_class, ps.value_attr).mapper.class_ + @property + def rel_mirror_relationship(self): + tgt_view = self.view_class.api.view_classes[self.tgt_class] + found = None + for rname, r in tgt_view.relationships.items(): + if not isinstance(r.obj, RelationshipProperty): + # Making the assumption that the mirror of any normal rel + # will be another normal rel. + continue + if self.direction is MANYTOMANY: + # For MANYTOMANY we need to look at the secondaryjoin. + if ( + self.obj.primaryjoin.left == r.obj.secondaryjoin.left and + self.obj.primaryjoin.right == r.obj.secondaryjoin.right and + self.obj.secondaryjoin.left == r.obj.primaryjoin.left and + self.obj.secondaryjoin.right == r.obj.primaryjoin.right + ): + return StdRelationship(rname, r.obj, tgt_view) + else: + if ( + self.obj.primaryjoin.left == r.obj.primaryjoin.left and + self.obj.primaryjoin.right == r.obj.primaryjoin.right + ): + # Done. + return StdRelationship(rname, r.obj, tgt_view) + return None + + @property + def proxy_mirror_relationship(self): + tgt_view = self.view_class.api.view_classes[self.tgt_class] + pi = self.obj.for_class(self.src_class) + for rname, r in tgt_view.relationships.items(): + if r.obj.extension_type is not ASSOCIATION_PROXY: + # Assume that the mirror of any association proxy rel + # will be another association proxy. + continue + rpi = r.obj.for_class(r.src_class) + if ( + pi.local_attr.property.primaryjoin.left == rpi.remote_attr.property.primaryjoin.left and + pi.local_attr.property.primaryjoin.right == rpi.remote_attr.property.primaryjoin.right and + pi.remote_attr.property.primaryjoin.left == rpi.local_attr.property.primaryjoin.left and + pi.remote_attr.property.primaryjoin.right == rpi.local_attr.property.primaryjoin.right + ): + return StdRelationship(rname, r.obj, tgt_view) + return None + + @property + def mirror_relationship(self): + if isinstance(self.obj, RelationshipProperty): + return self.rel_mirror_relationship + elif self.obj.extension_type is ASSOCIATION_PROXY: + return self.proxy_mirror_relationship + else: + return None + class DebugView: """Pyramid view class defining a debug API. diff --git a/pyramid_jsonapi/collection_view.py b/pyramid_jsonapi/collection_view.py index fb8769dd..3d7957fe 100644 --- a/pyramid_jsonapi/collection_view.py +++ b/pyramid_jsonapi/collection_view.py @@ -2,10 +2,13 @@ # pylint: disable=too-many-lines; It's mostly docstrings import functools import itertools +import importlib import logging import re +import sqlalchemy +from collections import namedtuple from collections.abc import Sequence - +from functools import partial from pyramid.httpexceptions import ( HTTPNotFound, HTTPForbidden, @@ -19,10 +22,14 @@ HTTPMethodNotAllowed, status_map, ) -import pyramid_jsonapi.jsonapi -import sqlalchemy +from pyramid.settings import asbool +from rqlalchemy import RQLQueryMixIn from sqlalchemy.ext.associationproxy import AssociationProxy -from sqlalchemy.orm import load_only, aliased +from sqlalchemy.orm import ( + load_only, + aliased, + Query as BaseQuery, +) from sqlalchemy.orm.relationships import RelationshipProperty from sqlalchemy.orm.exc import NoResultFound @@ -30,6 +37,15 @@ MANYTOMANY = sqlalchemy.orm.interfaces.MANYTOMANY MANYTOONE = sqlalchemy.orm.interfaces.MANYTOONE +import pyramid_jsonapi.workflow as wf + + +Entity = namedtuple('Entity', 'type') + + +class RQLQuery(BaseQuery, RQLQueryMixIn): + pass + class CollectionViewBase: """Base class for all view classes. @@ -43,6 +59,7 @@ class CollectionViewBase: # Define class attributes # Callable attributes use lambda to keep pylint happy api = None + all_attributes = None attributes = None callbacks = None collection_name = None @@ -56,6 +73,7 @@ class CollectionViewBase: max_limit = None model = lambda: None obj_id = None + not_found_message = None request = None rel = None rel_class = None @@ -64,6 +82,7 @@ class CollectionViewBase: relname = None view_classes = None settings = None + permission_filters = None def __init__(self, request): self.request = request @@ -78,206 +97,36 @@ def id_col(item): """Return the column holding an item's id.""" return getattr(item, item.__pyramid_jsonapi__['id_col_name']) - def jsonapi_view(func): # pylint: disable=no-self-argument - """Decorator for view functions. Adds jsonapi boilerplate, - and tests response validity.""" - - def view_exceptions(func): - """Decorator to intercept all exceptions raised by wrapped view methods. - - If the exception is 'valid' according to the schema, raise it. - Else raise a generic 4xx or 5xx error and log the real one. - """ - @functools.wraps(func) - def new_func(self): # pylint: disable=missing-docstring - ep_dict = self.api.endpoint_data.endpoints - # Get route_name from route - _, _, endpoint = self.request.matched_route.name.split(':') - method = self.request.method - try: - responses = set( - ep_dict['responses'].keys() | - ep_dict['endpoints'][endpoint]['responses'].keys() | - ep_dict['endpoints'][endpoint]['http_methods'][method]['responses'].keys() - ) - except KeyError: - raise HTTPMethodNotAllowed( - 'Unsupported method "{}" for endpoint "{}"'.format(method, endpoint) - ) - try: - result = func(self) # pylint: disable=not-callable - response_class = status_map[self.request.response.status_code] - if response_class not in responses: - logging.error( - "Invalid response: %s for route_name: %s path: %s", - response_class, - self.request.matched_route.name, - self.request.current_route_path() - ) - return result - except Exception as exc: - if exc.__class__ not in responses: - logging.exception( - "Invalid exception raised: %s for route_name: %s path: %s", - exc.__class__, - self.request.matched_route.name, - self.request.current_route_path() - ) - if hasattr(exc, 'code'): - try: - # We have a code but it's probably a string. We need an integer. - int_code = int(exc.code) # pylint: disable=no-member - except Exception as ex2: - # If we can't turn it into an integer then we're really stuck. - # raise HTTPInternalServerError("Unexpected server error.") - int_code = 500 - if 400 <= int_code < 500: # pylint:disable=no-member - raise HTTPBadRequest("Unexpected client error: {}".format(exc)) - else: - raise HTTPInternalServerError("Unexpected server error.") - raise - return new_func - - @functools.lru_cache() - def get_jsonapi_accepts(request): - """Return a set of all 'application/vnd.api' parts of the accept - header. - """ - accepts = re.split( - r',\s*', - request.headers.get('accept', '') - ) - return { - a for a in accepts - if a.startswith('application/vnd.api') - } - - def check_request_headers(request, jsonapi_accepts): - """Check that request headers comply with spec. - - Raises: - HTTPUnsupportedMediaType - HTTPNotAcceptable - """ - # Spec says to reject (with 415) any request with media type - # params. - if len(request.headers.get('content-type', '').split(';')) > 1: - raise HTTPUnsupportedMediaType( - 'Media Type parameters not allowed by JSONAPI ' + - 'spec (http://jsonapi.org/format).' - ) - # Spec says throw 406 Not Acceptable if Accept header has no - # application/vnd.api+json entry without parameters. - if jsonapi_accepts and\ - 'application/vnd.api+json' not in jsonapi_accepts: - raise HTTPNotAcceptable( - 'application/vnd.api+json must appear with no ' + - 'parameters in Accepts header ' + - '(http://jsonapi.org/format).' - ) - - def check_request_valid_json(request): - """Check that the body of any request is valid JSON. - - Raises: - HTTPBadRequest - """ - if request.content_length: - try: - request.json_body - except ValueError: - raise HTTPBadRequest("Body is not valid JSON.") - - @view_exceptions - @functools.wraps(func) - def view_wrapper(self): - """jsonapi boilerplate function to wrap decorated functions.""" - check_request_headers(self.request, get_jsonapi_accepts(self.request)) - check_request_valid_json(self.request) - - if self.request.content_length and self.api.settings.schema_validation: - # Validate request JSON against the JSONAPI jsonschema - self.api.metadata.JSONSchema.validate(self.request.json_body, self.request.method) - - # Spec says throw BadRequest if any include paths reference non - # existent attributes or relationships. - if self.bad_include_paths: - raise HTTPBadRequest( - "Bad include paths {}".format( - self.bad_include_paths - ) - ) - - # Spec says set Content-Type to application/vnd.api+json. - self.request.response.content_type = 'application/vnd.api+json' - - # Extract id and relationship from route, if provided - self.obj_id = self.request.matchdict.get('id', None) - self.relname = self.request.matchdict.get('relationship', None) - - if self.obj_id: - # Try to get the object - try: - self.item = self.single_return( - self.single_item_query(), - 'No id {} in collection {}'.format( - self.obj_id, - self.collection_name - ) - ) - except (sqlalchemy.exc.DataError, sqlalchemy.exc.StatementError): - # DataError is caused by e.g. id (int) = cat - # StatementError is caused by e.g. id (uuid) = 1 - raise HTTPNotFound('Object {} not found in collection {}'.format( - self.obj_id, - self.collection_name - )) - - if self.relname: - # Gather relationship info - mapper = sqlalchemy.inspect(self.model).mapper - try: - self.rel = self.relationships[self.relname] - except KeyError: - raise HTTPNotFound('No relationship {} in collection {}'.format( - self.relname, - self.collection_name - )) - self.rel_class = self.rel.tgt_class - self.rel_view = self.view_instance(self.rel_class) - - # Update the dictionary with the results of the wrapped method. - ret = func(self) # pylint:disable=not-callable - if ret: - # Include a self link unless the method is PATCH. - if self.request.method != 'PATCH': - selfie = {'self': self.request.url} - if hasattr(ret, 'links'): - ret.links.update(selfie) - else: - ret.links = selfie - - # Potentially add some debug information. - if self.api.settings.debug_meta: - debug = { - 'accept_header': { - a: None for a in get_jsonapi_accepts(self.request) - }, - 'qinfo_page': - self.collection_query_info(self.request)['_page'], - 'atts': {k: None for k in self.attributes.keys()}, - 'includes': { - k: None for k in self.requested_include_names() - } - } - ret.meta.update({'debug': debug}) - return ret.as_dict() + def get_one(self, query, not_found_message=None): + try: + item = query.one() + except (NoResultFound, sqlalchemy.exc.DataError, sqlalchemy.exc.StatementError): + # NoResultFound is sqlalchemy's native exception if there is no + # such id in the collection. + # DataError is caused by e.g. id (int) = cat + # StatementError is caused by e.g. id (uuid) = 1 + if not_found_message: + raise HTTPNotFound(not_found_message) else: - return {} - return view_wrapper + return None + return item - @jsonapi_view - def get(self): + def get_item(self, _id=None): + """Return the item specified by _id. Will look up id from request if _id is None. + """ + if _id is None: + _id = self.obj_id + return self.get_one( + self.dbsession.query( + self.model + ).options( + load_only(self.key_column.name) + ).filter( + self.key_column == _id + ) + ) + + def get_old(self): """Handle GET request for a single item. Get a single item from the collection, referenced by id. @@ -311,15 +160,9 @@ def get(self): http GET http://localhost:6543/people/1 """ - # We already fetched any item referenced by id while checking - # for existence in the wrapper. We put it into self.item in - # case it was needed. - for callback in self.callbacks['after_get']: - self.item = callback(self, self.item) - return self.item - - @jsonapi_view - def patch(self): + pass + + def patch_old(self): """Handle PATCH request for a single item. Update an existing item from a partially defined representation. @@ -392,129 +235,9 @@ def patch(self): } }' Content-Type:application/vnd.api+json """ - try: - data = self.request.json_body['data'] - except KeyError: - raise HTTPBadRequest('data attribute required in PATCHes.') - data_id = data.get('id') - if self.collection_name != data.get('type'): - raise HTTPConflict( - 'JSON type ({}) does not match URL type ({}).'.format( - data.get('type'), self.collection_name - ) - ) - if data_id != self.obj_id: - raise HTTPConflict( - 'JSON id ({}) does not match URL id ({}).'.format( - data_id, self.obj_id - ) - ) - for callback in self.callbacks['before_patch']: - data = callback(self, data) - atts = {} - hybrid_atts = {} - for key, value in data.get('attributes', {}).items(): - if key in self.attributes: - atts[key] = value - elif key in self.hybrid_attributes: - hybrid_atts[key] = value - else: - raise HTTPNotFound( - 'Collection {} has no attribute {}'.format( - self.collection_name, key - ) - ) - atts[self.key_column.name] = self.obj_id - item = self.dbsession.merge(self.model(**atts)) - for att, value in hybrid_atts.items(): - try: - setattr(item, att, value) - except AttributeError: - raise HTTPConflict( - 'Attribute {} is read only.'.format( - att - ) - ) - - rels = data.get('relationships', {}) - for relname, reldict in rels.items(): - try: - rel = self.relationships[relname] - except KeyError: - raise HTTPNotFound( - 'Collection {} has no relationship {}'.format( - self.collection_name, relname - ) - ) - rel_view = self.view_instance(rel.tgt_class) - try: - reldata = reldict['data'] - except KeyError: - raise HTTPBadRequest( - "Relationship '{}' has no 'data' member.".format(relname) - ) - except TypeError: - raise HTTPBadRequest( - "Relationship '{}' is not a dictionary with a data member.".format(relname) - ) - if reldata is None: - setattr(item, relname, None) - elif isinstance(reldata, dict): - if reldata.get('type') != rel_view.collection_name: - raise HTTPConflict( - 'Type {} does not match relationship type {}'.format( - reldata.get('type', None), rel_view.collection_name - ) - ) - if reldata.get('id') is None: - raise HTTPBadRequest( - 'An id is required in a resource identifier.' - ) - rel_item = self.dbsession.query( - rel.tgt_class - ).options( - load_only(rel_view.key_column.name) - ).get(reldata['id']) - if not rel_item: - raise HTTPNotFound('{}/{} not found'.format( - rel_view.collection_name, reldata['id'] - )) - setattr(item, relname, rel_item) - elif isinstance(reldata, list): - rel_items = [] - for res_ident in reldata: - rel_item = self.dbsession.query( - rel.tgt_class - ).options( - load_only(rel_view.key_column.name) - ).get(res_ident['id']) - if not rel_item: - raise HTTPNotFound('{}/{} not found'.format( - rel_view.collection_name, res_ident['id'] - )) - rel_items.append(rel_item) - setattr(item, relname, rel_items) - try: - self.dbsession.flush() - except sqlalchemy.exc.IntegrityError as exc: - raise HTTPConflict(str(exc)) - doc = pyramid_jsonapi.jsonapi.Document() - doc.meta = { - 'updated': { - 'attributes': [ - att for att in itertools.chain(atts, hybrid_atts) - if att != self.key_column.name - ], - 'relationships': [r for r in rels] - } - } - # if an update is successful ... the server - # responds only with top-level meta data - doc.filter_keys = {'meta': {}} - return doc + pass - @jsonapi_view - def delete(self): + def delete_old(self): """Handle DELETE request for single item. Delete the referenced item from the collection. @@ -537,24 +260,9 @@ def delete(self): http DELETE http://localhost:6543/people/1 """ + pass - doc = pyramid_jsonapi.jsonapi.Document() - item = self.single_item_query(loadonly=[self.key_column.name]).one() - for callback in self.callbacks['before_delete']: - callback(self, item) - try: - self.dbsession.delete(item) - self.dbsession.flush() - except sqlalchemy.exc.IntegrityError as exc: - raise HTTPFailedDependency(str(exc)) - doc.update({ - 'data': self.serialise_resource_identifier( - self.obj_id - )}) - return doc - - @jsonapi_view - def collection_get(self): + def collection_get_old(self): """Handle GET requests for the collection. Get a set of items from the collection, possibly matching search/filter @@ -606,33 +314,9 @@ def collection_get(self): http GET http://localhost:6543/people?page[limit]=2&page[offset]=2&sort=-name&include=posts """ - # Set up the query - query = self.dbsession.query( - self.model - ).options( - load_only(*self.allowed_requested_query_columns.keys()) - ) - query = self.query_add_sorting(query) - query = self.query_add_filtering(query) - qinfo = self.collection_query_info(self.request) - try: - count = query.count() - except sqlalchemy.exc.ProgrammingError: - raise HTTPInternalServerError( - 'An error occurred querying the database. Server logs may have details.' - ) - query = query.offset(qinfo['page[offset]']) - query = query.limit(qinfo['page[limit]']) + pass - ret = self.collection_return(query, count=count) - - # Alter return dict with any callbacks. - for callback in self.callbacks['after_collection_get']: - ret = callback(self, ret) - return ret - - @jsonapi_view - def collection_post(self): + def collection_post_old(self): """Handle POST requests for the collection. Create a new object in collection. @@ -691,108 +375,9 @@ def collection_post(self): } }' Content-Type:application/vnd.api+json """ - try: - data = self.request.json_body['data'] - except KeyError: - raise HTTPBadRequest('data attribute required in POSTs.') - - if not isinstance(data, dict): - raise HTTPBadRequest('data attribute must contain a single resource object.') - - # Alter data with any callbacks. - for callback in self.callbacks['before_collection_post']: - data = callback(self, data) - - # Check to see if we're allowing client ids - if not self.api.settings.allow_client_ids and 'id' in data: - raise HTTPForbidden('Client generated ids are not supported.') - # Type should be correct or raise 409 Conflict - datatype = data.get('type') - if datatype != self.collection_name: - raise HTTPConflict("Unsupported type '{}'".format(datatype)) - try: - atts = data['attributes'] - except KeyError: - atts = {} - if 'id' in data: - atts[self.model.__pyramid_jsonapi__['id_col_name']] = data['id'] - item = self.model(**atts) - with self.dbsession.no_autoflush: - for relname, reldict in data.get('relationships', {}).items(): - try: - reldata = reldict['data'] - except KeyError: - raise HTTPBadRequest( - 'relationships within POST must have data member' - ) - try: - rel = self.relationships[relname] - except KeyError: - raise HTTPNotFound( - 'No relationship {} in collection {}'.format( - relname, - self.collection_name - ) - ) - rel_type = self.api.view_classes[rel.tgt_class].collection_name - if rel.direction is ONETOMANY or rel.direction is MANYTOMANY: - # reldata should be a list/array - if not isinstance(reldata, Sequence) or isinstance(reldata, str): - raise HTTPBadRequest( - 'Relationship data should be an array for TOMANY relationships.' - ) - rel_items = [] - for rel_identifier in reldata: - if rel_identifier.get('type') != rel_type: - raise HTTPConflict( - 'Relationship identifier has type {} and should be {}'.format( - rel_identifier.get('type'), rel_type - ) - ) - try: - rel_items.append(self.dbsession.query(rel.tgt_class).get(rel_identifier['id'])) - except KeyError: - raise HTTPBadRequest( - 'Relationship identifier must have an id member' - ) - setattr(item, relname, rel_items) - else: - if (not isinstance(reldata, dict)) and (reldata is not None): - raise HTTPBadRequest( - 'Relationship data should be a resource identifier object or null.' - ) - if reldata.get('type') != rel_type: - raise HTTPConflict( - 'Relationship identifier has type {} and should be {}'.format( - reldata.get('type'), rel_type - ) - ) - try: - setattr( - item, - relname, - self.dbsession.query(rel.tgt_class).get(reldata['id']) - ) - except KeyError: - raise HTTPBadRequest( - 'No id member in relationship data.' - ) - try: - self.dbsession.add(item) - self.dbsession.flush() - except sqlalchemy.exc.IntegrityError as exc: - raise HTTPConflict(exc.args[0]) - self.request.response.status_code = 201 - self.request.response.headers['Location'] = self.request.route_url( - self.api.endpoint_data.make_route_name(self.collection_name, suffix='item'), - **{'id': self.id_col(item)} - ) - doc = pyramid_jsonapi.jsonapi.Document() - doc.update({'data': self.serialise_db_item(item, {})}) - return doc + pass - @jsonapi_view - def related_get(self): + def related_get_old(self): """Handle GET requests for related URLs. Get object(s) related to a specified object. @@ -848,32 +433,9 @@ def related_get(self): http GET http://localhost:6543/posts/1/author """ - # Set up the query - query = self.related_query(self.obj_id, self.rel) - - if self.rel.direction is ONETOMANY or self.rel.direction is MANYTOMANY: - query = self.rel_view.query_add_sorting(query) - query = self.rel_view.query_add_filtering(query) - qinfo = self.rel_view.collection_query_info(self.request) - try: - count = query.count() - except sqlalchemy.exc.ProgrammingError: - raise HTTPInternalServerError( - 'An error occurred querying the database. Server logs may have details.' - ) - query = query.offset(qinfo['page[offset]']) - query = query.limit(qinfo['page[limit]']) - ret = self.rel_view.collection_return(query, count=count) - else: - ret = self.rel_view.single_return(query) - - # Alter return dict with any callbacks. - for callback in self.callbacks['after_related_get']: - ret = callback(self, ret) - return ret + pass - @jsonapi_view - def relationships_get(self): + def relationships_get_old(self): """Handle GET requests for relationships URLs. Get object identifiers for items referred to by a relationship. @@ -927,36 +489,9 @@ def relationships_get(self): http GET http://localhost:6543/posts/1/relationships/author """ - # Set up the query - query = self.related_query(self.obj_id, self.rel, full_object=False) + pass - if self.rel.direction is ONETOMANY or self.rel.direction is MANYTOMANY: - query = self.rel_view.query_add_sorting(query) - query = self.rel_view.query_add_filtering(query) - qinfo = self.rel_view.collection_query_info(self.request) - try: - count = query.count() - except sqlalchemy.exc.ProgrammingError: - raise HTTPInternalServerError( - 'An error occurred querying the database. Server logs may have details.' - ) - query = query.offset(qinfo['page[offset]']) - query = query.limit(qinfo['page[limit]']) - ret = self.rel_view.collection_return( - query, - count=count, - identifiers=True - ) - else: - ret = self.rel_view.single_return(query, identifier=True) - - # Alter return dict with any callbacks. - for callback in self.callbacks['after_relationships_get']: - ret = callback(self, ret) - return ret - - @jsonapi_view - def relationships_post(self): + def relationships_post_old(self): """Handle POST requests for TOMANY relationships. Add the specified member to the relationship. @@ -1010,51 +545,9 @@ def relationships_post(self): { "type": "comments", "id": "1" } ]' Content-Type:application/vnd.api+json """ - if self.rel.direction is MANYTOONE: - raise HTTPForbidden('Cannot POST to TOONE relationship link.') - - # Alter data with any callbacks - data = self.request.json_body['data'] - for callback in self.callbacks['before_relationships_post']: - data = callback(self, data) - - obj = self.dbsession.query(self.model).get(self.obj_id) - items = [] - for resid in data: - if resid['type'] != self.rel_view.collection_name: - raise HTTPConflict( - "Resource identifier type '{}' does not match relationship type '{}'.".format( - resid['type'], self.rel_view.collection_name - ) - ) - try: - newitem = self.dbsession.query(self.rel_class).get(resid['id']) - except sqlalchemy.exc.DataError as exc: - raise HTTPBadRequest("invalid id '{}'".format(resid['id'])) - if newitem is None: - raise HTTPFailedDependency("One or more objects POSTed to this relationship do not exist.") - items.append(newitem) - getattr(obj, self.relname).extend(items) - try: - self.dbsession.flush() - except sqlalchemy.exc.IntegrityError as exc: - if 'duplicate key value violates unique constraint' in str(exc): - # This happens when using an association proxy if we attempt to - # add an object to the relationship that's already there. We - # want this to be a no-op. - pass - else: - raise HTTPFailedDependency(str(exc)) - except sqlalchemy.orm.exc.FlushError as exc: - if str(exc).startswith("Can't flush None value"): - raise HTTPFailedDependency("One or more objects POSTed to this relationship do not exist.") - else: - # Catch-all. Shouldn't reach here. - raise # pragma: no cover - return {} + pass - @jsonapi_view - def relationships_patch(self): + def relationships_patch_old(self): """Handle PATCH requests for relationships (TOMANY or TOONE). Completely replace the relationship membership. @@ -1117,70 +610,9 @@ def relationships_patch(self): { "type": "comments", "id": "2" } ]' Content-Type:application/vnd.api+json """ - # Alter data with any callbacks - data = self.request.json_body['data'] - for callback in self.callbacks['before_relationships_patch']: - data = callback(self, data) - - obj = self.dbsession.query(self.model).get(self.obj_id) - if self.rel.direction is MANYTOONE: - local_col, _ = self.rel.obj.local_remote_pairs[0] - resid = data - if resid is None: - setattr(obj, self.relname, None) - else: - if resid['type'] != self.rel_view.collection_name: - raise HTTPConflict( - "Resource identifier type '{}' does not match relationship type '{}'.".format( - resid['type'], - self.rel_view.collection_name - ) - ) - setattr( - obj, - local_col.name, - resid['id'] - ) - try: - self.dbsession.flush() - except sqlalchemy.exc.IntegrityError as exc: - raise HTTPFailedDependency( - 'Object {}/{} does not exist.'.format(resid['type'], resid['id']) - ) - except sqlalchemy.exc.DataError as exc: - raise HTTPBadRequest("invalid id '{}'".format(resid['id'])) - return {} - items = [] - for resid in self.request.json_body['data']: - if resid['type'] != self.rel_view.collection_name: - raise HTTPConflict( - "Resource identifier type '{}' does not match relationship type '{}'.".format( - resid['type'], - self.rel_view.collection_name - ) - ) - try: - newitem = self.dbsession.query(self.rel_class).get(resid['id']) - except sqlalchemy.exc.DataError as exc: - raise HTTPBadRequest("invalid id '{}'".format(resid['id'])) - if newitem is None: - raise HTTPFailedDependency("One or more objects POSTed to this relationship do not exist.") - items.append(newitem) - setattr(obj, self.relname, items) - try: - self.dbsession.flush() - except sqlalchemy.exc.IntegrityError as exc: - raise HTTPFailedDependency(str(exc)) - except sqlalchemy.orm.exc.FlushError as exc: - if str(exc).startswith("Can't flush None value"): - raise HTTPFailedDependency("One or more objects PATCHed to this relationship do not exist.") - else: - # Catch-all. Shouldn't reach here. - raise # pragma: no cover - return {} + pass - @jsonapi_view - def relationships_delete(self): + def relationships_delete_old(self): """Handle DELETE requests for TOMANY relationships. Delete the specified member from the relationship. @@ -1234,198 +666,38 @@ def relationships_delete(self): { "type": "comments", "id": "1" } ]' Content-Type:application/vnd.api+json """ - if self.rel.direction is MANYTOONE: - raise HTTPForbidden('Cannot DELETE to TOONE relationship link.') - obj = self.dbsession.query(self.model).get(self.obj_id) - - # Call callbacks - for callback in self.callbacks['before_relationships_delete']: - callback(self, obj) - - for resid in self.request.json_body['data']: - if resid['type'] != self.rel_view.collection_name: - raise HTTPConflict( - "Resource identifier type '{}' does not match relationship type '{}'.".format( - resid['type'], self.rel_view.collection_name - ) - ) - try: - item = self.dbsession.query(self.rel_class).get(resid['id']) - except sqlalchemy.exc.DataError as exc: - raise HTTPBadRequest("invalid id '{}'".format(resid['id'])) - if item is None: - raise HTTPFailedDependency("One or more objects DELETEd from this relationship do not exist.") - try: - getattr(obj, self.relname).remove(item) - except ValueError as exc: - if exc.args[0].endswith('not in list'): - # The item we were asked to remove is not there. - pass - else: - raise - try: - self.dbsession.flush() - except sqlalchemy.exc.IntegrityError as exc: - raise HTTPFailedDependency(str(exc)) - return {} - - def single_item_query(self, loadonly=None): - """A query representing the single item referenced by the request. + pass - **URL (matchdict) Parameters** - - **id** (*str*): resource id - - Returns: - sqlalchemy.orm.query.Query: query which will fetch item with id - 'id'. - """ + def base_collection_query(self, loadonly=None): if not loadonly: loadonly = self.allowed_requested_query_columns.keys() - return self.dbsession.query( + query = self.dbsession.query( self.model ).options( load_only(*loadonly) - ).filter( - self.id_col(self.model) == self.obj_id ) + query._entities = [Entity(type=self.model)] + query.__class__ = RQLQuery + return query - def single_return(self, query, not_found_message=None, identifier=False): - """Populate return dictionary for a single item. - - Arguments: - query (sqlalchemy.orm.query.Query): query designed to return one item. - - Keyword Arguments: - not_found_message (str or None): if an item is not found either: - - * raise 404 with ``not_found_message`` if it is a str; - - * or return ``{"data": None}`` if ``not_found_message`` is None. - - identifier: return identifier if True, object if false. - - Returns: - jsonapi.Document: in the form: - - .. parsed-literal:: - - { - "data": { resource object } - - optionally... - "included": [ included objects ] - } - - or - - .. parsed-literal:: - - { resource identifier } - - Raises: - HTTPNotFound: if the item is not found. - """ - included = {} - doc = pyramid_jsonapi.jsonapi.Document() - try: - item = query.one() - except NoResultFound: - if not_found_message: - raise HTTPNotFound(not_found_message) - else: - return doc - if identifier: - doc.data = self.serialise_resource_identifier(self.id_col(item)) - else: - doc.data = self.serialise_db_item(item, included) - if self.requested_include_names(): - doc.included = [obj for obj in included.values()] - return doc - - def collection_return(self, query, count=None, identifiers=False): - """Populate return document for collections. - - Arguments: - query (sqlalchemy.orm.query.Query): query designed to return multiple - items. - - Keyword Arguments: - count(int): Number of items the query will return (if known). + def single_item_query(self, obj_id=None, loadonly=None): + """A query representing the single item referenced by id. - identifiers(bool): return identifiers if True, objects if false. + Keyword Args: + obj_id: id of object to be fetched. If None then use the id from + the URL. + loadonly: which attributes to load. If None then all requested + attributes from the URL. Returns: - jsonapi.Document: in the form: - - .. parsed-literal:: - - { - "data": [ resource objects ] - - optionally... - "included": [ included objects ] - } - - or - - .. parsed-literal:: - - [ resource identifiers ] - - Raises: - HTTPBadRequest: If a count was not supplied and an attempt to call - q.count() failed. + sqlalchemy.orm.query.Query: query which will fetch item with id + 'id'. """ - # Get info for query. - qinfo = self.collection_query_info(self.request) - - # Add information to the return dict - doc = pyramid_jsonapi.jsonapi.Document(collection=True) - results = {} - - try: - count = count or query.count() - except sqlalchemy.exc.ProgrammingError: - raise HTTPInternalServerError( - 'An error occurred querying the database. Server logs may have details.' - ) - - results['available'] = count - - # Pagination links - doc.links = self.pagination_links( - count=results['available'] + if obj_id is None: + obj_id = self.obj_id + return self.base_collection_query(loadonly=loadonly).filter( + self.id_col(self.model) == obj_id ) - results['limit'] = qinfo['page[limit]'] - results['offset'] = qinfo['page[offset]'] - - # Primary data - try: - if identifiers: - data = [ - self.serialise_resource_identifier(self.id_col(dbitem)) - for dbitem in query.all() - ] - else: - included = {} - data = [ - self.serialise_db_item(dbitem, included) - for dbitem in query.all() - ] - # Included objects - if self.requested_include_names(): - doc.included = [obj for obj in included.values()] - except sqlalchemy.exc.DataError as exc: - raise HTTPBadRequest(str(exc.orig)) - for item in data: - res = pyramid_jsonapi.jsonapi.Resource() - res.update(item) - doc.resources.append(res) - results['returned'] = len(doc.resources) - - doc.meta = {'results': results} - return doc def query_add_sorting(self, query): """Add sorting to query. @@ -1458,16 +730,14 @@ def query_add_sorting(self, query): if main_key == 'id': main_key = self.key_column.name order_att = getattr(self.model, main_key) - # order_att will be a sqlalchemy.orm.properties.ColumnProperty if - # sort_keys[0] is the name of an attribute or a - # sqlalchemy.orm.relationships.RelationshipProperty if sort_keys[0] - # is the name of a relationship. - if hasattr(order_att, 'property') and isinstance(order_att.property, RelationshipProperty): + if main_key in self.relationships: # If order_att is a relationship then we need to add a join to # the query and order_by the sort_keys[1] column of the # relationship's target. The default target column is 'id'. + rel = self.relationships[main_key] + if rel.to_many: + raise HTTPBadRequest(f"Can't sort by TO_MANY relationship {main_key}.") query = query.join(order_att) - rel = order_att.property try: sub_key = sort_keys[1] except IndexError: @@ -1475,7 +745,7 @@ def query_add_sorting(self, query): sub_key = self.view_instance( rel.tgt_class ).key_column.name - order_att = getattr(rel.obj.mapper.entity, sub_key) + order_att = getattr(rel.tgt_class, sub_key) if key_info['ascending']: query = query.order_by(order_att) else: @@ -1579,6 +849,9 @@ def query_add_filtering(self, query): ) query = query.filter(comparator(val)) + for rql in qinfo['_rql_filters']: + query = query.rql(rql) + return query def related_limit(self, relationship): @@ -1596,7 +869,7 @@ def related_limit(self, relationship): Returns: int: paging limit for related resources. """ - limit_comps = ['limit', 'relationships', relationship.obj.key] + limit_comps = ['limit', 'relationships', relationship.name] limit = self.default_limit qinfo = self.collection_query_info(self.request) while limit_comps: @@ -1636,10 +909,14 @@ def association_proxy_query(self, obj_id, rel, full_object=True): proxy = rel.obj.for_class(rel.src_class) query = self. dbsession.query( rel.tgt_class - ).join(proxy.remote_attr).filter( - # I thought the simpler - # proxy.local_attr.contains() should work but it doesn't - proxy.local_attr.property.local_remote_pairs[0][1] == obj_id + ).select_from( + rel.src_class + ).join( + proxy.local_attr + ).join( + proxy.remote_attr + ).filter( + self.id_col(rel.src_class) == obj_id ) if full_object: query = query.options( @@ -1667,62 +944,24 @@ def standard_relationship_query(self, obj_id, relationship, full_object=True): sqlalchemy.orm.query.Query: query which will fetch related object(s). """ - rel = relationship.obj - rel_class = rel.mapper.class_ - rel_view = self.view_instance(rel_class) - local_col, rem_col = rel.local_remote_pairs[0] - query = self.dbsession.query(rel_class) - if full_object: - query = query.options( - load_only(*rel_view.allowed_requested_query_columns.keys()) - ) + rel_model = relationship.tgt_class + tables = [ + getattr(col, 'table', None) + for col in relationship.obj.local_remote_pairs[0] + ] + if tables[0] is tables[1]: + model = aliased(self.model) else: - query = query.options(load_only(rel_view.key_column.name)) - if rel.direction is ONETOMANY: - query = query.filter(obj_id == rem_col) - elif rel.direction is MANYTOMANY: - query = query.filter( - obj_id == rel.primaryjoin.right - ).filter( - self.id_col(rel_class) == rel.secondaryjoin.right - ) - elif rel.direction is MANYTOONE: - if rel.primaryjoin.left.table == rel.primaryjoin.right.table: - # This is a self-joined table with a child->parent rel. AKA - # adjacancy list. We need aliasing. - rel_class_alias = aliased(rel_class) - - # Assume a 'Node' model with 'id' and 'parent_id' attributes and - # a relationship 'parent' such that parent_id stores the id of - # this Node's parent. - # - # The parent_id column from the aliased class. - right_alias = getattr(rel_class_alias, rel.primaryjoin.right.key) - # The id column from the aliased class. - left_alias = getattr(rel_class_alias, rel.primaryjoin.left.key) - - query = query.join( - rel_class_alias, - # Node.id == Aliased.parent_id - rel.primaryjoin.left == right_alias - ).filter( - # Aliased.id == obj_id - left_alias == obj_id - ) - else: - query = query.filter( - rel.primaryjoin - ).filter( - self.id_col(self.model_from_table(local_col.table)) == obj_id - ) - else: - raise HTTPError('Unknown relationships direction, "{}".'.format( - rel.direction.name - )) - - return query + model = self.model + return self.dbsession.query(rel_model).select_from( + model + ).join( + getattr(model, relationship.name) + ).filter( + self.id_col(model) == obj_id + ) - def related_query(self, obj_id, relationship, full_object=True): + def related_query(self, obj, relationship, full_object=True): """Construct query for related objects. Parameters: @@ -1744,6 +983,10 @@ def related_query(self, obj_id, relationship, full_object=True): sqlalchemy.orm.query.Query: query which will fetch related object(s). """ + if obj is None: + obj_id = None + else: + obj_id = self.id_col(obj) if isinstance(relationship.obj, AssociationProxy): query = self.association_proxy_query( obj_id, relationship, full_object=full_object @@ -1764,14 +1007,13 @@ def object_exists(self, obj_id): bool: True if object exists, False if not. """ try: - item = self.dbsession.query( + return bool(self.dbsession.query( self.model ).options( load_only(self.key_column.name) - ).get(obj_id) + ).filter(self.key_column == obj_id).one_or_none()) except (sqlalchemy.exc.DataError, sqlalchemy.exc.StatementError): - item = False - return bool(item) + return False def mapped_info_from_name(self, name, model=None): """Get the pyramid_jsonapi info dictionary for a mapped object. @@ -1787,127 +1029,6 @@ def mapped_info_from_name(self, name, model=None): name ).info.get('pyramid_jsonapi', {}) - def serialise_resource_identifier(self, obj_id): - """Return a resource identifier dictionary for id "obj_id" - - """ - ret = { - 'type': self.collection_name, - 'id': str(obj_id) - } - - for callback in self.callbacks['after_serialise_identifier']: - ret = callback(self, ret) - - return ret - - def serialise_db_item( - self, item, - included, include_path=None, - ): - """Serialise an individual database item to JSON-API. - - Arguments: - item: item to serialise. - - Keyword Arguments: - included (dict): dictionary to be filled with included resource - objects. - include_path (list): list tracking current include path for - recursive calls. - - Returns: - jsonapi.Resource: - """ - - include_path = include_path or [] - - # Item's id and type are required at the top level of json-api - # objects. - # The item's id. - item_id = self.id_col(item) - # JSON API type. - item_url = self.request.route_url( - self.api.endpoint_data.make_route_name(self.collection_name, suffix='item'), - **{'id': item_id} - ) - - resource_json = pyramid_jsonapi.jsonapi.Resource(self) - resource_json.id = str(item_id) - resource_json.attributes = { - key: getattr(item, key) - for key in self.requested_attributes.keys() - if self.mapped_info_from_name(key).get('visible', True) - } - resource_json.links = {'self': item_url} - - rels = {} - for key, rel in self.relationships.items(): - is_included = False - if '.'.join(include_path + [key]) in self.requested_include_names(): - is_included = True - if key not in self.requested_relationships and not is_included: - continue - if not self.mapped_info_from_name(key).get('visible', True): - continue - - rel_dict = { - 'data': None, - 'links': { - 'self': '{}/relationships/{}'.format(item_url, key), - 'related': '{}/{}'.format(item_url, key) - }, - 'meta': { - 'direction': rel.direction.name, - 'results': {} - } - } - rel_view = self.view_instance(rel.tgt_class) - - query = self.related_query(item_id, rel, full_object=is_included) - - many = rel.direction is ONETOMANY or rel.direction is MANYTOMANY - if many: - limit = self.related_limit(rel) - rel_dict['meta']['results']['limit'] = limit - query = query.limit(limit) - - data = [] - ritems = query.all() - if not many and len(ritems) > 1: - raise HTTPInternalServerError("Multiple results for TOONE relationship.") - - for ritem in ritems: - data.append( - rel_view.serialise_resource_identifier( - self.id_col(ritem) - ) - ) - if is_included: - included[ - (rel_view.collection_name, self.id_col(ritem)) - ] = rel_view.serialise_db_item( - ritem, - included, include_path + [key] - ) - if many: - rel_dict['meta']['results']['available'] = query.count() - rel_dict['meta']['results']['returned'] = len(data) - rel_dict['data'] = data - else: - if data: - rel_dict['data'] = data[0] - - if key in self.requested_relationships: - rels[key] = rel_dict - - resource_json.relationships = rels - - for callback in self.callbacks['after_serialise_object']: - callback(self, resource_json) - - return resource_json.as_dict() - @classmethod @functools.lru_cache() def collection_query_info(cls, request): @@ -1953,7 +1074,11 @@ def collection_query_info(cls, request): cls.max_limit, int(request.params.get('page[limit]', cls.default_limit)) ) + if info['page[limit]'] < 0: + raise HTTPBadRequest('page[limit] must not be negative.') info['page[offset]'] = int(request.params.get('page[offset]', 0)) + if info['page[offset]'] < 0: + raise HTTPBadRequest('page[offset] must not be negative.') # Sorting. # Use param 'sort' as per spec. @@ -1981,6 +1106,7 @@ def collection_query_info(cls, request): # Find all parametrised parameters ( :) ) info['_filters'] = {} + info['_rql_filters'] = [] info['_page'] = {} for param in request.params.keys(): match = re.match(r'(.*?)\[(.*?)\]', param) @@ -2005,23 +1131,31 @@ def collection_query_info(cls, request): # # Find all the filters. if match.group(1) == 'filter': - colspec = match.group(2) - operator = 'eq' - try: - colspec, operator = colspec.split(':') - except ValueError: - pass - colspec = colspec.split('.') - info['_filters'][param] = { - 'colspec': colspec, - 'op': operator, - 'value': val - } + if match.group(2) == '*rql': + info['_rql_filters'].append(val) + else: + colspec = match.group(2) + operator = 'eq' + try: + colspec, operator = colspec.split(':') + except ValueError: + pass + colspec = colspec.split('.') + info['_filters'][param] = { + 'colspec': colspec, + 'op': operator, + 'value': val + } # Paging. elif match.group(1) == 'page': info['_page'][match.group(2)] = val + # Options. + info['pj_include_count'] = asbool( + request.params.get('pj_include_count', 'false') + ) + return info def pagination_links(self, count=0): @@ -2140,9 +1274,7 @@ def requested_attributes(self): } """ return { - k: v for k, v in itertools.chain( - self.attributes.items(), self.hybrid_attributes.items() - ) + k: v for k, v in self.all_attributes.items() if k in self.requested_field_names } @@ -2251,6 +1383,19 @@ def requested_include_names(self): inc.add('.'.join(curname)) return inc + # @functools.lru_cache() + def path_is_included(self, path): + """Test if path is in requested includes. + + Args: + path (list): list representation if include path to test. + + Returns: + bool: True if path is in requested includes. + + """ + return '.'.join(path) in self.requested_include_names() + @property def bad_include_paths(self): """Return a set of invalid 'include' parameters. @@ -2295,93 +1440,172 @@ def view_instance(self, model): Returns: class: subclass of CollectionViewBase providing view for ``model``. """ - return self.api.view_classes[model](self.request) + view_instance = self.api.view_classes[model](self.request) + try: + view_instance.pj_shared = self.pj_shared + except AttributeError: + pass + return view_instance @classmethod - def append_callback_set(cls, set_name): - """Append a named set of callbacks from ``callback_sets``. + def _add_stage_handler( + cls, view_method, stage_name, hfunc, + add_after='end', + add_existing=False, + ): + ''' + Add a stage handler to a stage of a view method. + ''' + vm_func = getattr(cls, view_method) + try: + stage = vm_func.stages[stage_name] + except KeyError: + raise KeyError( + f'Endpoint {view_method} has no stage {stage_name}.' + ) + try: + index = stage.index(hfunc) + except ValueError: + index = False + if index and not add_existing: + return + if add_after == 'start': + stage.appendleft(hfunc) + elif add_after == 'end': + stage.append(hfunc) + else: + stage.insert(stage.index(add_after) + 1, hfunc) - Args: - set_name (str): key in ``callback_sets``. - """ - for cb_name, callback in cls.callback_sets[set_name].items(): - cls.callbacks[cb_name].append(callback) + @classmethod + def add_stage_handler( + cls, methods, stages, hfunc, + add_after='end', + add_existing=False, + ): + ''' + Add a stage handler to stages of view methods. - def acso_after_serialise_object(view, obj): # pylint:disable=no-self-argument - """Standard callback altering object to take account of permissions. + Arguments: + methods: an iterable of view method names (``get``, + ``collection_get`` etc.). + stages: an iterable of stage names. + hfunc: the handler function. + add_existing: If True, add this handler even if it exists in the + deque. + add_after: 'start', 'end', or an existing function. + ''' + for vm_name in methods: + vm_func = getattr(cls, vm_name) + for stage_name in stages: + cls._add_stage_handler( + vm_name, stage_name, hfunc, add_after, add_existing, + ) - Args: - obj (dict): the object immediately after serialisation. + @staticmethod + def true_filter(*args, **kwargs): + return True - Returns: - dict: the object, possibly with some fields removed, or meta - information indicating permission was denied to the whole object. - """ - if view.allowed_object(obj): - # Remove any forbidden fields that have been added by other - # callbacks. Those from the model won't have been added in the first - # place. - - # Keep track so we can tell the caller which ones were forbidden. - forbidden = set() - for attr in ('attributes', 'relationships'): - if hasattr(obj, attr): - new = {} - for name, val in getattr(obj, attr).items(): - if name in view.allowed_fields: - new[name] = val - else: - forbidden.add(name) - setattr(obj, attr, new) - # Now add all the forbidden fields from the model to the forbidden - # list. They don't need to be removed from the serialised object - # because they should not have been added in the first place. - for field in view.requested_field_names: - if field not in view.allowed_fields: - forbidden.add(field) - if not hasattr(obj, 'meta'): - obj.meta = {} - obj.meta['forbidden_fields'] = list(forbidden) - else: - obj.meta = { - 'errors': [ - { - 'code': 403, - 'title': 'Forbidden', - 'detail': 'No permission to view {}/{}.'.format( - obj.type, obj.id - ) - } - ] + def permission_to_dict(self, permission): + if permission is False: + return { + 'id': False, + 'attributes': set(), + 'relationships': set() } - return obj - - def acso_after_get(view, ret): # pylint:disable=unused-argument, no-self-argument, no-self-use - """Standard callback throwing 403 (Forbidden) based on information in meta. - - Args: - ret (jsonapi.Document): object which would have been returned from get(). + if permission is True: + allowed = { + 'id': True, + 'attributes': set(self.all_attributes), + 'relationships': set(self.relationships), + } + else: + # Make a shallow copy of permission so we don't alter it. + allowed = dict() + allowed.update(permission) + # allowed should now be a dictionary. + if 'id' not in allowed: + allowed['id'] = True + if allowed['attributes'] is True: + allowed['attributes'] = set(self.all_attributes) + if allowed['attributes'] is False: + allowed['attributes'] = set() + if allowed['relationships'] is True: + allowed['relationships'] = set(self.relationships) + if allowed['relationships'] is False: + allowed['relationships'] = set() + return allowed - Returns: - jsonapi.Document: the same object if an error has not been raised. + @classmethod + def register_permission_filter(cls, permissions, stages, pfunc): + # Permission filters should have the signature: + # pfunc(object_rep, view, stage, permission) + + # expand out 'read' and 'write' http method sets into individual + # permissions. + perms = [] + for p in permissions: + if p == 'read' or p == 'write': + perms.extend(cls.api.endpoint_data.endpoints['method_sets'][p]) + else: + perms.append(p) + cls.api.enable_permission_handlers(perms, stages) + for stage_name in stages: + for perm in perms: + perm = perm.lower() + + # Register the filter function. + def dictified_pfunc(view, object_rep): + return view.permission_to_dict( + pfunc( + object_rep, + view=view, + stage=stage_name, + permission=perm, + ) + ) + cls.permission_filters[perm][stage_name] = dictified_pfunc - Raises: - HTTPForbidden + def permission_filter(self, permission, stage_name, default=None): """ - obj = ret - errors = [] + Find the permission filter given a permission and stage name. + """ + default = default or (lambda *a, **kw: True) try: - errors = obj.meta['errors'] - except KeyError: - return ret - for error in errors: - if error['code'] == 403: - raise HTTPForbidden(error['detail']) - return ret + filter = self.permission_filters[permission][stage_name] + except KeyError as e: + filter = lambda view, object_rep: view.permission_to_dict( + default( + object_rep, + view_instance=view, + stage_name=stage_name, + permission_sought=permission, + ) + ) + return partial(filter, self) - callback_sets = { - 'access_control_serialised_objects': { - 'after_serialise_object': acso_after_serialise_object, - 'after_get': acso_after_get - } - } + @classmethod + def permission_handler(cls, endpoint_name, stage_name): + # Look for the most specific permission handler first: see if one is + # defined by the workflow method module (wf_kind_endpoint). + wf_kind_endpoint = importlib.import_module( + getattr(cls.api.settings, 'workflow_{}'.format(endpoint_name)) + ) + try: + return wf_kind_endpoint.permission_handler(stage_name) + except (KeyError, AttributeError): + # Either no permission_handler (AttributeError) or it doesn't handle + # method_name or stage_name (KeyError). Either way look for a + # handler in the wf_kind package. + wf_kind = importlib.import_module(wf_kind_endpoint.__package__) + # Last part after the underscore of the endpoint name should be the + # HTTP method/verb. + try: + return wf_kind.permission_handler(endpoint_name, stage_name) + except (KeyError, AttributeError): + # Use generic workflow module if it handles this stage. + try: + return wf.permission_handler(endpoint_name, stage_name) + except KeyError: + # This method and stage is completely unhandled. Return a + # handler that effectively does nothing. + return lambda view, arg, pdata: arg diff --git a/pyramid_jsonapi/endpoints.py b/pyramid_jsonapi/endpoints.py index 9abcacd5..8e3e2251 100644 --- a/pyramid_jsonapi/endpoints.py +++ b/pyramid_jsonapi/endpoints.py @@ -1,6 +1,6 @@ """Classes to store and manipulate endpoints and routes.""" -from functools import partial +from functools import partial, lru_cache from pyramid.httpexceptions import ( HTTPBadRequest, HTTPCreated, @@ -92,6 +92,10 @@ def __init__(self, api): # Mandatory 'http_method' keys: function # Optional 'http_method' keys: renderer self.endpoints = { + 'method_sets': { + 'read': {'GET'}, + 'write': {'POST', 'PATCH', 'DELETE'}, + }, 'query_parameters': { 'fields': '', 'filter': '', @@ -106,6 +110,7 @@ def __init__(self, api): 'If the server does not support sorting as specified in the query parameter sort, it MUST return 400 Bad Request.', 'If an endpoint does not support the include parameter, it MUST respond with 400 Bad Request to any requests that include it.', 'If the request content is malformed in some way.']}, + HTTPForbidden: {'reason': ['The authenticated user is not allowed to access the resource in this way.']}, }, 'endpoints': { 'collection': { @@ -310,3 +315,45 @@ def find_all_keys(self, name, ep_type, method): # method instance if name in self.endpoints['endpoints'][ep_type]['http_methods'][method.upper()]: yield self.endpoints['endpoints'][ep_type]['http_methods'][method.upper()][name] + + def get_function_name(self, view, http_method, route_pattern): + """Find the name of the function which handles the given route and method. + """ + for endpoint, endpoint_opts in self.endpoints['endpoints'].items(): + ep_route_pattern = self.rp_constructor.api_pattern( + view.collection_name, + self.route_pattern_to_suffix( + endpoint_opts.get('route_pattern', {}) + ) + ) + if ep_route_pattern == route_pattern: + return endpoint_opts['http_methods'][http_method.upper()]['function'] + raise Exception( + 'No endpoint function found for {}, {}, {},'.format( + view.collection_name, + http_method, + route_pattern, + ) + ) + + @property + @lru_cache(maxsize=128) + def http_to_view_methods(self): + ep_map = {} + for ep_type in self.endpoints['endpoints'].values(): + for http_name, data in ep_type['http_methods'].items(): + http_name = http_name.lower() + try: + view_methods = ep_map[http_name] + except KeyError: + view_methods = ep_map[http_name] = set() + view_methods.add(data['function']) + ep_map['write'] = set() + for m in map(str.lower, self.endpoints['method_sets']['write']): + ep_map['write'] |= ep_map[m] + ep_map['read'] = set() + for m in map(str.lower, self.endpoints['method_sets']['read']): + ep_map['read'] |= ep_map[m] + ep_map['all'] = ep_map['read'] | ep_map['write'] + + return ep_map diff --git a/pyramid_jsonapi/jsonapi.py b/pyramid_jsonapi/jsonapi.py deleted file mode 100644 index d8c4ed06..00000000 --- a/pyramid_jsonapi/jsonapi.py +++ /dev/null @@ -1,164 +0,0 @@ -"""A series of classes which store jsonapi in an internal dictionary, -and provide access through class attributes and helper methods.""" - - -class Base(): - """Common class for magic attr <-> dict classes.""" - - def __init__(self): - """Create initial 'real' attributes. - - Attributes: - _jsonapi (dict): JSONAPI document object. - resources (list): List of associated Resource objects. - """ - # We override __setattr__ so must use the 'original' to create new attrs - # Value modification is allowed (update, pop etc) but not replacement - super().__setattr__('_jsonapi', {}) - super().__setattr__('resources', []) - - def __setattr__(self, attr, value): - """If attr is a key in _jsonapi, set it's value. - Otherwise, set a class attribute as usual. - """ - - if attr == 'data': - self.data_to_resources(value) - elif attr in self._jsonapi: - self._jsonapi[attr] = value - else: - super().__setattr__(attr, value) - - def __getattr__(self, attr): - """Return dict key as if an attribute.""" - if attr == 'data': - return self.data_from_resources() - try: - return self._jsonapi[attr] - # Convert KeyError to AttributeError for consistency - except KeyError: - raise AttributeError("object has no attribute '{}'".format(attr)) - - def as_dict(self): - """Generate a dictionary representing the entire JSONAPI object. - Update 'data' to contain a single resource item, or list of items. - - Returns: - dict: JSONAPI object. - """ - - jsonapi_dict = self._jsonapi.copy() - if hasattr(self, 'data_from_resources'): - resources = self.data_from_resources() - if resources: - jsonapi_dict.update(resources) - - # Only return keys which are in filter_keys - return {k: v for k, v in jsonapi_dict.items() if k in self.filter_keys} - - def update(self, doc): - """Update class from JSONAPI document. - 'data' keys will be converted to Resource objects and added to - 'resources' attribute. - - Args: - doc (dict): JSONAPI object. - """ - if doc: - for key, val in doc.items(): - # data contains a single resources, or list of resources - # Convert to Resource objects, then add to self.resources - if key == "data": - self.data_to_resources(val) - else: - self._jsonapi[key] = val - - -class Document(Base): - """JSONAPI 'root' document object.""" - - def __init__(self, collection=False): - """Extend _jsonapi to contain top-level keys. - - Args: - collection (bool): Is this document a collection, or a single item? - filter_keys (dict): Keys to be returned when as_dict() is called. - """ - super().__init__() - self.collection = collection - # filter_keys controls which keys are included in as_dict() output - # It's also used to build the internal dictionary - self.filter_keys = { - 'data': [], - 'included': [], - 'links': {}, - 'meta': {}, - } - self._jsonapi.update(self.filter_keys) - - def data_from_resources(self): - """Generate 'data' part of jsonapi document from resources list. - - Returns: - dict, list or None: JSONAPI 'data' resource element. - """ - data = [] - for resource in self.resources: - data.append(resource.as_dict()) - - if data and self.collection: - return {'data': data} - elif data: - return {'data': data[0]} - elif self.collection: - return {'data': []} - - return {'data': None} - - def data_to_resources(self, data): - """Convert 'data' part of jsonapi document to resource(s). - Add resources to the resources list. - - Args: - data (dict): JSONAPI 'data' resource element. - """ - reslist = [] - if isinstance(data, list): - reslist.extend(data) - else: - reslist.append(data) - for item in reslist: - res = Resource() - res.update(item) - self.resources.append(res) - - -class Resource(Base): - """JSONAPI Resource object.""" - - def __init__(self, view_class=None): - """Extend _jsonapi to contain resource keys. - If view_class provided, update attributes. - - Args: - view_class (jsonapi.view_class): Resource data extracted from the view_class - - Attributes: - filter_keys (dict): Keys to be returned when as_dict() is called. - """ - super().__init__() - self.filter_keys = { - 'id': "", - 'type': "", - 'attributes': {}, - 'links': {}, - 'related': {}, - 'relationships': {}, - 'meta': {}, - } - self._jsonapi.update(self.filter_keys) - - # Update class attributes with sourced data - if view_class: - self.type = view_class.collection_name - self.attributes = dict.fromkeys(view_class.attributes, None) diff --git a/pyramid_jsonapi/metadata/JSONSchema/__init__.py b/pyramid_jsonapi/metadata/JSONSchema/__init__.py index 1d059614..e82f7a53 100644 --- a/pyramid_jsonapi/metadata/JSONSchema/__init__.py +++ b/pyramid_jsonapi/metadata/JSONSchema/__init__.py @@ -289,6 +289,9 @@ def validate(self, json_body, method='get'): jsonschema.validate(json_body, schm) except (jsonschema.exceptions.ValidationError) as exc: raise HTTPBadRequest(str(exc)) + except Exception as exc: + from pyramid.httpexceptions import HTTPInternalServerError + raise HTTPInternalServerError(str(exc)) def build_definitions(self): """Build data and attribute references for all endpoints, diff --git a/pyramid_jsonapi/workflow/__init__.py b/pyramid_jsonapi/workflow/__init__.py new file mode 100644 index 00000000..551e427c --- /dev/null +++ b/pyramid_jsonapi/workflow/__init__.py @@ -0,0 +1,1256 @@ +import functools +import importlib +import json +import logging +import re +import sqlalchemy + +from collections import ( + deque, + abc, +) + +from functools import ( + partial, + partialmethod +) + +from sqlalchemy.orm import load_only +from sqlalchemy.orm.interfaces import ( + ONETOMANY, + MANYTOMANY, + MANYTOONE +) + +from pyramid.httpexceptions import ( + HTTPNotFound, + HTTPForbidden, + HTTPBadRequest, + HTTPConflict, + HTTPUnsupportedMediaType, + HTTPNotAcceptable, + HTTPError, + HTTPFailedDependency, + HTTPInternalServerError, + status_map, +) +from sqlalchemy.orm.relationships import RelationshipProperty + + +import pyramid_jsonapi + + +def make_method(name, api): + settings = api.settings + wf_module = importlib.import_module( + getattr(settings, 'workflow_{}'.format(name)) + ) + + # Set up the stages. + stages = { + '_view_method_name': name, + 'validate_request': deque(), + 'alter_request': deque(), + 'alter_document': deque(), + 'validate_response': deque() + } + # stage_order = ['alter_request', 'validate_request', ] + for stage_name in wf_module.stages: + stages[stage_name] = deque() + # stage_order.append(stage_name) + # stage_order.append('alter_results') + # stage_order.append('validate_response') + stages['validate_request'].append(sh_validate_request_headers) + stages['validate_request'].append(sh_validate_request_valid_json) + stages['validate_request'].append(sh_validate_request_common_validity) + stages['validate_request'].append(sh_validate_request_object_exists) + stages['alter_request'].append(sh_alter_request_add_info) + stages['alter_document'].append(sh_alter_document_self_link) + if name.endswith('get'): + stages['alter_document'].append(sh_alter_document_add_returned_count) + if api.settings.debug_meta: + stages['alter_document'].append(sh_alter_document_debug_info) + + # Stack the deques. + for stage_name, stage_deque in stages.items(): + try: + item = getattr(wf_module, 'stage_' + stage_name) + except AttributeError: + # If there isn't one, move on. + continue + if callable(item): + # If item is callable just append it. + stage_deque.append(item) + else: + # item should be an iterable of callables. Append them. + for handler in item: + stage_deque.append(handler) + + # Build a set of expected responses. + ep_dict = api.endpoint_data.endpoints + parts = name.split('_') + if len(parts) == 1: + endpoint = 'item' + http_method = parts[0].upper() + else: + endpoint, http_method = parts + http_method = http_method.upper() + responses = set( + ep_dict['responses'].keys() | + ep_dict['endpoints'][endpoint]['responses'].keys() | + ep_dict['endpoints'][endpoint]['http_methods'][http_method]['responses'].keys() + ) + + def method(view): + view.pj_shared = SharedState(view) + try: + # execute_stage(view, stages, 'alter_request') + request = execute_stage( + view, stages, 'alter_request', view.request + ) + request = execute_stage( + view, stages, 'validate_request', request + ) + view.request = request + document = wf_module.workflow(view, stages) + document = execute_stage( + view, stages, 'alter_document', document + ) + ret = execute_stage( + view, stages, 'validate_response', document + ) + except Exception as exc: + if exc.__class__ not in responses: + logging.exception( + "Invalid exception raised: %s for route_name: %s path: %s", + exc.__class__, + view.request.matched_route.name, + view.request.current_route_path() + ) + if isinstance(exc, HTTPError): + if 400 <= int(exc.code) < 500: # pylint:disable=no-member + raise HTTPBadRequest("Unexpected client error: {}".format(exc)) + else: + raise HTTPInternalServerError("Unexpected server error.") + raise + + # Log any responses that were not expected. + response_class = status_map[view.request.response.status_code] + if response_class not in responses: + logging.error( + "Invalid response: %s for route_name: %s path: %s", + response_class, + view.request.matched_route.name, + view.request.current_route_path() + ) + return ret + + # Make stages available as an attribute of the method. + method.stages = stages + + return method + + +def wrapped_query_all(query): + """ + Wrap query.all() so that SQLAlchemy exceptions can be transformed to + http ones. + """ + try: + for obj in query.all(): + yield obj + except sqlalchemy.exc.DataError as exc: + raise HTTPBadRequest(str(exc.orig)) + + +def follow_rel(view, rel_name, include_path=None): + """ + True if rel_name should be followed and added. + """ + include_path = include_path or [] + rel_include_path = include_path + [rel_name] + if rel_name not in view.requested_relationships and not view.path_is_included(rel_include_path): + return False + if not view.mapped_info_from_name(rel_name).get('visible', True): + return False + return True + + +def partition(items, predicate=bool): + trues, falses = [], [] + for item in items: + if predicate(item): + trues.append(item) + else: + falses.append(item) + return (trues, falses) + + +def execute_stage(view, stages, stage_name, arg): + for handler in stages[stage_name]: + arg = handler( + # view, arg, None + arg, view, + stage=stage_name, + view_method=stages['_view_method_name'], + ) + return arg + + +def partition_doc_data(doc_data, partitioner): + if partitioner is None: + return doc_data, [] + accepted, rejected = [], [] + for item in doc_data: + if partitioner(item, doc_data): + accepted.append(item) + else: + rejected.append(item) + return accepted, rejected + + +def shp_get_alter_document(doc, view, stage, view_method): + data = doc['data'] + # Make it so that the data part is always a list for later code DRYness. + # We'll put it back the way it was later. Honest ;-). + if isinstance(data, list): + many = True + else: + data = [data] + many = False + + # Find the top level filter function to run over data. + try: + data_filter = partial( + view.permission_filter('get', 'alter_document'), + permission_sought='get', + stage_name='alter_document', + view_instance=view, + ) + except KeyError: + data_filter = None + + # Remember what was rejected so it can be removed from included later. + rejected_set = set() + accepted, rejected = partition_doc_data(data, data_filter) + + # Filter any related items. + for item in data: + for rel_name, rel_dict in item.get('relationships', {}).items(): + rel_data = rel_dict['data'] + if isinstance(rel_data, list): + rel_many = True + else: + rel_data = [rel_data] + rel_many = False + rel_view = view.view_instance(view.relationships[rel_name].tgt_class) + try: + rel_filter = partial( + rel_view.permission_filter('get', 'alter_document'), + permission_sought='get', + stage_name='alter_document', + view_instance=view, + ) + except KeyError: + rel_filter = None + rel_accepted, rel_rejected = partition_doc_data(rel_data, rel_filter) + rejected_set |= {(item['type'], item['id']) for item in rel_rejected} + if rel_many: + rel_dict['data'] = rel_accepted + else: + try: + rel_dict['data'] = rel_accepted[0] + except IndexError: + rel_dict['data'] = None + + # Time to do what we promised and put scalars back. + if many: + doc['data'] = accepted + else: + try: + doc['data'] = accepted[0] + except IndexError: + if rejected: + raise(HTTPNotFound('Object not found.')) + else: + doc['data'] = None + + # Remove any rejected items from included. + included = [ + item for item in doc.get('included', {}) + if (item['type'], item['id']) not in rejected_set + ] + doc['included'] = included + return doc + + +def shp_collection_post_alter_request(request, view, stage, view_method): + # Make sure there is a permission filter registered. + try: + pfilter = view.permission_filter('post', 'alter_request') + except KeyError: + return request + + obj_data = request.json_body['data'] + allowed = pfilter(obj_data) + if allowed.get('id', True) is False: + # Straight up forbidden to create object. + raise HTTPForbidden('No permission to POST object:\n\n{}'.format(request.json_body['data'])) + for att_name in list(obj_data.get('attributes', {}).keys()): + if att_name not in allowed['attributes']: + del(obj_data['attributes'][att_name]) + # TODO: alternatively raise HTTPForbidden? + rel_names = list(obj_data.get('relationships', {}).keys()) + for rel_name in rel_names: + if rel_name not in allowed['relationships']: + del(obj_data['relationships'][rel_name]) + # TODO: alternatively raise HTTPForbidden? + # Loop through a shallow copy of obj_data['relationships'] so that we + # can delete entries without causing problems. + rels_copy = {} + rels_copy.update(obj_data.get('relationships', {})) + for rel_name, rel_dict in rels_copy.items(): + # For each of these allowed rels, look to see if the *other end* + # of the relationship is to_one (in which case we need PATCH permission + # to that rel in order to set it to this object) or to_many (in which + # case we need POST permission in order to add this object to it). + rel = view.relationships[rel_name] + mirror_rel = rel.mirror_relationship + # if rel.direction in (ONETOMANY, MANYTOMANY): + # tgt_ris = rel_dict['data'] + # else: + # tgt_ris = [rel_dict['data']] + if mirror_rel: + mirror_view = mirror_rel.view_class(view.request) + if mirror_rel.direction in (ONETOMANY, MANYTOMANY): + # Need POST permission on tgt_ri.mirror_rel. + permission_sought = 'post' + else: + # Need PATCH permission on tgt_ri.mirror_rel. + permission_sought = 'patch' + try: + mfilter = mirror_view.permission_filter( + permission_sought, 'alter_request' + ) + except KeyError: + # No filter registered - treat as always True and skip this + # rel. + continue + if rel.direction in (ONETOMANY, MANYTOMANY): + allowed_ris = [] + for tgt_ri in rel_dict['data']: + mallowed = mfilter(tgt_ri) + if mirror_rel.name in mallowed['relationships']: + allowed_ris.append(tgt_ri) + # TODO: alternatively raise HTTPForbidden? + rel_dict['data'] = allowed_ris + else: + mallowed = mfilter(rel_dict['data']) + if mirror_rel.name not in mallowed['relationships']: + del(obj_data['relationships'][rel_name]) + # TODO: alternatively raise HTTPForbidden? + + request.body = json.dumps({'data': obj_data}).encode() + return request + + +def shp_relationships_post_alter_request(request, view, stage, view_method): + # Make sure there is a permission filter registered. + try: + pfilter = view.permission_filter('post', 'alter_request') + except KeyError: + return request + + # First need permision to POST to obj.rel. + obj_perms = pfilter({'type': view.collection_name, 'id': view.obj_id}) + if view.rel.name not in obj_perms['relationships']: + request.body = json.dumps({'data': []}).encode() + return request + + # We might need + mirror_rel = view.rel.mirror_relationship + mirror_view = mirror_rel.view_class(request) + try: + del_filter = view.permission_filter('delete', 'alter_request') + except KeyError: + del_filter = False + try: + m_patch_filter = mirror_view.permission_filter('patch', 'alter_request') + except KeyError: + m_patch_filter = False + try: + m_post_filter = mirror_view.permission_filter('post', 'alter_request') + except KeyError: + m_post_filter = False + new_data = [] + for ri in request.json_body['data']: + # adding = True + if mirror_rel and (del_filter or m_patch_filter): + # Check permission to alter other end of relationship. + related = view.dbsession.query( + view.rel.tgt_class + ).filter(mirror_view.key_column == ri['id']).one_or_none() + if related: + related_resource = ResultObject(mirror_view, related) + else: + # Trying to add a related id that doesn't exist. This + # is an error but we leave it to be caught later in the + # standard place. In the meantime there is no point in + # checking for permission. + new_data.append(ri) + continue + if view.rel.direction is ONETOMANY: + # Need DELETE on related.mirror.old_resource.rel + old_resource = ResultObject( + view, + getattr(related, mirror_rel.name) + ) + old_dict = old_resource.to_dict() + if ( + del_filter and old_dict and + view.rel.name not in del_filter(old_dict)['relationships'] + ): + # Not allowed to alter this relationship. + # print(f'No DELETE on {old_dict["type"]}/{old_dict["id"]}.{view.rel.name}') + continue + # Need PATCH on related.mirror + perms = m_patch_filter(related_resource.to_dict()) + if mirror_rel.name not in perms['relationships']: + # print(f'No PATCH on {related_resource.to_dict()}') + continue + else: + # MANYTOMANY + # Need POST on related.mirror + perms = m_post_filter(related_resource.to_dict()) + if mirror_rel.name not in perms['relationships']: + continue + # if adding: + new_data.append(ri) + # TODO: option to select alternate behaviour + # if len(new_data) != len(json_body['data']): + # raise HTTPForbidden( + # 'No permission to POST {} to relationship {}.'.format( + # item, view.relname + # ) + # ) + request.body = json.dumps({'data': new_data}).encode() + return request + + +def patch_relationship_ar_helper(view, this_item, rel_name, rel_dict): + rel = view.relationships[rel_name] + mirror_rel = rel.mirror_relationship + if not mirror_rel: + # No mirror relationship: no need to check permissions on it. + return rel_dict + this_ro = ResultObject(view, this_item) + mirror_view = mirror_rel.view_class(view.request) + if rel.direction in (ONETOMANY, MANYTOMANY): + # rel data will be an array of ris. We'll need permission to post, + # delete, or patch each one on the mirror. + # Find the current related items. + current_related_ids = { + str(getattr(related_item, mirror_view.key_column.name)) + for related_item in getattr(this_item, rel.name) + } + new_related_ids = { + item['id'] for item in rel_dict['data'] + } + adding = new_related_ids - current_related_ids + removing = current_related_ids - new_related_ids + allowed_ids = set(current_related_ids) + # print(f'{rel.name}: cur {current_related_ids}, new {new_related_ids}') + # print(f' add {adding}, rem {removing}') + if rel.direction is ONETOMANY: + # mirror_rel is MANYTOONE and we need PATCH permission for any + # alterations. + try: + m_patch_filter = mirror_view.permission_filter('patch', 'alter_request') + except KeyError: + return rel_dict + for _id in adding: + mallowed = m_patch_filter( + { + 'type': mirror_view.collection_name, + 'id': _id, + 'relationships': { + mirror_rel.name: { + 'data': this_ro.identifier() + } + } + } + ) + if mirror_rel.name in mallowed['relationships']: + allowed_ids.add(_id) + # TODO: alternatively raise HTTPForbidden? + for _id in removing: + mallowed = m_patch_filter( + { + 'type': mirror_view.collection_name, + 'id': _id, + 'relationships': { + mirror_rel.name: { + 'data': None + } + } + } + ) + if mirror_rel.name in mallowed['relationships']: + allowed_ids.remove(_id) + # TODO: alternatively raise HTTPForbidden? + else: + # mirror_rel is MANYTOMANY and we need POST permission to add or + # DELETE permission to remove. + try: + m_post_filter = mirror_view.permission_filter('post', 'alter_request') + except KeyError: + m_post_filter = False + try: + m_del_filter = mirror_view.permission_filter('delete', 'alter_request') + except KeyError: + m_del_filter = False + if m_post_filter: + for _id in adding: + mallowed = m_post_filter( + { + 'type': mirror_view.collection_name, + 'id': _id, + 'relationships': { + mirror_rel.name: { + 'data': this_ro.identifier() + } + } + } + ) + if mirror_rel.name in mallowed['relationships']: + allowed_ids.add(_id) + # TODO: alternatively raise HTTPForbidden? + if m_del_filter: + for _id in removing: + mallowed = m_del_filter( + { + 'type': mirror_view.collection_name, + 'id': _id, + 'relationships': { + mirror_rel.name: { + 'data': this_ro.identifier() + } + } + } + ) + if mirror_rel.name in mallowed['relationships']: + allowed_ids.remove(_id) + # TODO: alternatively raise HTTPForbidden? + else: + allowed_ids -= removing + + rel_dict['data'] = [ + {'type': mirror_view.collection_name, 'id': _id} + for _id in allowed_ids + ] + else: + # rel.direction is MANYTOONE. There should be just one ri, or None. + cur_related = ResultObject(mirror_view, getattr(this_item, rel.name)) + if cur_related.object is None and rel_dict['data'] is None: + # Nothing to do. + return rel_dict + if str(cur_related.obj_id) == rel_dict['data'].get('id'): + # Also nothing to do. + return rel_dict + if cur_related.object is not None: + # Need DELETE permission on cur_related.mirror_rel + try: + m_del_filter = mirror_view.permission_filter('del', 'alter_request') + except KeyError: + return rel_dict + mallowed = m_del_filter( + { + 'type': mirror_view.collection_name, + 'id': cur_related.obj_id, + 'relationships': { + mirror_rel.name: { + 'data': this_ro.identifier() + } + } + } + ) + if mirror_rel.name not in mallowed['relationships']: + return False + # del(obj_data['relationships'][rel_name]) + if rel_dict['data'] is not None: + # Need POST permission on cur_related.mirror_rel + try: + m_post_filter = mirror_view.permission_filter('post', 'alter_request') + except KeyError: + return rel_dict + mallowed = m_post_filter( + { + 'type': mirror_view.collection_name, + 'id': cur_related.obj_id, + 'relationships': { + mirror_rel.name: { + 'data': this_ro.identifier() + } + } + } + ) + if mirror_rel.name not in mallowed['relationships']: + return False + # del(obj_data['relationships'][rel_name]) + return rel_dict + + +def shp_patch_alter_request(request, view, stage, view_method): + # Make sure there is a permission filter registered. + try: + pfilter = view.permission_filter('patch', 'alter_request') + except KeyError: + return request + + obj_data = request.json_body['data'] + allowed = pfilter(obj_data) + if allowed.get('id', True) is False: + # Straight up forbidden to PATCH object. + raise HTTPForbidden('No permission to PATCH object:\n\n{}'.format(request.json_body['data'])) + for att_name in list(obj_data.get('attributes', {}).keys()): + if att_name not in allowed['attributes']: + del(obj_data['attributes'][att_name]) + # TODO: alternatively raise HTTPForbidden? + rel_names = list(obj_data.get('relationships', {}).keys()) + for rel_name in rel_names: + if rel_name not in allowed['relationships']: + del(obj_data['relationships'][rel_name]) + # TODO: alternatively raise HTTPForbidden? + # Loop through a shallow copy of obj_data['relationships'] so that we + # can delete entries without causing problems. + rels_copy = {} + rels_copy.update(obj_data.get('relationships', {})) + this_item = view.get_item() + for rel_name, rel_dict in rels_copy.items(): + # For each of these allowed rels, look to see if the *other end* + # of the relationship is to_one (in which case we need PATCH permission + # to that rel in order to set it to this object) or to_many (in which + # case we need POST permission in order to add this object to it). + new_rel_dict = patch_relationship_ar_helper(view, this_item, rel_name, rel_dict) + if new_rel_dict: + obj_data['relationships'][rel_name] = new_rel_dict + request.body = json.dumps({'data': obj_data}).encode() + return request + + +def shp_relationships_patch_alter_request(request, view, stage, view_method): + # Make sure there is a permission filter registered. + try: + pfilter = view.permission_filter('patch', 'alter_request') + except KeyError: + return request + + allowed = pfilter({'type': view.collection_name, 'id': view.obj_id}) + if not allowed.get('id', True): + # Straight up forbidden to PATCH object. + raise HTTPForbidden('No permission to PATCH object:\n\n{}'.format(request.json_body['data'])) + if view.rel.name not in allowed['relationships']: + raise HTTPForbidden(f'No permission to PATCH {view.collection_name}/{view.obj_id}.{view.rel.name}') + this_item = view.get_item() + new_rel_dict = patch_relationship_ar_helper( + view, this_item, view.rel.name, + {'data': view.request.json_body['data']} + ) + if new_rel_dict: + request.body = json.dumps(new_rel_dict).encode() + else: + raise HTTPForbidden(f'No permission to alter remote side of {view.collection_name}/{view.obj_id}.{view.rel.name}') + return request + + +def get_item(view, item_or_id=None): + """Wrapper around view.get_item() to allow passing an item or an id.""" + if item_or_id is None: + item_or_id = view.obj_id + if isinstance(item_or_id, view.model): + return item_or_id + else: + return view.get_item(item_or_id) + + +def ar_check_mirror_rel_perms(view, permission, rel, rel_dict, item_or_id=None): + # Find the item rel is relative to. + + # Check for direct permission to alter rel. + # filter = view.permission_filter(permission, 'alter_request', default=view.true_filter) + # if not filter(this_item): + # return {'forward': False} + mirror_rel = rel.mirror_relationship + if not mirror_rel: + # No mirror relationship: no need to check permissions on it. Return + # False as a predicate value. + return False + # mirror_view = mirror_rel.view_class(view.request) + report = { + 'post': {'allowed': set(), 'denied': set()}, + 'patch': {'allowed': set(), 'denied': set()}, + 'delete': {'allowed': set(), 'denied': set()} + } + # related_items = view.related_query(this_ro.obj_id, rel).all() + rel_data = rel_dict['data'] + if rel.direction is MANYTOONE: + # Always a list for DRYness. + rel_data = [rel_data] + if mirror_rel.direction is MANYTOONE: + # Need patch permission for any alterations. + if permission == 'post': + adding = {'_id': 2} + return report + + +def shp_delete_alter_request(request, view, stage, view_method): + # Make sure there is a permission filter registered. + try: + pfilter = view.permission_filter('delete', 'alter_request') + except KeyError: + return request + + this_item = view.get_item() + this_ro = ResultObject(view, this_item) + this_data = this_ro.serialise() + allowed = pfilter(this_data, mask=view.everything_mask) + if not allowed['id']: + raise HTTPForbidden('No permission to delete {}/{}'.format( + view.collection_name, view.obj_id + )) + for att_name in this_data.get('attributes', {}): + if att_name not in allowed['attributes']: + # Need permission to *all* attributes for delete to work sensibly. + raise HTTPForbidden('No permission to delete {}/{}[{}]'.format( + view.collection_name, view.obj_id, att_name + )) + for rel_name in this_data.get('relationships', {}): + if rel_name not in allowed['relationships']: + # Need permission to *all* relationships for delete to work sensibly. + raise HTTPForbidden('No permission to delete {}/{}.{}'.format( + view.collection_name, view.obj_id, rel_name + )) + for rel_name, rel_dict in this_data.get('relationships', {}).items(): + rel = view.relationships[rel_name] + mirror_perms = ar_check_mirror_rel_perms(view, 'delete', rel, rel_dict) + if not mirror_perms: + # There is no mirror relationship for this rel. + continue + if mirror_perms['delete']['denied'] or mirror_perms['patch']['denied']: + # At least one required delete or patch on mirror is denied. + raise HTTPForbidden('No permission to delete {}/{}.{} by mirror relationship.'.format( + view.collection_name, view.obj_id, rel_name + )) + return request + + +def shp_relationships_delete_alter_request(request, view, stage, view_method): + try: + pfilter = partial( + view.permission_filter('delete', 'alter_request'), + permission_sought='delete', + stage_name='alter_request', + view_instance=view, + ) + except KeyError: + return request + # TODO: option to select alternate behaviour + if True: + # Pretend that the request only contained the items which are allowed. + new_data = [ + item for item in request.json_body['data'] + if pfilter(item, request.json_body['data']) + ] + request.json_body['data'] = new_data + else: + # Deny the whole request if we lack permission for any one item. + for item in request.json_body['data']: + if not pfilter(item, request.json_body['data']): + raise HTTPForbidden( + 'No permission to DELETE {} from relationship {}.'.format( + item, view.relname + ) + ) + return request + + +permission_handlers = { + 'get': { + 'alter_document': shp_get_alter_document, + }, + 'collection_get': { + 'alter_document': shp_get_alter_document, + }, + 'related_get': { + 'alter_document': shp_get_alter_document, + }, + 'relationships_get': { + 'alter_document': shp_get_alter_document, + }, + 'collection_post': { + 'alter_request': shp_collection_post_alter_request, + }, + 'relationships_post': { + 'alter_request': shp_relationships_post_alter_request, + }, + 'patch': { + 'alter_request': shp_patch_alter_request, + }, + 'relationships_patch': { + 'alter_request': shp_relationships_patch_alter_request, + }, + 'delete': { + 'alter_request': shp_delete_alter_request, + }, + 'relationships_delete': { + 'alter_request': shp_relationships_delete_alter_request, + } +} + + +def permission_handler(endpoint_name, stage_name): + return permission_handlers[endpoint_name][stage_name] + + +@functools.lru_cache() +def get_jsonapi_accepts(request): + """Return a set of all 'application/vnd.api' parts of the accept + header. + """ + accepts = re.split( + r',\s*', + request.headers.get('accept', '') + ) + return { + a for a in accepts + if a.startswith('application/vnd.api') + } + + +def sh_validate_request_headers(request, view, stage, view_method): + """Check that request headers comply with spec. + + Raises: + HTTPUnsupportedMediaType + HTTPNotAcceptable + """ + # Spec says to reject (with 415) any request with media type + # params. + if len(request.headers.get('content-type', '').split(';')) > 1: + raise HTTPUnsupportedMediaType( + 'Media Type parameters not allowed by JSONAPI ' + + 'spec (http://jsonapi.org/format).' + ) + # Spec says throw 406 Not Acceptable if Accept header has no + # application/vnd.api+json entry without parameters. + jsonapi_accepts = get_jsonapi_accepts(request) + if jsonapi_accepts and\ + 'application/vnd.api+json' not in jsonapi_accepts: + raise HTTPNotAcceptable( + 'application/vnd.api+json must appear with no ' + + 'parameters in Accepts header ' + + '(http://jsonapi.org/format).' + ) + + return request + + +def sh_validate_request_valid_json(request, view, stage, view_method): + """Check that the body of any request is valid JSON. + + Raises: + HTTPBadRequest + """ + if request.content_length: + try: + request.json_body + except ValueError: + raise HTTPBadRequest("Body is not valid JSON.") + + return request + + +def sh_validate_request_common_validity(request, view, stage, view_method): + """Perform common request validity checks.""" + + if request.content_length and view.api.settings.schema_validation: + # Validate request JSON against the JSONAPI jsonschema + view.api.metadata.JSONSchema.validate(request.json_body, request.method) + + # Spec says throw BadRequest if any include paths reference non + # existent attributes or relationships. + if view.bad_include_paths: + raise HTTPBadRequest( + "Bad include paths {}".format( + view.bad_include_paths + ) + ) + + # Spec says set Content-Type to application/vnd.api+json. + request.response.content_type = 'application/vnd.api+json' + + return request + + +def sh_validate_request_object_exists(request, view, stage, view_method): + """Make sure that id exists in collection for all urls specifying an id.""" + if view.obj_id is not None: + if not view.object_exists(view.obj_id): + raise HTTPNotFound('No item {} in {}'.format(view.obj_id, view.collection_name)) + return request + + +def sh_alter_document_self_link(doc, view, stage, view_method): + """Include a self link unless the method is PATCH.""" + if view.request.method != 'PATCH': + doc.update_child('links', {'self': view.request.url}) + return doc + + +def sh_alter_document_debug_info(doc, view, stage, view_method): + """Potentially add some debug information.""" + debug = { + 'accept_header': { + a: None for a in get_jsonapi_accepts(view.request) + }, + 'qinfo_page': + view.collection_query_info(view.request)['_page'], + 'atts': {k: None for k in view.attributes.keys()}, + 'includes': { + k: None for k in view.requested_include_names() + } + } + doc.update_child('meta', {'debug': debug}) + return doc + + +def sh_alter_document_add_returned_count(doc, view, stage, view_method): + """Add the returned count to meta.""" + # Don't add a returned count unless we're returning an array of objects. + if not isinstance(doc['data'], abc.Sequence): + return doc + try: + meta = doc['meta'] + except KeyError: + meta = doc['meta'] = {} + try: + results = meta['results'] + except KeyError: + results = meta['results'] = {} + results['returned'] = len(doc['data']) + return doc + + +def sh_alter_document_add_denied(doc, view, stage, view_method): + try: + meta = doc['meta'] + except KeyError: + meta = doc['meta'] = {} + try: + results = meta['results'] + except KeyError: + results = meta['results'] = {} + rejected_dict = view.pj_shared.rejected.rejected_dict + results['denied'] = len(rejected_dict['objects']) + meta['rejected'] = rejected_dict + + # delete(results['available']) + return doc + + +def sh_alter_request_add_info(request, view, stage, view_method): + """Add information commonly used in view operations.""" + + # Extract id and relationship from route, if provided + view.obj_id = view.request.matchdict.get('id', None) + view.not_found_message = f'No item {view.obj_id} in {view.collection_name}' + view.relname = view.request.matchdict.get('relationship', None) + if view.relname: + # Gather relationship info + mapper = sqlalchemy.inspect(view.model).mapper + try: + view.rel = view.relationships[view.relname] + except KeyError: + raise HTTPNotFound('No relationship {} in collection {}'.format( + view.relname, + view.collection_name + )) + view.rel_class = view.rel.tgt_class + view.rel_view = view.view_instance(view.rel_class) + return request + + +class ResultObject: + def __init__(self, view, object, related=None): + self.view = view + self.object = object + self.related = related or {} + if object is None: + self.obj_id = None + else: + self.obj_id = self.view.id_col(self.object) + self.url = self.view.request.route_url( + self.view.api.endpoint_data.make_route_name( + self.view.collection_name, suffix='item' + ), + **{'id': self.obj_id} + ) + self.attribute_mask = set(self.view.requested_attributes) + self.rel_mask = set(self.view.requested_relationships) + + self._included_dict = None + + def serialise(self): + # An object of 'None' is a special case. + if self.object is None: + return None + atts = { + key: getattr(self.object, key) + for key in self.attribute_mask + if self.view.mapped_info_from_name(key).get('visible', True) + } + rels = { + rel_name: res.rel_dict( + rel=self.view.relationships[rel_name], + rel_name=rel_name, + parent_url=self.url + ) + for rel_name, res in self.related.items() + if rel_name in self.rel_mask + } + return { + 'type': self.view.collection_name, + 'id': str(self.obj_id), + 'attributes': atts, + 'links': {'self': self.url}, + 'relationships': rels, + } + + def to_dict(self): + if self.object is None: + return None + return { + 'type': self.view.collection_name, + 'id': str(self.obj_id), + 'attributes': { + key: getattr(self.object, key) for key in self.view.all_attributes + } + } + + def identifier(self): + # An object of 'None' is a special case. + if self.object is None: + return None + return { + 'type': self.view.collection_name, + 'id': str(self.obj_id) + } + + @property + def tuple_identifier(self): + if self.object is None: + return None + return ( + self.view.collection_name, + str(self.obj_id) + ) + + @property + def str_identifier(self): + if self.object is None: + return None + return f'{self.view.collection_name}-::-{self.obj_id}' + + @property + def included_dict(self): + incd = {} + for rel_name, res in self.related.items(): + if not res.is_included: + continue + incd.update(res.included_dict) + return incd + + +class Results: + def __init__(self, view, objects=None, many=True, count=None, limit=None, is_included=False, is_top=False, not_found_message='Object not found.'): + self.view = view + self.objects = objects or [] + self.rejected_objects = [] + self.many = many + self.count = count + self.limit = limit + self.is_included = is_included + self.is_top = is_top + self.not_found_message = not_found_message + + self._meta = None + self._included_dict = None + self._flag_filtered = False + + def serialise(self, identifiers=False): + doc = Doc() + if self.many: + # doc.collection = True + doc['links'] = self.view.pagination_links(count=self.count) + if identifiers: + doc['data'] = self.identifiers() + else: + doc['data'] = self.data() + doc['included'] = self.included() + doc['meta'] = self.meta + + return doc + + def serialise_object_with(self, method_name): + data = [getattr(o, method_name)() for o in self.objects] + if self.many: + return data + else: + try: + return data[0] + except IndexError: + return None + + @property + def meta(self): + if self._meta is None: + self._meta = self.compute_meta() + return self._meta + + def compute_meta(self): + meta = {} + if self.many: + meta.update( + { + 'results': { + 'available': self.count, + 'limit': self.limit, + # 'returned': len(self.objects) + } + } + ) + return meta + + def data(self): + return self.serialise_object_with('serialise') + + def identifiers(self): + return self.serialise_object_with('identifier') + + def rel_dict(self, rel, rel_name, parent_url): + rd = { + 'data': self.identifiers(), + 'links': { + 'self': '{}/relationships/{}'.format(parent_url, rel_name), + 'related': '{}/{}'.format(parent_url, rel_name) + }, + 'meta': { + 'direction': rel.direction.name, + } + } + if self.many: + rd['meta']['results'] = {} + rd['meta']['results']['available'] = self.count + rd['meta']['results']['limit'] = self.limit + rd['meta']['results']['returned'] = len(rd['data']) + return rd + + def included(self): + return [o.serialise() for o in self.included_dict.values()] + + @property + def included_dict(self): + included_dict = {} + for o in self.objects: + if not self.is_top: + included_dict[(self.view.collection_name, o.obj_id)] = o + included_dict.update(o.included_dict) + return included_dict + + def filter(self, predicate, reason='Permission denied', force_rerun=False): + # if self._flag_filtered and not force_rerun: + # return + accepted = [] + for obj in self.objects: + pred = self.view.permission_to_dict(predicate(obj)) + if pred['id']: + accepted.append(obj) + reject_atts = obj.attribute_mask - pred['attributes'] + obj.attribute_mask &= pred['attributes'] + # record rejected atts + self.view.pj_shared.rejected.reject_attributes( + obj.tuple_identifier, + reject_atts, + reason, + ) + reject_rels = obj.rel_mask - pred['relationships'] + obj.rel_mask &= pred['relationships'] + # record rejected rels + self.view.pj_shared.rejected.reject_relationships( + obj.tuple_identifier, + reject_rels, + reason, + ) + else: + self.rejected_objects.append(obj) + self.view.pj_shared.rejected.reject_object(obj.tuple_identifier, reason) + + self.objects = accepted + self._flag_filtered = True + + +class Doc(dict): + + def update_child(self, key, value): + try: + self[key].update(value) + except KeyError: + self[key] = value + + +class SharedState(): + + def __init__(self, view, request=None, results=None, document=None, rejected=None): + self.view = view + self.request = request + self.results = results + self.document = document + self.rejected = rejected or Rejected(view) + + +class Rejected(): + + def __init__(self, view, rejected=None): + self.view = view + self.rejected = rejected or { + 'objects': {}, + 'attributes': {}, + 'relationships': {}, + } + + def reject_object(self, identifier, reason): + self.rejected['objects'][identifier] = reason + + def _reject_multiple(self, identifier, things, reason, category): + if not things: + return + new = {t: reason for t in things} + try: + self.rejected[category][identifier].update(new) + except KeyError: + self.rejected[category][identifier] = new + + reject_attributes = partialmethod(_reject_multiple, category='attributes') + reject_relationships = partialmethod(_reject_multiple, category='relationships') + + def identifier_to_str(self, identifier): + return f'{identifier[0]}::{identifier[1]}' + + @property + def rejected_dict(self): + ret = {} + for part in ['objects', 'attributes', 'relationships']: + ret[part] = { + self.identifier_to_str(k): v for k, v in self.rejected[part].items() + } + return ret diff --git a/pyramid_jsonapi/workflow/loop/__init__.py b/pyramid_jsonapi/workflow/loop/__init__.py new file mode 100644 index 00000000..5871c113 --- /dev/null +++ b/pyramid_jsonapi/workflow/loop/__init__.py @@ -0,0 +1,154 @@ +import pyramid_jsonapi.workflow as wf +import sqlalchemy + +from functools import ( + partial +) +from itertools import ( + islice, +) +from pyramid.httpexceptions import ( + HTTPBadRequest, + HTTPForbidden, + HTTPNotFound, +) +from sqlalchemy.orm.interfaces import ( + ONETOMANY, + MANYTOMANY, + MANYTOONE +) + + +stages = ( + 'alter_query', + 'alter_related_query', + 'alter_result', + 'before_write_item', +) + + +def get_one_altered_result_object(view, stages, query): + res_obj = wf.execute_stage( + view, stages, 'alter_result', + wf.ResultObject(view, view.get_one(query, view.not_found_message)) + ) + if res_obj.tuple_identifier in view.pj_shared.rejected.rejected['objects']: + raise HTTPForbidden(view.not_found_message) + return res_obj + + +def altered_objects_iterator(view, stages, stage_name, objects_iterable): + """ + Return an iterator of objects from objects_iterable filtered and altered by + the stage_name stage. + """ + return filter( + lambda o: o.tuple_identifier not in view.pj_shared.rejected.rejected['objects'], + map( + partial(wf.execute_stage, view, stages, stage_name), + (wf.ResultObject(view, o) for o in objects_iterable) + ) + ) + + +def get_related(obj, rel_name, stages, include_path=None): + """ + Get the objects related to obj via the relationship rel_name. + """ + view = obj.view + include_path = include_path or [] + rel_include_path = include_path + [rel_name] + rel = view.relationships[rel_name] + rel_view = view.view_instance(rel.tgt_class) + many = rel.direction is ONETOMANY or rel.direction is MANYTOMANY + is_included = view.path_is_included(rel_include_path) + if rel.queryable: + query = view.related_query(obj.object, rel, full_object=is_included) + query = wf.execute_stage( + view, stages, 'alter_related_query', query + ) + # print('*' * 80) + # print(rel_name) + # print(query.statement.compile(view.dbsession.bind)) + # print('*' * 80) + objects_iterable = wf.wrapped_query_all(query) + else: + objects_iterable = getattr(obj.object, rel_name) + if not many: + objects_iterable = [objects_iterable] + rel_objs = list( + islice( + altered_objects_iterator( + rel_view, stages, + 'alter_result', + objects_iterable, + ), + view.related_limit(rel) + ) + ) + rel_results = wf.Results( + rel_view, + objects=rel_objs, + many=many, + is_included=is_included + ) + if is_included: + for rel_obj in rel_results.objects: + for rel_rel_name in rel_obj.view.relationships: + if wf.follow_rel(rel_obj.view, rel_rel_name, include_path=rel_include_path): + rel_obj.related[rel_rel_name] = get_related( + rel_obj, + rel_rel_name, + stages, + include_path=rel_include_path + ) + if many: + rel_results.limit = view.related_limit(rel) + return rel_results + + +def fill_result_object_related(res_obj, stages): + view = res_obj.view + for rel_name in view.relationships: + if wf.follow_rel(view, rel_name): + res_obj.related[rel_name] = get_related( + res_obj, rel_name, stages + ) + + +def shp_get_alter_result(obj, view, stage, view_method): + reason = "Permission denied." + predicate = view.permission_filter('get', stage) + # pred = view.permission_to_dict(predicate(obj)) + pred = predicate(obj) + if pred['id']: + reject_atts = obj.attribute_mask - pred['attributes'] + obj.attribute_mask &= pred['attributes'] + # record rejected atts + view.pj_shared.rejected.reject_attributes( + obj.tuple_identifier, + reject_atts, + reason, + ) + reject_rels = obj.rel_mask - pred['relationships'] + obj.rel_mask &= pred['relationships'] + # record rejected rels + view.pj_shared.rejected.reject_relationships( + obj.tuple_identifier, + reject_rels, + reason, + ) + else: + view.pj_shared.rejected.reject_object(obj.tuple_identifier, reason) + return obj + + +def permission_handler(endpoint_name, stage_name): + handlers = { + 'get': { + 'alter_result': shp_get_alter_result, + } + } + for ep in ('collection_get', 'related_get', 'relationships_get'): + handlers[ep] = handlers['get'] + return handlers[endpoint_name][stage_name] diff --git a/pyramid_jsonapi/workflow/loop/collection_get.py b/pyramid_jsonapi/workflow/loop/collection_get.py new file mode 100644 index 00000000..285a4a9b --- /dev/null +++ b/pyramid_jsonapi/workflow/loop/collection_get.py @@ -0,0 +1,64 @@ +import pyramid_jsonapi.workflow as wf +import sqlalchemy + +from itertools import islice +from pyramid.httpexceptions import ( + HTTPBadRequest, + HTTPInternalServerError, +) +from . import stages + + +def workflow(view, stages): + query = view.base_collection_query() + query = view.query_add_sorting(query) + query = view.query_add_filtering(query) + qinfo = view.collection_query_info(view.request) + limit = qinfo['page[limit]'] + + # If there is any chance that the code might alter the number of results + # after they come from the database then we can't rely on LIMIT and COUNT + # at the database end, so these have been disabled. They might return + # if a suitable flag is introduced. + # try: + # count = query.count() + # except sqlalchemy.exc.ProgrammingError: + # raise HTTPInternalServerError( + # 'An error occurred querying the database. Server logs may have details.' + # ) + # query = query.limit(limit) + # query = query.offset(qinfo['page[offset]']) + + query = wf.execute_stage( + view, stages, 'alter_query', query + ) + + # Get the direct results from this collection (no related objects yet). + # Stage 'alter_result' will run on each object. + objects_iterator = wf.loop.altered_objects_iterator( + view, stages, 'alter_result', wf.wrapped_query_all(query) + ) + # Only do paging the slow way if page[offset] is explicitly specified in the + # request. + offset_count = 0 + if 'page[offset]' in view.request.params: + offset_count = sum(1 for _ in islice(objects_iterator, qinfo['page[offset]'])) + objects = list(islice(objects_iterator, limit)) + count = None + if(qinfo['pj_include_count']): + count = offset_count + len(objects) + sum(1 for _ in objects_iterator) + results = wf.Results( + view, + objects=objects, + many=True, + is_top=True, + count=count, + limit=limit + ) + + # Fill the relationships with related objects. + # Stage 'alter_result' will run on each object. + for res_obj in results.objects: + wf.loop.fill_result_object_related(res_obj, stages) + + return results.serialise() diff --git a/pyramid_jsonapi/workflow/loop/collection_post.py b/pyramid_jsonapi/workflow/loop/collection_post.py new file mode 100644 index 00000000..cc3731de --- /dev/null +++ b/pyramid_jsonapi/workflow/loop/collection_post.py @@ -0,0 +1,126 @@ +import pyramid_jsonapi.workflow as wf +import sqlalchemy + +from collections.abc import Sequence + +from .get import ( + get_doc, +) + +from sqlalchemy.orm.interfaces import ( + ONETOMANY, + MANYTOMANY, + MANYTOONE +) + +from pyramid.httpexceptions import ( + HTTPBadRequest, + HTTPForbidden, + HTTPConflict, + HTTPNotFound, +) +from . import stages + + +def workflow(view, stages): + try: + data = view.request.json_body['data'] + except KeyError: + raise HTTPBadRequest('data attribute required in POSTs.') + + if not isinstance(data, dict): + raise HTTPBadRequest('data attribute must contain a single resource object.') + + # Check to see if we're allowing client ids + if not view.api.settings.allow_client_ids and 'id' in data: + raise HTTPForbidden('Client generated ids are not supported.') + # Type should be correct or raise 409 Conflict + datatype = data.get('type') + if datatype != view.collection_name: + raise HTTPConflict("Unsupported type '{}'".format(datatype)) + try: + atts = data['attributes'] + except KeyError: + atts = {} + if 'id' in data: + atts[view.model.__pyramid_jsonapi__['id_col_name']] = data['id'] + item = view.model(**atts) + with view.dbsession.no_autoflush: + for relname, reldict in data.get('relationships', {}).items(): + try: + reldata = reldict['data'] + except KeyError: + raise HTTPBadRequest( + 'relationships within POST must have data member' + ) + try: + rel = view.relationships[relname] + except KeyError: + raise HTTPNotFound( + 'No relationship {} in collection {}'.format( + relname, + view.collection_name + ) + ) + rel_type = view.api.view_classes[rel.tgt_class].collection_name + if rel.direction is ONETOMANY or rel.direction is MANYTOMANY: + # reldata should be a list/array + if not isinstance(reldata, Sequence) or isinstance(reldata, str): + raise HTTPBadRequest( + 'Relationship data should be an array for TOMANY relationships.' + ) + rel_items = [] + for rel_identifier in reldata: + if rel_identifier.get('type') != rel_type: + raise HTTPConflict( + 'Relationship identifier has type {} and should be {}'.format( + rel_identifier.get('type'), rel_type + ) + ) + try: + rel_items.append(view.dbsession.query(rel.tgt_class).get(rel_identifier['id'])) + except KeyError: + raise HTTPBadRequest( + 'Relationship identifier must have an id member' + ) + setattr(item, relname, rel_items) + else: + if (not isinstance(reldata, dict)) and (reldata is not None): + raise HTTPBadRequest( + 'Relationship data should be a resource identifier object or null.' + ) + if reldata.get('type') != rel_type: + raise HTTPConflict( + 'Relationship identifier has type {} and should be {}'.format( + reldata.get('type'), rel_type + ) + ) + try: + setattr( + item, + relname, + view.dbsession.query(rel.tgt_class).get(reldata['id']) + ) + except KeyError: + raise HTTPBadRequest( + 'No id member in relationship data.' + ) + item = wf.execute_stage( + view, stages, 'before_write_item', item + ) + try: + view.dbsession.add(item) + view.dbsession.flush() + except sqlalchemy.exc.IntegrityError as exc: + raise HTTPConflict(exc.args[0]) + view.request.response.status_code = 201 + item_id = view.id_col(item) + view.request.response.headers['Location'] = view.request.route_url( + view.api.endpoint_data.make_route_name(view.collection_name, suffix='item'), + **{'id': item_id} + ) + + # The rest of this is more or less a get. + return get_doc( + view, getattr(view, 'get').stages, view.single_item_query(item_id) + ) diff --git a/pyramid_jsonapi/workflow/loop/delete.py b/pyramid_jsonapi/workflow/loop/delete.py new file mode 100644 index 00000000..8da2d68b --- /dev/null +++ b/pyramid_jsonapi/workflow/loop/delete.py @@ -0,0 +1,27 @@ +import pyramid_jsonapi.workflow as wf +import sqlalchemy + +from pyramid.httpexceptions import ( + HTTPFailedDependency, +) +from . import stages + + +def workflow(view, stages): + item = view.get_one( + view.single_item_query(loadonly=[view.key_column.name]), + not_found_message='No item {} in collection {}'.format( + view.obj_id, view.collection_name + ) + ) + item = wf.execute_stage( + view, stages, 'before_write_item', item + ) + try: + view.dbsession.delete(item) + view.dbsession.flush() + except sqlalchemy.exc.IntegrityError as exc: + raise HTTPFailedDependency(str(exc)) + doc = wf.Doc() + doc['data'] = wf.ResultObject(view, item).identifier() + return doc diff --git a/pyramid_jsonapi/workflow/loop/get.py b/pyramid_jsonapi/workflow/loop/get.py new file mode 100644 index 00000000..377ee51a --- /dev/null +++ b/pyramid_jsonapi/workflow/loop/get.py @@ -0,0 +1,30 @@ +from pyramid.httpexceptions import ( + HTTPForbidden, + HTTPNotFound, +) +import pyramid_jsonapi.workflow as wf +from . import stages + + +def get_doc(view, stages, query): + query = wf.execute_stage( + view, stages, 'alter_query', query + ) + res_obj = wf.loop.get_one_altered_result_object(view, stages, query) + results = view.pj_shared.results = wf.Results( + view, + objects=[res_obj], + many=False, + is_top=True, + not_found_message=view.not_found_message, + ) + + # We have a result but we still need to fill the relationships. + # Stage 'alter_result' will run on each related object. + wf.loop.fill_result_object_related(res_obj, stages) + + return results.serialise() + + +def workflow(view, stages): + return get_doc(view, stages, view.single_item_query()) diff --git a/pyramid_jsonapi/workflow/loop/patch.py b/pyramid_jsonapi/workflow/loop/patch.py new file mode 100644 index 00000000..1fe7c5ac --- /dev/null +++ b/pyramid_jsonapi/workflow/loop/patch.py @@ -0,0 +1,155 @@ +import itertools +import pyramid_jsonapi.workflow as wf +import sqlalchemy + +from json.decoder import ( + JSONDecodeError, +) +from pyramid.httpexceptions import ( + HTTPBadRequest, + HTTPConflict, + HTTPNotFound, +) +from sqlalchemy.orm import ( + load_only, +) +from . import stages + + +def workflow(view, stages): + validate_patch_request(view) + data = view.request.json_body['data'] + atts = {} + hybrid_atts = {} + for key, value in data.get('attributes', {}).items(): + if key in view.attributes: + atts[key] = value + elif key in view.hybrid_attributes: + hybrid_atts[key] = value + else: + raise HTTPNotFound( + 'Collection {} has no attribute {}'.format( + view.collection_name, key + ) + ) + atts[view.key_column.name] = view.obj_id + item = view.dbsession.merge(view.model(**atts)) + for att, value in hybrid_atts.items(): + try: + setattr(item, att, value) + except AttributeError: + raise HTTPConflict( + 'Attribute {} is read only.'.format( + att + ) + ) + + rels = data.get('relationships', {}) + for relname, reldict in rels.items(): + try: + rel = view.relationships[relname] + except KeyError: + raise HTTPNotFound( + 'Collection {} has no relationship {}'.format( + view.collection_name, relname + ) + ) + rel_view = view.view_instance(rel.tgt_class) + try: + reldata = reldict['data'] + except KeyError: + raise HTTPBadRequest( + "Relationship '{}' has no 'data' member.".format(relname) + ) + except TypeError: + raise HTTPBadRequest( + "Relationship '{}' is not a dictionary with a data member.".format(relname) + ) + if reldata is None: + setattr(item, relname, None) + elif isinstance(reldata, dict): + if reldata.get('type') != rel_view.collection_name: + raise HTTPConflict( + 'Type {} does not match relationship type {}'.format( + reldata.get('type', None), rel_view.collection_name + ) + ) + if reldata.get('id') is None: + raise HTTPBadRequest( + 'An id is required in a resource identifier.' + ) + rel_item = view.dbsession.query( + rel.tgt_class + ).options( + load_only(rel_view.key_column.name) + ).get(reldata['id']) + if not rel_item: + raise HTTPNotFound('{}/{} not found'.format( + rel_view.collection_name, reldata['id'] + )) + setattr(item, relname, rel_item) + elif isinstance(reldata, list): + rel_items = [] + for res_ident in reldata: + rel_item = view.dbsession.query( + rel.tgt_class + ).options( + load_only(rel_view.key_column.name) + ).get(res_ident['id']) + if not rel_item: + raise HTTPNotFound('{}/{} not found'.format( + rel_view.collection_name, res_ident['id'] + )) + rel_items.append(rel_item) + setattr(item, relname, rel_items) + item = wf.execute_stage( + view, stages, 'before_write_item', item + ) + try: + view.dbsession.flush() + except sqlalchemy.exc.IntegrityError as exc: + raise HTTPConflict(str(exc)) + doc = wf.Doc() + doc['meta'] = { + 'updated': { + 'attributes': [ + att for att in itertools.chain(atts, hybrid_atts) + if att != view.key_column.name + ], + 'relationships': [r for r in rels] + } + } + # if an update is successful ... the server + # responds only with top-level meta data + return doc + + +def validate_patch_request(view): + request = view.request + try: + data = request.json_body['data'] + except KeyError: + raise HTTPBadRequest('data attribute required in PATCHes.') + except JSONDecodeError as exc: + raise HTTPBadRequest('Error decoding JSON body: {}.'.format(exc)) + data_id = data.get('id') + if view.collection_name != data.get('type'): + raise HTTPConflict( + 'JSON type ({}) does not match URL type ({}).'.format( + data.get('type'), view.collection_name + ) + ) + if data_id != view.obj_id: + raise HTTPConflict( + 'JSON id ({}) does not match URL id ({}).'.format( + data_id, view.obj_id + ) + ) + if not view.object_exists(view.obj_id): + raise HTTPNotFound( + 'No id {} in collection {}'.format( + view.obj_id, + view.collection_name + ) + ) + return request diff --git a/pyramid_jsonapi/workflow/loop/related_get.py b/pyramid_jsonapi/workflow/loop/related_get.py new file mode 100644 index 00000000..bc122d6e --- /dev/null +++ b/pyramid_jsonapi/workflow/loop/related_get.py @@ -0,0 +1,88 @@ +import sqlalchemy +import pyramid_jsonapi.workflow as wf + +from itertools import ( + islice, +) +from pyramid.httpexceptions import ( + HTTPInternalServerError, + HTTPBadRequest, +) +from sqlalchemy.orm.interfaces import ( + ONETOMANY, + MANYTOMANY, + MANYTOONE, +) +from . import stages + + +def get_results(view, stages): + qinfo = view.rel_view.collection_query_info(view.request) + rel_stages = getattr(view.rel_view, 'related_get').stages + limit = qinfo['page[limit]'] + count = None + # We will need the original object with id view.obj_id. + obj = wf.loop.get_one_altered_result_object( + view, stages, view.single_item_query() + ) + if view.rel.queryable: + query = view.related_query(obj.object, view.rel) + else: + rel_objs = getattr(obj.object, view.rel.name) + # rel_objs = getattr(obj.object, view.rel.name) + + if view.rel.direction is ONETOMANY or view.rel.direction is MANYTOMANY: + many = True + if view.rel.queryable: + query = view.rel_view.query_add_sorting(query) + query = view.rel_view.query_add_filtering(query) + query = query.offset(qinfo['page[offset]']) + query = query.limit(qinfo['page[limit]']) + query = wf.execute_stage(view.rel_view, rel_stages, 'alter_query', query) + rel_objs_iterable = wf.wrapped_query_all(query) + else: + rel_objs_iterable = rel_objs + objects_iterator = wf.loop.altered_objects_iterator( + view.rel_view, rel_stages, 'alter_result', rel_objs_iterable + ) + offset_count = 0 + if 'page[offset]' in view.request.params: + offset_count = sum(1 for _ in islice(objects_iterator, qinfo['page[offset]'])) + res_objs = list(islice(objects_iterator, limit)) + if(qinfo['pj_include_count']): + count = offset_count + len(res_objs) + sum(1 for _ in objects_iterator) + else: + many = False + if view.rel.queryable: + query = wf.execute_stage( + view.rel_view, rel_stages, 'alter_query', query + ) + res_objs = [ + wf.loop.get_one_altered_result_object( + view.rel_view, rel_stages, query + ) + ] + else: + res_objs = [wf.ResultObject(view.rel_view, rel_objs)] + if(qinfo['pj_include_count']): + count = 1 + + results = wf.Results( + view.rel_view, + objects=res_objs, + many=many, + is_top=True, + count=count, + limit=limit + ) + + # Fill the relationships with related objects. + # Stage 'alter_result' will run on each object. + for res_obj in results.objects: + wf.loop.fill_result_object_related(res_obj, rel_stages) + + return results + + +def workflow(view, stages): + return get_results(view, stages).serialise() diff --git a/pyramid_jsonapi/workflow/loop/relationships_delete.py b/pyramid_jsonapi/workflow/loop/relationships_delete.py new file mode 100644 index 00000000..19916ad2 --- /dev/null +++ b/pyramid_jsonapi/workflow/loop/relationships_delete.py @@ -0,0 +1,52 @@ +import pyramid_jsonapi.workflow as wf +import sqlalchemy + +from pyramid.httpexceptions import ( + HTTPInternalServerError, + HTTPBadRequest, + HTTPForbidden, + HTTPConflict, + HTTPFailedDependency, +) +from sqlalchemy.orm.interfaces import ( + ONETOMANY, + MANYTOMANY, + MANYTOONE, +) +from . import stages + + +def workflow(view, stages): + if view.rel.direction is MANYTOONE: + raise HTTPForbidden('Cannot DELETE to TOONE relationship link.') + obj = view.dbsession.query(view.model).get(view.obj_id) + + for resid in view.request.json_body['data']: + if resid['type'] != view.rel_view.collection_name: + raise HTTPConflict( + "Resource identifier type '{}' does not match relationship type '{}'.".format( + resid['type'], view.rel_view.collection_name + ) + ) + try: + item = view.dbsession.query(view.rel_class).get(resid['id']) + except sqlalchemy.exc.DataError as exc: + raise HTTPBadRequest("invalid id '{}'".format(resid['id'])) + if item is None: + raise HTTPFailedDependency("One or more objects DELETEd from this relationship do not exist.") + try: + getattr(obj, view.relname).remove(item) + except ValueError as exc: + if exc.args[0].endswith('not in list'): + # The item we were asked to remove is not there. + pass + else: + raise + obj = wf.execute_stage( + view, stages, 'before_write_item', obj + ) + try: + view.dbsession.flush() + except sqlalchemy.exc.IntegrityError as exc: + raise HTTPFailedDependency(str(exc)) + return wf.Doc() diff --git a/pyramid_jsonapi/workflow/loop/relationships_get.py b/pyramid_jsonapi/workflow/loop/relationships_get.py new file mode 100644 index 00000000..a05ec710 --- /dev/null +++ b/pyramid_jsonapi/workflow/loop/relationships_get.py @@ -0,0 +1,9 @@ +from . import stages +from .related_get import ( + get_results, +) + + +# Do what reated_get does but serialise as identifiers. +def workflow(view, stages): + return get_results(view, stages).serialise(identifiers=True) diff --git a/pyramid_jsonapi/workflow/loop/relationships_patch.py b/pyramid_jsonapi/workflow/loop/relationships_patch.py new file mode 100644 index 00000000..f7626fb7 --- /dev/null +++ b/pyramid_jsonapi/workflow/loop/relationships_patch.py @@ -0,0 +1,78 @@ +import pyramid_jsonapi.workflow as wf +import sqlalchemy + +from pyramid.httpexceptions import ( + HTTPInternalServerError, + HTTPBadRequest, + HTTPForbidden, + HTTPConflict, + HTTPFailedDependency, +) +from sqlalchemy.orm.interfaces import ( + ONETOMANY, + MANYTOMANY, + MANYTOONE, +) +from . import stages + + +def workflow(view, stages): + obj = view.dbsession.query(view.model).get(view.obj_id) + if view.rel.direction is MANYTOONE: + local_col, _ = view.rel.obj.local_remote_pairs[0] + resid = view.request.json_body['data'] + if resid is None: + setattr(obj, view.relname, None) + else: + if resid['type'] != view.rel_view.collection_name: + raise HTTPConflict( + "Resource identifier type '{}' does not match relationship type '{}'.".format( + resid['type'], + view.rel_view.collection_name + ) + ) + setattr( + obj, + local_col.name, + resid['id'] + ) + try: + view.dbsession.flush() + except sqlalchemy.exc.IntegrityError as exc: + raise HTTPFailedDependency( + 'Object {}/{} does not exist.'.format(resid['type'], resid['id']) + ) + except sqlalchemy.exc.DataError as exc: + raise HTTPBadRequest("invalid id '{}'".format(resid['id'])) + return wf.Doc() + items = [] + for resid in view.request.json_body['data']: + if resid['type'] != view.rel_view.collection_name: + raise HTTPConflict( + "Resource identifier type '{}' does not match relationship type '{}'.".format( + resid['type'], + view.rel_view.collection_name + ) + ) + try: + newitem = view.dbsession.query(view.rel_class).get(resid['id']) + except sqlalchemy.exc.DataError as exc: + raise HTTPBadRequest("invalid id '{}'".format(resid['id'])) + if newitem is None: + raise HTTPFailedDependency("One or more objects POSTed to this relationship do not exist.") + items.append(newitem) + setattr(obj, view.relname, items) + obj = wf.execute_stage( + view, stages, 'before_write_item', obj + ) + try: + view.dbsession.flush() + except sqlalchemy.exc.IntegrityError as exc: + raise HTTPFailedDependency(str(exc)) + except sqlalchemy.orm.exc.FlushError as exc: + if str(exc).startswith("Can't flush None value"): + raise HTTPFailedDependency("One or more objects PATCHed to this relationship do not exist.") + else: + # Catch-all. Shouldn't reach here. + raise # pragma: no cover + return wf.Doc() diff --git a/pyramid_jsonapi/workflow/loop/relationships_post.py b/pyramid_jsonapi/workflow/loop/relationships_post.py new file mode 100644 index 00000000..c2d1ff81 --- /dev/null +++ b/pyramid_jsonapi/workflow/loop/relationships_post.py @@ -0,0 +1,62 @@ +import pyramid_jsonapi.workflow as wf +import sqlalchemy + +from pyramid.httpexceptions import ( + HTTPInternalServerError, + HTTPBadRequest, + HTTPForbidden, + HTTPConflict, + HTTPFailedDependency, +) +from sqlalchemy.orm.interfaces import ( + ONETOMANY, + MANYTOMANY, + MANYTOONE, +) +from . import stages + + +def workflow(view, stages): + if view.rel.direction is MANYTOONE: + raise HTTPForbidden('Cannot POST to TOONE relationship link.') + + # Alter data with any callbacks + data = view.request.json_body['data'] + + obj = view.dbsession.query(view.model).get(view.obj_id) + items = [] + for resid in data: + if resid['type'] != view.rel_view.collection_name: + raise HTTPConflict( + "Resource identifier type '{}' does not match relationship type '{}'.".format( + resid['type'], view.rel_view.collection_name + ) + ) + try: + newitem = view.dbsession.query(view.rel_class).get(resid['id']) + except sqlalchemy.exc.DataError as exc: + raise HTTPBadRequest("invalid id '{}'".format(resid['id'])) + if newitem is None: + raise HTTPFailedDependency("One or more objects POSTed to this relationship do not exist.") + items.append(newitem) + getattr(obj, view.relname).extend(items) + obj = wf.execute_stage( + view, stages, 'before_write_item', obj + ) + try: + view.dbsession.flush() + except sqlalchemy.exc.IntegrityError as exc: + if 'duplicate key value violates unique constraint' in str(exc): + # This happens when using an association proxy if we attempt to + # add an object to the relationship that's already there. We + # want this to be a no-op. + pass + else: + raise HTTPFailedDependency(str(exc)) + except sqlalchemy.orm.exc.FlushError as exc: + if str(exc).startswith("Can't flush None value"): + raise HTTPFailedDependency("One or more objects POSTed to this relationship do not exist.") + else: + # Catch-all. Shouldn't reach here. + raise # pragma: no cover + return wf.Doc() diff --git a/setup.py b/setup.py index efe04d78..f89ed93a 100644 --- a/setup.py +++ b/setup.py @@ -14,6 +14,7 @@ 'pyramid_mako', 'pyramid_settings_wrapper', 'pyyaml>=5.1', # openapi-spec-validator requires >= 5.1 + 'rqlalchemy', 'SQLAlchemy', ] diff --git a/test_project/development.ini b/test_project/development.ini index 652922dd..0b482636 100644 --- a/test_project/development.ini +++ b/test_project/development.ini @@ -27,6 +27,22 @@ pyramid_jsonapi.route_pattern_prefix = pyramid_jsonapi.paging_default_limit = 10 pyramid_jsonapi.paging_max_limit = 100 pyramid_jsonapi.allow_client_ids = true +pyramid_jsonapi.transaction_isolation_level = SERIALIZABLE +pyramid_jsonapi.load_strategy = loop +pyramid_jsonapi.save_strategy = loop +pj_wf_modules = pyramid_jsonapi.workflow +pj_wf_load_modules = %(pj_wf_modules)s.%(pyramid_jsonapi.load_strategy)s +pj_wf_save_modules = %(pj_wf_modules)s.%(pyramid_jsonapi.save_strategy)s +pyramid_jsonapi.workflow_get = %(pj_wf_load_modules)s.get +pyramid_jsonapi.workflow_patch = %(pj_wf_save_modules)s.patch +pyramid_jsonapi.workflow_delete = %(pj_wf_save_modules)s.delete +pyramid_jsonapi.workflow_collection_get = %(pj_wf_load_modules)s.collection_get +pyramid_jsonapi.workflow_collection_post = %(pj_wf_save_modules)s.collection_post +pyramid_jsonapi.workflow_related_get = %(pj_wf_load_modules)s.related_get +pyramid_jsonapi.workflow_relationships_get = %(pj_wf_load_modules)s.relationships_get +pyramid_jsonapi.workflow_relationships_post = %(pj_wf_save_modules)s.relationships_post +pyramid_jsonapi.workflow_relationships_patch = %(pj_wf_save_modules)s.relationships_patch +pyramid_jsonapi.workflow_relationships_delete = %(pj_wf_save_modules)s.relationships_delete # By default, the toolbar only appears for clients from IP addresses diff --git a/test_project/setup.py b/test_project/setup.py index d14c437f..92204099 100644 --- a/test_project/setup.py +++ b/test_project/setup.py @@ -1,4 +1,6 @@ +import glob import os +import re from setuptools import setup, find_packages @@ -6,7 +8,19 @@ README = open(os.path.join(here, 'README.txt')).read() CHANGES = open(os.path.join(here, 'CHANGES.txt')).read() + +def local_ltree_pkg(): + list_of_files = glob.glob('/home/chiggs1/git/ltree_models/dist/*.tar.gz') + return max(list_of_files, key=os.path.getctime) + +def ltree_version(path): + fname = os.path.basename(path) + match = re.search(r'ltree_models-(.*)\.tar\.gz', fname) + return match.group(1) + requires = [ + # f'ltree @ file://localhost{local_ltree_pkg()}', + 'ltree_models', 'openapi_spec_validator', 'psycopg2', 'pyramid', diff --git a/test_project/test_project/__init__.py b/test_project/test_project/__init__.py index 828e7ad7..4056bf63 100644 --- a/test_project/test_project/__init__.py +++ b/test_project/test_project/__init__.py @@ -5,6 +5,7 @@ # The jsonapi module. import pyramid_jsonapi +import pyramid_jsonapi.workflow as wf # Import models as a module: needed for create_jsonapi... from . import models @@ -17,7 +18,7 @@ test_settings = { 'models_iterable': { 'module': models, - 'list': [models.Person, models.Blog], + 'list': [models.Blog, models.Person, models.Post], 'composite_key': [models2.CompositeKey] } } @@ -27,41 +28,14 @@ def datetime_adapter(obj, request): return obj.isoformat() - -def person_callback_add_information(view, ret): - param = view.request.params.get( - 'fields[{}]'.format(view.collection_name) - ) - if param is None: - requested_fields = {'name_copy', 'age'} - else: - requested_fields = view.requested_field_names - if 'name_copy' in requested_fields: - ret.attributes['name_copy'] = ret.attributes['name'] - if 'age' in requested_fields: - ret.attributes['age'] = 42 - return ret - - -def person_allowed_fields(self): - if self.request.method == 'GET': - return set(self.fields) | {'name_copy'} - else: - return set(self.fields) - - -def person_allowed_object(self, obj): - if self.request.method == 'GET': - try: - name = obj.attributes['name'] - except KeyError: - return True - if name == 'secret_squirrel': - return False - else: - return True - else: - return True +# Make sure the schema generator understands some types from sqlalchemy_utils. +import sqlalchemy_utils +import alchemyjsonschema +alchemyjsonschema.default_column_to_schema.update( + { + sqlalchemy_utils.LtreeType: "string" + } +) def main(global_config, **settings): @@ -96,17 +70,32 @@ def main(global_config, **settings): # Create the routes and views automagically. pj.create_jsonapi_using_magic_and_pixie_dust() - person_view = pj.view_classes[ - models.Person - ] - person_view.callbacks['after_serialise_object'].appendleft( - person_callback_add_information - ) - person_view.allowed_fields = property(person_allowed_fields) - person_view.allowed_object = person_allowed_object - pj.append_callback_set_to_all_views( - 'access_control_serialised_objects' - ) + person_view = pj.view_classes[models.Person] + blogs_view = pj.view_classes[models.Blog] + def sh_add_some_info(doc, view, stage, view_method): + doc['meta']['added'] = 'some random info' + return doc + + # Add some random information via the alter_document stage. + person_view.get.stages['alter_document'].append(sh_add_some_info) + + # Apply GET permission handlers at the alter_direct_results and + # alter_related_results stages. + # pj.enable_permission_handlers('get', ['alter_direct_results', 'alter_related_results']) + + # Add permission filters to do the logic of accepting or rejecting items. + # person_view.register_permission_filter( + # ['get'], + # ['alter_direct_results', 'alter_related_results'], + # lambda obj, *args, **kwargs: obj.object.name != 'alice', + # ) + # blogs_view.register_permission_filter( + # ['get'], + # ['alter_direct_results', 'alter_related_results'], + # lambda obj, *args, **kwargs: obj.object.id != 3, + # ) # Back to the usual pyramid stuff. - return config.make_wsgi_app() + app = config.make_wsgi_app() + app.pj = pj + return app diff --git a/test_project/test_project/models.py b/test_project/test_project/models.py index 9896f112..e36c328c 100644 --- a/test_project/test_project/models.py +++ b/test_project/test_project/models.py @@ -1,3 +1,6 @@ +from ltree_models import ( + LtreeMixin, +) from sqlalchemy import ( Table, Column, @@ -9,21 +12,26 @@ ForeignKey, UniqueConstraint, CheckConstraint, - func + func, + select, ) from sqlalchemy.dialects.postgresql import JSONB - from sqlalchemy.ext.associationproxy import association_proxy from sqlalchemy.ext.declarative import declarative_base from sqlalchemy.ext.hybrid import hybrid_property, hybrid_method - from sqlalchemy.orm import ( scoped_session, sessionmaker, relationship, - backref + backref, + foreign, + remote, ) - +from sqlalchemy.orm.interfaces import ( + ONETOMANY, + MANYTOMANY, + MANYTOONE, +) from zope.sqlalchemy import register DBSession = scoped_session(sessionmaker()) @@ -50,12 +58,16 @@ class Person(Base): __tablename__ = 'people' id = IdColumn() name = Column(Text) + age = Column(Integer) invisible = Column(Text) + @hybrid_property + def invisible_hybrid(self): + return 'boo!' + blogs = relationship('Blog', backref='owner') posts = relationship('Post', backref='author') comments = relationship('Comment', backref='author') invisible_comments = relationship('Comment') - articles_by_assoc = relationship( "ArticleByAssoc", secondary=authors_articles_assoc, @@ -66,17 +78,22 @@ class Person(Base): cascade='all, delete-orphan', backref='author' ) - articles_by_proxy = association_proxy('article_associations', 'article') + # A relationship that doesn't join along the usual fk -> pk lines. + blogs_from_titles = relationship( + 'Blog', + primaryjoin="remote(Blog.title).like('%' + foreign(Person.name))", + viewonly=True, + uselist=True, + ) + - @hybrid_property - def invisible_hybrid(self): - return 'boo!' # make invisible columns invisible to API invisible.info.update({'pyramid_jsonapi': {'visible': False}}) invisible_hybrid.info.update({'pyramid_jsonapi': {'visible': False}}) invisible_comments.info.update({'pyramid_jsonapi': {'visible': False}}) + class Blog(Base): __tablename__ = 'blogs' __table_args__ = ( @@ -86,7 +103,6 @@ class Blog(Base): id = IdColumn() title = Column(Text) owner_id = IdRefColumn('people.id') - posts = relationship('Post', backref='blog') # A read only hybrid property @hybrid_property def owner_name(self): @@ -96,16 +112,33 @@ def owner_name(self): # No owner return None + posts = relationship('Post', backref='blog') + # Using a hybrid property as a ONETOMANY relationship. + @hybrid_property + def posts_authors(self): + # Return the authors of all of the posts (as objects, like a relationship) + authors = set() + for post in self.posts: + authors.add(post.author) + return list(authors) + posts_authors.info['pyramid_jsonapi'] = { + 'relationship': { + 'direction': ONETOMANY, + 'queryable': False, + 'tgt_class': 'Person', + } + } + + class Post(Base): __tablename__ = 'posts' id = IdColumn() title = Column(Text) content = Column(Text) - published_at = Column(DateTime, nullable=False) + published_at = Column(DateTime, nullable=False, server_default=func.now()) json_content = Column(JSONB) blog_id = IdRefColumn('blogs.id') author_id = IdRefColumn('people.id', nullable=False) - comments = relationship('Comment', backref = 'post') # A read-write hybrid property @hybrid_property def author_name(self): @@ -120,6 +153,21 @@ def author_name(self): def author_name(self, name): self.author.name = name + comments = relationship('Comment', backref = 'post') + # Using a hybrid property as a MANYTOONE relationship. + @hybrid_property + def blog_owner(self): + # Return the owner of the blog this post is in (as an object, like a + # relationship) + return self.blog.owner + blog_owner.info['pyramid_jsonapi'] = { + 'relationship': { + 'direction': MANYTOONE, + 'queryable': False, + 'tgt_class': Person, + } + } + class Comment(Base): __tablename__ = 'comments' @@ -178,6 +226,7 @@ class ArticleByObj(Base): cascade='all, delete-orphan', backref='article' ) + authors_by_proxy = association_proxy('author_associations', 'author') class ArticleAuthorAssociation(Base): @@ -235,3 +284,19 @@ class TreeNode(Base): children = relationship("TreeNode", backref=backref('parent', remote_side=[id]) ) + + +class PersonView(Base): + __table__ = select(Person).subquery() + + posts = relationship('Post', backref='view_author') + + __pyramid_jsonapi__ = { + 'collection_name': 'view_people', + } + + +class LtreeNode(Base, LtreeMixin): + __tablename__ = 'ltree_nodes' + + id = IdColumn() diff --git a/test_project/test_project/query_tests.py b/test_project/test_project/query_tests.py new file mode 100644 index 00000000..4d4cafda --- /dev/null +++ b/test_project/test_project/query_tests.py @@ -0,0 +1,171 @@ +import ltree_models +import sqlalchemy +from sqlalchemy import ( + create_engine, +) +from sqlalchemy.orm import ( + aliased, +) +import testing.postgresql +from test_project import ( + test_data +) +from test_project.models import ( + DBSession, + ArticleAuthorAssociation, + ArticleByAssoc, + ArticleByObj, + Base, + Blog, + Person, + LtreeNode, + TreeNode, +) +import transaction +import unittest + +def setUpModule(): + '''Create a test DB and import data.''' + # Create a new database somewhere in /tmp + global postgresql + global engine + postgresql = testing.postgresql.Postgresql(port=7654) + engine = create_engine(postgresql.url()) + ltree_models.add_ltree_extension(engine) + DBSession.configure(bind=engine) + + +def tearDownModule(): + '''Throw away test DB.''' + global postgresql + DBSession.close() + postgresql.stop() + + +class DBTestBase(unittest.TestCase): + + def setUp(self): + Base.metadata.create_all(engine) + # Add some basic test data. + test_data.add_to_db(engine) + transaction.begin() + + def tearDown(self): + transaction.abort() + Base.metadata.drop_all(engine) + + +class IllustrateRelatedQueries(DBTestBase): + + def test_fk_one_to_many(self): + query = DBSession.query(Blog).select_from(Person).join( + Person.blogs + ).filter( + Person.id == '1' + ) + alice = DBSession.query(Person).get('1') + self.assertEqual(query.all(), alice.blogs) + + def test_fk_many_to_one(self): + query = DBSession.query(Person).select_from(Blog).join( + Blog.owner + ).filter( + Blog.id == '1' + ) + self.assertEqual(query.one(), DBSession.query(Person).get('1')) + + def test_fk_many_to_many_assoc_table(self): + query = DBSession.query(ArticleByAssoc).select_from(Person).join( + Person.articles_by_assoc + ).filter( + Person.id == '11' + ) + person11 = DBSession.query(Person).get('11') + self.assertEqual(query.all(), person11.articles_by_assoc) + query = DBSession.query(ArticleByAssoc).select_from(Person).join( + Person.articles_by_assoc + ).filter( + Person.id == '12' + ) + person12 = DBSession.query(Person).get('12') + self.assertEqual(query.all(), person12.articles_by_assoc) + + def test_fk_many_to_many_assoc_proxy(self): + rel = sqlalchemy.inspect(Person).all_orm_descriptors['articles_by_proxy'] + proxy = rel.for_class(Person) + # print(proxy.local_attr) + # print(proxy.remote_attr) + query = DBSession.query(ArticleByObj).select_from(Person).join( + # Person.article_associations + proxy.local_attr + ).join( + # ArticleAuthorAssociation.article + proxy.remote_attr + ).filter( + Person.id == '12' + ) + person12 = DBSession.query(Person).get('12') + self.assertEqual( + [aa.article for aa in person12.article_associations], + query.all() + ) + + def test_fk_self_one_to_many(self): + tn2 = aliased(TreeNode) + query = DBSession.query(TreeNode).select_from(tn2).join( + tn2.children + ).filter( + tn2.id == '1' + ) + root = DBSession.query(TreeNode).get('1') + self.assertEqual(query.all(), root.children) + + def test_fk_self_many_to_one(self): + tn2 = aliased(TreeNode) + query = DBSession.query(TreeNode).select_from(tn2).join( + tn2.parent + ).filter( + tn2.id == '2' + ) + child = DBSession.query(TreeNode).get('2') + self.assertEqual(query.one(), child.parent) + + def test_join_condition_one_to_many(self): + query = DBSession.query(Blog).select_from(Person).join( + Person.blogs_from_titles + ).filter( + Person.id == '1' + ) + alice = DBSession.query(Person).get('1') + self.assertEqual(query.all(), alice.blogs_from_titles) + + def test_ltree_node_children(self): + lt2 = aliased(LtreeNode) + query = DBSession.query(LtreeNode).select_from(lt2).join( + lt2.children + ).filter( + lt2.id == '1' + ) + root = DBSession.query(LtreeNode).get('1') + self.assertEqual(query.all(), root.children) + + def test_ltree_node_parent(self): + lt2 = aliased(LtreeNode) + query = DBSession.query(LtreeNode).select_from(lt2).join( + lt2.parent + ).filter( + lt2.id == '2' + ) + child = DBSession.query(LtreeNode).get('2') + self.assertEqual(query.one(), child.parent) + + def test_ltree_node_ancestors(self): + lt2 = aliased(LtreeNode) + query = DBSession.query(LtreeNode).select_from(lt2).join( + lt2.ancestors + ).filter( + lt2.node_name == 'r.1.2' + ) + node = DBSession.query(LtreeNode).filter(LtreeNode.node_name == 'r.1.2').one() + # self.assertEqual(query.all(), root.children) + print(query.all()) diff --git a/test_project/test_project/test_data.json b/test_project/test_project/test_data.json index 73fc55a7..c9324e43 100644 --- a/test_project/test_project/test_data.json +++ b/test_project/test_project/test_data.json @@ -4,11 +4,13 @@ [ { "id": "1", - "name": "alice" + "name": "alice", + "age": 21 }, { "id": "2", - "name": "bob" + "name": "bob", + "age": 42 }, { "id": "3", @@ -16,7 +18,8 @@ }, { "id": "4", - "name": "secret_squirrel" + "name": "secret_squirrel", + "age": 63 }, { "id": "10", diff --git a/test_project/test_project/test_data.py b/test_project/test_project/test_data.py index d708a249..a2f6e86c 100644 --- a/test_project/test_project/test_data.py +++ b/test_project/test_project/test_data.py @@ -1,5 +1,8 @@ import sqlalchemy -from sqlalchemy import func +from sqlalchemy import ( + delete, + func, +) import transaction from test_project import models from test_project.models import ( @@ -12,6 +15,8 @@ from pathlib import Path import re +import ltree + def add_to_db(engine): '''Add some basic test data.''' meta = sqlalchemy.MetaData() @@ -46,6 +51,11 @@ def add_to_db(engine): if not rows: DBSession.execute(table.insert(), assoc) + DBSession.execute(delete(models.LtreeNode.__table__)) + + lbuilder = ltree.LtreeBuilder(DBSession.bind, models.LtreeNode) + lbuilder.populate(2, 5) + def item_transform(item): '''Transform item prior to saving to database. @@ -77,8 +87,8 @@ def set_item(model, data, opts): keycol = keycols[0] item = DBSession.query(model).get(data[keycol.name]) if item: - DBSession.query(model)\ - .filter(keycol == data[keycol.name]).update(data) + for key, val in data.items(): + setattr(item, key, val) else: item = model(**data) DBSession.add(item) diff --git a/test_project/test_project/tests.py b/test_project/test_project/tests.py index 6e9fcd50..ffc3d139 100644 --- a/test_project/test_project/tests.py +++ b/test_project/test_project/tests.py @@ -18,13 +18,16 @@ import warnings import json from parameterized import parameterized -import pyramid_jsonapi.jsonapi import pyramid_jsonapi.metadata from openapi_spec_validator import validate_spec +import pprint +import ltree from test_project.models import ( DBSession, - Base + Base, + Person, + Blog, ) from test_project import test_data @@ -83,6 +86,19 @@ ) +class MyTestApp(webtest.TestApp): + + def _check_status(self, status, res): + try: + super()._check_status(status, res) + except webtest.AppError as e: + errors = res.json_body.get('errors', [{}]) + raise webtest.AppError( + '%s\n%s', + errors, res.json_body.get('traceback') + ) + + def setUpModule(): '''Create a test DB and import data.''' # Create a new database somewhere in /tmp @@ -90,6 +106,7 @@ def setUpModule(): global engine postgresql = testing.postgresql.Postgresql(port=7654) engine = create_engine(postgresql.url()) + ltree.add_ltree_extension(engine) DBSession.configure(bind=engine) @@ -99,11 +116,16 @@ def tearDownModule(): DBSession.close() postgresql.stop() + def rels_doc_func(func, i, param): src, tgt, comment = param[0] return '{}:{}/{} ({})'.format(func.__name__, src.collection, src.rel, comment) +def make_ri(_type, _id): + return { 'type': _type, 'id': _id } + + class DBTestBase(unittest.TestCase): _test_app = None @@ -123,7 +145,7 @@ def tearDown(self): Base.metadata.drop_all(engine) def test_app(self, options=None): - if (not options) and self._test_app: + if (options is None) and self._test_app: # If there are no options and we have a cached app, return it. return self._test_app return self.new_test_app(options) @@ -146,7 +168,9 @@ def new_test_app(options=None): "ignore", category=SAWarning ) - test_app = webtest.TestApp(get_app(config_path)) + app = get_app(config_path) + test_app = MyTestApp(app) + test_app._pj_app = app if options: os.remove(config_path) return test_app @@ -160,6 +184,749 @@ def evaluate_filter(self, att_val, op, test_val): class TestTmp(DBTestBase): '''To isolate tests so they can be run individually during development.''' + # def test_ep_map(self): + # test_app = self.test_app({}) + # pj = test_app._pj_app.pj + # print(pj.endpoint_data.http_to_view_methods) + + # @parameterized.expand(rel_infos[1:2], doc_func=rels_doc_func) + # def test_rels_related_get(self, src, tgt, comment): + # ''''related' link should fetch related resource(s). + # + # If present, a related resource link MUST reference a valid URL, even if + # the relationship isn’t currently associated with any target resources. + # ''' + # # Fetch item 1 from the collection + # r = self.test_app().get('/{}/1'.format(src.collection)) + # item = r.json['data'] + # + # # Fetch the related url. + # url = item['relationships'][src.rel]['links']['related'] + # data = self.test_app().get(url).json['data'] + # + # # Check that the returned data is of the expected type. + # if tgt.many: + # self.assertIsInstance(data, list) + # for related_item in data: + # self.assertEqual(related_item['type'], tgt.collection) + # else: + # self.assertIsInstance(data, dict) + # self.assertEqual(data['type'], tgt.collection) + + +class TestPermissions(DBTestBase): + '''Test permission handling mechanisms. + ''' + + def test_get_alter_result_item(self): + test_app = self.test_app({}) + pj = test_app._pj_app.pj + # Not allowed to see alice (people/1) + pj.view_classes[test_project.models.Person].register_permission_filter( + ['read'], + ['alter_result'], + lambda obj, *args, **kwargs: obj.object.name != 'alice', + ) + # Shouldn't be allowed to see people/1 (alice) + test_app.get('/people/1', status=403) + # Should be able to see people/2 (bob) + test_app.get('/people/2') + + def test_get_alter_result_item_individual_attributes(self): + test_app = self.test_app({}) + pj = test_app._pj_app.pj + def pfilter(obj, *args, **kwargs): + if obj.object.name == 'alice': + return {'attributes': {'name'}, 'relationships': True} + else: + return True + pj.view_classes[test_project.models.Person].register_permission_filter( + ['get'], + ['alter_result', ], + pfilter, + ) + # Alice should have attribute 'name' but not 'age'. + alice = test_app.get('/people/1').json_body['data'] + self.assertIn('name', alice['attributes']) + self.assertNotIn('age', alice['attributes']) + + def test_get_alter_result_item_individual_rels(self): + test_app = self.test_app({}) + pj = test_app._pj_app.pj + def pfilter(obj, *args, **kwargs): + if obj.object.name == 'alice': + return {'attributes': True, 'relationships': {'blogs'}} + else: + return True + pj.view_classes[test_project.models.Person].register_permission_filter( + ['get'], + ['alter_result', ], + pfilter, + ) + # Alice should have relationship 'blogs' but not 'posts'. + alice = test_app.get('/people/1').json_body['data'] + self.assertIn('blogs', alice['relationships']) + self.assertNotIn('posts', alice['relationships']) + + def test_get_alter_result_item_rel_ids(self): + test_app = self.test_app({}) + pj = test_app._pj_app.pj + # Not allowed to see blogs/1 (one of alice's 2 blogs) + pj.view_classes[test_project.models.Blog].register_permission_filter( + ['get'], + ['alter_result', ], + lambda obj, *args, **kwargs: obj.object.id != 1, + ) + alice = test_app.get('/people/1').json_body['data'] + alice_blogs = alice['relationships']['blogs']['data'] + self.assertIn({'type': 'blogs', 'id': '2'}, alice_blogs) + self.assertNotIn({'type': 'blogs', 'id': '1'}, alice_blogs) + + def test_get_alter_result_item_included_items(self): + test_app = self.test_app({}) + pj = test_app._pj_app.pj + # Not allowed to see blogs/1 (one of alice's 2 blogs) + pj.view_classes[test_project.models.Blog].register_permission_filter( + ['get'], + ['alter_result', ], + lambda obj, *args, **kwargs: obj.object.id != 1, + ) + included = test_app.get('/people/1?include=blogs').json_body['included'] + included_blogs = { + item['id'] for item in included if item['type'] == 'blogs' + } + self.assertNotIn('1', included_blogs) + self.assertIn('2', included_blogs) + + def test_get_alter_result_collection(self): + test_app = self.test_app({}) + pj = test_app._pj_app.pj + # Not allowed to see alice (people/1) + pj.view_classes[test_project.models.Person].register_permission_filter( + ['get'], + ['alter_result', ], + lambda obj, *args, **kwargs: obj.object.name != 'alice', + ) + # Make sure we get the lowest ids with a filter. + ret = test_app.get('/people?filter[id:lt]=3').json_body + people = ret['data'] + ppl_ids = { person['id'] for person in people } + self.assertNotIn('1', ppl_ids) + self.assertIn('2', ppl_ids) + + def test_get_alter_result_collection_meta_info(self): + test_app = self.test_app({}) + pj = test_app._pj_app.pj + # Not allowed to see alice (people/1) + pj.view_classes[test_project.models.Person].register_permission_filter( + ['get'], + ['alter_result', ], + lambda obj, *args, **kwargs: obj.object.name != 'alice', + ) + # Make sure we get the lowest ids with a filter. + res = test_app.get('/people?filter[id:lt]=3').json_body + meta = res['meta'] + self.assertIn('people::1', meta['rejected']['objects']) + + def test_related_get_alter_result(self): + ''' + 'related' link should fetch only allowed related resource(s). + ''' + test_app = self.test_app({}) + pj = test_app._pj_app.pj + # Not allowed to see blog with title 'main: alice' (aka blogs/1) + pj.view_classes[test_project.models.Blog].register_permission_filter( + ['get'], + ['alter_result', ], + lambda obj, *args, **kwargs: obj.object.title != 'main: alice', + ) + r = test_app.get('/people/1/blogs').json_body + data = r['data'] + ids = {o['id'] for o in data} + self.assertIsInstance(data, list) + self.assertNotIn('1', ids) + + # Not allowed to see alice (people/1) + pj.view_classes[test_project.models.Person].register_permission_filter( + ['get'], + ['alter_result', ], + lambda obj, *args, **kwargs: obj.object.name != 'alice', + ) + r = test_app.get('/blogs/2/owner', status=403) + + def test_post_alterreq_collection(self): + test_app = self.test_app({}) + pj = test_app._pj_app.pj + # Not allowed to post the name "forbidden" + def pfilter(obj, *args, **kwargs): + return obj['attributes'].get('name') != 'forbidden' + pj.view_classes[test_project.models.Person].register_permission_filter( + ['post'], + ['alter_request'], + pfilter, + ) + # Make sure we can't post the forbidden name. + test_app.post_json( + '/people', + { + 'data': { + 'type': 'people', + 'attributes': { + 'name': 'forbidden' + } + } + }, + headers={'Content-Type': 'application/vnd.api+json'}, + status=403 + ) + # Make sure we can post some other name. + test_app.post_json( + '/people', + { + 'data': { + 'type': 'people', + 'attributes': { + 'name': 'allowed' + } + } + }, + headers={'Content-Type': 'application/vnd.api+json'}, + ) + + def test_post_alterreq_collection_with_rels(self): + test_app = self.test_app({}) + pj = test_app._pj_app.pj + def blogs_pfilter(obj, *args, **kwargs): + return {'attributes': True, 'relationships': True} + pj.view_classes[test_project.models.Blog].register_permission_filter( + ['post'], + ['alter_request'], + blogs_pfilter, + ) + # /people: allow POST to all atts and to 3 relationships. + def people_pfilter(obj, *args, **kwargs): + return { + 'attributes': True, + 'relationships': { + 'comments', 'articles_by_proxy', 'articles_by_assoc' + } + } + pj.view_classes[test_project.models.Person].register_permission_filter( + ['post'], + ['alter_request'], + people_pfilter, + ) + # /comments: allow PATCH (required to set 'comments.author') on all + # but comments/4. + pj.view_classes[test_project.models.Comment].register_permission_filter( + ['patch'], + ['alter_request'], + lambda obj, *args, **kwargs: obj['id'] not in {'4'} + ) + # /articles_by_assoc: allow POST (required to add people/new to + # 'articles_by_assoc.authors') on all but articles_by_assoc/11. + pj.view_classes[test_project.models.ArticleByAssoc].register_permission_filter( + ['post'], + ['alter_request'], + lambda obj, *args, **kwargs: obj['id'] not in {'11'} + ) + pj.view_classes[test_project.models.ArticleByObj].register_permission_filter( + ['post'], + ['alter_request'], + lambda obj, *args, **kwargs: obj['id'] not in {'10'} + ) + person_in = { + 'data': { + 'type': 'people', + 'attributes': { + 'name': 'post perms test' + }, + 'relationships': { + 'posts': { + 'data': [ + {'type': 'posts', 'id': '20'}, + {'type': 'posts', 'id': '21'} + ] + }, + 'comments': { + 'data': [ + {'type': 'comments', 'id': '4'}, + {'type': 'comments', 'id': '5'}, + ] + }, + 'articles_by_assoc': { + 'data': [ + {'type': 'articles_by_assoc', 'id': '10'}, + {'type': 'articles_by_assoc', 'id': '11'}, + ] + }, + 'articles_by_proxy': { + 'data': [ + {'type': 'articles_by_obj', 'id': '10'}, + {'type': 'articles_by_obj', 'id': '11'}, + ] + } + } + } + } + person_out = test_app.post_json( + '/people', + person_in, + headers={'Content-Type': 'application/vnd.api+json'}, + ).json_body['data'] + rels = person_out['relationships'] + self.assertEqual(len(rels['posts']['data']),0) + self.assertIn({'type': 'comments', 'id': '5'}, rels['comments']['data']) + self.assertNotIn({'type': 'comments', 'id': '4'}, rels['comments']['data']) + self.assertIn({'type': 'articles_by_assoc', 'id': '10'}, rels['articles_by_assoc']['data']) + self.assertNotIn({'type': 'articles_by_assoc', 'id': '11'}, rels['articles_by_assoc']['data']) + self.assertIn({'type': 'articles_by_obj', 'id': '11'}, rels['articles_by_proxy']['data']) + self.assertNotIn({'type': 'articles_by_obj', 'id': '10'}, rels['articles_by_proxy']['data']) + + # Still need to test a to_one relationship. Posts has one of those. + # Switching to " for quoting so that the following can be copy/pasted as + # JSON in manual tests. + post_json = { + "data": { + "type": "posts", + "attributes": { + "title": "test" + }, + "relationships": { + "author": { + "data": {"type": "people", "id": "10"} + }, + "blog": { + "data": {"type": "blogs", "id": "10"} + } + } + } + } + # The Person permission filter defined above shouldn't allow us to POST + # post_json because we don't have permission to POST to Person.posts. + test_app.post_json( + '/posts', + post_json, + headers={'Content-Type': 'application/vnd.api+json'}, + status=409 # this should probably be a different status. + ) + # Replace the permission filter for Person - we need to be able to + # alter the Person.posts relationship. + pj.view_classes[test_project.models.Person].register_permission_filter( + ['post'], + ['alter_request'], + lambda *a, **kw: True, + ) + post_out = test_app.post_json( + '/posts', + post_json, + headers={'Content-Type': 'application/vnd.api+json'}, + ) + + def test_post_alterreq_relationship(self): + test_app = self.test_app({}) + pj = test_app._pj_app.pj + def blogs_pfilter(obj, *args, **kwargs): + if obj['id'] == '12': + return False + else: + return {'attributes': True, 'relationships': True} + pj.view_classes[test_project.models.Blog].register_permission_filter( + ['patch'], + ['alter_request'], + blogs_pfilter, + ) + # /people: allow POST to all atts and to 3 relationships. + def people_pfilter(obj, *args, **kwargs): + if kwargs['permission'] == 'delete' and obj['id'] == '20': + return False + if kwargs['permission'] == 'post' and obj['id'] == '12': + return False + return { + 'attributes': True, + 'relationships': { + 'blogs', 'articles_by_proxy', 'articles_by_assoc' + } + } + pj.view_classes[test_project.models.Person].register_permission_filter( + ['post', 'delete'], + ['alter_request'], + people_pfilter, + ) + # /articles_by_assoc: allow POST (required to add people/new to + # 'articles_by_assoc.authors') on all but articles_by_assoc/11. + pj.view_classes[test_project.models.ArticleByAssoc].register_permission_filter( + ['post'], + ['alter_request'], + lambda obj, *args, **kwargs: obj['id'] not in {'11'} + ) + pj.view_classes[test_project.models.ArticleByObj].register_permission_filter( + ['post'], + ['alter_request'], + lambda obj, *args, **kwargs: obj['id'] not in {'10'} + ) + # ONETOMANY relationship. + out = test_app.post_json( + '/people/1/relationships/blogs', + { + 'data': [ + {'type': 'blogs', 'id': '10'}, + {'type': 'blogs', 'id': '11'}, + {'type': 'blogs', 'id': '12'}, + {'type': 'blogs', 'id': '20'}, + ] + }, + headers={'Content-Type': 'application/vnd.api+json'}, + ).json_body + # pprint.pprint(out) + + # Now fetch people/1 and see if the new blogs are there. + p1 = test_app.get('/people/1').json_body['data'] + blogs = p1['relationships']['blogs']['data'] + # Should have left the original blogs in place. + self.assertIn({'type': 'blogs', 'id': '1'}, blogs) + # Should have added blogs/10 (previously no owner) + self.assertIn({'type': 'blogs', 'id': '10'}, blogs) + # Should have added blogs/11 (previously owned by 11) + self.assertIn({'type': 'blogs', 'id': '11'}, blogs) + # blogs/12 disallowed by blogs filter. + self.assertNotIn({'type': 'blogs', 'id': '12'}, blogs) + # blogs/20 disallowed by people filter on people/20. + self.assertNotIn({'type': 'blogs', 'id': '20'}, blogs) + + # MANYTOMANY relationship. + out = test_app.post_json( + '/people/1/relationships/articles_by_assoc', + { + 'data': [ + {'type': 'articles_by_assoc', 'id': '10'}, + {'type': 'articles_by_assoc', 'id': '11'}, + {'type': 'articles_by_assoc', 'id': '12'}, + ] + }, + headers={'Content-Type': 'application/vnd.api+json'}, + ).json_body + p1 = test_app.get('/people/1').json_body['data'] + articles = p1['relationships']['articles_by_assoc']['data'] + # Should have added articles_by_assoc/10 + self.assertIn({'type': 'articles_by_assoc', 'id': '10'}, articles) + # articles_by_assoc/11 disallowed by articles_by_assoc filter. + self.assertNotIn({'type': 'articles_by_assoc', 'id': '11'}, articles) + # articles_by_assoc/12 disallowed by people filter. + # self.assertNotIn({'type': 'articles_by_assoc', 'id': '12'}, articles) + + def test_patch_alterreq_item_with_rels(self): + test_app = self.test_app({}) + pj = test_app._pj_app.pj + def blogs_pfilter(obj, **kwargs): + return {'attributes': True, 'relationships': True} + pj.view_classes[test_project.models.Blog].register_permission_filter( + ['post'], + ['alter_request'], + blogs_pfilter, + ) + # /people: allow PATCH to all atts and to 3 relationships. + def people_pfilter(obj, **kwargs): + return { + 'attributes': True, + 'relationships': { + 'comments', 'articles_by_proxy', 'articles_by_assoc' + } + } + pj.view_classes[test_project.models.Person].register_permission_filter( + ['patch'], + ['alter_request'], + people_pfilter, + ) + # /comments: allow PATCH (required to set 'comments.author') on all + # but comments/4. + def comments_pfilter(obj, **kwargs): + if obj['id'] == '4' and obj['relationships']['author']['data']['id'] == '1': + # We're not allowing people/1 to be the author of comments/4 for + # some reason. + return False + return True + pj.view_classes[test_project.models.Comment].register_permission_filter( + ['patch'], + ['alter_request'], + comments_pfilter + ) + # /articles_by_assoc: allow POST (required to add people/new to + # 'articles_by_assoc.authors') on all but articles_by_assoc/11. + pj.view_classes[test_project.models.ArticleByAssoc].register_permission_filter( + ['post'], + ['alter_request'], + lambda obj, *args, **kwargs: obj['id'] not in {'11'} + ) + pj.view_classes[test_project.models.ArticleByObj].register_permission_filter( + ['post'], + ['alter_request'], + lambda obj, *args, **kwargs: obj['id'] not in {'11'} + ) + person_in = { + 'data': { + 'type': 'people', + 'id': '1', + 'attributes': { + 'name': 'post perms test' + }, + 'relationships': { + 'posts': { + 'data': [ + {'type': 'posts', 'id': '1'}, + {'type': 'posts', 'id': '2'}, + {'type': 'posts', 'id': '3'}, + {'type': 'posts', 'id': '20'}, + ] + }, + 'comments': { + 'data': [ + {'type': 'comments', 'id': '1'}, + {'type': 'comments', 'id': '4'}, + {'type': 'comments', 'id': '5'}, + ] + }, + 'articles_by_assoc': { + 'data': [ + {'type': 'articles_by_assoc', 'id': '10'}, + {'type': 'articles_by_assoc', 'id': '11'}, + ] + }, + 'articles_by_proxy': { + 'data': [ + {'type': 'articles_by_obj', 'id': '1'}, + {'type': 'articles_by_obj', 'id': '10'}, + {'type': 'articles_by_obj', 'id': '11'}, + ] + } + } + } + } + test_app.patch_json( + '/people/1', + person_in, + headers={'Content-Type': 'application/vnd.api+json'}, + ) + person_out = test_app.get('/people/1').json_body['data'] + rels = person_out['relationships'] + # pprint.pprint(rels['posts']['data']) + # pprint.pprint(rels['comments']['data']) + # pprint.pprint(rels['articles_by_assoc']['data']) + # pprint.pprint(rels['articles_by_proxy']['data']) + + # Still need to test a to_one relationship. Blogs has one of those. + def blogs_pfilter(obj, **kwargs): + if obj['id'] == '13': + # Not allowed to change blogs/13 at all. + return False + if obj['id'] == '10': + # Not allowed to set owner of blogs/10 to people/13 + if obj['relationships']['owner']['data'].get('id') == '13': + # print('people/13 not allowed as owner of 10') + return {'attributes': True, 'relationships': {'posts'}} + if obj['id'] == '11': + # Not allowed to set owner of blogs/11 to None. + if obj['relationships']['owner']['data'] is None: + return {'attributes': True, 'relationships': {'posts'}} + return True + pj.view_classes[test_project.models.Blog].register_permission_filter( + ['patch'], + ['alter_request'], + blogs_pfilter + ) + blog = { + 'data': { + 'type': 'blogs', 'id': None, + 'relationships': { + 'owner': { + 'data': None + } + } + } + } + blog_owner = blog['data']['relationships']['owner'] + # /blogs/10 is owned by no-one. Change owner to people/11. Should + # Have permission for this one. + ppl11 = make_ri('people', '11') + blog['data']['id'] = '10' + blog_owner['data'] = ppl11 + self.assertNotEqual( + test_app.get('/blogs/10').json_body['data']['relationships']['owner']['data'], + ppl11 + ) + test_app.patch_json( + '/blogs/10', + blog, + headers={'Content-Type': 'application/vnd.api+json'}, + ) + self.assertEqual( + test_app.get('/blogs/10').json_body['data']['relationships']['owner']['data'], + ppl11 + ) + # Not allowed to set blogs/10.owner to people/13 though. + ppl13 = make_ri('people', '13') + blog_owner['data'] = ppl13 + test_app.patch_json( + '/blogs/10', + blog, + headers={'Content-Type': 'application/vnd.api+json'}, + ) + self.assertNotEqual( + test_app.get('/blogs/10').json_body['data']['relationships']['owner']['data'], + ppl13 + ) + + # Should be able to switch ownership of blogs/11 to people/12 + ppl12 = make_ri('people', '12') + blog['data']['id'] = '11' + blog_owner['data'] = ppl12 + test_app.patch_json( + '/blogs/11', + blog, + headers={'Content-Type': 'application/vnd.api+json'}, + ) + self.assertEqual( + test_app.get('/blogs/11').json_body['data']['relationships']['owner']['data'], + ppl12 + ) + # but not to None + blog_owner['data'] = None + test_app.patch_json( + '/blogs/11', + blog, + headers={'Content-Type': 'application/vnd.api+json'}, + ) + self.assertNotEqual( + test_app.get('/blogs/11').json_body['data']['relationships']['owner']['data'], + None + ) + + # Shouldn't be allowed to patch blogs/13 at all. + blog['data']['id'] = '13' + test_app.patch_json( + '/blogs/13', + blog, + headers={'Content-Type': 'application/vnd.api+json'}, + status=403 + ) + + def test_patch_alterreq_relationships(self): + test_app = self.test_app({}) + pj = test_app._pj_app.pj + def people_pfilter(obj, **kwargs): + if obj['id'] == '1': + return False + if obj['id'] == '2': + return {'attributes': True, 'relationships': False} + return True + pj.view_classes[test_project.models.Person].register_permission_filter( + ['patch'], + ['alter_request'], + people_pfilter + ) + def blogs_pfilter(obj, **kwargs): + if obj['id'] == '10': + # Not allowed to change blogs/10 at all. + return False + if obj['id'] == '11': + # Not allowed to set owner of blogs/11 to None. + if obj['relationships']['owner']['data'] is None: + return {'attributes': True, 'relationships': {'posts'}} + if obj['id'] == '12': + # Not allowed to set owner of blogs/12 to people/11 + if obj['relationships']['owner']['data'].get('id') == '11': + return {'attributes': True, 'relationships': {'posts'}} + return True + pj.view_classes[test_project.models.Blog].register_permission_filter( + ['patch'], + ['alter_request'], + blogs_pfilter + ) + + # ONETOMANY tests + # No permission to patch people/1 at all. + test_app.patch_json( + '/people/1/relationships/blogs', + { + 'data': [ + {'type': 'blogs', 'id': '10'}, + ] + }, + headers={'Content-Type': 'application/vnd.api+json'}, + status=403 + ) + # No permission to patch relationship of people/2. + test_app.patch_json( + '/people/2/relationships/blogs', + { + 'data': [ + {'type': 'blogs', 'id': '10'}, + ] + }, + headers={'Content-Type': 'application/vnd.api+json'}, + status=403 + ) + + test_app.patch_json( + '/people/11/relationships/blogs', + { + 'data': [ + {'type': 'blogs', 'id': '10'}, + {'type': 'blogs', 'id': '12'}, + {'type': 'blogs', 'id': '13'}, + ] + }, + headers={'Content-Type': 'application/vnd.api+json'}, + ) + blog_ids = [ + b['id'] for b in + test_app.get('/people/11').json_body['data']['relationships']['blogs']['data'] + ] + # No permission to blogs/10 + self.assertNotIn('10', blog_ids) + # No permission to set blogs/11.owner = people/11 + self.assertNotIn('12', blog_ids) + # No permission to set blogs/11.owner = None + self.assertIn('11', blog_ids) + # Allowed to add blogs/13 :) + self.assertIn('13', blog_ids) + + # MANYTOMANY tests + def articles_by_assoc_pfilter(obj, **kwargs): + if obj['id'] == '10': + # Not allowed to change articles_by_assoc/10 at all. + return False + if obj['id'] == '12': + # Not allowed to alter author of articles_by_assoc/12 + return {'attributes': True, 'relationships': False} + return True + pj.view_classes[test_project.models.ArticleByAssoc].register_permission_filter( + ['post', 'delete'], + ['alter_request'], + articles_by_assoc_pfilter + ) + test_app.patch_json( + '/people/12/relationships/articles_by_assoc', + { + 'data': [ + {'type': 'articles_by_assoc', 'id': '10'}, + {'type': 'articles_by_assoc', 'id': '1'}, + ] + }, + headers={'Content-Type': 'application/vnd.api+json'}, + ) + article_ids = [ + b['id'] for b in + test_app.get('/people/12').json_body['data']['relationships']['articles_by_assoc']['data'] + ] + # No permission to add 10 + self.assertNotIn('10', article_ids) + # Permission to remove 13 + self.assertNotIn('13', article_ids) + # No permission to remove 12 + self.assertIn('12', article_ids) + # Permission to add 1 + self.assertIn('1', article_ids) class TestRelationships(DBTestBase): '''Test functioning of relationsips. @@ -1504,8 +2271,11 @@ def test_spec_compound_full_linkage(self): # Find all the resource identifiers. rids = set() for rel in person['data']['relationships'].values(): - for item in rel['data']: - rids.add((item['type'], item['id'])) + if isinstance(rel['data'], list): + for item in rel['data']: + rids.add((item['type'], item['id'])) + else: + rids.add((rel['data']['type'], rel['data']['id'])) # Every included item should have an identifier somewhere. for item in person['included']: @@ -1568,7 +2338,7 @@ def test_spec_links(self): Note: only URL string links are currently generated by jsonapi. ''' - links = self.test_app().get('/people').json['links'] + links = self.test_app().get('/people?pj_include_count=1').json['links'] self.assertIsInstance(links['self'], str) self.assertIsInstance(links['first'], str) self.assertIsInstance(links['last'], str) @@ -1639,6 +2409,25 @@ def test_spec_single_sort(self): prev = item['attributes']['content'] + def test_spec_related_sort(self): + '''Should return collection sorted by related field. + + Note: It is recommended that dot-separated (U+002E FULL-STOP, “.”) sort + fields be used to request sorting based upon relationship attributes. + For example, a sort field of author.name could be used to request that + the primary data be sorted based upon the name attribute of the author + relationship. + ''' + res = self.test_app().get('/posts?sort=author.name') + data = res.json['data'] + prev = '' + for item in data: + # author_name is a hybrid attribute that just happens to have + # author.name in it. + self.assertGreaterEqual(item['attributes']['author_name'], prev) + prev = item['attributes']['author_name'] + + def test_spec_multiple_sort(self): '''Should return collection sorted by multiple fields, applied in order. @@ -1700,7 +2489,9 @@ def test_spec_pagination_links(self): * prev: the previous page of data * next: the next page of data ''' - json = self.test_app().get('/posts?page[limit]=2&page[offset]=2').json + json = self.test_app().get( + '/posts?pj_include_count=1&page[limit]=2&page[offset]=2' + ).json self.assertEqual(len(json['data']), 2) self.assertIn('first', json['links']) self.assertIn('last', json['links']) @@ -1713,10 +2504,10 @@ def test_spec_pagination_unavailable_links(self): Keys MUST either be omitted or have a null value to indicate that a particular link is unavailable. ''' - r = self.test_app().get('/posts?page[limit]=1') + r = self.test_app().get('/posts?pj_include_count=1&page[limit]=1') available = r.json['meta']['results']['available'] json = self.test_app().get( - '/posts?page[limit]=2&page[offset]=' + str(available - 2) + '/posts?pj_include_count=1&page[limit]=2&page[offset]=' + str(available - 2) ).json self.assertEqual(len(json['data']), 2) self.assertNotIn('next', json['links']) @@ -2051,7 +2842,7 @@ def test_spec_post_with_id_disallowed(self): test_app = self.test_app( options={'pyramid_jsonapi.allow_client_ids': 'false'} ) - test_app.post_json( + res = test_app.post_json( '/people', { 'data': { @@ -2500,6 +3291,34 @@ def test_hybrid_writeable_patch(self): self.assertEqual(data['attributes']['name'], 'alice2') +class TestHybridRelationships(DBTestBase): + '''Test cases for @hybrid_property relationships.''' + + def test_hybrid_rel_to_one_get(self): + '''Post should have a relationship called blog_owner''' + data = self.test_app().get('/posts/1').json['data'] + # Should have a relationship called blog_owner. + self.assertIn('blog_owner', data['relationships']) + # But not an attribute + self.assertNotIn('blog_owner', data['attributes']) + self.assertEqual( + data['relationships']['blog_owner']['data'], + {'type': 'people', 'id': '1'} + ) + + def test_hybrid_rel_to_many_get(self): + '''Blog should have a relationship called posts_authors''' + data = self.test_app().get('/blogs/1').json['data'] + # Should have a relationship called posts_authors. + self.assertIn('posts_authors', data['relationships']) + # But not an attribute + self.assertNotIn('posts_authors', data['attributes']) + self.assertEqual( + data['relationships']['posts_authors']['data'], + [{'type': 'people', 'id': '1'}] + ) + + class TestJoinedTableInheritance(DBTestBase): '''Test cases for sqlalchemy joined table inheritance pattern.''' @@ -2568,7 +3387,7 @@ def test_feature_construct_with_models_list(self): test_app = self.test_app( options={'pyramid_jsonapi_tests.models_iterable': 'list'} ) - test_app.get('/people/1') + test_app.get('/blogs/1') def test_feature_debug_endpoints(self): '''Should create a set of debug endpoints for manipulating the database.''' @@ -2646,7 +3465,7 @@ def test_19_last_negative_offset(self): ''' # Need an empty collection: use a filter that will not match. last = self.test_app().get( - '/posts?filter[title:eq]=frog' + '/posts?pj_include_count=1&filter[title:eq]=frog' ).json['links']['last'] offset = int( urllib.parse.parse_qs( @@ -2700,101 +3519,9 @@ def test_association_proxy(self): data = self.test_app().get('/people/1').json['data'] self.assertIn('articles_by_proxy', data['relationships']) - def test_175_unsupported_method(self): - '''Should produce 405 Method Not Allowed on unsupported method.''' - self.test_app().head('/people/1', status=405) - - -class TestJSONAPI(unittest.TestCase): - - def test_asdict(self): - """Test asdict method.""" - expected_dict = {'links': {}, 'data': None, 'meta': {}, 'included': []} - doc = pyramid_jsonapi.jsonapi.Document() - self.assertEqual(doc.as_dict(), expected_dict) - - def test_filter_keys(self): - """Test filter_keys to modify dict output.""" - doc = pyramid_jsonapi.jsonapi.Document() - # Filter out links from result - del(doc.filter_keys['links']) - self.assertTrue('links' not in doc.as_dict()) - - def test_set_jsonapi_attribute(self): - """Test setting jsonapi values via class attributes.""" - new_links = {"self": "http://example.com"} - doc = pyramid_jsonapi.jsonapi.Document() - doc.links = new_links - self.assertEqual(doc._jsonapi['links'], new_links) - - def test_set_invalid_jsonapi_attribute(self): - """Test setting non-jsonapi class attributes.""" - doc = pyramid_jsonapi.jsonapi.Document() - doc.not_jsonapi = "cat" - # doesn't end up in _jsonapi - self.assertTrue('not_jsonapi' not in doc._jsonapi) - # Is actually a real class attribute - self.assertTrue(hasattr(doc, 'not_jsonapi')) - - def test_data_from_resources_item(self): - """Test creating 'data' json from a resources object as an item.""" - rsc_links = {"self": "http://example.com"} - doc = pyramid_jsonapi.jsonapi.Document() - # Empty rscs - data is None - self.assertIsNone(doc.data_from_resources()['data']) - rsc = pyramid_jsonapi.jsonapi.Resource() - rsc.links = rsc_links - # 1 item - data is a dict - doc.resources.append(rsc) - self.assertIsInstance(doc.data_from_resources()['data'], dict) - self.assertEqual(doc.data_from_resources()['data']['links'], rsc_links) - - def test_data_from_resources_collection(self): - """Test creating 'data' json from a resources object as a collection.""" - rsc_links = {"self": "http://example.com"} - doc = pyramid_jsonapi.jsonapi.Document(collection=True) - # data is a list, even if empty. - self.assertIsInstance(doc.data_from_resources()['data'], list) - rsc = pyramid_jsonapi.jsonapi.Resource() - rsc.links = rsc_links - doc.resources.append(rsc) - self.assertEqual(doc.data_from_resources()['data'][0]['links'], rsc_links) - - def test_data_to_resources_item(self): - """Test adding a single data resource to a document.""" - data = {'id':1, 'type': 'person', 'attributes':{}} - doc = pyramid_jsonapi.jsonapi.Document() - doc.data_to_resources(data) - #Should have appended a Resource to doc.resources - rsc = doc.resources[0] - self.assertIsInstance(rsc, pyramid_jsonapi.jsonapi.Resource) - self.assertEqual(rsc.id, 1) - - def test_data_to_resources_list(self): - """Test adding a list of data resources to a document.""" - data = [{'id':1, 'type': 'person', 'attributes':{}}, - {'id':2, 'type': 'person', 'attributes':{}}] - doc = pyramid_jsonapi.jsonapi.Document() - # Should have appended each data item to doc.resources - doc.data_to_resources(data) - rsc = doc.resources[1] - self.assertTrue(len(doc.resources) == 2) - self.assertEqual(rsc.id, 2) - - def test_update(self): - """Test update method creates resources and updates _jsonapi.""" - links = {'self': 'http://example.com'} - doc_dict = { - 'data': [{'id':1, 'type': 'person', 'attributes':{}}, - {'id':2, 'type': 'person', 'attributes':{}}], - 'links': links, - 'meta': {} - } - doc = pyramid_jsonapi.jsonapi.Document() - doc.update(doc_dict) - # Appends data to doc.resources, and updates _jsonapi for other attibutes - self.assertTrue(len(doc.resources) == 2) - self.assertEqual(doc._jsonapi['links'], links) + def test_175_head_method(self): + '''Should produce OK for HEAD request.''' + self.test_app().head('/people/1') class TestEndpoints(DBTestBase): @@ -3030,9 +3757,10 @@ def test_openapi_specification_view(self): """Test that specification view returns valid json.""" self.test_app().get('/metadata/OpenAPI/specification', status=200).json - def test_openapi_specification_valid(self): - """Test that the openapi specification returned is valid.""" - validate_spec(self.test_app().get('/metadata/OpenAPI/specification', status=200).json) + # def test_openapi_specification_valid(self): + # """Test that the openapi specification returned is valid.""" + # validate_spec(self.test_app().get('/metadata/OpenAPI/specification', status=200).json) + # print(json.dumps(self.test_app().get('/metadata/OpenAPI/specification', status=200).json, indent=4)) def test_openapi_file(self): """Test providing openapi spec updates in a file.""" diff --git a/tox.ini b/tox.ini index bfa216ac..5c978352 100644 --- a/tox.ini +++ b/tox.ini @@ -1,20 +1,15 @@ [tox] -envlist=py3 +envlist=py3, docs, report skip_install=true skipsdist=true -[testenv] +[testenv:py3] # Deps needed for code testing (actual deps are in setup.py) deps= coverage coveralls pycodestyle pylint - sphinx - sphinx-rtd-theme - travis-sphinx -# Pass in TRAVIS tokens and GH_TOKEN for travis-sphinx -passenv=TRAVIS TRAVIS_* GH_TOKEN commands= ## Comands fail 'fast' so later commands won't run if any earlier one fails # Install 'local' pyramid_jsonapi @@ -24,12 +19,29 @@ commands= pylint --errors-only --rcfile=.pylintrc pyramid_jsonapi pycodestyle --ignore=E402,E501,W503,W504,E731 pyramid_jsonapi # Call unittest from coverage (add --buffer to 'discover' to hide output from tests that pass) - coverage run --source=pyramid_jsonapi -m unittest discover --verbose test_project + coverage run --source=pyramid_jsonapi -m unittest --verbose -f {posargs:test_project.tests} # Generate coverage report - coverage report -m + #coverage report -m # Try to push coverage data to coveralls (ignore exit code as will fail if not on travis) - coveralls + +[testenv:report] +deps= + coverage +commands= + coverage report -m + +[testenv:docs] +# Deps needed for code testing (actual deps are in setup.py) +deps= + sphinx + sphinx-rtd-theme + travis-sphinx +# Pass in TRAVIS tokens and GH_TOKEN for travis-sphinx +passenv=TRAVIS TRAVIS_* GH_TOKEN +commands= + pip install -e . # Build the sphinx docs (will push to gh-pages if tox is run by travis) - /bin/bash docs/sphinx.sh + docs: /bin/bash docs/sphinx.sh whitelist_externals= /bin/bash