Skip to content
This repository has been archived by the owner on Jan 29, 2020. It is now read-only.

SimpleCacheDecorator and providesPerItemTtl leads undocumented behaviors #171

Open
dol opened this issue May 17, 2018 · 4 comments
Open

Comments

@dol
Copy link

dol commented May 17, 2018

The SimpleCacheDecorator returns false and doesn't store the item if the underlying storage doesn't has the staticTtl capability.
For testing purposes I changed from an APCU storage to memory storage. Our tests where failing due to the fact that the memory storage has no per item ttl support => in other words staticTtl = false.

$this->providesPerItemTtl = $capabilities->getStaticTtl() && (0 < $capabilities->getMinTtl());

In our code base we where not checking the return value of CacheInterface::set(), which in the case of memory return false.

<?php
use Zend\Cache\StorageFactory;
use Zend\Cache\Psr\SimpleCache\SimpleCacheDecorator;

$storage = StorageFactory::factory([
    'adapter' => [
        'name'    => 'memory',
        'options' => ['ttl' => 3600],
    ],
]);

$cache = new SimpleCacheDecorator($storage);

if (true === $cache->set('someKey', $value, 30)) {
    echo 'success';
} else {
    echo 'fail';
}

The documentation has no information about this behavior:

When setting values, whether single values or multiple, you can also optionally provide a Time To Live (TTL) value. This proxies to the underlying storage instance's options, temporarily resetting its TTL value for the duration of the operation. TTL values may be expressed as integers (in which case they represent seconds) or DateInterval instances. As examples:

In our case we set a TTL on the adapter level. But we also set the TTL per item to make sure the item has the same TTL even when we replace the underlying PSR-16 library.

Our workaround for the moment is to set the per item TTL to null for an Zend cache adapter that doesn't support staticTtl. This adds some unwanted conditional if/else and knowledge about the special behavior. There is also the problem if the internal detection state of providesPerItemTtl changes in never releases we need to adopt to this changes.

Possible solution (eater one of them or a combination)##

  • If the TTL of the item is the same as the TTL of the storage adapter allow the per item TTL.
  • If no storage adapter TTL is set silently ignore the TTL and fall back to the storage default => this is what https://www.php-fig.org/psr/psr-16/#13-cache suggests
  • Add a configuration flag to the SimpleCacheDecorator to handle the failure state:
    • ignore failure of storage not supports per item ttl => this helps getting aware of the issue and can be added to the docs
  • expose the storage and providesPerItemTtl to allow for workarounds
@Zegnat
Copy link

Zegnat commented Jun 4, 2018

I just ran into this and it took a surprising amount of head scratching before I realised the per item TTL was to blame. Commenting mostly so I can keep track of this issue.

@michalbundyra
Copy link
Member

@dol @Zegnat It might be fixed in #184 and released yday - can you please check version 2.8.3 and tell me if you still observe the issue?

Thanks!

@marc-mabe
Copy link
Member

PSR-16

... If the underlying implementation does not support TTL, the user-specified TTL MUST be silently ignored. ...

I have the feeling that what PSR-16 defines here could be very very dangerous:
Here a simple example that will result in a security issue:

function verifyAccessToken($accessToken) {
    $accessTokenValidKey = 'access_token_valid_' . md5($accessToken);
    if ($cache->get($accessTokenValidKey) !== '1') {
        // verify access token by querying authentication server
        // if invalid -> return false
        // if valid -> authentication server returns expiration ($expiresIn)
        $cache->set($accessTokenValidKey, '1', $expiresIn);
    }
    return true;
}

The Time-to-Live should define the maximum time where this item is considered valid. In caching it normally means that there is a guaranty to be invalidated after that time and this guaranty gets lost here.

@weierophinney
Copy link
Member

This repository has been closed and moved to laminas/laminas-cache; a new issue has been opened at laminas/laminas-cache#5.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

5 participants