Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

reversed "update" method in multi_cache to overwrite old values with new #489

Open
wants to merge 2 commits into
base: master
Choose a base branch
from

Conversation

Trizalio
Copy link

If wrapped function can return dict with more keys, than required there is a case, in which new value will be replaced by old, example:

import asyncio
from typing import List, Dict

from aiocache import multi_cached


@multi_cached("ids", namespace="main")
async def foo(ids: List[int]) -> Dict[int, int]:
    biggest_value = max(ids)
    result = {id_: biggest_value for id_ in range(biggest_value + 1)}
    print(f'foo result: {result}')
    return result


async def amain():
    print(await foo(ids=[1]))
    print(await foo(ids=[0, 2]))
    print(await foo(ids=[0]))

if __name__ == "__main__":
    asyncio.run(amain())

will be printed:

foo result: {0: 1, 1: 1}
{0: 1, 1: 1}
foo result: {0: 2, 1: 2, 2: 2}
{0: 1, 1: 2, 2: 2}
{0: 1}

As we can see, on second call of foo value for 1 changed, but it was overwritten by cache value.

In my case i must return aggregated values based on values from two distinct bigdata storages. Based on requested keys i can predict future requests, so i'd like to preload values for predicted keys with keys passed to my funstion and save them all in aiocache

Copy link
Member

@argaen argaen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for this @Trizalio!

We would need to add tests in order to merge this

@codecov
Copy link

codecov bot commented Oct 19, 2020

Codecov Report

Merging #489 (9981377) into master (a966f84) will increase coverage by 0.00%.
The diff coverage is 100.00%.

Impacted file tree graph

@@           Coverage Diff           @@
##           master     #489   +/-   ##
=======================================
  Coverage   99.13%   99.13%           
=======================================
  Files          13       13           
  Lines        1043     1044    +1     
  Branches      116      116           
=======================================
+ Hits         1034     1035    +1     
  Misses          9        9           
Impacted Files Coverage Δ
aiocache/decorators.py 100.00% <100.00%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update a966f84...9981377. Read the comment docs.

@Dreamsorcerer
Copy link
Member

Looks reasonable, but as mentioned, please add a test (after merging master). Test probably should go in tests/ut/test_decorators.py

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants