python: functools.lru_cache (beginner - intermediate) anthony explains #54

preview_player
Показать описание


==========

I won't ask for subscriptions / likes / comments in videos but it really helps the channel. If you have any suggestions or things you'd like to see please comment below!
Рекомендации по теме
Комментарии
Автор

very clear explanation. I've been seeing this used in a lot of leetcode solutions and never understood what it meant. thanks!

stephan
Автор

Excellent. I first came across lru_cache in FastAPI but this video helps understand it's wider usage

python
Автор

This makes my memoization DP code A LOT cleaner! Thanks!😅

dera_ng
Автор

Wow, using lru_cache as a closure namespace is a neat little trick!

VladimirTheAesthete
Автор

Thanks man. This has been very helpful.

juantorres
Автор

Thanks for the video. Can you maybe make a video on the variable types, e.g. in the video you write: def square(x: float) -> float. What's the usefulness of it other than you explicitly see what are the variable types, in this case input and output are floats.

smjure
Автор

You mentioned that the large cache is wasteful on that function that will only ever return one value, but is it? I assumed the cache is just like a dictionary or something so it wouldn't reserve space for up to maxsize, so if you only ever put one item in it, the size would be the same no matter what you set maxsize to.

sparkyb
Автор

I'm guessing functools only works with functions :/
If you set some default values in a module, how do you stop them being reset if that module is imported a second time?

zig
Автор

please explain functools.partial as well.

patternsandconnections
Автор

Can a logger be benefited by slamming lru_cache on every single* method of the Logger class?
I did profiling on the logger class and turns out the number of function call were reduced but is it still safe to assume that this slamming most of the methods with lru_cache is okay? Implementation wise the output of the logger is correct so haven't had any issues with that as well, just curious if it is good implementation or am I missing something big here?

akshaymestry
Автор

How long the values would be cached for with lru_cache?

wanhe
Автор

I am spamming you with lots of comments, apologies for this... do you think this can be used within AWS lambda functions? I have a specific use case to retrieve read only data which is only updated nightly, given the lambda container doesnt hang around all that long I am wondering if I could use this as a cache for json data instead of the additional overhead and complexity of adding elasticache in. I guess what would be even better were if the cache TTL could be controlled?

Walruz
Автор

is this kinda like dfs with memorisation?

yangliu_