BIGGEST integer in Python!? #python #programming #coding #computerscience

preview_player
Показать описание
Do you know what the largest integer in Python is? Hint: it's very, very big - probably bigger than you can imagine!

Subscribe & like for more coding, programming, and Python content!
Рекомендации по теме
Комментарии
Автор

Architecturally, as modern systems are only 64-bit/32-bit, even considering unlimited memory, the maximum amount of data stored is (2^30)^59, excluding any excess memory from the bigint or any memory used by the interpreter. This is also Python implementation-agnostic, unlike the answer Martmists pointed out.

somehybrid
Автор

Wow! Incredible video. Short, sweet, and informative! Absolutely love it

gamerpedia
Автор

This is a great video! Also explains why even trivial arithmetic computations can be arbitrsrily slow on python

stacksmasherninja
Автор

awesome for general purpose scripting, though if you want to work with possibly large datasets (in ML or even when querying large data for a webserver) to use bound representations instead for either decimal or integers since it not only computes faster it also allows better memory allocation in the hood

WeirdDuck
Автор

i use python to calculate something very percise

Garfield_Minecraft
Автор

Also interesting to note is that the ob_size field (which keeps track of how large the array is for an integer) can be negative, which indicates the number itself is negative. This also means it's technically bounded, but to an array of ints the size of the max value of a size_t (or ssize_t, don't recall which), which would be at either ~8GB or ~18EB depending on the maximum value. This equates to ~500 million digits or ~2 quintillion digits respectively

Martmists
Автор

Its 4300 digits after that python cannot do mathematical operations

mayank
Автор

As a consequence, all mathematical and logical operations using the big ints are much slower.

justaway_of_the_samurai
Автор

I recalled telling a friend that python data-types (int, str etc.) are global objects 😢

coderoyalty
Автор

Python is so slow anyways, it can afford it. Compiled languages generally don’t default to BigInteger as it has costs that a programmer should opt into deliberately.

Bolpat
Автор

c/cpp has __int_128 / __int_128_t type, so 2^63-1 is a limit only if u dont know theres more :)

NoName-zgte
Автор

Consider using a 32/64bit float to represent your number first. You may lose precision but it is way more efficient to store and calculate.

jack
Автор

It's kind of like how pythons can keep appending their arrays. It's not because they're doing a better data type. The language just increases the array to a larger array every time you get close to the max. But you do get overhead for that.

TuberTugger
Автор

Tip on making yt shorts, don't place text content where the yt shorts ui will obscure the text.

LeoStaley
Автор

There is a also a limitation on string representation of numbers which is 4300 digits by default

voodooman
Автор

There are certain aspects of python where there is an integer limit. For example, the index magic method is limited to returning a 64 bit value

veryblocky
Автор

i know theres such thing as an octuple precision floating point format, its a byte of bits in size.

g_vost
Автор

Want to see a large number? Use the factorial library and do factorial(factorial(1000))

quitchiboo
Автор

I think it still bounded. a friend of mine tried to print the number into a file and final number was about 21mb worth of digits, after which python couldn't handle it

DeathSugar
Автор

its actually bounded to a bit under 2 gigs you would need to first remove that limit of you want to store bigger numbers

maedafeelscoke
visit shbcf.ru