a very cool integral comparison problem

preview_player
Показать описание
🌟Support the channel🌟

🌟my other channels🌟

🌟My Links🌟

🌟How I make Thumbnails🌟

🌟Suggest a problem🌟
Рекомендации по теме
Комментарии
Автор

The n-dimensional integral does limit to 1:

Because we are integrating over a mass of one, we can easily think of this as an expectation. In particular, we can define Y = x_1 * x_2 * ... * x_n, and think of x_1, x_2, ..., x_n and Y as random variables, and the integral is then E(Y^Y). The x_i are iid distributed uniformly on [0, 1]. As n goes to infinity, Y converges in probability to 0. Because f(x) = x^x is continuous on [0, 1] (taking f(0) = 1), this means E(Y^Y) limits to E(f(0)) = 1.


To show Y converges in probability to 0, I think it's sufficient to see that it is positive, and E(Y) converges to 0. And E(Y) = E(x_1) * E(x_2) * ... * E(x_n) = (1/2)^n, which limits to 0.

isthissarcasm
Автор

After staring at the thumbnail, I decided the single integral was larger. The solution was a pleasant surprise.

charlesbrowne
Автор

Surprising! I assumed the xy one was smaller, since you're multiplying a number in (0, 1) by another number in (0, 1) which shrinks the product

txikitofandango
Автор

Another proof the limit is 1.
It seems to be easier to work with the logarithm of that function. And since we expect the values are close 1 most of the time, that crude bound is enough.
f_n(...) = x1 x2 ...xn (log(x1)+...+log(xn))
Since
exp(x)>=1+x
\int exp(f_n(...)) >= \int_n 1 + \int f_n(...) = 1 - \int -f_n(...)
\int goes over n-th dimensional hypercbe
On the left side we have our integral from the video. On the right side we already have 1, and we subtract an integral of a positive function. Let's show it is small.
\int -f_n(...) = \int -x1 x2 ...xn (log(x1)+...+log(xn)) = \sum_i=1^n \int -log(x_i) x1...xn

Each integral is the same, so let's change names of variables and we can write it as one term
...= n* \int -log(x_1) x1 x2...xn
-x1 log(x1) is bounded by a constant (1/e to be precise)and positive, everything else is positive, so even without namedroping Hoelder inequality:
...<= n* 1/e \int x2...xn = n/e *2^-(n-1)
Now, we apply this bound the the initial inequality, remembering ghat the function under the original integral is <=1, so the internal is also <=1
1>= \int exp(f_n(...)) >= \int_n 1 + \int f_n(...) = 1 - \int - f_n(...) >= 1- n/e *2^-(n-1) ->1

bartekltg
Автор

No need double change of variable Perform the single change of variable u(x)=xy to obtain integral from x=0 to 1 of 1/x integrate from u=0 to x of u^u then, perform the integration by parts dv=1/x

richardheiville
Автор

The t = u^u substitution is invalid as u^u has a derivative = 0 somewhere (1/e, I guess) inside the interval.

janihabetler
Автор

In general, I guess the integral is equal to \frac{1}{(n-1)!} \int_0^1 u^u (1 / \ln{u})^{n-1} du, which goes to 1 if n goes to infinity

alexeycanopus
Автор

I can't quite show directly that it's indeed would be greater than original integral for n>2, but here my steps:

Change variables to
x_1 = u_1/(u_2 * ... * u_n)
x_2 = u_2 ...
the jacobian then would be 1/(u_2 ... u_n)

The integral changes in this way:
J_n = \int _ [0, 1]^n d^n x (x_1 * ... * x_n)^(x_1 * ... * x_n) = \int_S u_1^u_1 * 1/(u_2 * ... * u_n) d^n u

A bit of thinking about parametrization so that x_1 is the "last" variable to integrate leads to these limits:

\int_S (...) = \int_0^1 du_1 u_1^(u_1) \int_(u_1)^1 du^2/u^2 \int_(u_1/u_2)^1 du_3/u_3 ... \int_(u_1/(u_2 ... u_n-1)^1 du_n/u_n

Changing variables once more (but in u_1 it's just relabeling) to ln(u_i) = y_i (pretty straightforward change)

\int_0^1 du_1 u_1^(u_1) \int_(y_1)^0 dy_2 \int_(y_1 - y_2)^0 dy_3 ... \int_(y_1 - y_2 - ... - y_(n-1)) dy_n

One can see that this is a volume of something (something like a symplex but not quite, mostly a pyramid?). Let's change the sign of variables and define:
V_n(r) = \int_0^r dx_1 * \int_0^(r-x_1) dx_2 * ... * \int_0^(r-x_1-...-x_n-1) dx_n
And we see that the original integral is:

J_n = \int _ [0, 1]^n d^n x (x_1 * ... * x_n)^(x_1 * ... * x_n) = \int_0^1 x^x dx * V_{n-1}(-ln(x))

It's easy to spot recursive relation of V's:

V_n(r) = \int_0^r du V_{n-1}(r-u) = \int_0^r du V_n-1(u); V_1(r) = r
Pretty easy to say that V_n(r) = r^k/k!

Then I dont really know how to proceed, these V_n(r)'s are not increasing if x is fixed, but they seem (in case of V_n(-ln x) like an approximating function of delta function (cuz they are normalized to 1 and localize to a spike at x=0). So in a sense J_n -|n->inf|-> lim_{x->0} x^x = 1

Why there's a feeling that they are increasing - that's because they "localize" the rise at the graph of x^x near x=0. It's pretty obvious if we get a rectangle bumps limit for delta function, but it may not be true for V_n(-ln x).

And I guess to actually calculate the limit explicitly one can try using method of steepest descent (after changing variables -ln x = u) for
J_n = \int_0^+inf e^(-u(1+e^-u))u^k / k! du

egoreremeev
Автор

Since they are in different dimensions, the question is poorly posed - eg is 18 feet larger or smaller than 15 square feet?

johnpaterson
Автор

My guess is that the x^x will be larger

evankalis
Автор


So this result, that the double integral equals the single integral should be called the Junior's Dream?!

Teja
Автор

According to desmos they’re both equal

Neo_Bones
Автор

Have not watched it yet but .... thinking a quarter to a quarter is greater than a (quarter to a quarter) all squared but will it survive Michael's testabilities, analysis and computational skills?
Start the video!

Alan-zftt
Автор

What about this question dear?
The indefinte integral of x^x
Someone gave me to solve and I could not solve.
If you solve me it would be your ...

sshkbf
welcome to shbcf.ru