Linear Transformations -- Abstract Linear Algebra 13

preview_player
Показать описание
⭐Support the channel⭐

⭐my other channels⭐

⭐My Links⭐
Рекомендации по теме
Комментарии
Автор

will this series cover eigen-values and eigen-vectors? Im really enjoying it !

zherox
Автор

I really hope this series talks a bit about module theory at some point. I feel like you would present it a little less dense and more digestible than Borcherds. I love the series so far, and the second channel! keep doing you and only make the videos you want to make :)

benbetts
Автор

I was definitely expecting the rank-nullity theorem to be proved as an immediate corollary of the 1st isomorphism theorem for vectorspaces, going by the title of the series (you already defined semigroups monoids groups and rings and gave the universal properties of the direct product and direct sum, so it wouldn't be a stretch to define quotient vectorspaces, especially since you don't have the complication of needing "normal subspaces" to define quotients of vectorspaces - i.e. quotients of vectorspaces are akin to quotients of abelian groups by subgroups rather than quotients of non-abelian groups by normal subgroups).

P.S: It is actually possible to define "vector spaces" in which addition is not commutative, however these have to be over a ring, not a field - so already they are like modules, not vector spaces - and the ring has to consist entirely of zero-divisors (in particular it has to be a ring without 1). So these things, let's call them "gyromodules", are quite rare, but there are examples, and the notion of quotient gyromodules requires "normal sub-gyromodules" as opposed to just sub-gyromodules.

schweinmachtbree
Автор

I don't think we've actually explicitly proved three of the properties used here in the proof:
1) W a subspace of a finite-dimensional vector space V, and dim V = dim W = n implies W = V.
2) dim{0} = 0.
3) V is a vector space with dim V = 0 implies V = {0}.
So I will add what I worked out here in case it helps, please add a correction if you see an error!

1) Suppose we have W a subspace of a finite-dimensional vector space V, and dim V = dim W = n. Then we can definitely find a basis, B, for W since it's a vector space, by the corollary to the first theorem in Video 12. Since dim W = n, the number of elements in B, card(B) = n = dim V. Since B is also linearly independent, the fourth Theorem in video 12 tells us B is also a basis for V.
Now take some v in V, and it must be in the span of B (because B is a basis for V), which means we can write v as a linear combination of the basis vectors of W. But W is a vector space, so it is closed under addition and scalar multiplication, so any linear combination of vectors from W must be in W. i.e. any v in V must also be in W. So V is a subset of W; since we started with W being a subset of V, we have V = W.

2) The next is a bit harder. Suppose we have a U, W both subspaces of a vector space, V, over a field k, and the direct sum of U and W is V. Again we can find a basis for U, call it A, and a basis for W, call it C. Now we consider the set A + C (the union of A and C).

You can show this set is linearly independent: take scalars from the field such that a linear combination of these vectors is zero, separate them into: (linear combination of vectors in A) = -(linear combination of vectors in C) = some t in V. Due to the same closure properties used above, the LHS implies t is in U, the RHS implies it is in W. However, U + W being a direct sum means the only element in both sets is the zero vector, ergo we have: (linear combination of vectors from A) = (linear combination of vectors from C) = 0 . But A and C are basises, so they are both linearly independent, so this equation is only solved when all the scalars are zero. Thus the set A + C is linearly independent.

What about the span? Well it's quite obvious when you write it out that the span(A+C) = span A + span C = U + W = V (we defined the last equality at the beginning). So clearly A + C is a basis for V. Also U + W being a direct sum guarantees that nothing in U is in W and vice-versa, so card(A + C) = card(A) + card(C).
Putting all that together we have dim V = card(A+C) = card (A) + card(C) = dim U + dim W.

Great, now just pick the subspace V and the subspace with the zero vector only, {0}. Both these are subspaces of V. Subbing into the equation we just derived: dim V = dim V + dim{0}. Proving that dim{0} = 0.

3) Finally, suppose V is a vector space with dim V = 0. We know {0} is a subspace of all vector spaces, so {0} is a subspace of V, which is a finite-dimensional vector space. From what we just showed dim {0} = dim V = 0. Now we need to go back and use the first property I proved. It tells us that V = {0}.

StanleyDevastating
Автор

Very good elementes for linear algebra!!

lucachiesura
Автор

How long is this series going to last?

CookieGod