From last class {\displaystyle {\mbox{From last class}}{}_{}^{}}
M 1 = ( 1 0 0 0 ) , M 2 = ( 0 1 0 0 ) , M 3 = ( 0 0 1 0 ) , M 4 ( 0 0 0 1 ) {\displaystyle M_{1}={\begin{pmatrix}1&0\\0&0\end{pmatrix}},M_{2}={\begin{pmatrix}0&1\\0&0\end{pmatrix}},M_{3}={\begin{pmatrix}0&0\\1&0\end{pmatrix}},M_{4}{\begin{pmatrix}0&0\\0&1\end{pmatrix}}}
N 1 = ( 0 1 1 1 ) , N 2 = ( 1 0 1 1 ) , N 3 = ( 1 1 0 1 ) , N 4 ( 1 1 1 0 ) {\displaystyle N_{1}={\begin{pmatrix}0&1\\1&1\end{pmatrix}},N_{2}={\begin{pmatrix}1&0\\1&1\end{pmatrix}},N_{3}={\begin{pmatrix}1&1\\0&1\end{pmatrix}},N_{4}{\begin{pmatrix}1&1\\1&0\end{pmatrix}}}
The M i s generate M 2 × 2 {\displaystyle {\mbox{The }}M_{i}{\mbox{s generate }}M_{2\times 2}}
Fact T ⊂ span S ⇒ span T ⊂ span S {\displaystyle {\mbox{Fact }}T\subset {\mbox{ span }}S\Rightarrow {\mbox{ span }}T\subset {\mbox{ span }}S}
S ⊂ V is linearly independent ⇔ whenever u i ∈ S are distinct {\displaystyle S\subset V{\mbox{ is linearly independent }}\Leftrightarrow {\mbox{ whenever }}u_{i}\in S{\mbox{ are distinct}}}
∑ a i u i = 0 ⇒ V i a i = 0 waste not {\displaystyle \sum a_{i}u_{i}=0\Rightarrow V_{i}a_{i}=0{\mbox{ waste not}}}
Comments {\displaystyle {\mbox{Comments}}{}_{}^{}}
Proof {\displaystyle {\mbox{Proof}}{}_{}^{}}
1. ⇐: start from second assertion and deduce first. {\displaystyle {\mbox{1.}}\Leftarrow :{\mbox{ start from second assertion and deduce first.}}}
Assume v ∈ span S {\displaystyle {\mbox{Assume }}v_{}^{}\in {\mbox{span }}S} v = ∑ a i u i where u i ∈ S , a i ∈ F {\displaystyle v=\sum a_{i}u_{i}{\mbox{ where }}u_{i}\in S,a_{i}\in F}
∑ a i u i − 1 ⋅ v = 0 this is a linear combination of elements in S ∪ v {\displaystyle \sum a_{i}u_{i}-1\cdot v=0{\mbox{ this is a linear combination of elements in }}S\cup v} in which not all coefficients are 0 and which add to 0 . {\displaystyle {\mbox{ in which not all coefficients are }}0{\mbox{ and which add to }}0_{}^{}.} So S ∪ { v } is linearly dependent by definition {\displaystyle {\mbox{So }}S\cup \lbrace v\rbrace {\mbox{ is linearly dependent by definition}}} 2. :⇒ Assume S ∪ { v } is linearly dependent ⇒ a linear combination can be found, of the form: {\displaystyle {\mbox{2.}}:\Rightarrow {\mbox{ Assume }}S\cup \lbrace v\rbrace {\mbox{ is linearly dependent }}\Rightarrow {\mbox{ a linear combination can be found, of the form:}}}
( ∗ ) ∑ a i u i + b v = 0 where u i ∈ S and not all of the a i and b are 0 {\displaystyle (*)\qquad \sum a_{i}u_{i}+bv=0{\mbox{ where }}u_{i}\in S{\mbox{ and not all of the }}a_{i}{\mbox{ and }}b{\mbox{ are }}0}
If b = 0 , then ∑ a i u i = 0 and not a i s are 0 {\displaystyle {\mbox{If }}b=0{\mbox{, then }}\sum a_{i}u_{i}=0{\mbox{ and not }}a_{i}{\mbox{s are }}0} ⇒ S is linearly dependent {\displaystyle {}_{}^{}\Rightarrow S{\mbox{ is linearly dependent}}} but initial assumption was S is linearly independent. ⇒ contradiction so b ≠ 0 {\displaystyle {}_{}^{}{\mbox{but initial assumption was }}S{\mbox{ is linearly independent.}}\Rightarrow {\mbox{ contradiction so }}b\neq 0} So divide by b : (*) becomes ∑ a i b u i + v = 0 ⇒ v = − ∑ a i b u i ⇒ v ∈ span S {\displaystyle {\mbox{So divide by }}b{\mbox{: (*) becomes }}\sum {\frac {a_{i}}{b}}u_{i}+v=0\Rightarrow v=-\sum {\frac {a_{i}}{b}}u_{i}\Rightarrow v\in {\mbox{ span }}S}
Definition {\displaystyle {\mbox{Definition}}{}_{}^{}}
A basis of a vector space V is a subset β ⊂ V {\displaystyle {}_{}^{}{\mbox{A basis of a vector space }}V{\mbox{ is a subset }}\beta \subset V} such that {\displaystyle {}_{}^{}{\mbox{such that}}}
Examples {\displaystyle {\mbox{Examples}}{}_{}^{}}
1. β = ∅ is a basis of { 0 } {\displaystyle 1.\beta =\emptyset {}_{}^{}{\mbox{ is a basis of }}\lbrace 0\rbrace }
2. V be R as a vector space over R {\displaystyle 2.{}_{}^{}V{\mbox{ be }}\mathbb {R} {\mbox{ as a vector space over }}\mathbb {R} } β = { 5 } and β = { 1 } are bases. {\displaystyle \qquad {}_{}^{}\beta =\lbrace 5\rbrace {\mbox{ and }}\beta =\lbrace 1\rbrace {\mbox{ are bases.}}}
3. Let V be C as a vector space over R β = { 1 , i } {\displaystyle 3.{}_{}^{}{\mbox{ Let }}V{\mbox{ be }}\mathbb {C} {\mbox{ as a vector space over }}\mathbb {R} \quad \beta =\lbrace 1,i\rbrace }
4. V ∈ R n = { ( ⋮ ) y , e 1 = ( 1 0 ⋮ 0 ) , e 2 = ( 0 1 ⋮ 0 ) , … , e n = ( 0 0 ⋮ 1 ) } {\displaystyle {}_{}^{}{\mbox{4. }}V\in \mathbb {R} ^{n}=\left\lbrace {\begin{pmatrix}\vdots \end{pmatrix}}y,\qquad e_{1}={\begin{pmatrix}1\\0\\\vdots \\0\end{pmatrix}},e_{2}={\begin{pmatrix}0\\1\\\vdots \\0\end{pmatrix}},\ldots ,e_{n}={\begin{pmatrix}0\\0\\\vdots \\1\end{pmatrix}}\right\rbrace }
5. In V = P 3 ( R ) , β = { 1 , x , x 2 , x 3 } {\displaystyle {}_{}^{}{\mbox{5. In }}V=P_{3}(\mathbb {R} ),\qquad \beta =\lbrace 1,x,x^{2},x^{3}\rbrace }
6. In V = P 1 ( R ) = { a x + b } , β = { 1 + x , 1 − x } is a basis {\displaystyle {}_{}^{}{\mbox{6. In }}V=P_{1}(\mathbb {R} )=\lbrace ax+b\rbrace ,\qquad \beta =\lbrace 1+x,1-x\rbrace {\mbox{ is a basis}}}
Theorem {\displaystyle {\mbox{Theorem}}{}_{}^{}}
A subset β of a vectorspace V is a basis iff every v ∈ V can be expressed as {\displaystyle {}_{}^{}{\mbox{A subset }}\beta {\mbox{ of a vectorspace }}V{\mbox{ is a basis iff every }}v\in V{\mbox{ can be expressed as}}} a linear combination of elements in {\displaystyle {}_{}^{}{\mbox{a linear combination of elements in }}} β in exactly one way. {\displaystyle {}_{}^{}\beta {\mbox{ in exactly one way.}}}
It is a combination of things we already know. {\displaystyle {}_{}^{}{\mbox{It is a combination of things we already know.}}}