\( \newcommand{\blah}{blah-blah-blah} \newcommand{\eqb}[1]{\begin{eqnarray*}#1\end{eqnarray*}} \newcommand{\eqbn}[1]{\begin{eqnarray}#1\end{eqnarray}} \newcommand{\bb}[1]{\mathbf{#1}} \newcommand{\mat}[1]{\begin{bmatrix}#1\end{bmatrix}} \newcommand{\nchoose}[2]{\left(\begin{array}{c} #1 \\ #2 \end{array}\right)} \newcommand{\defn}{\stackrel{\vartriangle}{=}} \newcommand{\rvectwo}[2]{\left(\begin{array}{c} #1 \\ #2 \end{array}\right)} \newcommand{\rvecthree}[3]{\left(\begin{array}{r} #1 \\ #2\\ #3\end{array}\right)} \newcommand{\rvecdots}[3]{\left(\begin{array}{r} #1 \\ #2\\ \vdots\\ #3\end{array}\right)} \newcommand{\vectwo}[2]{\left[\begin{array}{r} #1\\#2\end{array}\right]} \newcommand{\vecthree}[3]{\left[\begin{array}{r} #1 \\ #2\\ #3\end{array}\right]} \newcommand{\vecfour}[4]{\left[\begin{array}{r} #1 \\ #2\\ #3\\ #4\end{array}\right]} \newcommand{\vecdots}[3]{\left[\begin{array}{r} #1 \\ #2\\ \vdots\\ #3\end{array}\right]} \newcommand{\eql}{\;\; = \;\;} \definecolor{dkblue}{RGB}{0,0,120} \definecolor{dkred}{RGB}{120,0,0} \definecolor{dkgreen}{RGB}{0,120,0} \newcommand{\bigsp}{\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;} \newcommand{\plss}{\;\;+\;\;} \newcommand{\miss}{\;\;-\;\;} \newcommand{\implies}{\Rightarrow\;\;\;\;\;\;\;\;\;\;\;\;} \newcommand{\prob}[1]{\mbox{Pr}\left[ #1 \right]} \newcommand{\exval}[1]{\mbox{E}\left[ #1 \right]} \newcommand{\variance}[1]{\mbox{Var}\left[ #1 \right]} \)


How to prep for the exam


 

About the exam:

 

How to prep:

 


Sample questions

 

One set of sample questions are the non-programming module exercises, or slightly modified versions of them.
 

Here are some others:


  1. Consider the following assertions:
    1. It is possible to add vectors with different numbers of elements as long as the larger one has zeroes where the smaller one has no elements, as in \((1,1,1,0,0)\) and \((2,2,2)\).
    2. The dot product of two vectors is zero only if at least one of them is the zero vector.
    Choose one of the following as the best answer:
    1. I and II are both true.
    2. I and II are both false.
    3. I is true but II is false.
    4. II is true but I is false.

  2. Suppose that, in the equation \({\bf Ax}={\bf b}\), the matrix \({\bf A}\) is $$ {\bf A} \eql \mat{1 & 0 & 1\\ 1 & 1 & 2\\ 2 & 1 & 3} $$ and that \({\bf b}=(-1,0,-1)\). Consider the following assertions
    1. Because the third column of A is the sum of the first two columns, there is no solution.
    2. Because one equation has a zero as the coefficient of \(x_2\), the value of \(x_2\) can be anything.
    Choose one of the following as the best answer:
    1. I and II are both true.
    2. I and II are both false.
    3. I is true but II is false.
    4. II is true but I is false.

  3. Consider an instance of \({\bf Ax}={\bf b}\) with 6 variables and 4 equations, and the following assertions:
    1. Because there are more variables than equations, there can be no solution.
    2. The RREF can have at most 4 pivots.
    Choose one of the following as the best answer:
    1. I and II are both true.
    2. I and II are both false.
    3. I is true but II is false.
    4. II is true but I is false.

  4. Consider the following assertions:
    1. The product of the number 0 times the vector \({\bf 0}\) is the number 0.
    2. The product of the number 0 times the vector \({\bf 0}\) is the vector \({\bf 0}\).
    3. The dot product of the vector \({\bf 0}\) with any vector x is the number 0.
    4. The dot product of the vector \({\bf 0}\) with any vector x is the vector \({\bf 0}\).
    The only true assertions are:
    1. I and IV.
    2. I and III.
    3. II and III.
    4. II and IV.

  5. If the matrix \({\bf A}\) rotates a vector clockwise by 45, and matrix \({\bf B}\) reflects about the y axis, then \({\bf BA}\) takes the vector (1,0) to
    1. \((-1,1)\)
    2. \((-1,-1)\)
    3. \((-1/\sqrt{2} , 1/\sqrt{2})\)
    4. \((-1/\sqrt{2} , -1/\sqrt{2})\)

  6. Consider 3D space, the vectors \({\bf u}=(2,3,0), {\bf v}=(4,6,0), {\bf w}=(2,2,0)\) and these assertions:
    1. \({\bf u}\) and \({\bf v}\) are a basis for the xy-plane.
    2. \({\bf u}\) and \({\bf w}\) are a basis for the xy-plane.
    Choose one of the following as the best answer:
    1. I is true but II is false.
    2. II is true but I is false.
    3. I and II are both true.
    4. I and II are both false.

  7. Consider the following proof of the statement: if \({\bf A}^{-1}\) exists then \({\bf x}= {\bf A}^{-1}{\bf b}\) is the unique solution to \({\bf Ax}={\bf b}\).
    1. \({\bf A}^{-1}\) is unique for any invertible matrix \({\bf A}\).
    2. Since we are given \({\bf b}\), there is only one possible product \({\bf A}^{-1}{\bf b}\).
    3. Since \({\bf x}= {\bf A}^{-1}{\bf b}\), the value of \({\bf x}\) is completely determined, and so, there is only one possible solution \({\bf x}\) to \({\bf Ax}={\bf b}\).
    Choose the best possible answer from:
    1. The proof is correct.
    2. Not correct because 1 is false
    3. Not correct because even though \({\bf A}^{-1}{\bf b}\) is a solution, there may be other solutions \({\bf y}\) where \({\bf Ay}={\bf b}\).
    4. Not correct because multiple solutions may exist if \({\bf A}\) is not square.

  8. Suppose \({\bf u}, {\bf v}\) and \({\bf w}\) are orthogonal vectors. Consider the assertions
    1. Each of them has unit length.
    2. \(({\bf u} \cdot {\bf v}) \cdot {\bf w} = {\bf u} \cdot ({\bf v} \cdot {\bf w})\)
    Choose one of the following as the best answer:
    1. I and II are both true.
    2. I and II are both false.
    3. I is true but II is false.
    4. II is true but I is false.

  9. Consider the following reasoning related to the least squares solution \( \hat{\bf x} = ({\bf A}^T {\bf A})^{-1} {\bf A}^T {\bf b} \)
    1. \( ({\bf A}^T {\bf A})^{-1} = {\bf A}^{-1} ({\bf A}^T)^{-1}\).
    2. Therefore, \( ({\bf A}^T {\bf A})^{-1} {\bf A}^T {\bf b} = {\bf A}^{-1} ({\bf A}^T)^{-1} {\bf A}^T {\bf b}\).
    3. Therefore \(\hat{\bf x} = {\bf A}^{-1} {\bf b}\).
    Choose the best answer:
    1. I is true for all matrices \({\bf A}\).
    2. I is true if \({\bf A}\) has an inverse.
    3. II follows from the properties of a matrix transpose.
    4. III demonstrates that the least squares solution is one way to prove that the inverse exists.

  10. Consider any two non-collinear vectors \({\bf w}\) and \({\bf v}\) and let \({\bf y}=\mbox{proj}_{\bf v}({\bf w})\) be the projection of of \({\bf w}\) on \({\bf v}\). Consider these assertions:
    1. The angle between \({\bf w}\) and \({\bf y}\) is \(90^\circ\).
    2. The length of \({\bf y}\) is always less than that of \({\bf v}\).
    Choose one of the following as the best answer:
    1. I and II are both true.
    2. I and II are both false.
    3. I is true but II is false.
    4. II is true but I is false.

  11. Suppose \({\bf x}_1,{\bf x}_2,\ldots,{\bf x}_n\) is a collection of orthogonal vectors in \(\mathbb{R}^n\). Let \({\bf v}\) be a vector and \({\bf w}=(\alpha_1,\alpha_2,\ldots,\alpha_n)\) be the coordinates of \({\bf v}\) using the \({\bf x}_i\)'s as a basis. Then, which of the following (perhaps more than one) are true? Explain your reasoning.
    1. \(\alpha_i = ({\bf v} \cdot {\bf x}_i) / ({\bf x}_i \cdot {\bf x}_i)\)
    2. \({\bf w} = {\bf A}^T {\bf v}\)
    3. \({\bf w} = {\bf A}^{-1} {\bf v}\)
    4. \(\alpha_i = |{\bf v}| |{\bf x}_i| \cos \theta_i\), where \(\theta_i\) is the angle between \({\bf v}\) and \({\bf x}_i\)

  12. Suppose \({\bf A}_{m \times n}\) is a matrix and \({\bf x}\) is a nonzero vector such that \({\bf Ax}={\bf 0}\). Let \({\bf S}\) be the set of all vectors \({\bf y}\) such that \({\bf Ay}={\bf 0}\). Let \(r\) be the rank of \({\bf A}\). Then which of the following (possibly more than one) is true?
    1. \({\bf x}\) has m elements.
    2. \({\bf x}\) is in the nullspace.
    3. \(RREF({\bf A})\) has r pivots.
    4. A basis for \({\bf S}\) needs \(m-r\) vectors.

  13. Suppose \(T\) is a linear transformation. Consider the following reasoning steps:
    1. \(T({\bf 0}) = T(1 \times {\bf 0} + (-1) \times {\bf 0})\)
    2. Next, \(T(1 \times {\bf 0} + (-1) \times {\bf 0}) = 1\times T({\bf 0}) - 1\times T({\bf 0})\)
    3. \(1 \times T({\bf 0}) - 1 \times T({\bf 0}) = {\bf 0}\)
    4. Therefore \(T({\bf 0}) = {\bf 0}\)
    Which of the following (possibly more than one) is true?
    1. I is true because we've applied the definition of a linear transformation.
    2. II is true because of the unique properties of \(T({\bf 0})\).
    3. III follows from the rules of integer arithmetic.
    4. IV concludes the result of reasoning in steps I-III.

  14. Consider the following functions: \(f_1(t)=5, f_2(t)=2t^2, f_3(t)=1-t, f_4(t)=t^{-1}\). Let \(g(t) = 1 - t^2\) and \(h(t) = t^2 - t - 1\). Which of the following (possibly more than one) are true?
    1. \(g(t)\) can be written as a linear combination of \(f_1, f_2, f_3, f_4\).
    2. \(g(t)\) is a polynomial.
    3. \(h(t)\) can be written as a linear combination of \(f_1, f_2, f_3\).
    4. \(f_1, f_2\) form a basis for all polynomials of degree 2 or less.

  15. Suppose \({\bf x}\) is an eigenvector of a matrix \({\bf A}\). Then which of the following (possibly more than one) are true?
    1. \({\bf Ax} = {\bf x}\).
    2. \(-{\bf x}\) is an eigenvector of \({\bf A}\).
    3. \({\bf x}\) is orthogonal to every column of \({\bf A}\).
    4. \({\bf A}\) is a square matrix.

  16. Suppose \({\bf X}_{m\times n}\) is a data matrix (the data samples as columns) and \({\bf Y}\) is the same data in PCA coordinates. Consider these assertions:
    1. The purpose of PCA is to reduce variance in \({\bf Y}\) so that there's less noise in the transformed data.
    2. The matrix \({\bf S}^T\) that transforms into \({\bf X}\) to \({\bf Y}\) is an \(n \times m\) matrix.
    Choose one of the following as the best answer:
    1. I and II are both true.
    2. I and II are both false.
    3. I is true but II is false.
    4. II is true but I is false.
Note:
  • The above sample questions are only a guideline. Do NOT read anything into the particular topics selected above.
  • Answers to the above questions will not be provided, nor will the TA do these for you. The idea is for you to develop confidence in answering the questions yourself.