·

Engenharia da Computação ·

Análise de Sinais e Sistemas

Send your question to AI and receive an answer instantly

Ask Question

Preview text

Chapter 2\n\nSOLUTIONS TO CHAPTER 2\nBackground\n2.1 The DFT of a sequence x(n) of length N may be expressed in matrix form as follows\n\n X = W x\n\nwhere x = [x(0), x(1), \u2026, x(N \u2013 1) ^T] is a vector containing the signal values and X is a vector containing the DFT coefficients X(k),\n\n(a) Find the matrix W.\n(b) What properties does the matrix W have?\n(c) What is the inverse of W?\n\nSolution\n\n(a) The DFT of a sequence x(n) of length N is\n \n X(k) = \n \\sum_{n=0}^{N-1} x(n)e^{-j \\frac{2 \pi}{N}kn}\n\nwhere W = e^{-j \\frac{2 \pi}{N}}. If we define\n\n w^{nl} = [1, W, W^2, \u2026, W^{N-1}]^T \n\nthen X(k) is the inner product\n\n X(k) = w^H x\n\nArranging the DFT coefficients in a vector we have,\n\n x = \n \\begin{bmatrix}\n X(0) \\\n X(1) \\\n \\vdots \\\n X(N - 1)\n \\end{bmatrix} = W x\n\nwhere\n\n W = \n \\begin{bmatrix}\n w^{00} & w^{01} & \cdots & w^{0(N - 1)} \\\\\n w^{10} & w^{11} & \cdots & w^{1(N - 1)} \\\\\n \\vdots & \\vdots & \\ddots & \\vdots \\\\ \n w^{(N - 1)0} & w^{(N - 1)1} & \\cdots & w^{(N - 1)(N - 1)}\n \\end{bmatrix}\n\n(b) The matrix W is symmetric and nonsingular. In addition, due to the orthogonality of the complex exponentials,\n\n w^{l} w^H = \\sum_{m=0}^{N - 1} e^{-j \\frac{2 \\pi}{N}(k - l)(m)} = \\left\\{ N : k = l \\\\\n 0 : k \u2260 l \\right.\n\nit follows that W is orthogonal.\n\n(c) Due to the orthogonality of W, the inverse is\n\n W^{-1} = \\frac{1}{N} W^H 2.3 Find the minimum norm solution to the following set of underdetermined linear equations,\n\n\\[ \\begin{bmatrix} 1 & 0 & 2 & -1 \\\\ -1 & 1 & 0 & 1 \\end{bmatrix} \\begin{bmatrix} x_1 \\\\ x_2 \\\\ x_3 \\\\ x_4 \\end{bmatrix} = \\begin{bmatrix} 1 \\\\ 1 \\end{bmatrix} \\]\n\nSolution\nWith\n\\[ A = \\begin{bmatrix} 1 & 0 & 2 & -1 \\\\ -1 & 1 & 0 & 1 \\end{bmatrix} \\]\nsince the rows of A are linearly independent, then the minimum norm solution is unique and given by\n\\[ x_0 = A^\\dagger (AA^\\dagger)^-1 b \\]\n\nWith\n\\[ AA^T = \\begin{bmatrix} -6 & 2 \\\\ 2 & -3 \\end{bmatrix} \\]\nand\n\\[ (AA^T)^{-1} = \\begin{bmatrix} 3 & 2 \\\\ 2 & 6 \\end{bmatrix} \\]\nit follows that the minimum norm solution is\n\\[ x = \\frac{1}{4} \\begin{bmatrix} -1 & 3 \\\\ 0 & 1 \\\\ 2 & 6 \\end{bmatrix} = \\begin{bmatrix} -3 \\\\ 8 \\\\ 10 \\\\ 3 \\end{bmatrix} \\] 2.4 Consider the set of inconsistent linear equations Ax = b given by\n\n\\[ \\begin{bmatrix} 1 & 0 & 0 \\\\ 0 & 1 & 1 \\\\ 1 & 1 & 1 \\end{bmatrix} \\begin{bmatrix} x_1 \\\\ x_2 \\end{bmatrix} = \\begin{bmatrix} 1 \\\\ 1 \\end{bmatrix} \\]\n\n(a) Find the least squares solution to these equations.\n(b) Find the projection matrix P_A.\n(c) Find the best approximation \\bar{b} = P_A b to b.\n(d) Consider the matrix\n\\[ P_A = I - P_A \\]\nFind the vector b^\\perp = P_A b and show that it is orthogonal to b. What does the matrix P_A^\\perp represent?\n\nSolution\n(a) Since the columns of A are linearly independent, the least squares solution is unique and given by\n\\[ x_0 = (A^TA)^{-1}A^Tb \\]\n\nWith\n\\[ A^TA = \\begin{bmatrix} 1 & 0 & 0 \\end{bmatrix} \\begin{bmatrix} 0 & 1 & 1 \\end{bmatrix} \\]\nit follows that\n\\[ (A^TA)^{-1} = \\begin{bmatrix} 2 & -1 \\-2 & 1 \\end{bmatrix} \\]\nand, therefore,\n\\[ x_0 = \\begin{bmatrix} 2 & -1 \\\\ -2 & 1 \\end{bmatrix} \\begin{bmatrix} 1 & 0 & 1 \\end{bmatrix} \\] (c) The best approximation to b is\n\\[ \\bar{b} = P_A b = \\begin{bmatrix} 2 & -1 \\ -1 & 2 \\ 1 & 1 \\end{bmatrix} = \\frac{1}{3} \\begin{bmatrix} 1 & 0 & 1 \\end{bmatrix} \\]\n(d) The matrix P_A is\n\\[ P_A = A(A^TA)^{-1}A^T = \\begin{bmatrix} 1 & 1 & -1 \\ -1 & -1 & 1 \\end{bmatrix} \\]\n\nand\n\\[ b^\\perp = P_A b = \\begin{bmatrix} 2 \\ 2 \\ -2 \\end{bmatrix} \\]\nThe inner product between \\bar{b} and b^\\perp is\n\\[ < \\bar{b}, b^\\perp > = \\frac{1}{3} \\begin{bmatrix} 2 & 2 & 2 \\end{bmatrix} \\begin{bmatrix} 2 & 2 & 2 \\end{bmatrix} = 0 \\]\nTherefore, \\bar{b} is orthogonal to b^\\perp. The matrix P_A^\\perp is a projection matrix that projects a vector onto the space that is orthogonal to the space spanned by the columns of A. Chapter 2\n2.5 Consider the problem of trying to model a sequence x(n) as the sum of a constant plus a complex exponential of frequency ω0,\nx̄(n) = c + aeᶲω0n ; n = 0, 1, ..., N - 1\nwhere c and a are unknown. We may express the problem of finding the values for c and a as one of solving a set of overdetermined linear equations\n⎡ 1 1 eᶲω0 ⎤ ⎡ c ⎤ ⎡ x(0) ⎤\n⎢ 1 1 eᶲ2ω0⎥ ⎢ a ⎥ = ⎢ x(1) ⎥\n⎢ ⋮ ⎥ ⎢ ⎥ ⎢ ⎥\n⎣ 1 1 eᶲ(N-1)ω0 ⎦ ⎣ ⎦ ⎣ x(N - 1) ⎦\n(a) Find the least squares solution for c and a.\n(b) If N is even and ω0 = 2πk/N for some integer k, find the least squares solution for c and a.\nSolution\n(a) Assuming that ω0 ≠ 0, 2π, ..., the columns of the matrix A =\n⎡ 1 1 1 1 ⎤\n⎢ 1 eᶲω0 eᶲ2ω0 ⎥\n⎢ ⋮ ⎥\n⎣ 1 eᶲ(N-1)ω0 ⎦\nare linearly independent, and the least squares solution for c and a is given by\n⎡ c ⎤\n⎢ a ⎥\n = (AᶥA)⁻¹Aᶥx\nSince\nAᶥA =\n⎡ N ∑N-1n=0 eᶲnω0 ⎤\n⎢ ∑N-1n=0 eᶲnω0 N ⎥\n⎣ ∑N-1n=0 eᶲ2nω0 N ⎦\nTherefore, the inverse of (AᶥA) is\n(AᶥA)⁻¹ = 1/N² 1/(1 - cosω0)\n 1/(1 - cosω0)\nThen we can express the solution in terms of sums of the sequence. 8\nProblem Solutions\nwhich becomes\n⎡ c ⎤\n⎢ a ⎥\n = 1/N² 1/(1 - cos Nω0)\n 1/(1 - cosω0)\nand\n(b) If ω0 = 2πk/N and k ≠ 0, then\n1 - eᶲNω0\n1 - eᶲω0 = 0\nand\n1 - cos Nω0 = 0\n1 - cosω0 = 0\nTherefore, we have\n⎡ c ⎤\n⎢ a ⎥\n = [ N/N² 1/(1 - cos Nω0)\n 1/N 1/N ∑N-1n=0 x(n) eᶲnω0 ]\n Chapter 2\n2.6 It is known that the sum of the squares of n from n = 1 to N - 1 has a closed form expression of the following form\n∑N-1n=0 n² = a0 + qN + q2N² + a3N³\nGiven that a third-order polynomial is uniquely determined in terms of the values of the polynomial at four distinct points, derive a closed form expression for this sum by setting up a set of linear equations and solving these equations for a0, a1, a2, a3. Compare your solution to that given in Table 2.3.\nSolution\nAssuming that\n∑N-1n=0 n² = a0 + qN + q2N² + a3N³\nwe may evaluate this sum for N = 1, 2, 3, 4 and write down the following set of four equations in four unknowns\n⎡ 1 1 1 1 ⎤ ⎡ a0 ⎤ ⎡ 0 ⎤\n⎢ 1 2 4 8 ⎥ ⎢ a1 ⎥ = ⎢ 1 ⎥\n⎢ 1 3 9 27 ⎥ ⎢ a2 ⎥ ⎢ 5 ⎥\n⎣ 1 4 16 64 ⎦ ⎣ a3 ⎦ ⎣ 14 ⎦\nSolving these equations for a0, a1, a2, and a3, we find\na0 = 0\na1 = 1/6\na2 = -1/2\na3 = 1/3\nwhich gives the following closed-form expression for the sum,\n∑N-1n=0 n² = 1/6 N² - 1/2 N + 1/3\n 2.7 Show that a projection matrix P_A has the following two properties,\n\n1. It is idempotent, P_A^2 = P_A.\n2. It is Hermitian.\n\nSolution\nGiven a matrix A, the projection matrix P_A is\nP_A = A(A^HA)^{-1}A^H\n\nTherefore,\nP_A^2 = A(A^HA)^{-1}A^H A(A^HA)^{-1}A^H\n = A(A^HA)^{-1} = P_A\n\nand it follows that P_A is idempotent. Also,\nP_A^H = [A(A^HA)^{-1}A^H]^H = [A(A^HA)^{-1}]^H A^H\nSince AA^H is Hermitian, then so is its inverse,\n[(A^HA)^{-1}]^H = (A^HA)^{-1}\n\nThus, P_A is Hermitian.