# NISP Commit Details

Date: 2015-03-30 22:21:26 (3 years 8 months ago) Michael Baudin 332 331 Fixed typos.

## File differences

 % Copyright (C) 2013 - Michael Baudin␍␊ % Copyright (C) 2013 - 2015 - Michael Baudin␍␊ %␍␊ % This file must be used under the terms of the ␍␊ % Creative Commons Attribution-ShareAlike 3.0 Unported License :␍␊ \newpage␍␊ $\textrm{ }$␍␊ \vfill␍␊ Copyright \copyright{} 2013 - Michael Baudin␍␊ Copyright \copyright{} 2013 - 2015 - Michael Baudin␍␊ ␍␊ This file must be used under the terms of the ␍␊ Creative Commons Attribution-ShareAlike 3.0 Unported License:␍␊
 \label{def-polymonic}␍␊ We denote by $\PP_n$ the set of real polynomials with ␍␊ degree $n$, i.e. $p_n\in\PP_n$ if :␍␊ $$␍␊ \begin{eqnarray}␍␊ \label{def-poly}␍␊ p_n(x)=a_{n+1} x^n + a_n x^{n-1} + \ldots + a_1,␍␊$$␍␊ \end{eqnarray}␍␊ for any $x\in I$, where $a_{n+1}$, $a_n$,..., $a_1$ are ␍␊ real numbers. ␍␊ In this case, the degree of the polynomial $p_n$ is $n$.␍␊ ␍␊ \begin{definition}␍␊ (\emph{Orthogonal polynomials})␍␊ The set of polynomials $\{P_n\}_{n\geq 0}$ are orthogonal polynomials if ␍␊ $P_n$ is a polynomial of degree $n$ and:␍␊ The set of polynomials $\{p_n\}_{n\geq 0}$ are orthogonal polynomials if ␍␊ $p_n$ is a polynomial of degree $n$ and:␍␊ \begin{eqnarray*}␍␊ \lwdotprod{P_i}{P_j}=0␍␊ \lwdotprod{p_i}{p_j}=0␍␊ \end{eqnarray*}␍␊ for $i\neq j$.␍␊ \end{definition}␍␊ ␍␊ ␍␊ \begin{definition}␍␊ \label{def-orthopoly}␍␊ (\emph{Orthonormal polynomials})␍␊ The set of polynomials $\{P_n\}_{n\geq 0}$ are orthonormal polynomials if ␍␊ $P_n$ is a polynomial of degree $n$ and:␍␊ The set of polynomials $\{p_n\}_{n\geq 0}$ are orthonormal polynomials if ␍␊ $p_n$ is a polynomial of degree $n$ and:␍␊ \begin{eqnarray*}␍␊ \lwdotprod{P_i}{P_j}=\delta_{ij}␍␊ \lwdotprod{p_i}{p_j}=\delta_{ij}␍␊ \end{eqnarray*}␍␊ for $i\neq j$.␍␊ \end{definition}␍␊ \begin{proposition}␍␊ (\emph{Integral of orthogonal polynomials})␍␊ \label{prop-integpoly}␍␊ Let $\{P_n\}_{n\geq 0}$ be orthogonal polynomials. ␍␊ Let $\{p_n\}_{n\geq 0}$ be orthogonal polynomials. ␍␊ We have ␍␊ \begin{eqnarray}␍␊ \label{eq-integpoly}␍␊ Moreover, for $n\geq 1$, we have␍␊ \begin{eqnarray}␍␊ \label{eq-integpoly2}␍␊ \int_I P_n(x) w(x) dx ␍␊ \int_I p_n(x) w(x) dx ␍␊ &=& 0␍␊ \end{eqnarray}␍␊ \end{proposition}␍␊ The equation \ref{eq-integpoly} is the straightforward consequence of \ref{eq-integpoly0}. ␍␊ Moreover, for any $n\geq 1$, we have␍␊ \begin{eqnarray*}␍␊ \int_I P_n(x) w(x) dx ␍␊ &=& \int_I P_0(x) P_n(x) w(x) dx \\␍␊ &=& \lwdotprod{P_0(x)}{P_n(x)} \\␍␊ \int_I p_n(x) w(x) dx ␍␊ &=& \int_I P_0(x) p_n(x) w(x) dx \\␍␊ &=& \lwdotprod{P_0(x)}{p_n(x)} \\␍␊ &=& 0,␍␊ \end{eqnarray*}␍␊ by the orthogonality property.␍␊ \subsection{Orthogonal polynomials for probabilities}␍␊ \label{sec-probaorth}␍␊ ␍␊ In this section, we present the properties of $P_n(X)$, ␍␊ In this section, we present the properties of $p_n(X)$, ␍␊ when $X$ is a random variable associated with the orthogonal polynomials ␍␊ $\{P_n\}_{n\geq 0}$.␍␊ $\{p_n\}_{n\geq 0}$.␍␊ ␍␊ \index{Distribution function}␍␊ \begin{proposition}␍␊ \begin{proposition}␍␊ (\emph{Expectation of orthogonal polynomials})␍␊ \label{prop-expecpoly}␍␊ Let $\{P_n\}_{n\geq 0}$ be orthogonal polynomials. ␍␊ Let $\{p_n\}_{n\geq 0}$ be orthogonal polynomials. ␍␊ Assume that $X$ is a random variable associated with the probability distribution ␍␊ function $f$, derived from the weight function $w$. ␍␊ We have ␍␊ Moreover, for $n\geq 1$, we have␍␊ \begin{eqnarray}␍␊ \label{eq-expecpoly2}␍␊ E(P_n(X))&=& 0.␍␊ E(p_n(X))&=& 0.␍␊ \end{eqnarray}␍␊ \end{proposition}␍␊ ␍␊ since $f$ is a distribution function.␍␊ Moreover, for any $n\geq 1$, we have␍␊ \begin{eqnarray*}␍␊ E(P_n(X))␍␊ &=& \int_I P_n(x) f(x) dx \\␍␊ &=& \frac{1}{\int_I w(x) dx} \int_I P_n(x) w(x) dx \\␍␊ E(p_n(X))␍␊ &=& \int_I p_n(x) f(x) dx \\␍␊ &=& \frac{1}{\int_I w(x) dx} \int_I p_n(x) w(x) dx \\␍␊ &=& 0,␍␊ \end{eqnarray*}␍␊ where the first equation derives from the equation \ref{eq-unifw}, ␍␊ \begin{proposition}␍␊ (\emph{Variance of orthogonal polynomials})␍␊ \label{prop-varpoly}␍␊ Let $\{P_n\}_{n\geq 0}$ be orthogonal polynomials. ␍␊ Let $\{p_n\}_{n\geq 0}$ be orthogonal polynomials. ␍␊ Assume that $x$ is a random variable associated with the probability distribution ␍␊ function $f$, derived from the weight function $w$. ␍␊ We have ␍␊ Moreover, for $n\geq 1$, we have␍␊ \begin{eqnarray}␍␊ \label{eq-varpoly2}␍␊ V(P_n(X))&=& \frac{\|P_n\|^2}{\int_I w(x) dx}.␍␊ V(p_n(X))&=& \frac{\|p_n\|^2}{\int_I w(x) dx}.␍␊ \end{eqnarray}␍␊ \end{proposition}␍␊ ␍␊ The equation \ref{eq-varpoly1} is implied by the fact that $P_0$ is a constant. ␍␊ Moreover, for $n\geq 1$, we have:␍␊ \begin{eqnarray*}␍␊ V(P_n(X))␍␊ &=& E\left(\left(P_n(X)-E(P_n(X))\right)^2\right) \\␍␊ &=& E\left(P_n(X)^2\right) \\␍␊ &=& \int_I P_n(x)^2 f(x) dx \\␍␊ &=& \frac{1}{\int_I w(x)dx} \int_I P_n(x)^2 w(x) dx,␍␊ V(p_n(X))␍␊ &=& E\left(\left(p_n(X)-E(p_n(X))\right)^2\right) \\␍␊ &=& E\left(p_n(X)^2\right) \\␍␊ &=& \int_I p_n(x)^2 f(x) dx \\␍␊ &=& \frac{1}{\int_I w(x)dx} \int_I p_n(x)^2 w(x) dx,␍␊ \end{eqnarray*}␍␊ where the second equality is implied by the equation \ref{eq-expecpoly2}. ␍␊ \end{proof}␍␊ ␍␊ \begin{proposition}␍␊ \label{prop-exppipj}␍␊ Let $\{P_n\}_{n\geq 0}$ be orthogonal polynomials. ␍␊ Let $\{p_n\}_{n\geq 0}$ be orthogonal polynomials. ␍␊ Assume that $x$ is a random variable associated with the probability distribution ␍␊ function $f$, derived from the weight function $w$. ␍␊ For two integers $i,j\geq 0$, we have ␍␊ \begin{eqnarray}␍␊ \label{eq-exppipj1}␍␊ E(P_i(X)P_j(X))=0␍␊ E(p_i(X)p_j(X))=0␍␊ \end{eqnarray}␍␊ if $i\neq j$. ␍␊ Moreover, if $i\geq 1$, then ␍␊ \begin{eqnarray}␍␊ \label{eq-exppipj2}␍␊ E(P_i(X)^2) = V(P_i(X)).␍␊ E(p_i(X)^2) = V(p_i(X)).␍␊ \end{eqnarray}␍␊ \end{proposition}␍␊ ␍␊ \begin{proof}␍␊ We have ␍␊ \begin{eqnarray*}␍␊ E(P_i(X)P_j(X))␍␊ &=& \int_I P_i(x) P_j(x) f(x) dx \\␍␊ &=& \frac{1}{\int_I w(x)dx} \int_I P_i(x) P_j(x) w(x) dx \\␍␊ &=& \frac{\lwdotprod{P_i}{P_j}}{\int_I w(x)dx}. ␍␊ E(p_i(X)p_j(X))␍␊ &=& \int_I p_i(x) p_j(x) f(x) dx \\␍␊ &=& \frac{1}{\int_I w(x)dx} \int_I p_i(x) p_j(x) w(x) dx \\␍␊ &=& \frac{\lwdotprod{p_i}{p_j}}{\int_I w(x)dx}. ␍␊ \end{eqnarray*}␍␊ If $i\neq j$, the orthogonality of the polynomials implies \ref{eq-exppipj1}. ␍␊ If, on the other hand, we have $i=j\geq 1$, then ␍␊ \begin{eqnarray*}␍␊ E(P_i(X)^2) ␍␊ &=& \frac{\lwdotprod{P_i}{P_i}}{\int_I w(x)dx} \\␍␊ &=& \frac{\|P_i\|^2}{\int_I w(x)dx}. ␍␊ E(p_i(X)^2) ␍␊ &=& \frac{\lwdotprod{p_i}{p_i}}{\int_I w(x)dx} \\␍␊ &=& \frac{\|p_i\|^2}{\int_I w(x)dx}. ␍␊ \end{eqnarray*}␍␊ We then use the equation \ref{eq-varpoly2}, which leads to ␍␊ the equation \ref{eq-exppipj2}.␍␊ \begin{center}␍␊ \begin{tabular}{lllllll}␍␊ \hline␍␊ Distrib. & Support & Poly. & $w(x)$ & $f(x)$ & $\|P_n\|^2$ & $V(P_n)$\\␍␊ Distrib. & Support & Poly. & $w(x)$ & $f(x)$ & $\|p_n\|^2$ & $V(p_n)$\\␍␊ \hline␍␊ $\mathcal{N}(0,1)$ & $\RR$ & Hermite & $\exp\left(-\frac{x^2}{2}\right)$ & $\frac{1}{\sqrt{2\pi}}\exp\left(-\frac{x^2}{2}\right)$ & $\sqrt{2\pi} n!$ & $n!$\\␍␊ $\mathcal{U}(-1,1)$ & $[-1,1]$ & Legendre & $1$ & $\frac{1}{2}$ & $\frac{2}{2n+1}$ & $\frac{1}{2n+1}$ \\␍␊ (\emph{Degree of exactness})␍␊ \label{def-degexact}␍␊ The degree of exactness of a quadrature rule is $d$ if ␍␊ $d$ is the largest degree for which, for any polynomial ␍␊ $p_d\in\PP_d$, we have ␍␊ $d$ is the largest degree for which we have ␍␊ \begin{eqnarray}␍␊ \label{eq-degexact}␍␊ I(p_d)=I_n(p_d).␍␊ I(p_d)=I_n(p_d),␍␊ \end{eqnarray}␍␊ for any polynomial $p_d\in\PP_d$.␍␊ \end{definition}␍␊ ␍␊ In other words, the degree of exactness of a quadrature rule ␍␊ \label{def-maximalquad}␍␊ Let $m>0$ be and integer. ␍␊ The quadrature rule \ref{eq-quadrule} has degree of exactness ␍␊ $n+m$ is and only if ␍␊ $n+m$ if and only if ␍␊ \begin{enumerate}␍␊ \item the formula \ref{eq-quadrule} is interpolatory, ␍␊ \item for any $p_{m-1} \in \PP_{m-1}$, we have ␍␊ \end{proposition}␍␊ ␍␊ \begin{proof}␍␊ We are going to prove that $m\leq n+1$, then the proposition \ref{def-maximalquad} ␍␊ We are going to prove that $m\leq n+1$. ␍␊ Then the proposition \ref{def-maximalquad} ␍␊ implies that the maximum degree of exactness is $n+m=n+n+1=2n+1$. ␍␊ ␍␊ Let us prove this by contradiction : suppose that $m\geq n+2$. ␍␊ Then the equality \ref{eq-maximalquad} is true for $m=n+2$, which ␍␊ implies that the polynomial $\omega_{n+1}\in\PP_{m-1}$ satisfies ␍␊ the equality :␍␊ Then the equality \ref{eq-maximalquad} is true for $m=n+2$. ␍␊ Therefore $n+1=m-1$, which ␍␊ implies that the polynomial $\omega_{n+1}\in\PP_{n+1}=\PP_{m-1}$ ␍␊ satisfies the equality :␍␊ $$␍␊ \int_I \omega_{n+1}^2(x)w(x)dx=0.␍␊$$␍␊ Since the weight $w$ is by hypothesis continuous, nonnegative ␍␊ and with a nonnegative integral, this implies that $\omega_{n+1}=0$, ␍␊ which is impossible. ␍␊ and with a nonnegative integral, the previous equation ␍␊ implies that $\omega_{n+1}=0$, which is impossible. ␍␊ \end{proof}␍␊ ␍␊ We have seen that the maximum possible value of $m$ is $n+1$. ␍␊ Gaussian quadrature. ␍␊ \end{definition}␍␊ ␍␊ In the following, we denote by $\{\pi_k\}_{i=1,...,n}$ the ␍␊ In the following, we denote by $\{\pi_k\}_{k=1,...,n}$ the ␍␊ monic node polynomials associated with the equation~\ref{eq-nodepolycond2}.␍␊ ␍␊ \begin{proposition}␍␊ (\emph{Properties of node polynomials})␍␊ \label{prop-propnodpoly}␍␊ \begin{enumerate}␍␊ \item The polynomials $\{\pi_k\}_{i=1,...,n}$ are orthogonals. ␍␊ \item The polynomials $\{\pi_k\}_{k=1,...,n}$ are orthogonals. ␍␊ \item They are linearly independent. ␍␊ \item The polynomials $\{\pi_k\}_{i=1,...,n}$ are a basis of $\PP_n$. ␍␊ \item The polynomials $\{\pi_k\}_{k=1,...,n}$ are a basis of $\PP_n$. ␍␊ \end{enumerate}␍␊ \end{proposition}␍␊ ␍␊ \begin{proof}␍␊ \begin{enumerate}␍␊ \item Let us prove that the polynomials $\{\pi_k\}_{i=1,...,n}$ are orthogonals. ␍␊ \item Let us prove that the polynomials $\{\pi_k\}_{k=1,...,n}$ are orthogonals. ␍␊ Consider two polynomials $\pi_i$ and $\pi_j$ satisfying the equation ␍␊ \ref{eq-nodepolycond2}, with $i\neq j$. ␍␊ Without loss of generality, we can assume that $i>j$ (otherwise, we ␍␊ $$␍␊ which concludes the proof. ␍␊ ␍␊ \item The fact that the orthogonal polynomials \{\pi_k\}_{i=1,...,n} ␍␊ \item The fact that the orthogonal polynomials \{\pi_k\}_{k=1,...,n} ␍␊ are linearly independent is a result of linear algebra. ␍␊ Indeed, for any real numbers \alpha_1,..., \alpha_n, ␍␊ assume that ␍␊ since the other terms in the sum are zero, by orthogonality. ␍␊ Suppose (\pi_i,\pi_i)= 0. This would contradict the hypothesis ␍␊ that the weight w is a continuous, nonnegative function. ␍␊ This implies that, necessarily, (\pi_i,\pi_i)> 0. ␍␊ This implies that, necessarily, we have (\pi_i,\pi_i)> 0. ␍␊ We combine this inequality with the previous equality, ␍␊ and get \alpha_i=0, which shows that the orthogonal ␍␊ polynomials are linearily independent. ␍␊ ␍␊ \item Linear algebra shows that the polynomials \{\pi_k\}_{i=1,...,n} ␍␊ \item Linear algebra shows that the polynomials \{\pi_k\}_{k=1,...,n} ␍␊ are a basis of \PP_n. ␍␊ Indeed, consider the following set of polynomials :␍␊$$␍␊ 1, \quad x, x^2, ..., x^n.␍␊ 1, x, x^2, ..., x^n.␍␊ $$␍␊ The definition \ref{def-polymonic} shows that the previous polynomials ␍␊ are a basis of \PP_n, since any degree n polynomial is equal ␍␊ to a linear combination of these n polynomials. ␍␊ Therefore, the dimension of the vector space \PP_n is n. ␍␊ However, the n polynomials \{\pi_k\}_{i=1,...,n} are linearly ␍␊ However, the n polynomials \{\pi_k\}_{k=1,...,n} are linearly ␍␊ independent, which implies that these are a basis of \PP_n, and ␍␊ concludes the proof.␍␊ \end{enumerate}␍␊ \begin{proposition}␍␊ (\emph{Three term recurrence of monic orthogonal polynomials})␍␊ \label{prop-threeterm}␍␊ Assume \{\pi_k\}_{i=-1,0,1,...,n} is a family of monic orthogonal ␍␊ Assume \{\pi_k\}_{k=-1,0,1,...,n} is a family of monic orthogonal ␍␊ polynomials, with ␍␊$$␍␊ \pi_{-1}=0, \qquad \pi_0=1.␍␊ \end{proposition}␍␊ ␍␊ In the previous proposition, let us make clear that the ␍␊ scalar product $(x\pi_k,\pi_k)$ in $\alpha_k$ implies the ␍␊ scalar product $(x\pi_k,\pi_k)$ in $\alpha_k$ involves the ␍␊ polynomial $x\pi_k(x)$, for any $x\in I$.␍␊ ␍␊ Notice that the proposition does not state the value of ␍␊ lower or equal to $k$. ␍␊ Indeed, both $\pi_{k+1}$ and $\pi_k$ are monic, so that the leading term $x^{k+1}$ ␍␊ cancels. ␍␊ Since the orthogonal polynomials are a basis of $\PP_k$, we have ␍␊ Since the orthogonal polynomials $\{\pi_0,\pi_1,...,\pi_k\}$ ␍␊ are a basis of $\PP_k$, we have ␍␊ \begin{eqnarray}␍␊ \label{eq-threeterm4}␍␊ \pi_{k+1}-x\pi_k = -\alpha_k \pi_k - \beta_k \pi_{k-1} ␍␊ for $k=0,1,...,n$, where $\alpha_k$, $\beta_k$ and $\gamma_{kj}$, ␍␊ for $j=0,...,k-2$, are real numbers. ␍␊ ␍␊ First, the scalar product of the equation \ref{eq-threeterm4} with $\pi_k$ ␍␊ \begin{enumerate}␍␊ \item The scalar product of the equation \ref{eq-threeterm4} with $\pi_k$ ␍␊ is :␍␊ $$␍␊ -(x\pi_k,\pi_k)=-\alpha_k(\pi_k,\pi_k),␍␊$$␍␊ since the orthogonality of the polynomials implies that the other terms in the sum ␍␊ are zero. ␍␊ since the orthogonality of the polynomials implies that the other ␍␊ terms in the sum are zero. ␍␊ The previous equation immediately leads to the equation \ref{eq-threeterm2}.␍␊ ␍␊ Second, the scalar product of the equation \ref{eq-threeterm4} with $\pi_{k-1}$ ␍␊ \item The scalar product of the equation \ref{eq-threeterm4} with $\pi_{k-1}$ ␍␊ is :␍␊ \begin{eqnarray}␍␊ \label{eq-threeterm5}␍␊ &=& (\pi_k,x\pi_{k-1}). \label{eq-threeterm6}␍␊ \end{eqnarray}␍␊ Moreover, the polynomial $x\pi_{k-1}$ is a monic degree $k$ polynomial. ␍␊ Hence, it can be decomposed as :␍␊ Hence, it can be decomposed as~:␍␊ $$␍␊ x\pi_{k-1} = \pi_k + c_k\pi_{k-1} + ... + c_1\pi_0,␍␊$$␍␊ $$␍␊ (x\pi_k,\pi_{k-1}) = (\pi_k,\pi_k).␍␊$$␍␊ We previous equation can be combined with \ref{eq-threeterm5}, which leads ␍␊ The previous equation can be combined with \ref{eq-threeterm5}, which leads ␍␊ to \ref{eq-threeterm3}.␍␊ ␍␊ Thirdly, in order to prove the equation \ref{eq-threeterm1}, we are going ␍␊ \item In order to prove the equation \ref{eq-threeterm1}, we are going ␍␊ to use the equation \ref{eq-threeterm4} and prove that, for any $j=0,1,...,k-2$, ␍␊ we have $\gamma_{kj}=0$.␍␊ Using orthogonality, the scalar product of the equation \ref{eq-threeterm4} with $\pi_j$ is :␍␊ \gamma_{kj}(\pi_j,\pi_j)=0.␍␊ $$␍␊ However, we know that (\pi_j,\pi_j)>0, which concludes the proof. ␍␊ \end{enumerate}␍␊ \end{proof}␍␊ ␍␊ The three-term recurrence \ref{eq-threeterm1} is for monic orthogonal ␍␊ \begin{proposition}␍␊ (\emph{Three term recurrence of orthogonal polynomials})␍␊ \label{prop-threetermgen}␍␊ Assume \{p_k\}_{i=-1,0,1,...,n} is a family of orthogonal ␍␊ Assume \{p_k\}_{k=-1,0,1,...,n} is a family of orthogonal ␍␊ polynomials, with ␍␊$$␍␊ p_{-1}=0, \qquad p_0=\frac{1}{\gamma_0},␍␊ \end{proof}␍␊ ␍␊ Finally, we can normalize the polynomials and get orthonormal ␍␊ polynomials.␍␊ ␍␊ \begin{definition}␍␊ (\emph{Orthonormal polynomials})␍␊ The set of polynomials $\{p_k\}_{k\geq 0}$ are orthonormal polynomials if ␍␊ $p_k$ is a polynomial of degree $k$ and:␍␊ \begin{eqnarray*}␍␊ \lwdotprod{p_i}{p_j}=0␍␊ \end{eqnarray*}␍␊ for $i\neq j$ and ␍␊ \begin{eqnarray*}␍␊ \lwdotprod{p_i}{p_i}=1,␍␊ \end{eqnarray*}␍␊ for any integer $i$. ␍␊ \end{definition}␍␊ ␍␊ polynomials as presented in the definition \ref{def-orthopoly}.␍␊ The following proposition is a straightforward consequence of ␍␊ the proposition \ref{prop-threeterm}. ␍␊ Notice that, as we normalize the polynomials, they are not monic ␍␊ \begin{proposition}␍␊ (\emph{Three term recurrence of orthonormal polynomials})␍␊ \label{prop-threetermnorm}␍␊ Assume $\{p_k\}_{i=-1,0,1,...,n}$ is a family of orthonormal ␍␊ Assume $\{p_k\}_{k=-1,0,1,...,n}$ is a family of orthonormal ␍␊ polynomials, with ␍␊ \begin{eqnarray}␍␊ p_{-1}=0, \qquad p_0=\frac{1}{\sqrt{\beta_0}},␍␊ ␍␊ \begin{proof}␍␊ We use the proposition \ref{prop-threetermgen} with ␍␊ $\gamma_k=\|pi_k\|$.␍␊ $\gamma_k=\|p_k\|$.␍␊ The equation \ref{eq-threetermgen4} implies ␍␊ ␍␊ \|p_k\| = \frac{\|\pi_k\|}{\|\pi_k\|} = 1,␍␊ This is the equation used by the \scifun{chebyshev\_quadrature} ␍␊ function. ␍␊ ␍␊ The other method to compute uses the \scifun{chebyshev\_poly}, ␍␊ presented in the section \ref{sec-chebyshev}, which ␍␊ The other method uses the \scifun{chebyshev\_poly} function ␍␊ (see the section \ref{sec-chebyshev}), which ␍␊ computes the Chebyshev polynomial. ␍␊ More precisely, Scilab uses a data structure based on its ␍␊ coefficients. ␍␊ It is then straightforward to use the \scifun{roots} function, ␍␊ which returns the roots of the polynomial, based on the ␍␊ eigenvalues of the companion matrix \cite{Edelman1994}. ␍␊ By definition, the companion matrix of the polynomial defined by ␍␊ the equation \ref{def-poly} is :␍␊ \begin{eqnarray*}␍␊ C(p)=␍␊ \begin{pmatrix}␍␊ 0 & 0 & \ldots & 0 & -a_1/a_{n+1} \\␍␊ 1 & 0 & \ldots & 0 & -a_2/a_{n+1} \\␍␊ 0 & 1 & \ldots & 0 & -a_3/a_{n+1} \\␍␊ \vdots & \vdots & \ddots & \vdots & \vdots \\␍␊ 0 & 0 & \ldots & 1 & -a_n/a_{n+1} \\␍␊ \end{pmatrix}.␍␊ \end{eqnarray*}␍␊ ␍␊ In the following script, we compute the roots of the ␍␊ Chebyshev polynomials from degree 2 to 50 by the ␍␊