read this if you’re new to the Putnam series

Due:

13/6/2018 Two problems.

Problems:

1996 Putnam | A6

Let \(c > 0\) be a constant. Give a complete description, with proof, of the set of all continuous functions \(f: \mathcal{R} \rightarrow \mathcal{R}\) such that \(f(x) = f(x^2 + c) \forall x \in \mathcal{R}\).

Note: \(\mathcal{R}\) is set of real numbers.

_ I was super late in doing this. Like, a week late. Not that I wasn’t trying but, I am still trying to work on finding time._

I failed to do it when I did the next problem. Lol

This problem is beautiful. If \(f(x) = f(x^2 + c)\), then let’s look at all the roots of \(x^2 -x + c\), the idea is so simple yet so beautiful and that I am dumbfounded that such an ideology exists and that I didn’t think like that before.

\(x^2 + c = x\) for some \(x\) which is a root.

Writing discriminant \(D^2 = 1 - 4c\) and noting that there are two cases, when \(c > \frac{1}{4}\) and otherwise.

For \(c > \frac{1}{4}\), the equation has no real roots. Let \(x_n = x_{n-1}^2 + c\), then, \(f(x)\) defined on \([x_{n-1}, x_{n}]\) can be transformed to be defined on \([x_{n}, x_{n+1}]\) and so on.

In the absense of real roots, this is the extent to which we can say. This is a generic statement. And, if in case there are real roots,

when \(c \leq \frac{1}{4}\), let the real roots be \(\alpha, \beta\) with \(\alpha \leq \beta\).

Looking at the \(f(x) = f(x^2 + c)\), one captures that motion on x-axis is increaing. Let’s suppose we have a very large value and we want to see all the previous values where the function value is the same. The way we define the sequence then is \(x_{n+1} = \sqrt{x_n + c}\). Let’s find the limit of this sequence, noting that this is a decreasing sequence.

1996 Putnam | B4

For any square matrix \(A\), we can define \(sinA\) by the usual power series:

\[sinA = \sum_{n=0}^\infty \frac{ (-1)^n} {(2n+1)!} A^{2n+1}\]

Prove or disprove: there exists a \(2\times 2\) matrix \(A\) with real entries such that

\[sinA = \begin{bmatrix} 1 & 1996 \\ 0 & 1 \end{bmatrix}\]

_ I was super late in doing this. Like, a week late. Not that I wasn’t trying but, I am still trying to work on finding time._

I didn’t get the answer and since I was already two weeks late, I looked up the answer. I know. *sigh*

In my defense, I think I was on the right track but didn’t get intuition on how to proceed forward; but now that I have seen this, I’m intuition-ed to think like this more.

Suppose, \(\exists A \in \mathcal{R}_{2\times 2}\)

Further suppose that \(A\) is not defective, meaning, \(A\) has \(n (=2)\) independent eigen values, which would mean that \(A\) has an eigenvalue decomposition, and thus, diagonalizable.

\[A = Q^{-1} \Lambda Q\]
$$ Lambda\(being the diagonal matrix made up of the eigenvalues of\)A$$.

Re-expressing \(A\), we get,

\[\Lambda = Q A Q^{-1}\] \[\implies sin \Lambda = Q sin A Q^{-1}\]

Since, \(Q\) is orthogonal, \(n\)-the power of matrix is easier to compute. Also, we note that, \(sin \Lambda\) is diagonal since \(\Lambda\) is a diagonal matrix.

\(sin \Lambda\) is diagonal \(\implies\) \(sin A\) is diagonal.

This is a contradiction, implying \(A\) is defective and eigenvalues of \(A\) are same (since \(A\) is \(2\times 2\)).

Given any two dimensional matrix, one can write it’s characteristics equation as follows:

\[\lambda^2 - tr(A) \lambda + det(A) = 0\]

If such a matrix has equal eigenvalues, it implies that

\[tr(A)^2 = 4 det(A)\]

Assuming, \(A = \begin{bmatrix} a & b \\ c & d \end{bmatrix}\),

The above equation transforms to be,

\[(a-d)^2 + 4bc = 0\]

Setting, \(a = d\) implies one of \(b,c\) is zero.

Therefore, \(A = \begin{bmatrix} a & b \\ 0 & a \end{bmatrix}\)

Which means,

\[sin A = \begin{bmatrix} sin(a) & b cos(a) \\ 0 & sin(a) \end{bmatrix}\]

Comparing that with question, we get obvious contradiction. Hence, no such \(A\) can exist.

QED

My thoughts: I came to like making use of the eigenvalue decomposition but that was the end of my train of thought. Me not happy with this. I should sit and think more about it.