While talking mathematics with a postdoc here at UCLA (March Boedihardjo) we came across the following matrix problem which we managed to solve, but the proof was cute and the process of discovering it was fun, so I thought I would present the problem here as a puzzle without revealing the solution for now.

The problem involves word maps on a matrix group, which for sake of discussion we will take to be the special orthogonal group $SO(3)$ of real $3 \times 3$ matrices (one of the smallest matrix groups that contains a copy of the free group, which incidentally is the key observation powering the Banach-Tarski paradox).  Given any abstract word $w$ of two generators $x,y$ and their inverses (i.e., an element of the free group ${\bf F}_2$), one can define the word map $w: SO(3) \times SO(3) \to SO(3)$ simply by substituting a pair of matrices in $SO(3)$ into these generators.  For instance, if one has the word $w = x y x^{-2} y^2 x$, then the corresponding word map $w: SO(3) \times SO(3) \to SO(3)$ is given by

$\displaystyle w(A,B) := ABA^{-2} B^2 A$

for $A,B \in SO(3)$.  Because $SO(3)$ contains a copy of the free group, we see the word map is non-trivial (not equal to the identity) if and only if the word itself is nontrivial.

Anyway, here is the problem:

Problem. Does there exist a sequence $w_1, w_2, \dots$ of non-trivial word maps $w_n: SO(3) \times SO(3) \to SO(3)$ that converge uniformly to the identity map?

To put it another way, given any $\varepsilon > 0$, does there exist a non-trivial word $w$ such that $\|w(A,B) - 1 \| \leq \varepsilon$ for all $A,B \in SO(3)$, where $\| \|$ denotes (say) the operator norm, and $1$ denotes the identity matrix in $SO(3)$?

As I said, I don’t want to spoil the fun of working out this problem, so I will leave it as a challenge. Readers are welcome to share their thoughts, partial solutions, or full solutions in the comments below.