In his seminal paper Kenneth Iverson described a new mathematical notation which soon became A Programming Language.

Recently, while reading Surely You’re Joking, Mr. Feynman!, I found that Feynman invented his own notation when he was in school.

While I was doing all this trigonometry, I didn’t like the symbols for sine, cosine, tangent, and so on. To me, $\sin f$ looked like $s$ times $i$ times $n$ times $f$! So I invented another symbol, like a square root sign, that was $\sigma$ with a long arm sticking out of it, and I put the $f$ underneath. For the tangent it was $\tau$ with the top of the tau extended, and for the cosine I made a kind of $\gamma$, but it looked a little bit like the square root sign.

Now the inverse sine was the same sigma, but left-to-right reflected so that it started with the horizontal line with the value underneath, and then the sigma. That was the inverse sine, NOT $\sin^{-1} f$–that was crazy! They had that in books! To me, $\sin^{-1}$ meant $1/\sin$, the reciprocal. So my symbols were better.

I didn’t like $f(x)$–that looked to me like $f$ times $x$. I also didn’t like $dy/dx$–you have a tendency to cancel the d’s–so I made a different sign, something like an & sign. For logarithms it was a big L extended to the right, with the thing you take the log of inside, and so on.

I thought my symbols were just as good, if not better, than the regular symbols–it doesn’t make any difference what symbols you use–but I discovered later that it does make a difference. Once when I was explaining something to another kid in high school, without thinking I started to make these symbols, and he said, “What the hell are those?” I realized then that if I’m going to talk to anybody else, I’ll have to use the standard symbols, so I eventually gave up my own symbols.