Startups and companies that aim to provide cryptographically secure internet messaging platforms have started cropping up during the last six months. This is a very good development. Unfortunately, cryptography is ridiculously hard to get right (provided that you are really starting from scratch, rolling your own algorithm that is only based on first principles). Cryptocat already got its share of criticism. A few weeks ago, I was very exited to learn about Telegram, an application geared towards providing a secure alternative to WhatsApp (these are my words, not the actual mission statement of the software). An initial post on Hackernews resulted in very mixed reviews. People were skeptical about some of the claims and the protocol choices made by the developers. Geoffroy Couprie wrote a long and detailed critique of Telegram, which has since been updated to contain new information. Personally, I consider his wording a bit harsh at times, but he makes some valid points. Paul Miller raised some equally valid points in a blog article, namely (among others) that it is easy to criticize people for doing something completely new. He has even been providing a live status of Telegram vulnerabilities.

With all these emotions flying around, I want to take a more diplomatic stance. In the best tradition of the mathematical crackpot index, Scott Aaronson's list of signs a mathematical breakthrough is wrong, and Jeff Atwood's collection of code smells in computer science, I decided to think about some crypto smells, i.e. signs that may indicate that a given cryptographic algorithm is overly hyped. This is not to say that every protocol that "smells" is necessarily wrong. It might explain why the Telegram experienced so much backlash, though.

Here's the crypto smells I have identified so far:

  1. Claiming credentials that are not related to the subject at hand. This is a classic variant of the argument from authority and is akin to using your professorship from Hogwarts to impress those muggles. Whether a cryptographic system is sound must depend only on its mathematics, not on the fact that it is created by John Doe, PhD.
  2. Bashing other cryptographic systems for their alleged complexity. While it is perfectly fine to claim that your product explicitly caters to non-experts, it is unnecessary to ridicule established algorithms such as PGP.
  3. Excessive use of buzzwords like military-strength encryption or adjectives such as unbreakable. Real science and scientific writing should be humble. Let your peers decide the worth of your application by showing them proof.
  4. Too many claims without proof. The description of your cryptosystem should be public. Do no behind empty phrases. This resonates with a previous article of mine on this subject.
  5. Not open-sourcing even the main algorithm. Given enough eyeballs, all bugs are shallow. Bugs in cryptographic applications have the highest potential for causing real damage. Any closed-source cryptographic application does not have any credibility at all.
  6. Using dated cryptographic functions (I am looking at you, MD5) in some places of the protocol. Good protocols might survive insecure components but this might still be indicative of a larger problem.
  7. Relying on central instances to perform auxiliary functions in the protocol. I am not talking about something like the PGP keyserves, which only serve to facilitate the existing protocol, but rather about something like a central server, for example, to broker between clients. Any central system smells fishy because it requires users to trust a central instance.
  8. Arranging a "crypto challenge" and claiming that the protocol is secure just because nobody solved the challenge. This is not the way it works.

That's all for now, I will expand the list if I identify any more smells. A very large and loud "Thank you" to all cryptographers out there. May your crypto always smell like roses.