Newsportal - Ruhr-Universität Bochum
Security with an in-built vulnerability
Intentional vulnerabilities in encryption algorithms seem tempting to secret services and law enforcement agencies alike – after all, they allow supposedly secure information to be read. Professor Gregor Leander and Dr. Christof Beierle from the Symmetric Cryptography group and Dr. David Rupprecht from the Chair of System Security discuss the sense and nonsense of such backdoors and describe a very long-lasting example of such a gap. Together with international colleagues, they showed that current smartphones still have the insecure mobile phone encryption GEA-1 installed. It has been around since the 1990s, and according to mobile phone standards, it should have disappeared in 2013.
Professor Leander, Dr. Rupprecht, Dr. Beierle, you are looking for secret backdoors. What exactly is that?
David Rupprecht: A backdoor is a kind of in-built weak link in the encryption process. You can think of it like a master key that shouldn’t exist in the first place. In the case we are investigating, it is physically located in a chipset installed in mobile phones, i.e. on the hardware.
Gregor Leander: In our case, it’s symmetric cryptography. This means that all those legitimately involved in the communication – in this case mobile phones and cell towers – have the same key. The underlying algorithm is, so to speak, the recipe for producing these keys.
Rupprecht: In order to generate the key, which, by the way, is regenerated with every new contact between the mobile phone and the mast, an additional secret code stored on the SIM card of the mobile phone is needed. Based on this, the GEA key is calculated by an algorithm, both from the mobile phone and the mobile mast. The result means for both of them: we are friends, we can communicate.
Facebook still sees the metadata. And this is enough.
Which data is affected by the security vulnerability in GEA-1?
Leander: Basically, all of them. But this is not relevant for all data. Because when I use online banking, for example, the data is additionally encrypted by the bank, end-to-end, so it is not decrypted at all in between.
Christof Beierle: In the 1990s, when GEA encryption was first introduced, this was not yet the case.
Leander: However, such backdoors are less about the actual contents of the information that is sent back and forth, but rather about metadata, which is often underestimated. It’s about the information: who communicates with whom and when? This metadata is of tremendous value. This is evident from the fact that Meta Platforms introduced end-to-end encryption for WhatsApp without much pressure from users. This seems like a contradiction, because Facebook lives off data. The reason is simple: Facebook still sees the metadata. And this is enough.
The encryption had to be exportable according to certain restrictions. That means: it had to be just weak enough to get through, but not too weak either.
Who instigates the installation of backdoors in the systems?
Leander: Such backdoors are of course in the interest of the secret services and law enforcement agencies. Invariably, backdoors for these purposes are always being discussed, even if they don’t make much sense. In the case of GEA-1, you have to remember that it was developed in the 1990s. At that time, cryptography was considered a weapon. Powerful cryptography was not allowed to be exported abroad, there were strict export restrictions. But of course people wanted to sell mobile phones to other countries, too. So they had to get around these export restrictions.
Beierle: We have a document from 1998 on the requirements for the cipher. One of them was: the encryption had to be exportable according to certain restrictions. That means: it had to be just weak enough to get through, but not too weak either.
Who sets such standards as those governing encryption?
Rupprecht: In the case of GEA, it was the European Standard Organisation ETSI, a kind of DIN institute at European level. The organisation includes, for example, large manufacturers, companies such as Deutsche Telekom, as well as governmental organisations.
Leander: We can’t rule out the possibility that members of the secret services were also employed there at the time.
Do we know whether the backdoor in GEA-1 was exploited?
Leander: As far as GEA is concerned, we don’t know whether it was used or not. But in other cases, it has been proven that backdoors were exploited.
Rupprecht: The revelations published by Edward Snowden, for example, brought to light that Angela Merkel’s mobile phone was bugged. If you wonder how that could have been accomplished, you quickly come up with encryption methods that work not unlike GEA and are used for voice telephony. Here, too, a relatively weak algorithm was integrated.
Criminals don’t care that cryptography is forbidden.
Dr. Leander, you’ve just indicated that you don’t consider deliberately built-in backdoors to be useful as far as the authorities are concerned.
Leander: There are 1,000 legitimate reasons for law enforcement and intelligence agencies to want such backdoors to exist. But they are the wrong way to go. A master key like that can also be found by someone who may have criminal intentions. And once the loophole is there, it is always there – after all, we can see that it hasn’t been possible to eliminate GEA-1 to this day, even though it should have been done years ago.
Rupprecht: There is another aspect: if everyone knows that only weak algorithms are allowed, criminals will hide from the authorities by using secure encryption. Criminals don’t care that cryptography is forbidden. They simply switch to their own system. In addition, of course, there are fundamental principles of democracy such as the protection of privacy. Mass surveillance is not compatible with democratic values.
How come GEA is still integrated in the latest devices, even though we know that the encryption has a backdoor?
Rupprecht: Well, the manufacturing industry is huge, so maybe it just slips under the radar because it’s not a priority at the moment.
Should we assume that encryption algorithms with backdoors are active in all our devices?
Leander: No. We are now keeping a close eye on things.
Rupprecht: Not in end devices. This is currently an issue in the network products, for example routers, on which the internet is based. There are examples of more recent encryptions with backdoors. A recent case is the manipulation of random number generators by the US secret service NSA. Randomness is often necessary in encryption algorithms, and if you ensure that zeros instead of ones are generated super randomly, you can simplify the encryption keys. In the case of NSA, the manipulated algorithm was so slow that no one wanted it, so companies were paid to put it in.
Mathematical algorithms do exist that no-one in the world can break. Therefore, we can be hopeful.
Leander: On the other hand, cryptographic algorithms without a backdoor do exist.
Rupprecht: There’s been a shift since the 1990s: The weaknesses of cryptography at that time are now known, and the algorithms have become more public.
Beierle: It’s always suspicious when algorithms are not public. The GEA1 standard, for example, was secret.
Leander: Today, the selection of encryption methods is public and transparent. Researchers submit proposals, which are evaluated in a multi-stage process. If there’s even a hint of ambiguity, the proposal is immediately rejected. So there are no more deliberate weaknesses in public encryption protocols. This is also one of the reasons why we at the Cluster of Excellence CASA believe that protection against secret services like the NSA is possible: mathematical algorithms do exist that no-one in the world can break. Therefore, we can be hopeful.
What are your plans for your future activities?
Leander: We will continue to look for backdoors. There are indications that they exist, the only problem is finding them. We are looking for them in a structured way. We look at large programmes, sift out the cryptography and analyse them – especially the ones that are new to us. Some of them are secret. In the case of GEA-1, a whistleblower tipped us off, and the same applies to another case we are currently investigating.
How come there is no public outcry when such discoveries come to light?
Leander: There’s no outcry from users, but there is a great echo in the press. The interest is there.
Beierle: Maybe there wasn’t such a big outcry at GEA, because the method is so old and no longer poses a danger.
Rupprecht: We have to make end users understand what is at stake. But the problem remains very abstract for many. The situation is different in the industry. Manufacturers are really willing to do something about it.
Leander: You really have to distinguish between users and decision-makers. End users don’t care about their data. Quite the contrary, considering how they use social media. The same applies to using amazing services on the internet for free – how does that work? Simply by having your data harvested. But people don’t care about that. The decision-makers have to care. It’s like driving a car: if seat belts weren’t compulsory, no-one would wear them.
The Cluster of Excellence “Cyber Security in the Age of Large-Scale Attackers” (CASA) develops strategies against the most powerful attackers in cyberspace by bringing together technology-based research with research in the humanities. It’s been funded as part of the Excellence Strategy of the German federal and state governments since 2019.
18 January 2023
9.28 AM