Linked by Thom Holwerda on Mon 22nd May 2017 11:42 UTC
In the News

Like many other countries, The Netherlands uses a chip card for paying and using public transport, and while there's been a number of issues regarding its security, privacy, and stability, it won't be going anywhere any time soon. Just today, the various companies announced a new initiative where Android users can use their smartphones instead of their chip cards to pay for and use public transport.

The new initiative, jointly developed by the various companies operating our public transport system and our carriers, is Android-only, because Apple "does not allow it to work, on a technical level", and even then, it's only available on two of our three major carriers for now.

This got me thinking about something we rarely talk about: the increasing reliance on external platforms for vital societal infrastructure. While this is a test for now, it's easy to see how the eventual phasing out of the chip cards - already labelled as "outdated" by the companies involved - will mean we have to rely on platforms beyond society's control for vital societal infrastructure. Chip cards for public transport or banks or whatever are a major expense, and there's a clear economic incentive to eliminate them and rely on e.g. smartphones instead.

As we increasingly outsource access to vital societal infrastructure to foreign, external corporations, we have to start asking ourselves what this actually means. Things like public transport, payments, taxes, and so on, are absolutely critical to the functioning of our society, and to me, it seems like a terrible idea to restrict access to them to platforms beyond our own control.

Can you imagine what happens if an update to an application required to access public transport gets denied by Apple? What if the tool for paying your taxes gets banned from the Play Store days before the tax deadline? What if a crucial payment application is removed from the App Store? Imagine the immense, irreparable damage this could do to a society in mere hours.

If these systems - for whatever reason - break down today, we can hold our politicians accountable, because they bear the responsibility for these systems. During the introduction of our current public transport chip card and its early growing pains, our parliament demanded swift action from the responsible minister (secretary in American parlance). Since the private companies responsible for the chip card system took part in a tender process with strict demands, guidelines, rules, and possible consequences for failure to deliver, said companies could and can be held accountable by the government. This covers the entire technological stack, from the cards themselves up to the control systems that run everything.

If we move to a world where applications for iOS and Android are the only way to access crucial government-provided services, this system of accountability breaks down, because while the application itself would be part of the tender process, meaning its creator would be accountable, the platforms it runs on would not - i.e., only a part of the stack is covered. In other words, if Google or Apple decides to reject an update or remove an application - they are not accountable for the consequences in the same way a party to a government tender would be. The system of accountability breaks down.

Of course, even today this system of accountability isn't perfect, but it is a vital path for recourse in case private companies fail to deliver. I'm sure not every one of you even agrees the above is a problem at all - especially Americans have a more positive view of corporate services compared to government services (not entirely unreasonable if you look at the state of US government services today). In countries like The Netherlands, though, despite our constant whining about every one of these services, they actually rank among the very best in the world.

I am genuinely worried about the increasing reliance on - especially - technology companies without them actually being part of the system of accountability. The fact that we might, one day, be required to rely on black boxes like iOS devices, Microsoft computers, or Google Play Services-enabled Android phones to access vital government services is a threat to our society and the functioning of our democracy. With access to things like public transport, money, and all that come with those, locked to closed-source platforms, we, the people, will have zero control over the pillars of our own societies.

What can we do to address this? I believe we need to take aggressive steps - at the EU-level - to demand full public access to the source code that underpins the platforms that are vital to the functioning of our society. We, the people, have the right to know how these systems work, what they do, and how secure they really are. As computers and phones become the only way to access and use crucial government services, they must be fully 100% open source.

We as The Netherlands are irrelevant and would never be able to make such demands stick, but the EU is one of the most powerful economic blocks in the world. If you want access to the wealthy 450 million customers in the European Union (figure excludes the UK), your software must be open source so that we can ensure the security and stability of our infrastructure. If you do not comply, you will be denied access to this huge economic block. Most of you will probably balk at this suggestion, but I truly believe it is the only way to guarantee the security and stability of vital government services we rely on every single day.

We should not rely on closed-source, foreign code for our government services. It's time the European Union starts thinking about how to address this threat.

Permalink for comment 644740
To read all comments associated with this story, please click here.
Alfman
Member since:
2011-01-28

quackalist,


I'm no expert and neither am I going to ask one but think it's a kinda no-brainier to claim no system can be 100% secure.


It really depends how you want to look at it.

Systems built on discrete mathematics can be proven to be 100% correct. That's not hard to do in principal, and for small systems it's quite achievable, you just need to prove that every possible outcome is correct for every possible input. Given that computers are strictly finite computation machines, proving the correctness of arbitrarily large algorithms is theoretically possible. However large algorithms quickly exceed our human ability to prove them. Even small and medium code bases can have edge cases that are very difficult to prove. And then even if the software is proven to be 100% correct, the hardware and toolchains may not be.


On top of that physics itself is inherently probabilistic, so we can't rely on real machines to execute our code 100% reliably - there will always be the possibility for error.

We try and mitigate hardware errors with ECC RAM and disk data, but those are also probabilistic and will eventually experience an error. An attacker might try to exploit this by irradiating the target's CPU, or manipulating power to derail the correct code execution.

So mathematically speaking, a system could be 100% secure, but when we allow for physical attacks, no machine can be "100% secure".

Reply Parent Score: 2