Jan 10, 2011

Vodafone’s infosec balls-up a symptom of wider problems

Vodafone’s apparent information security breach, if it’s being described accurately, certainly suggests a botched approach. But corporate Australia’s blasé attitude to our personal identity information is as much to blame.

Vodafone’s apparent information security breach, if it’s being described accurately, certainly suggests a botched approach. But corporate Australia’s blasé attitude to our personal identity information is as much to blame. Fairfax’s Natalie O’Brien broke the story yesterday that anyone with a valid Vodafone dealer login could access every customer’s complete file -- name, address, date of birth, driver’s licence number, credit card details, the PIN they use to operate their account and even the full history of their phone calls and text messages. It’s wrong to say this data was "publicly available on the internet". You did need a valid login, after all. But the set-up seems deeply flawed, and valid logins are on the loose. "Vodafone retailers have said each store has a user name and password for the system," O’Brien wrote. "That access is shared by staff and every three months it is changed." That for a start fails a fundamental principle of infosec auditing. With a single login shared by everyone within a store, it’s impossible to track who accessed which data or who leaked their login details to the bad guys. And if someone gets sacked, they could still be able to access the system for up to three months. Being able to view a customer’s PIN is just plain wrong. Passwords -- and a PIN is just a password -- should never be stored in their unencrypted form. Standard security practice is to store an encrypted version of the password. When someone needs to supply their password, the same encryption process is applied and, if it matches the stored version, access is granted. But you can’t turn that back into the password itself without deploying spook-agency-grade computing resources. Any system that can reveal the password or PIN itself is broken. Any system that exposes customer credit card information to a shared login is also broken. The Payment Card Industry Data Security Standard (PCI DSS) requires access to cardholder data to be restricted on a "business need-to-know" basis, to assign a unique ID to every person with computer access, and to track and monitor all access to network resources and cardholder data. All this, if true, is sufficient reason to slap Vodafone, and slap them very hard indeed. But another part of the problem is the insistence -- not just by Vodafone, but by so many companies -- of compiling databases of personal information that simply aren’t required to fulfill the business need. The idea that companies need to photocopy or scan a driver’s licence before issuing a mobile phone SIM card, for example, is "absolute rot" and "outrageous", according to Paul Ducklin, head of technology for the Asia-Pacific region for global information security firm Sophos. "I try to make a point that if people wish me to identify myself, for example checking into a hotel or dealing with a shop, and so I need photo ID, then I will hold up my driving licence for them to look at so that they can satisfy themselves that I’m who I say I am. They can write down my name if they wish,” Ducklin told the Patch Monday podcast. "Frequently they then just reach out and expect to take that licence and do something with it, and my response is always, 'I’m sorry, you can look but you can’t touch'." The system only needs to record the fact that valid ID was sighted, not the ID itself. Similarly, at least one ticket agency asks for and stores each customer’s date of birth. Why? Some performances have age restrictions, either because of their content or because they’re taking place on licensed premises. But this misses the point on two grounds. One, you don’t need to record the date of birth, just the fact that the customer is of the right age. And two, as Ducklin points out, the age test is being conducted at the wrong point in the process. "It doesn’t matter what you say when you’re doing the booking, whether you’re over 18 or not determines whether you’re allowed to actually go in through the gates of the concert," he said. Vodafone is currently saying that they’re only aware of a single breach. One interesting question is whether that will turn out to be the case on their internal investigations are done. Another, in the absence of US-style laws requiring companies to disclose data breaches, is whether we’ll ever know.

Free Trial

You've hit members-only content.

Sign up for a FREE 21-day trial to keep reading and get the best of Crikey straight to your inbox

By starting a free trial, you agree to accept Crikey’s terms and conditions


Leave a comment

23 thoughts on “Vodafone’s infosec balls-up a symptom of wider problems

  1. ronin8317

    There is a mobile phone PIN which is used to reset the phone, so the operators need that for support calls. Account PINs are also accessible by the operator because you have to read the PIN out to them over the phone to confirm your identity.

    Because Vodafone don’t audit access, they are being entirely truthful by saying they’re only aware of one breach. I expect their IT department will get a lot of action in the next few weeks.

  2. Stilgherrian

    @ronin8317: My understanding is that the mobile phone’s PIN — actually the SIM’s PIN, I think — can be changed without the new version being known to the network operator, but that there’s a separate unlock code to break into that should the PIN be forgotten. Have I got that right?

    The account PIN does need to be read to the phone operator to confirm the customer’s identity, but the point is that the PIN itself does not need to be — and I say should not be — stored or visible. The customer should state their PIN, the operator should key it in, and then the system tells the operator whether the PIN was correct or not. Having the PIN visible means the operator can impersonate the customer at a later time.

    And yes indeed, Vodafone’s IT people will be extremely right now…

  3. Grover Jones

    Having been through a PCIDSS audit in the exact same industry as Vodafone (i.e. a, now defunct, mobile provider) I am shocked, shocked I say, that Visa has authorised them to accept Visa cards, and Mastercard the same. The requirements for a business of that size are very strict, and, at least when we went though it about 6 years ago, Visa are not satisfied until everything is perfect.
    If the situation is as you have describe it, then surely they are in major breach of PCIDSS and should have the ability to accept CC payments revoked until they can pass an audit.
    Remind me never to work for Vodafone (one of my IT hats is in security) if their IT staff are allowing this sort of crap. I’m appalled, and will continue to ignore Vodafone as a potential supplier.

  4. Douglas Gration

    Unfortunately Paul Ducklin’s comment that “The idea that companies need to photocopy or scan a driver’s licence before issuing a mobile phone SIM card, for example, is “absolute rot” and “outrageous”” isn’t correct.

    Consumers might normally expect that National Privacy Principle number 8 – which states that “Wherever it is lawful and practicable, individuals must have the option of not identifying themselves when entering transactions with an organisation” – would apply. But it doesn’t in the case of mobile phone companies. The Telecommunications (Service Provider — Identity Checks for Pre-paid Public Mobile Telecommunications Services) Determination 2000 (and its post-paid equivalent) prohibits a mobile carrier from activating a mobile service until the customer has provided all sorts of identifying information – presumably for the benefit of law enforcement and other government agencies.

    So, it’s completely fair to have a go at Vodafone for its lax security practices, but the blame for collection of the customer data in the first place lies with the government, not Vodafone.

  5. Chris Ailwood

    I’m with Doug Gration on his view about Paul Ducklin’s comments. Ducklin is clearly in IT, not compliance.

    It is absolutely essential that when an identity is verified by the production of a document, such as a passport or driver’s licence, that a copy be made of the document and retained. This not only protects the staff member undertaking the identity verification from accusations of collusion with the customer/potentail fraudster but also provides the organisation with evidence that the identity verification was undertaken in the event that it has to defend against a negligence claim.

    There is no point in having a fraud protection requirement if you cannot assure compliance with the requirement.

  6. Gavin Moodie

    Thanx Stilgherrian for reporting this. I did not believe the radio report I heard this morning that Vodafone staff members shared logins and passwords. How common is this?

    Perhaps this is a good subject for a Parliamentary inquiry – a highly visible but low stakes and moderate cost way of promulgating good IT security practices.

  7. Stilgherrian

    Thanks for the pointers to the relevant rules about telcos being required to do identity checks. But there’s a difference between verifying an identity and keeping a database of the source documents that were used to verify that identity. If indeed the government is requiring telcos to create such a massive database of documents that could be used for identity theft, then they’re part of the problem.

    I do understand the point that keeping a copy of the document protects staff against allegations that they didn’t do the check. But what’s the risk assessment here? And are there other ways of verifying that they’re doing their job, such as spot checks?

    My question would be whether creating these vast databases of identity documents is more of a risk that the supposed horror of someone getting a mobile phone under false name. And on a related note, how many staff and dealers could spot a fake driver’s license or passport anyway?

    The snippet of Ducklin I quoted here may not show his comments in the best light – the “joys” of editing quickly. You can here the full interview and therefore the full context in the Patch Monday podcast.

  8. Stilgherrian

    Um, you can “hear” it. Sigh. Monday.

  9. Daniel Bond

    There was a story in the current Cosmopolitan magazine (please don’t ask how I know this) about spying on your boyfriend. Included a quote from someone who worked at a phone store, and used to browse her boyfriend’s call and SMS history — and then, if she saw a number she didn’t recognise, would browse the database to find out who it belongs to.

    That’s abhorrent behaviour from the Cosmo commenter, but also entirely predictable behaviour given this security. For what possible reason could front-line staff at a phone store require my call and SMS history?

  10. db

    The comment on in the article on how passwords are typically resolved is oversimplified. The computer that works out is you should be allowed in or not doesn’t actually know your password either.

    What it does know is that if it does a few mathematical operations with fairly large numbers to your password it should exactly match something the computer has stored – your scrambled password effectively (often called a password hash). Working backwards and unscrambling it is similar to trying to unscramble an egg. Even someone with full access to that stored password information would need to take a very long time to try to find out what it is unless it can be obviously guessed by another means (eg. name of pet). Of course an attacker can feed an entire dictionary of likely passwords in to see what matches, but if that it doesn’t work it comes down to “brute force”, where every possible password is checked one at a time to see if they match. With a lot of very fast computers and typical password encryption this may take years. That’s why the author mentioned “deploying spook-agency-grade computing resources”.

    A computer does not need to know your password at all so storing the passwords in plain text for anything important is an unforgivable act of laziness. Even pacemakers based around a CPU originally designed in 1976 (Zilog Z80) have enough processing power to handle such secure passwords.

Share this article with a friend

Just fill out the fields below and we'll send your friend a link to this article along with a message from you.

Your details

Your friend's details