Posts Tagged ‘privacy’

Take a moment, go view this video, specifically but not exclusively the supplementary questions by Jan Logie, (about 5:35) then come back to this post so you have the background you’ll need.

Done? Thanks. The Ministry of Social Development has made an outrageous decision to hold NGO funding hostage to big data collection. The level of data collection they want might not be a problem for certain NGOs, especially were to compromise and allow an opt-in basis, such as say, ones offering immunisations. But prominent organisations like NZAC1 have expressed that handing over the requested data is a breach of confidentiality and in their view unethical, and Rape Crisis have announced they will boycott government funding if they’re made to disclose any additional data. For context, rape crisis centres are constantly struggling to recieve enough community funding to provide the services they need to even with the government’s help, so this should reinforce that they view this mandatory data-sharing as an existential threat.

As someone who has done data analysis in their work, (I did it even though it wasn’t in my job description, in fact) and who even does it in my spare time to settle matters of debate, I absolutely understand the value of this data to the government, especially in targeting their social assistance dollars more effectively. (because money should be prioritised to places where we can prove it’s effective, for sure) I understand the challenges of incomplete data sets. I also understand the challenges of people misinterpreting what data they need to provide. I will even grant that the people in the government who initially requested this data will want it for nothing other than researching what the most effective types of social spending are, and that they fully intend to be ethical caretakers of people’s private information, and to store it securely.

But none of that makes it a good idea to hold hostage NGO funding to big data. Firstly, there is the obvious issue that services involving counselling or survivors of abusive behaviour absolutely, critically need to be able to provide services on a condition of strict confidentially or even complete anonymity in some cases in order to be able to help people in our society that need it the most.

Not only is it tying the hands of counsellours or volunteers to require them to explain that clients’ data is confidential and private even though it will be provided to MSD, (in addition to being flat out inaccurate- the strict interpretation of confidentiality held to by most such services requires that records be only held in one place, be secured using physical lock-and-key for paper records and encryption for digital ones, and not be shared with anyone else outside of anonymous use in ethical review or in the event that a client is a risk to their own or someone else’s safety) people who have been raped will have difficulty filling out a form identifying themselves when they’re seeking help because they may not be ready to admit what they experienced is real, and writing something down can “make it real.” Jan Logie’s example of men who won’t even give their names is not only accurate but typical of many people seeking support, and not just men. Anyone who accepts this contract is going to be committing to turning away people unable to accept anything less than the strictest confidentiality, or worse, they’re going to entrap themselves into breaching the contract in order to help people.

What adds insult upon insult to this issue, (we’re well past the initial injury if these contracts aren’t amended for services that require confidentiality) is that the government has an expert panel2 helping guide its data strategy that has recommended against precisely this type of mandatory collection in their report3, calling for at least an easily accessible opt-out procedure for all big data collection by Government agencies, so even the proponents of big data don’t want the government to take this approach. Literally the best defense that can be made is that the government is legally allowed to insist that its data collection is more important than anonymity.

If the government wants more data from these services, the most they should reasonably do is require that clients be allowed to opt in to data collection if they’re comfortable with the idea. This allows those receptive to provide a limited data set for analysis, or to ask the questions the government are so keen for their NGO partners to explain, and those not amenable to the idea to dismiss it and still access life-saving services that most likely are incredibly effective uses of funding. If the government can concede that paid leave to deal with the aftermath of domestic violence is effective, they can certainly concede that funding services like Rape Crisis shouldn’t be contingent on mandatory data sharing.


For those of you who haven’t been following things, (and you could be forgiven for having been busy on boxing day, much the way Valve could be forgiven) a technical hiccough has exposed private information of some steam customers.

This may not be 100% confirmed yet, but apparently valve pushed an update to its caching on its store pages that didn’t work as intended, and exposed other people’s emails, their usernames, their steam wallet balances, (think prepaid cash balance, although it can also be the proceeds from selling digital goods such as steam trading cards) and the last two digits of their credit cards. We don’t know the exact timeframe, but potentially everyone who accessed any steam store pages and saw anyone else’s info has had this information exposed. Fortunately, nobody had the ability to charge anyone else’s account during the time as far as I know.

This exposure occured for roughly an hour, after which Valve managed to get someone on-site and shut down external access to the problematic pages, until they could rectify the breach. (Store pages are now accessible with no adverse consequences as of my drafting of this post) This is a relatively impressive turnaround for a public holiday and is to be commended, not attacked. Only the most basic services should be using real staff on public holidays, and Steam is not a basic service.

As a former employee of an organisation that has struggled with both public perceptions and privacy breaches, I can tell you that there are some basic steps that need to be taken as soon as Valve can get people back into work:

  1. Firstly, own up publicly to what information was exposed, apologise to all customers, even those unaffected, and offer to allow people to close their accounts and have their personal information deleted. The first part of this is the basic necessity. You HAVE to apologise if you’ve screwed up, full stop. It also helps if no excuses are made until after the unreserved apology is delivered. But allowing people to express their distrust in you by leaving your service, and deleting their information if they do so, shows you really mean your apology and are accepting the consequences of your mistake.
  2. If possible, generate a list of customers whose accounts were accessed during the timeframe the breach occured, and warn all of them their privacy may have been breached by email. Valve should also recommend that they be aware of potential phishing attempts, take any necessary steps to ensure their credit card remains secure, and change their steam passwords, and any other passwords that match their steam passwords. While in the short term actively notifying people of the breach who haven’t learned of it might seem bad PR, in the medium and long term it means customers know that Valve is willing to be accountable when mistakes are made, and that they will place their customers needs ahead of their own PR.
  3. Valve should put ALL employees through privacy training immediately, so they are aware of the consequences of for instance accidentally disclosing an email address or a partial credit card number. This is both a practical (Valve will be under extra scrutiny now, and human security breaches will be much more serious) and a PR requirement.
  4. Valve should take immediate policy steps to ensure this same breach cannot occur again. For instance, they may want to institute a policy that no software patches or website changes that could impact security or privacy are to be pushed near holidays.
  5. In the medium term, Valve needs to upgrade its privacy security policies and systems. Valve serves some of its private information directly over insecure protocols- this needs to stop. If valve wants to offer Steam pages over the web, it should secure them if the web pages offer private information, or it should only serve account information through its client in secure packets, or on seperate, secured pages. (similar to how purchases are currently handled) The worst privacy breach that should be possible using secure software is that someone unintended views your account name. There are also some really basic information security steps that can be taken, such as:
    1. turning off auto-complete for any external addresses in all email clients,
    2. stocktaking access to private and/or confidential information and ensuring all access granted is either necessary or authorised, practical, and secure,
    3. disabling insecure methods of file-sharing, such as email attachments, without a second employee authorising them,
    4. implement quality-checking on any existing and new safeguards, at least in the short term.
  6. In the longer term, ensure customer data is secure from external access, hackers, and properly anonymised to internal employees.

Valve has a lot of work to do. A lot of this work is better done before any privacy issues occur, but they’re in for a lot of learning about why prevention is better than cure. I’m pretty sorry to all of the employees who weren’t responsible but are about to be affected.

A breach is not a leak

Posted: April 3, 2013 in media
Tags: , , ,

Much of the recent discussion of EQC’s privacy issues has a glaring semantic error which is colouring the story. Most notably, they mostly describe the two recent incidents as “leaks”. This is a very incorrect usage of this term. When you accidentally e-mail a file to the wrong recipient, and it contains private or personal information, what you have is a privacy breach.

When you deliberately e-mail a file with the intent of either revealing the information to a concerned party without official authorisation, or with the intent of creating publicity around an issue, that is a leak.

Privacy breaches require preventative action and are usually not a cause for individual censorship, unless there is good policy in place to prevent the breach, and can often be a matter of systems or workplace culture enabling human error. Leaks are more about the motivations and intent of the individual who performed the leak, and may not even be a bad thing overall, if done with care and for an overriding ethical reason that is not adequately weighted in official policy. For example, what the recipient of the most recent breach is threatening to do by releasing the information to media, that is actually a leak.