August was a busy month as the already cracked veneer on the surface of the Irish state’s shambolic data acquisition and sharing projects shattered further. We were treated to a senior government minister denying she was splitting hairs while attempting a precision distinction between ‘mandatory’ and ‘compulsory’. Elsewhere it was (bad) business as usual for unthinking and downright unscrupulous app makers, microchipping people and facial recognition trials.
1. The Public Services Card ‘controversy’
I’ve used quotation marks around controversy as it isn’t really a controversy. If you want to really stretch and call it one it’s a controversy entirely of the Government’s own making. There are legal issues which have not been addressed by the Department of Social Protection around the legality of what they are doing with the national identity register they are building. As the story developed and slid into September it became apparent that there was also a yawning technical knowledge gap. The Minister for Social Protection, fresh from arguing that mandatory wasn’t compulsory, went back into a radio studio and attempted a redefinition of the word biometric.
Notes for humans
This isn’t yet a debate about national identity cards. This is a debate about whether we should allow a biometric national identity register be introduced without debate. In the UK they had a debate and decided they wouldn’t do this. It appears the civil service in Ireland took note of this reversal in the UK and opted not to have a debate but to patch together a justification and a legal basis and issue the cards by decree.
We’ve written a lot about the affair so far and attempted to collect most of the media coverage here.
2. Gendered privacy issues
When you’re a woman whose personal and digital space is invaded with alarming regularity, you think carefully about how your digital life intersects with your real one—especially when the data you’re sharing is quite literally close to your front door.
Rosie Spinks writing in Quartz highlighted some worrying design aspects of the fitness app Strava. A setting called Enhanced Privacy meant nothing like what you’d reasonably expect it to mean. There were many hoops that had to be jumped through to prevent the app making a user’s name and their running location easily available to other users.
In Motherboard Jillian York pointed out related concerns about the popular secure messaging app Signal.
Fortunately, I can block a single harasser’s phone number, but what if someone decided to make my private number public? I’m not willing to take that risk.
I’m not so surprised that the mostly-male developers of these tools didn’t consider these risks—risks that largely affect women and other vulnerable groups.
Notes for humans
These things happen when technology products and services are designed predominantly by youngish white men currently driven by a commercial imperative to acquire as much personal information as possible. A mindset that feels sharing personal information must be a feature desired by users is still common.
This will change as app and service makers respond to market forces and new laws. Recent surveys show that people remain uncomfortable with the amount of their personal data acquired by technology companies. In the Australian Community Attitudes To Privacy Survey 2017 carried out by the Australian Information Commissioner’s office only 17% of respondents reported “feeling comfortable with social networking companies keeping databases of information on their online actions.”
Notes for humans
Besides the bullying aspect of this decision by Sonos there is a significant broader problem with internet-connected devices which require a connection and third party software running on a third party’s servers to work. Circumstances change and companies stop updating software. This really shouldn’t result in a perfectly good device ceasing to work.
- They’re a policy so they’re more of a promise or an aspiration than a watertight commitment
- For now it is up to you to keep an eye on changes and make decisions about whether you wish to continue using that product or service
4. A Very Well Known Weather App Is Selling Your Location Data To Advertisers
What’s more, even when found out they seem unrepentant and haven’t stopped.
Security researcher Will Strafach intercepted the traffic from an iPhone running the latest version of AccuWeather and its servers and found that even when the app didn’t have permission to access the device’s precise location, the app would send the Wi-Fi router name and its unique MAC address to the servers of data monetization firm Reveal Mobile every few hours. That data can be correlated with public data to reveal an approximate location of a user’s device.
Notes for humans
Selling data to third parties is how many app makers pay the bills, or make additional profits. They’re not particularly bothered about whether you want them to do that with your data or not, and will use underhand methods to continue doing it. They’re prepared to ignore your obvious preferences and find a workaround so they can keep that data flowing. Be aware of how their money is made. Uninstall apps that don’t make it clear what they’re doing with your personal data. Immediately uninstall any apps from makers who are as dishonest in their data gathering as Accuweather.
5. Facial Recognition And False Positives
The use of facial recognition systems was another topic that tumbled out of the large and growing pile of unanswered questions about the Public Services Card. During August police in Berlin announced a six month trial of a facial recognition system in a major train station. Police in England trialled facial recognition at the Notting Hill Carnival.
Notes for humans
Your face is yours. It is a defining feature of your identity. But it’s also just another datapoint waiting to be collected. At a time when cameras are ubiquitous and individual data collection is baked into nearly every transaction a person can make, faces are increasingly up for grabs.
‘Who Owns Your Face’, The Atlantic
The Metropolitan police have denied their facial recognition system at the Notting Hill Carnival prompted them to make a wrongful arrest. Nevertheless these systems are intrusive and unreliable. As they become more reliable their use must be far more carefully regulated. You have an absolute right to know who has an image of your face, what other personal information it is linked to and what uses are being made of this image of your face.
+ ‘Germany’s facial recognition pilot program divides public’, Deutsche Welle
+ ‘How a Facial Recognition Mismatch Can Ruin Your Life’, The Intercept
- Facebook is another organisation that’s very interested in faces. It has its own biometric facial recognition system called DeepFace and holds the world’s largest store of images of faces. “When you tag a friend in a photo, that’s feeding a massive facial recognition dataset.” Maybe its time to rethink your personal policies on tagging?
- Workers in Wisconsin are having microchips embedded in their hands to, uh, improve their lives and not the bottom line of their employer, no doubt.
- Ahead of the first formal review in September, European Data Protection Supervisor Giovanni Buttarelli said the Privacy Shield data transfer agreement between the EU and the US was “an interim instrument for the short term. Something more robust needs to be conceived.”
- The HSE announced that CIO Richard Corbridge will be leaving his post later in the year. Corbridge was always enthusiastic in his communications and does seem to have energised the HSE in their attempts at digital transformation. What his departure will mean for the HSE’s enormous Individual Health Identifier register remains to be seen.
[Image credit: Scott Webb on Unsplash]