August Roundup

August was a busy month as the already cracked veneer on the surface of the Irish state’s shambolic data acquisition and sharing projects shattered further. We were treated to a senior government minister denying she was splitting hairs while attempting a precision distinction between ‘mandatory’ and ‘compulsory’. Elsewhere it was (bad) business as usual for unthinking and downright unscrupulous app makers, microchipping people and facial recognition trials.

1. The Public Services Card ‘controversy’

I’ve used quotation marks around controversy as it isn’t really a controversy. If you want to really stretch and call it one it’s a controversy entirely of the Government’s own making. There are legal issues which have not been addressed by the Department of Social Protection around the legality of what they are doing with the national identity register they are building. As the story developed and slid into September it became apparent that there was also a yawning technical knowledge gap. The Minister for Social Protection, fresh from arguing that mandatory wasn’t compulsory, went back into a radio studio and attempted a redefinition of the word biometric.

Notes for humans

This isn’t yet a debate about national identity cards. This is a debate about whether we should allow a biometric national identity register be introduced without debate. In the UK they had a debate and decided they wouldn’t do this. It appears the civil service in Ireland took note of this reversal in the UK and opted not to have a debate but to patch together a justification and a legal basis and issue the cards by decree.

We’ve written a lot about the affair so far and attempted to collect most of the media coverage here.

2. Gendered privacy issues

When you’re a woman whose personal and digital space is invaded with alarming regularity, you think carefully about how your digital life intersects with your real one—especially when the data you’re sharing is quite literally close to your front door.

Rosie Spinks writing in Quartz highlighted some worrying design aspects of the fitness app Strava. A setting called Enhanced Privacy meant nothing like what you’d reasonably expect it to mean. There were many hoops that had to be jumped through to prevent the app making a user’s name and their running location easily available to other users.

In Motherboard Jillian York pointed out related concerns about the popular secure messaging app Signal.

Fortunately, I can block a single harasser’s phone number, but what if someone decided to make my private number public? I’m not willing to take that risk.

I’m not so surprised that the mostly-male developers of these tools didn’t consider these risks—risks that largely affect women and other vulnerable groups.

Notes for humans

These things happen when technology products and services are designed predominantly by youngish white men currently driven by a commercial imperative to acquire as much personal information as possible. A mindset that feels sharing personal information must be a feature desired by users is still common.

This will change as app and service makers respond to market forces and new laws. Recent surveys show that people remain uncomfortable with the amount of their personal data acquired by technology companies. In the Australian Community Attitudes To Privacy Survey 2017 carried out by the Australian Information Commissioner’s office only 17% of respondents reported “feeling comfortable with social networking companies keeping databases of information on their online actions.”

The upcoming European General Data Protection Regulation embraces the principles of Privacy By Design as developed by Ann Cavoukian and others in the 1990s. Simply put, these principles place the onus on the folks making products and services to consider user privacy at every stage of the design process and not as an afterthought which can be covered by inserting another page of impenetrable legalese into the privacy policy that barely anybody reads anyway.

+ ‘Using a fitness app taught me the scary truth about why privacy settings are a feminist issue’, Quartz

+ ‘How to Use Signal Without Giving Out Your Phone Number: A Gendered Security Issue’, Motherboard

3. Privacy policy as ransom

Speaking of privacy policies, consider this not at all hypothetical situation. You’ve splashed out on a fancy speaker from a big name brand. The speaker turns round and demands you accept new terms and conditions or it will simply stop working. Sonos, maker of swanky speakers, announced that owners who had given Sonos a chunk of money would have to accept a new privacy policy or accept that their speakers would stop working over time. To any reasonable observer the data which Sonos described as “functional data” appears to go far beyond the functional. Sonos doesn’t need to know your email and IP addresses to ensure your speaker keeps functioning properly.

Notes for humans

Besides the bullying aspect of this decision by Sonos there is a significant broader problem with internet-connected devices which require a connection and third party software running on a third party’s servers to work. Circumstances change and companies stop updating software. This really shouldn’t result in a perfectly good device ceasing to work.

A privacy policy is something you should read carefully. Of course, you don’t. A few things you should know about privacy policies even if you don’t read them carefully from beginning to end are

  • They’re a policy so they’re more of a promise or an aspiration than a watertight commitment
  • They’re liable to change at any time. If there are significant changes to the privacy policy the author of the privacy policy who is holding your personal data should give you notice of the impending changes and what these mean to you. They usually don’t makes as much of an effort as they could to do so.
  • For now it is up to you to keep an eye on changes and make decisions about whether you wish to continue using that product or service

+ ‘Sonos says users must accept new privacy policy or devices may “cease to function”‘, ZDNet

+ ‘How to Stop Your Sonos From Collecting (As Much) Personal Data’, Lifehacker

 

4. A Very Well Known Weather App Is Selling Your Location Data To Advertisers

What’s more, even when found out they seem unrepentant and haven’t stopped.

Security researcher Will Strafach intercepted the traffic from an iPhone running the latest version of AccuWeather and its servers and found that even when the app didn’t have permission to access the device’s precise location, the app would send the Wi-Fi router name and its unique MAC address to the servers of data monetization firm Reveal Mobile every few hours. That data can be correlated with public data to reveal an approximate location of a user’s device.

Notes for humans

Selling data to third parties is how many app makers pay the bills, or make additional profits. They’re not particularly bothered about whether you want them to do that with your data or not, and will use underhand methods to continue doing it. They’re prepared to ignore your obvious preferences and find a workaround so they can keep that data flowing. Be aware of how their money is made. Uninstall apps that don’t make it clear what they’re doing with your personal data. Immediately uninstall any apps from makers who are as dishonest in their data gathering as Accuweather.

+ ‘AccuWeather caught sending user location data, even when location sharing is off’, ZDNet

+ ‘Despite privacy outrage, AccuWeather still shares precise location data with ad firms’, ZDNet

+ ‘AccuWeather app caught “red-handed” tracking location of users against their wishes’, BoingBoing

5. Facial Recognition And False Positives

The use of facial recognition systems was another topic that tumbled out of the large and growing pile of unanswered questions about the Public Services Card. During August police in Berlin announced a six month trial of a facial recognition system in a major train station. Police in England trialled facial recognition at the Notting Hill Carnival.

Notes for humans

Your face is yours. It is a defining feature of your identity. But it’s also just another datapoint waiting to be collected. At a time when cameras are ubiquitous and individual data collection is baked into nearly every transaction a person can make, faces are increasingly up for grabs.

‘Who Owns Your Face’, The Atlantic

The Metropolitan police have denied their facial recognition system at the Notting Hill Carnival prompted them to make a wrongful arrest. Nevertheless these systems are intrusive and unreliable. As they become more reliable their use must be far more carefully regulated. You have an absolute right to know who has an image of your face, what other personal information it is linked to and what uses are being made of this image of your face.

+ ‘German police test facial recognition cameras at Berlin stations’, Reuters

+ ‘Germany’s facial recognition pilot program divides public’, Deutsche Welle

+ ‘Police deny Notting Hill Carnival face recog tech led to wrongful arrest’, The Register

+ ‘The decision to use facial recognition software at Notting Hill Carnival is another example of racial profiling by the police’, Independent

+ ‘How a Facial Recognition Mismatch Can Ruin Your Life’, The Intercept

 

Honourable mentions

[Image credit: Scott Webb on Unsplash]

One Comment Add yours

  1. Pingback: September Roundup

Leave a Reply

Your email address will not be published. Required fields are marked *