Tag:privacy

1
Proposed Regulations under the California Consumer Privacy Act: A Step in the Right Direction but Far from the Destination
2
Pushing for Change: Congress Pushes for Privacy Legislation ahead of CCPA
3
Human error accounts for 34% of Notifiable Data Breaches – 3 key take outs from the latest OAIC report
4
Uniformity of Law: NSW Government opens consultation to consider making Data Breach Reporting mandatory in respect of State Government Agencies
5
You can be anonymised, but you can’t hide
6
Who have you been giving your name and number to? A cautionary tale
7
Facing up to privacy risks
8
The OAIC engages in more in-depth investigations and stronger exercise of its power
9
Privacy Awareness Week (Personal Data): technology suspicion – consumer concerns surrounding voice and digital assistants
10
Encryption bill to give unprecedented power

Proposed Regulations under the California Consumer Privacy Act: A Step in the Right Direction but Far from the Destination

By Cameron Abbott and Max Evans

We recently blogged about the intention of Californian lawmakers to enact stringent privacy regulations through the California Consumer Privacy Act (CCPA). In particular, we noted the useful guidance provided by our colleagues over at The Privacist on the impact of potential contingencies for organisations.

Read More

Pushing for Change: Congress Pushes for Privacy Legislation ahead of CCPA

By Cameron Abbott and Max Evans

With the California Consumer Privacy Act (CCPA) looming, Californian lawmakers have affirmed their intention to enact stringent privacy protections, with the legislature adjourning without making any major changes to the state’s landmark privacy laws.

Read More

Human error accounts for 34% of Notifiable Data Breaches – 3 key take outs from the latest OAIC report

By Cameron Abbott and Karla Hodgson

The Office of the Australian Information Commissioner has released its Q2 statistics on notifications received under the Notifiable Data Breach (NDB) scheme. The 245 breach notifications in Q2 are on par with each other quarter since the scheme was introduced in July 2018 and while the majority of NDBs (62%) are attributed to malicious or criminal attacks, we noted with interest that a staggering 34% are due to human error – that is, mostly avoidable errors made by staff. A consistent theme of our blogs is reinforcing the message that employees are the front line of defence for organisations.

There are 3 key statistics we took away from these human error NDBs.

Read More

Uniformity of Law: NSW Government opens consultation to consider making Data Breach Reporting mandatory in respect of State Government Agencies

By Cameron Abbott, Warwick Anderson and Max Evans

We have blogged numerous times on the notifiable data breach scheme provided for in Part IIIC of Privacy Act 1988 (Cth) including more recently in relation to its success in assisting the preparedness of the health sector to report and respond to data breaches.

Whilst the NSW Information Privacy Commissioner recommends that public sector agencies notify it and affected individuals where a data breach creates a risk of serious harm, neither NSW privacy laws nor the notifiable data breach scheme require public sector agencies in NSW to provide such notification. There are many reasons for state government agencies to mandatorily report data breaches. Informing citizens when privacy breaches occur provides an opportunity for individual protection against potentially adverse consequences, whilst mandatory data breach reporting would address the current under-reporting of data breaches in NSW, which according to the consultation may be the norm.

Read More

You can be anonymised, but you can’t hide

By Cameron Abbott, Michelle Aggromito and Karla Hodgson

If you think there is safety in numbers when it comes to the privacy of your personal information, think again. A recent study in Nature Communications found that, given a large enough dataset, anonymised personal information is only an algorithm away from being re-identified.

Anonymised data refers to data that has been stripped of any identifiable information, such as a name or email address. Under many privacy laws, anonymising data allows organisations and public bodies to use and share information without infringing an individual’s privacy, or having to obtain necessary authorisations or consents to do so.

But what happens when that anonymised data is combined with other data sets?

Read More

Who have you been giving your name and number to? A cautionary tale

By Cameron Abbott and Allison Wallace

Have you inadvertently given the owners of global, searchable databases of phone numbers and associated names access to your entire contact list?

We suspect that you cannot confidently answer “no”.

In yet another tale of why you should read the terms of use and service of apps and other online products you download or sign-up to use, we’ve recently been exposed to the shock of having your name appear on a complete stranger’s phone, after they’re given your number (but not your name) to call you. We asked the question of how this could happen – and found the answer to be quite alarming.

The Samsung Smart Call function, which is powered by Hiya, boasts that it allows you to “deal with spam the easy way”, by letting you know who is calling you, even if their number is not saved in your contact list. In theory, this is a handy tool, and in the context of robocalls or other unsolicited marketing calls, doesn’t create any privacy issues. But when the database which powers the function contains the names and numbers of (we suspect) millions of private citizens, this becomes quite concerning.

So, how do private numbers (and the names of their associated users) come to be listed in databases such as Hiya? Well, for one, anyone who downloads the Hiya app is given the option to share their contacts. If they do, and your number is saved to their phone, your details will become part of the database. We have no doubt that many who download and use the Hiya app didn’t realise what they were signing up for (or what they were signing up their entire contact list for) – because they didn’t read the terms of use. This also begs the question – are companies like Hiya properly satisfying their privacy obligations merely by asking users to “opt in” to share their contacts?

Hiya is of course not the only “caller ID” app on the market – a quick search of the Apple App store reveals numerous other options for download – including Truecaller, Caller-ID, Sync.ME and CallHelp. In 2018, Hiya reached 50 million active users worldwide, while Truecaller’s website says it has over 130 million daily active users. Those figures of course would barely scrape the surface of the number of names and phone numbers held in their collective databases.

In case you’re wondering how much damage could really be done by a third party having access to your name and phone number – think about all of the things your number is linked to. Your Facebook, your Gmail, maybe even your bank account and credit cards. Information is power – and this is the kind of information that could easily allow hackers to wreak a reasonable amount of havoc. So before you sign-up to a new app, take the time to read the terms of service, because your use could not only be exposing your personal information, but that of your entire contact list.

Facing up to privacy risks

By Cameron Abbott and Karla Hodgson

Images of dramatically aged friends and family members have been flooding social media feeds over the last week, courtesy of FaceApp, an app that uses AI to digitally age a user’s photo. While many have been asking themselves “why would I make myself look older?” others have been discussing the risks of allowing an app to access and store personal data.

The app’s privacy policy allows FaceApp to retrieve information such as IP addresses and location data from users, in addition to the photo the user has selected for editing. When users agree to FaceApp’s terms of service, they agree to grant FaceApp a perpetual and irrevocable licence to use this data, including their name and likeness, which can be used for any purposes, including commercial purposes.

Read More

The OAIC engages in more in-depth investigations and stronger exercise of its power

By Cameron Abbott, Rob Pulham and Jacqueline Patishman

Following two key data incidents concerning how the Commonwealth Bank of Australia (CBA) handled data, the OAIC has successfully taken court action binding the banking heavyweight to “substantially improve its privacy practices”.

As a quick summary of the incidents, the first incident involved the loss of magnetic storage tapes (which are used to print account statements). These contained historical customer data including customer statements of up to 20 million bank customers. In 2016, the CBA was unable to confirm that the two magnetic tapes were securely disposed of after the scheduled destruction by a supplier.

Read More

Privacy Awareness Week (Personal Data): technology suspicion – consumer concerns surrounding voice and digital assistants

By Cameron Abbott, Rob Pulham, Michelle Aggromito, Max Evans and Rebecca Gill

Protecting personal data is a fundamental aspect of any privacy regime. As we become more technological advanced, organisations are finding innovative ways to interact with consumers through more intuitive communication channels, such as voice recognition via digital assistants. But not everyone trusts such technology, as Microsoft’s April 2019 report on voice assistants and conversational artificial intelligence has found.

The report found that 41% of voice assistant users were concerned about trust, privacy and passive listening. Other interesting findings of the report include:

Read More

Encryption bill to give unprecedented power

By Cameron Abbott and Wendy Mansell

The Coalition government is attempting to pass large-scale decryption reforms which will give sweeping powers to law enforcement agencies for overt and covert computer access.

The reforms have caused significant controversy as they may force tech companies and communications providers to modify their services, creating “systemic weaknesses” for intelligence agencies to exploit. However many point out these same vulnerabilities may be utilised by criminals.

Further the potential repercussions of these reforms may undermine consumers’ privacy, safety and trust through unprecedented access to private communications. This could have anti-competitive effects, as the reputations of Australian software developers and hardware manufacturers will suffer within international markets.

At the same time, the harsh reality that terrorists and organised crime increasingly utilise these technologies to evade surveillance highlights a very clear problem for law enforcement authorities.

We won’t seek to suggest where the balance between these interests should lie, but the debate rages on. Stay tuned.

Copyright © 2024, K&L Gates LLP. All Rights Reserved.