DKLM

  • Send an email
  • +44(0)207 549 7888

Data Protection Breaches - Recent Cases

In a recent case, Plymouth Hospital NHS Trust was ordered to pay compensation to a patient after one of its employees unlawfully gained access to the man’s medical records. The nurse who accessed the data was the man’s partner at the time. The patient claimed that the breach of the Data Protection Act 1998 (DPA) and the way his subsequent complaint regarding the matter was handled had made worse a pre-existing paranoid personality disorder and prevented him from working. He was awarded damages of £12,500 for exacerbation of his pre-existing medical condition and £4,800 for loss of earnings. In a second case, a former health worker at the Royal Liverpool University Hospital pleaded guilty to unlawfully obtaining patient information by accessing the medical records of five members of her ex-husband’s family so that she could obtain their new telephone numbers. The matter came to light when a man contacted the hospital after receiving nuisance calls which he suspected had been made by his former daughter-in-law. He had previously changed his phone number following unwanted calls from her and was immediately concerned that there had been a breach of patient confidentiality. Checks by the hospital revealed that none of the patients whose details had been compromised were at any time under the woman’s care and she had no work-related reasons to access their records. She had accessed the information for her own purposes without the consent of her employer and was fined £500 for breach of the DPA and also ordered to pay £1,000 towards prosecution costs and a £15 victim surcharge. Meanwhile, the European Commission has announced proposals for significant reform of data protection legislation. The Information Commissioner’s initial response to the proposals can be found on the website of the Information Commissioner's Office.

Search site

Contact our office

Make an enquiry

Trending News

Proskauer Rose LLP, Law Firm, business law

Related Practices & Jurisdictions

  • Communications, Media & Internet
  • Civil Procedure
  • Litigation / Trial Practice
  • United Kingdom

data protection act 1998 case study

The UK Supreme Court handed down its much-anticipated decision in the  Lloyd v Google LLC [2021] UKSC 50  case on 10 November 2021 restricting claimants’ ability to bring data privacy class actions in the UK under the (now repealed) Data Protection Act 1998 ( DPA 1998 ). This decision will be persuasive (though not binding) with respect to similar class actions brought under the (in-force) UK General Data Protection Regulation and the Data Protection Act 2018 (collectively, the  UK GDPR ). This decision will not directly impact litigation brought under the EU General Data Protection Regulation in EU member states.

Key takeaways

The Supreme Court determined that compensation under section 13 of the DPA 1998 may be awarded to affected individuals only where it is established that an individual has suffered damage (interpreted by the Supreme Court to mean material damages such as financial loss or mental distress) caused by a contravention of DPA 1998 by a data controller. Importantly, statutory infringement would not, in and of itself, constitute material damages for purpose of awarding compensation. Requiring claimants to prove that an infringement of DPA 1998, no matter how severe, has caused the claimant’s damage makes it more difficult for claimants to succeed in such claims.

The Supreme Court suggested that it may be appropriate for claimants to bring bifurcated proceedings in similar cases in the future, i.e., to first bring a representative action to establish the defendant’s liability, and to then pursue individual claims for compensation. This type of two-stage approach will make such claims less appealing for both prospective claimants (given the increased cost to pursue individual, low value claims through to the second stage) and litigation funders (who may only obtain an award for damages following the second stage of litigation and now will have to prove the damage caused to each particular individual).

Importantly, the Supreme Court’s decision was made under the now-repealed DPA 1998, and not the in-force UK GDPR. While the DPA 1998 is similar to the UK GDPR, there are potentially material differences in the statutory regimes. Therefore, while the Supreme Court’s decision will be persuasive with respect to similar class actions brought under the UK General Data Protection Regulation and the Data Protection Act 2018, it is unclear whether it would be formally binding with respect to such class actions.

Overall, this decision will have wide-ranging consequences for the future of class actions in the UK in general, and in the data protection field in particular. In turn, this decision is likely to be welcomed by many businesses who may be potential defendants in data privacy litigation in the UK. Nonetheless businesses will need to be wary with respect to potential UK class actions under the UK GDPR.

Our in-depth analysis of the wider impact of the Supreme Court’s decision on the class action landscape in the UK will be published shortly.

Current Legal Analysis

More from proskauer rose llp, upcoming legal education events.

Foley and Lardner LLP Law Firm

Sign Up for e-NewsBulletins

GDPR: Key cases so far

  • 7 February 2019
  • Data Protection & GDPR

Loretta Maxfield

GDPR: Key cases so far

Google fined by national French data protection regulator

On 21 January, Google LLC (Google’s French arm) was fined €50million by the Commission Nationale de l’information et des Liberties (CNIL) for various failings under GDPR.

The main failing CNIL found was that individuals using Google’s services were not furnished with the requisite “fair processing information” (the information usually provided in privacy notices) by seemingly omitting to inform individuals about why Google processed their personal data how long their data was kept. The ruling also attacked the accessibility of the information saying that although most of the information was there, it was scattered around it site via various different “links”. The second key failing was not meeting the GDPR standard of “consent” when providing personalised advert content. Under GDPR, consent must be sufficiently informed, specific, unambiguous, granular and be gained through a form of active acceptance. In the first instance the CNIL did not consider the consent to be informed enough as it ruled users were not given enough information about what giving their consent would mean in terms of the ad personalisation services Google would then push. The fine was also imposed in light of Google not ensuring that consent met the GDPR threshold through using pre-ticked boxes and not separating out consents for advert personalisation from other processing by Google.

The takeaways for your organisation are to ensure it’s easy for your customers or service users to understand what you do with their data. Privacy notices should be clearly signposted, and be as accurate as possible about what data is collected and why it is used. It also reminds us of the strict threshold consents must reach before they are valid. Businesses are certainly becoming more savvy when it comes to making sure individuals an give consent for different purposes, but it’s not uncommon to still come across the pre-ticked box! If your organisation relies on consent and would like Thorntons to review how you use it, please get in touch and we can give advice on whether you are meeting the GDPR standard.

Marriot International suffer unprecedented data breach

On 19 November last year, Marriott International announced that the personal data of 500 million of its customers had been compromised. The group, which operates hotel chains under the brands W Hotels, Sheraton, and Le Méridien among many others, said that they had reason to believe that certain of their computer systems had been hacked in 2014 which has now led to this breach. The number of people affected, which data relates to customer bookings from 2014 onwards, has now been revised and whilst they still cannot state the exact number, it believes the number of customer records now totals around 383 million. This remains an extremely large number of affected customers, and the hackers were able to access personal details, passport numbers, and in some cases payment information.

Although a breach of this scale is rare, there are various pointers that all organisations can take from this case. Firstly, it’s a reminder to continuously monitor the technical and organisational security measures protecting personal data. Testing and monitoring of your organisation’s security should be subject to regular review. Secondly, it’s a reminder to have in place a practical guide for how to respond to a data breach. As well as having a clear process for how to report and assess breaches internally, your guide should be clear on what kind of breaches should be reported to the ICO, and perhaps statements to release to the media. Lastly, this case is a reminder of conducting regular audits of data held so that your organisation is always aware of how much data it actually holds. Marriott’s reduced forecast of the number of data subjects affected is based on the fact they have now discovered that many of the accounts compromised actually relate to the same individual. If Marriott had an up-to-date list of active customers it potentially could have been able to respond more quickly.

The ICO takes action against organisations for failing to pay the new data protection fee

At the end of September, the ICO announced that it had begun formal enforcement action against organisations for failing to pay the new data protection fee. Since 25th May when GDPR came into force, organisations which are classified as data controllers have been required by the Data Protection (Charges and Information) Regulations 2018 to register with the ICO, and pay the applicable fee. Whilst the specific organisations have not been named, the ICO has confirmed they have issued 900 notices of intent to fine organisations which span “the public and private sector including the NHS, recruitment, finance, government and accounting”. Of those 900, to-date 100 penalty notices have been issued which range from £400 to £4000, although the ICO has confirmed that the maximum could be £4350 depending on aggravating factors. If you are unsure whether your organisation is required to pay a fee, please get in touch and we can advise accordingly.

The ICO issues its first Enforcement Notice for a breach of GDPR

The ICO has issued its first formal notice under the GDPR to AggregateIQ Data Services Ltd (“AIQ”). AIQ, a Canadian company, was involved in targeting political advertising on social media to individuals whose information was supplied to them by various political parties and campaigns (such as Vote Leave, BeLeave, Veterans for Britain, and DUP Vote to Leave).

After an investigation by the ICO, AIQ was found not to have adequately complied with its obligations as a controller under the GDPR by: (1) not processing personal data in a way that the data subjects were aware of, (2) not processing personal data for purposes for which data subjects expected, (3) not having a lawful basis for processing, (4) not processing the personal data in a way in a way which was compatible with the reasons for which it was originally collected, and (5) not issuing the appropriate fair processing information to those individuals (commonly communicated through a privacy notice).

As well as those practical failings, the ICO also considered that it was likely that those individuals whose information was passed to AIQ and used for targeted advertising were likely to cause those individuals damage or distress through not being given the opportunity to understand how their personal information would be used.

The most interesting point about this case is that although the company is based in Canada, the ICO has still exercised its authority over those organisations which process data of those in the UK and ordered that AIQ must now erase all the personal data it holds on individuals in the UK. For a company which mainly deals in data and analytics, this could have a detrimental impact on its business operations in the UK. Although AIQ was passed the personal data from other organisations, this enforcement action demonstrates that it is still AIQ’s responsibility to ensure that their use of the data was not incompatible with any of the purposes for which it was originally intended, and still incumbent on them to ensure individuals were aware of what they were doing with it. In addition, whilst there has been and continues to be a lot of emphasis in the media of the risk of large fines under GDPR, it is notable that no monetary penalty has been issued by the ICO, although the ICO has reserved its ability to do so should AIQ not comply with this notice.

Morrisons held liable for the wrongful acts of its rogue employee by the Court of Appeal (England)

The circumstances of this interesting case centre around an employee whose rogue actions were still considered by the court to be attributable to the employer as a breach of the Data Protection Act 1998. The employee was employed by Morrisons Supermarkets as an internal IT auditor who in 2014, knowingly decided to copy the personal data of around 100,000 of Morrisons’ employees onto a USB stick. At home, the employee then posted the personal data, which included names, addresses and bank details, onto the internet under the name of another Morrisons employee in an attempt to cover his tracks.

In finding that Morrisons was vicariously liable for the actions of the rogue employee, the Court concluded that there was a sufficiently close link between the employee’s job role, and the wrongful action. That the wrongful event occurred outside the workplace was irrelevant, as the Court found that the employee in question was acting “within the field of activities assigned”. Because the employee had access to the compromised personal data in the course of carrying out his role in facilitating payroll, he was specifically entrusted with that kind of information in order to do his job, so the Court decided that there was a sufficient link between the job role and the wrongful disclosure.

The key, striking, message from this case is that it is possible for employers to be held liable for rogue actions taken by its employees. Although this particular employee was obviously not acting within the expected confines of his job role, it is interesting that the Court still determined that employers may be liable for acts that it would normally reasonably consider out of its control. Although this incident occurred in 2014 and therefore decided under the Data Protection Act 1998, this case demonstrates how vital it is that organisations put in place appropriate technical and organisational security measures adequate for the type of data that is being held and also taking into account the risk of disgruntled employees and what they may do with their access to the information. This case also acts as a reminder of ensuring your staff are trained and aware of data protection and the role they personally can play in the protection of data, not just focusing on technical computer security which a lot of organisations pay more attention to. As remarked in this judgment, it also serves as a reminder of having adequate insurance in place in the event of a major data breach.

The ICO receives notification of thousands of breaches

Although organisations could report data breaches to the ICO under the Data Protection Act 1998, you will be aware that under GDPR there is mandatory reporting of breaches to the ICO in cases where there is a “risk to the rights and freedoms of individuals”. The ICO has now reported that it has received notification of more than 8000 breaches in the 6 months since GDPR came into force. Last summer the ICO observed that many breaches that were being reported did not necessarily meet the threshold of risk, however they do welcome the honesty and transparency coming from organisations under legislation which is designed to strengthen rights for individuals.

With breaches requiring to be reported to the ICO within 72 hours of becoming aware, it is vital that mechanisms are in place internally for employees to understand how to report a breach and complete a risk assessment in the appropriate time-frame to assess whether it is reportable. If you would like any help compiling a data breach policy or risk assessment framework tailored to your organisation please get in touch.

Related services

  • Data Protection and GDPR

Stay updated

Receive the latest news, legal updates and event information straight to your inbox

About the author

Loretta Maxfield

Data Protection & GDPR, Intellectual Property

For more information, contact Loretta Maxfield on +44 1382 346814 .

Make an enquiry

You are using an outdated browser. Please upgrade your browser to improve your experience.

Global HR Lawyers

  • LinkedIn LinkedIn
  • Twitter Twitter
  • Facebook Facebook

Employees fined for unlawfully obtaining data

25 March 2019

It is not just businesses that need to worry about the long arm of data protection, the Information Commissioner’s Office (“ICO”) has warned, after two employees were convicted of unlawfully accessing personal data and fined

Both cases were prosecuted under section 55 of the Data Protection Act 1998 (now repealed), which states that a person must not knowingly or recklessly, without the permission of the data controller, access or disclose personal data. (A similar provision can now be found in section 170 of the Data Protection Act 2018.)

In the first case, an NHS employee with access rights to personal records viewed the data of several family members and children known to her without a professional need to do so. She admitted to offences under section 55 and was fined £1,000 (with a £50 victim surcharge), as well as being ordered to pay towards prosecution costs.

The second case concerned an employee who, before resigning from her role, forwarded several emails containing personal data of customers and other employees from her work account to her personal email account. Having also admitted offences under section 55, she was fined £200 (with a £30 victim surcharge) and ordered to contribute towards prosecution costs.

Implications

The second prosecution will be of particular interest to employers who face issues with employees taking customer or client information with them when they leave to go to a competitor. While carefully drafted restrictive covenants and ongoing confidentiality obligations in the contract of employment are the first line of defence against such conduct, the enforcement of such terms can be expensive, fraught and uncertain.

The data protection offences committed in these two recent cases, and the ICO’s interest in prosecuting them , operate as an additional deterrent to those thinking of taking customer or client information with them when they leave. This is especially the case for individuals in regulated sectors such as law and finance, for whom any convictions could potentially have a significant impact of their careers. Employers should therefore consider warning employees explicitly about the criminal consequences of unlawfully obtaining personal data - and also that any such behaviour will be reported to the regulator with a view to prosecution.

It remains to be seen how many more cases like this will arise. Mike Shaw, who heads up the ICO’s criminal investigations team, has however emphasised that this will be an area of ongoing concern for the regulator:

“People expect that their personal information will be treated with respect and privacy. Unfortunately, there are those who abuse their position of trust and the ICO will take action against them for breaking data protection laws.”

For further information about workplace data privacy matters, please contact members of our data protection team .

Related items

Related services.

  • Workplace Data Privacy
  • Sustainability
  • Client Login

Australian flag

  • Built Environment
  • Energy & Natural Resources
  • Financial Services
  • Government & Public Sector
  • Technology, Media & Communications

Legal Services

  • Commercial, Regulatory & Data
  • Dispute Resolution
  • Employment and Pensions
  • Finance and Restructuring
  • Real Estate
  • Tax & Private Capital
  • India Group

Legal Operations

  • Contracts Management
  • Cyber Incident Services
  • Legal Analytics
  • Legal Operations & Consulting
  • Litigation and Investigations

Business Services

  • Claims Management & Adjusting
  • Corporate Governance & Compliance
  • DWF Chambers
  • Regulatory Consulting
  • Class Actions
  • Economic Crime & Fraud Hub
  • Sustainable Business & ESG
  • Data Protection and Cyber Security
  • News and Insights
  • Reports and Publications
  • News and Press
  • DWF onDemand
  • Brave New Law
  • DWF Link: Early Careers Network
  • Consumer Duty Hub
  • Morrisons vindicated

Morrisons vindicated: A landmark judgment in data protection and vicarious liability

DWF acted for Wm Morrison Supermarkets in their successful defence of a group action for vicarious liability arising out of a mass employee data theft perpetrated by a rogue employee. It is the first mass data breach claim of its kind before the Courts.

The claim for direct fault-based liability was successfully defended at the original trial. Morrisons was found to have met all the relevant statutory data protection standards and did not foresee, nor could they reasonably have foreseen, the covert criminal enterprise their rogue employee had embarked upon.   However, Morrisons was found liable for no-fault vicarious liability as employer. In Morrisons' successful appeal, the Supreme Court has clarified how the law of vicarious liability should be applied and in so doing reversed the High Court and Court of Appeal decisions against Morrisons. 

Register for our webinar to understand what this means for your business 

Hear from the team who worked on this ground-breaking case as we discuss what the outcome means for businesses. Our employment, data protection and commercial litigation specialists will be answering your questions live. 

Tuesday 21 April at 11am 

Register for our webinar >

Summary of the case .

In November 2013, Morrisons gave one of its senior internal auditors, Andrew Skelton, access to its payroll data for around 126,000 individuals so he could provide it securely to Morrisons' external auditors during the statutory audit process. In March 2014, Morrisons became aware that the payroll data of 100,000 of its current and former employees from that database had been put online and sent to three newspapers under the guise of an anonymous concerned person. Morrisons promptly had the data removed from the websites on which it appeared, informed the ICO, the police and other agencies, and launched its own enquires. Morrisons wrote to all 126,000 individuals and everyone who had been employed since then to inform them whether their personal data was affected, and of ID protection which Morrisons arranged at huge cost to be available to them. 

Following a police investigation which identified Skelton as the culprit, Skelton was charged with a number of offences and at his criminal trial was convicted and sentenced to a lengthy prison term of 8 years. The trial established that Skelton, who was skilled in IT, had devised his criminal plan out of a desire to harm Morrisons, against whom he bore an irrational grudge following a minor and unrelated disciplinary incident some months earlier, which resulted in him receiving a verbal warning. Significantly, having taken an unauthorised copy of the payroll data, Skelton sought to conceal his identity and distance himself from his employer: (i) he effected the online disclosure at home, (ii) he used a 'burner' phone, (iii) he set up an email account with credentials which pointed to a colleague against whom he also bore a grudge for the colleague's role in the earlier disciplinary matter, (iv) he used the 'The Onion Router' web browser to conceal his computer, and (v) he wrote anonymously to the newspapers posing as a concerned individual who had found the data online.    A Group Action was launched against Morrisons for direct liability under the Data Protection Act 1998, the tort of misuse of private information and the equitable remedy of breach of confidence. In the alternative, it was claimed that Morrisons was vicariously liable for the unlawful acts of Skelton. Whilst over 9,000 affected individuals joined the group action, the greater significance lay in the potential for any and all of the 100,000 affected employees to rely on the Court's finding against Morrisons whether they joined the group or not, and seek damages.

The claim if successful would have been hugely costly, and for most lesser companies, bodies, charities and local authorities who employ large numbers, such a claim would be potentially ruinous. Hence the claim was closely watched by industry and insurers alike.

The claims of direct fault-based liability were dismissed at the trial. The ICO, to whom Morrisons had promptly reported the incident, following investigations made no adverse finding against Morrisons.

Take away points from the Supreme Court

  • The mere fact that an employee's employment provides the opportunity to commit the wrongful act is not sufficient to warrant the imposition of vicarious liability; and
  • regard must be had to whether the employee was engaged, however misguidedly, in furthering his employer’s business, or whether the employee was engaged solely in pursuing his own interests: in the time-honoured phrase, on a ‘frolic of his own’.

The Supreme Court explored these points.  Here is a summary of how the conclusions were reached.

1. The field of activities

The Supreme Court held that, contrary to the lower courts, the disclosure did not fall within the field of Skelton's employed activities. It was true that he had been asked to disclose the data to the statutory auditors, but his criminal act of copying the data and disclosing it deliberately to harm Morrisons was not within, or sufficiently closely connected to, his authorised duties.  

2. Close connection test

The lower courts relied heavily on Lord Toulson's judgment in one of the leading cases, Mohamud v WM Morrison Supermarkets [2016] (note the irony there). They emphasised the seamless chain of events starting with entrusting the data to Skelton, leading ultimately to his criminal disclosure of it. Lord Toulson had set out how an employer may be vicariously liable for an employee's actions if there is a "seamless and continuous sequence of events … an unbroken chain" between the employee's conduct and their employment. The Supreme Court found that the lower courts had misunderstood and misapplied Lord Toulsons' judgment. "[A]lthough there was a close temporal link and an unbroken chain of causation linking the provision of the data to Skelton for the purpose of transmitting it to [the external auditors] and his disclosing it on the Internet, a temporal or causal connection does not in itself satisfy the close connection test." The Court distinguished between cases where the employment merely offered the opportunity, and where the wrong itself was perpetrated in the context of the employee doing their job.

The lower courts had rejected Morrisons' argument that it would be perverse for the Courts to visit liability on Morrisons, the intended victim of the crime, so that it could compensate the other collateral victims of Skelton (none of whom had claimed to suffer actual financial loss). The trial judge confessed to being troubled at the prospect of the Court furthering the criminal wrongdoer's goal. However, Lord Toulson had commented in Mohamud that "motive was irrelevant". The Supreme Court found that Lord Toulsons' judgment had been misapplied: the reason why Skelton "acted wrongfully was not irrelevant: on the contrary, whether he was acting on his employer’s business or for purely personal reasons was highly material" and it was "abundantly clear that Skelton was not engaged in furthering his employer’s business when he committed the wrongdoing in question. On the contrary, he was pursuing a personal vendetta, seeking vengeance for the disciplinary proceedings some months earlier. In those circumstances […..] Skelton’s wrongful conduct was not so closely connected with acts which he was authorised to do that, for the purposes of Morrisons’ liability to third parties, it can fairly and properly be regarded as done by him while acting in the ordinary course of his employment."  The Supreme Court clarified the principles of vicarious liability within its existing boundaries, and found, entirely contrary to the lower courts, that to impose vicarious liability on Morrisons would constitute "a major change in the law". No doubt welcome relief for data controllers everywhere.  

Vicarious Liability within the data protection world

Whilst these issues apply to all vicarious liability cases, it is worth noting that this case revolves around data protection, the duties of a data controller, the rights of data subjects and the liability of an employer data controller for the activities of another data controller who happens to be its employee.   In obiter remarks, the Supreme Court held that the principle of vicarious liability can be applied to claims under the Data Protection Act 1998 (and by extension the GDPR and Data Protection Act 2018).  

The DWF team was Andrew Harris (consultant), Michelle Maher (senior associate), Nicole Burton (director) and Elinor Webster (solicitor).  

Related Authors

Andrew Harris

Andrew Harris

Michelle Maher

Michelle Maher

Kirsty Rogers

Kirsty Rogers

Partner // Global Head of ESG

Stewart Room thumbnail

Stewart Room

Head of Technology, Media & Communications Sector

Related Sectors

  • Technology, Media & Communications

Related Services

  • Data Protection Risks

Further Reading

A costly reminder of the pitfalls associated with letters of intent

How costly can it be to proceed with a project based on a letter of intent? Our team explores the pitfalls and some practical tips to side step a potential dispute.  

Cyber Liability Insurance ExecuSummit: Key learnings

In the ever-evolving landscape of cybersecurity, staying ahead of the latest trends and insights is key for individuals and businesses alike. 

HR Insights – March 2024

Qualitative Research and the Data Protection Act 1998

Qualitative Market Research

ISSN : 1352-2752

Article publication date: 1 March 2002

Beck, J. (2002), "Qualitative Research and the Data Protection Act 1998", Qualitative Market Research , Vol. 5 No. 1. https://doi.org/10.1108/qmr.2002.21605aaf.001

Emerald Group Publishing Limited

Copyright © 2002, MCB UP Limited

Introduction

The Data Protection Act 1998 is the UK's response to an EU Data Protection Directive designed to protect individual rights in the collection, processing and transferring of personal data. Similar responses are being produced all over Europe and they vary in severity from the relatively relaxed regimes proposed in Ireland and Sweden to the tough stance being taken by Italy and Greece.

The UK Act (the DPA) takes a minimalist approach and came fully into force on 23 October 2001. Like the Human Rights Act and market research codes of conduct, the DPA is principles-based and therefore open to interpretation. That interpretation is ultimately the responsibility of the Information Commission but has been influenced by direct and detailed discussions between the Commission and a market research industry taskforce in which the MRS (Market Research Society) and AQR (Association of Qualitative Research) were represented.

What we now have is an Act that underpins and gives weight to our codes of conduct and, as Information Commissioner, Elizabeth France said at the Market Research Society's Research 2001 Conference, "The Data Protection Act does not stop you sharing processed information, it doesn't stop you disclosing it. It doesn't stop you using it outside the exemption for research but it makes you do it in a way that respects individuals … ".

What the Act contains

The 1998 Act covers all data collection or processing that is in any sense organised – by electronic or other means – and this includes personal data recorded on audio or video-tape, so UK data protection is now extended to all research methodologies – quantitative and qualitative. Most qualitative researchers and recruiters will need to notify with (or register) with the Information Commission as they will be "controlling" personal data.

There are eight data protection principles enshrined in the Act that can be broadly summarised as:

Personal data must be processed fairly and lawfully.

Personal data can only be used for the specified and lawful purposes for which they were collected. (The specified purpose principle means that data cannot be processed after collection for any other purpose than that for which informed consent was given. So for example, qualitative tapes cannot be passed to clients unless specific consent was obtained for that at the time of the research.)

Personal data shall be adequate, relevant and not excessive.

Personal data shall be accurate and kept up to date.

Personal data must not be kept beyond fulfilling the purpose for which they were collected. (This should mean an end to hanging on to old tapes and recruitment questionnaires!).

Personal data shall be processed in accordance with the rights of data subjects.

Personal data must be kept secure.

Personal data shall not be transferred outside the EEA unless adequate protections are in place. (This is something that international researchers or researchers working on behalf of international clients need to be aware of. The EEA is assumed to have adequate protection but many territories do not.)

For market researchers, the guiding construct underpinning the 1998 Act is that of informed consent and this has two key components:

Transparency . Ensuring individuals have a very clear and unambiguous understanding of the purpose(s) of collecting the data and how it will be used.

Consent . At the time that the data is collected, individuals must give their consent to their data being collected, and also at that time, have the opportunity to opt out of any subsequent uses of the data.

What the Act means for qualitative research

The Act does carry new implications for every stage of qualitative research – from recruitment to handling primary data. It establishes respondents' rights as paramount and will require some changes to the way we do things. These will be reflected in revised MRS qualitative guidelines, covering the following.

Ensuring emotional wellbeing

At recruitment, respondents must be told (either verbally or through invitation):

the subject of the discussion;

that it is for market research purposes;

how long the session will last;

if it is to take place in viewing facilities;

if it is to be audio- or video-recorded.

They should also be told:

if the session is likely to be viewed;

that they have a right to withdraw and withhold.

When the topic is judged to be sensitive, the subject must be explicitly communicated and the content of the discussion should be disclosed.

During interviewing, researchers must obtain:

permission to record the session;

explicit permission to release the data to a third party, with the purpose and other details clearly stated.

Primary data

The DPA is based on the right of respondents to know how their personal data will be used. Researchers have a responsibility to inform respondents accordingly and to ensure that the data will only be used in the way that respondents have been told it will be used. This has significant implications for the way qualitative researchers (and their clients) handle primary data.

Primary qualitative data include recruitment questionnaires, audio tapes, video tapes, transcripts (where individuals may be recognised by their turn of phrase and the universe might be small), hand-written notes containing personal data, projective material, attendance lists/signature lists, etc.

Respondents must give their informed consent in writing at some time during the research for any primary data to be handed to a third party. They must be told:

explicitly to whom the data will be passed;

who will see them;

what they will be used for.

Respondents must also give consent where data are to leave the country and must be told where they will be going. Researchers must ensure that any country to which personal data are transferred has adequate data protection measures in place. This is particularly important outside the EEA where countries have weak data protection regimes. For example, data (including tapes and recruitment questionnaires) cannot be transferred to the USA unless:

data protection is safeguarded in the contract between client and researcher;

the US organisation involved has signed up to a Safe Harbour agreement; or

all respondents have given explicit written consent for the transfer to take place.

Getting full consent, during the research, is important because it is difficult (and potentially unlawful) to get permissions changed after the research has been completed.

Primary data collected in a market research project can only ever be used for market research purposes.

Primary data must be labelled with appropriate restrictions when handed over to a third party.

Researchers must ensure that the recipients, viewers, readers and listeners of the primary data are aware of the requirements of the DPA.

In order to comply with the new Act, qualitative researchers need to take some practical steps, including:

identifying someone responsible for data protection policy;

notifying with the Commission and making sure that notification is as comprehensive as possible;

reviewing all information supplied to respondents on paper or verbally – including invitations, introductions, consent forms, data release forms etc. – to make sure that they meet all the requirements for transparency and consent;

reviewing contracts and terms of business to make sure that researcher and client roles, responsibilities and access to primary data are specified;

ensuring that everyone involved – employees, suppliers, clients, etc. – knows what their data protection responsibilities are;

checking and improving data security.

A consultative draft containing these and more detailed guidance on issues like recruitment, observation of qualitative research, client anonymity and observational research is available from the Market Research Society or on the Code page of www.mrs.org.uk . There you will also find downloadable copies of The Data Protection Act 1998 and Market Research: Guidance for MRS Members. This also contains advice on notification.

This is a further step towards improving the professionalism of the UK qualitative research industry and the increasing emphasis on research by informed consent can only benefit the quality and authority of qualitative work.

Jennie Beck Chair of the MRS Professional Standards Committee, Beck Consultancy, Muswell Hill, London [email protected]

Related articles

We’re listening — tell us what you think, something didn’t work….

Report bugs here

All feedback is valuable

Please share your general feedback

Join us on our journey

Platform update page.

Visit emeraldpublishing.com/platformupdate to discover the latest news and updates

Questions & More Information

Answers to the most commonly asked questions here

The International Forum for Responsible Media Blog

  • Table of Media Law Cases
  • About Inforrm
  • Search for: Search Button

Top 10 Privacy and Data Protection Cases of 2018: a selection

data protection act 1998 case study

  • Cliff Richard v. The British Broadcasting Corporation [2018] EWHC 1837 (Ch) .

This was Sir Cliff Richard’s privacy claim against the BBC and was the highest profile privacy of the year.  The claimant was awarded damages of £210,000. We had a case preview and case reports on each day of the trial and posts from a number of commentators including Paul Wragg , Thomas Bennett ( first and second ), Jelena Gligorijević . The BBC subsequently announced that it would not seek permission to appeal.

  • ABC v Telegraph Media Group Ltd [2018] EWCA Civ 2329 .

This was perhaps the second most discussed privacy case of the year.  The Court of Appeal allowed the claimants’ appeal and granted an interim injunction to prevent the publication of confidential information about alleged “discreditable conduct” by a high profile executive.  Lord Hain subsequently named the executive as Sir Philip Green.  We had a case comment from Persephone Bridgman Baker. We also had comments criticising Lord Hain’s conduct from Paul Wragg , Robert Craig and Tom Double .

  • Ali v Channel 5 Broadcast ( [2018] EWHC 298 (Ch)) .

The claimants had featured in a “reality TV” programme about bailiffs, “Can’t pay? We’ll Take it Away”. Their claim for misuse of private information was successful and damages of £20,000 were awarded. We had a case comment from Zoe McCallum. An appeal and cross appeal was heard on 4 December 2018 and judgment is awaited.

  • NT1 and NT2 v Google Inc [2018] 3 WLR 1165.

This was the first “right to be forgotten” claim in the English Courts – with claims in both data protection and privacy. Both claimants had spent convictions – one was successful and the other not.  We had a case preview from Aidan Wills and a comment on the case from Iain Wilson,

  • Lloyd v Google LLC [2018] EWHC 2599 (QB) .

This was an attempt to bring a “representative action” in data protection on behalf of all iPhone users in respect of the “Safari Workaround”. The representative claimant was refused permission to serve Google out of the jurisdiction.  We had a case comment from Rosalind English.  There was a Panopticon Blog post the case. The claimant has been given permission to appeal and it is likely that the appeal will be heard in late 2019.

  • TLU v Secretary of State for the Home Department [2018] EWCA Civ 2217 .

The Court of Appeal dismissed an appeal in a “data leak” case on the issue of liability to individuals affected by a data leak but not specifically named in the leaked document. We had a case comment from Lorna Skinner and further comment from Iain Wilson.  There was also a Panopticon Blog post .

  • Stunt v Associated Newspapers [2018] EWCA Civ 170 .

The Court of Appeal referred the question of whether the “journalistic exemption” in section 32(4) of the Data Protection Act 1998 is compatible with the Data Protection Directive and the EU Charter of Fundamental Rights to the CJEU.  There was a Panopticon Blog post on the case.

  • Various Claimants v W M Morrison Supermarkets plc [2018] EWCA Civ 2339 .

The Court of Appeal upheld the decision of Langstaff J that Morrisons were vicariously liable for a mass data breach caused by the criminal act of a rogue employee. We had a case comment from Alex Cochrane.  There was a Panopticon Blog post the case.

  • Big Brother Watch v. Secretary of State [2018] ECHR 722 .

An important case in which the European Court of Human Rights held that secret surveillance regimes including the bulk interception of external communications violated Articles 8 and 10 of the Convention. We had a post by Graham Smith as to the implications of this decision for the present regime.

  • ML and WW v Germany ( [2018] ECHR 554 ). 

This was the first case in the European Court of Human Rights on the “right to be forgotten”. This was an application under Article in respect of the historic publication by the media of information concerning a murder conviction.  The application was dismissed.  We had a case comment from Hugh Tomlinson and Aidan Wills.  There was also a Panopticon blog post on the case.

Share this:

Caselaw , Data Protection , Privacy

2018 Top 10 Privacy and Data Protection Cases

' src=

January 29, 2019 at 6:25 am

Reblogged this on | truthaholics and commented: “In this post we round up some of the most legally and factually interesting privacy and data protection cases from England and Europe from the past year.”

' src=

January 29, 2019 at 9:38 am

Reblogged this on tummum's Blog .

' src=

February 2, 2019 at 12:27 am

Very Nice and informative data…keep the good work going on

3 Pingbacks

  • Top 10 Privacy and Data Protection Cases of 2020: a selection – Suneet Sharma – Inforrm's Blog
  • Top 10 Privacy and Data Protection Cases of 2021: A selection – Suneet Sharma – Inforrm's Blog
  • Top 10 Privacy and Data Protection Cases 2022, a selection – Suneet Sharma – Inforrm's Blog

Leave a Reply Cancel reply

data protection act 1998 case study

Contact the Inforrm Blog

Inforrm  can be contacted by email [email protected]

Email Subscription

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Email Address:

Sign me up!

Media Law Employment Opportunities

Schillings Senior Associate

Schillings Associate

Good Law Practice, Defamation Lawyer

Brett Wilson, NQ – 4 years’ PQE solicitor

Mishcon de Reya, Associate Reputation Protection, 1-4 PQE

Slateford, NQ – 2 years’ PQE solicitor

  • Top 10 Defamation Cases 2022: a selection - Suneet Sharma
  • Top 10 Privacy and Data Protection Cases of 2021: A selection - Suneet Sharma
  • Top 10 Defamation Cases of 2023: a selection - Suneet Sharma
  • Law and Media Round Up - 8 April 2024
  • Global Freedom of Expression, Columbia University: Newsletter, 13 April 2024

Recent Judgments

  • Artificial Intelligence
  • Broadcasting
  • Cybersecurity
  • Data Protection
  • Freedom of expression
  • Freedom of Information
  • Government and Policy
  • Human Rights
  • Intellectual Property
  • Leveson Inquiry
  • Media Regulation
  • New Zealand
  • Northern Ireland
  • Open Justice
  • Philippines
  • Phone Hacking
  • Social Media
  • South Africa
  • Surveillance
  • Uncategorized
  • United States

Search Inforrm’s Blog

  • Alternative Leveson 2 Project
  • Blog Law Online
  • Brett Wilson Media Law Blog
  • Canadian Advertising and Marketing Law
  • Carter-Ruck's News and Insights
  • Cearta.ie – The Irish for Rights
  • Centre for Internet and Society – Stanford (US)
  • Clean up the Internet
  • Cyberlaw Clinic Blog
  • Cyberleagle
  • Czech Defamation Law
  • David Banks Media Consultancy
  • Defamation Update
  • Defamation Watch Blog (Aus)
  • Droit et Technologies d'Information (France)
  • Fei Chang Dao – Free Speech in China
  • Guardian Media Law Page
  • Hacked Off Blog
  • Information Law and Policy Centre Blog
  • Internet & Jurisdiction
  • Internet Cases (US)
  • Internet Policy Review
  • Journlaw (Aus)
  • LSE Media Policy Project
  • Media Reform Coalition Blog
  • Media Report (Dutch)
  • Michael Geist – Internet and e-commerce law (Can)
  • Musings on Media (South Africa)
  • Paul Bernal's Blog
  • Press Gazette Media Law
  • Scandalous! Field Fisher Defamation Law Blog
  • Simon Dawes: Media Theory, History and Regulation
  • Social Media Law Bulletin (Norton Rose Fulbright)
  • Strasbourg Observers
  • Transparency Project
  • UK Constitutional Law Association Blog
  • Zelo Street

Blogs about Privacy and Data Protection

  • Canadian Privacy Law Blog
  • Data Matters
  • Data protection and privacy global insights – pwc
  • DLA Piper Privacy Matters
  • Données personnelles (French)
  • Europe Data Protection Digest
  • Mass Privatel
  • Norton Rose Fulbright Data Protection Report
  • Panopticon Blog
  • Privacy and Data Security Law – Dentons
  • Privacy and Information Security Law Blog – Hunton Andrews Kurth
  • Privacy Europe Blog
  • Privacy International Blog
  • Privacy Lives
  • Privacy News – Pogo was right
  • RPC Privacy Blog
  • The Privacy Perspective

Blogs about the Media

  • British Journalism Review
  • Jon Slattery – Freelance Journalist
  • Martin Moore's Blog
  • Photo Archive News

Blogs and Websites: General Legal issues

  • Carter-Ruck Legal Analysis Blog
  • Human Rights in Ireland
  • Human Rights Info
  • ICLR Case Commentary
  • Joshua Rozenberg Facebook
  • Law and Other Things (India)
  • Letters Blogatory
  • Mills and Reeve Technology Law Blog
  • Open Rights Group Blog
  • RPC's IP Hub
  • RPC's Tech Hub
  • SCOTUS Blog
  • The Court (Canadian SC)
  • The Justice Gap
  • UK Human Rights Blog
  • UK Supreme Court Blog

Court, Government, Regulator and Other Resource Sites

  • Australian High Court
  • Canadian Supreme Court
  • Commonwealth Legal Information Institute
  • Cour De Cassation France
  • European Data Protection Board
  • Full Fact.org
  • German Federal Constitutional Court
  • IMPRESS Project
  • Irish Supreme Court
  • New Zealand Supreme Court
  • NSW Case Law
  • Press Complaints Commission
  • Press Council (Australia)
  • Press Council (South Africa)
  • South African Constitutional Court
  • UK Judiciary
  • UK Supreme Court
  • US Supreme Court

Data Protection Authorities

  • Agencia Española de Protección de Datos (in Spanish)
  • BfDI (Federal Commissioner for Data Protection)(in German)
  • CNIL (France)
  • Danish Data Protection Agency
  • Data Protection Authority (Belgium)
  • Data Protection Commission (Ireland)
  • Dutch Data Protection Authority
  • Information Commissioner's Office
  • Italian Data Protection Authority
  • Scottish Information Commissioner
  • Swedish Data Protection Authority

Freedom of Expression Blogs and Sites

  • Backlash – freedom of sexual expression
  • Council of Europe – Freedom of Expression
  • EDRi – Protecting Digital Freedom
  • Free Word Centre
  • Freedom House Freedom of Expression
  • Freedom of Expression Institute (South Africa)
  • Guardian Freedom of Speech Page
  • Index on Censorship

Freedom of Information Blogs and Sites

  • All About Information (Can)
  • Campaign for Freedom of Information
  • David Higgerson
  • FreedomInfo.org
  • Open and Shut (Aus)
  • Open Knowledge Foundation Blog
  • The Art of Access (US)
  • The FOIA Blog (US)
  • The Information Tribunal
  • UCL Constitution Unit – FOI Resources
  • US Immigration, Freedom of Information Act and Privacy Act Facts
  • Veritas – Zimbabwe
  • Whatdotheyknow.com

Inactive and Less Active Blogs and Sites

  • #pressreform
  • Aaronovitch Watch
  • Atomic Spin
  • Bad Science
  • Banksy's Blog
  • Brown Moses Blog – The Hackgate Files
  • California Defamation Law Blog (US)
  • CYB3RCRIM3 – Observations on technology, law and lawlessness.
  • Data Privacy Alert
  • Defamation Lawyer – Dozier Internet Law
  • DemocracyFail
  • Entertainment & Media Law Signal (Canada)
  • Forty Shades of Grey
  • Greenslade Blog (Guardian)
  • Head of Legal
  • Heather Brooke
  • IBA Media Law and Freedom of Expression Blog
  • Information and Access (Aus)
  • Informationoverlord
  • ISP Liability
  • IT Law in Ireland
  • Journalism.co.uk
  • Korean Media Law
  • Legal Research Plus
  • Lex Ferenda
  • Media Law Journal (NZ)
  • Media Pal@LSE
  • Media Power and Plurality Blog
  • Media Standards Trust
  • Nied Law Blog
  • No Sleep 'til Brooklands
  • Press Not Sorry
  • Primly Stable
  • Responsabilidad En Internet (Spanish)
  • Socially Aware
  • Story Curve
  • Straight Statistics
  • Tabloid Watch
  • The IT Lawyer
  • The Louse and The Flea
  • The Media Blog
  • The Public Privacy
  • The Sun – Tabloid Lies
  • The Unruly of Law
  • UK FOIA Requests – Spy Blog
  • UK Freedom of Information Blog

Journalism and Media Websites

  • Campaign for Press and Broadcasting Freedom
  • Centre for Law, Justice and Journalism
  • Committee to Protect Journalists
  • Council of Europe – Platform to promote the protection of journalism and safety of journalists
  • ECREA Communication Law and Policy
  • Electronic Privacy Information Centre
  • Ethical Journalism Network
  • European Journalism Centre
  • European Journalism Observatory
  • Frontline Club
  • Hold the Front Page
  • International Federation of Journalists
  • Journalism in the Americas
  • Media Wise Trust
  • New Model Journalism – reporting the media funding revolution
  • Reporters Committee for Freedom of the Press
  • Reuters Institute for the Study of Journalism
  • Society of Editors
  • Sports Journalists Association
  • Spy Report – Media News (Australia)
  • The Hoot – the Media in the Sub-Continent

Law and Media Tweets

  • 1stamendment
  • DanielSolove
  • David Rolph
  • FirstAmendmentCenter
  • Guardian Media
  • Heather Brooke (newsbrooke)
  • humanrightslaw
  • Internetlaw
  • jonslattery
  • Kyu Ho Youm's Media Law Tweets
  • Leanne O'Donnell
  • Media Law Blog Twitter
  • Media Law Podcast
  • Siobhain Butterworth

Media Law Blogs and Websites

  • 5RB Media Case Reports
  • Ad IDEM – Canadian Media Lawyers Association
  • Entertainment and Sports Law Journal (ESLJ)
  • Gazette of Law and Journalism (Australia)
  • International Media Lawyers Association
  • Legalis.Net – Jurisprudence actualite, droit internet
  • Office of Special Rapporteur on Freedom of Expression – Inter American Commission on Human Rights
  • One Brick Court Cases
  • Out-law.com
  • EthicNet – collection of codes of journalism ethics in Europe
  • Handbook of Reuters Journalism
  • House of Commons Select Committee for Culture Media and Sport memoranda on press standards, privacy and libel

US Law Blogs and Websites

  • Above the Law
  • ACLU – Blog of Rights
  • Blog Law Blog (US)
  • Chilling Effects Weather Reports (US)
  • Citizen Media Law Project
  • Courthousenews
  • Entertainment and Law (US)
  • Entertainment Litigation Blog
  • First Amendment Center
  • First Amendment Coalition (US)
  • Free Expression Network (US)
  • Internet Cases – a blog about law and technology
  • Jurist – Legal News and Research
  • Legal As She Is Spoke
  • Media Law Prof Blog
  • Media Legal Defence Initiative
  • Newsroom Law Blog
  • Shear on Social Media Law
  • Student Press Law Center
  • Technology and Marketing Law Blog
  • The Hollywood Reporter
  • The Public Participation Project (Anti-SLAPP)
  • The Thomas Jefferson Centre for the Protection of Free Expression
  • The Volokh Conspiracy

US Media Blogs and Websites

  • ABA Media and Communications
  • Accuracy in Media Blog
  • Columbia Journalism Review
  • County Fair – a blog from Media Matters (US)
  • Fact Check.org
  • Media Gazer
  • Media Law – a blog about freedom of the press
  • Media Matters for America
  • Media Nation
  • Nieman Journalism Lab
  • Pew Research Center's Project for Excellence in Journalism
  • Regret the Error
  • Reynolds Journalism Institute Blog
  • Stinky Journalism.org
  • February 2024
  • January 2024
  • December 2023
  • November 2023
  • October 2023
  • September 2023
  • August 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • September 2021
  • August 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • February 2020
  • January 2020
  • December 2019
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • February 2012
  • January 2012
  • December 2011
  • November 2011
  • October 2011
  • September 2011
  • August 2011
  • February 2011
  • January 2011
  • December 2010
  • November 2010
  • October 2010
  • September 2010
  • August 2010
  • February 2010
  • January 2010
  • Media (1,601)
  • Legal (1,596)
  • Libel (975)
  • Privacy (960)
  • Freedom of expression (778)
  • The Conversation (445)
  • Weekly Round Up (411)
  • Brian Cathcart (190)
  • Old Bailey Trial (172)
  • Columbia Global Freedom of Expression (154)
  • 2024 (60)
  • 2023 (225)
  • 2022 (254)
  • 2021 (337)
  • 2020 (372)

© 2024 Inforrm's Blog

Theme by Anders Norén — Up ↑

Discover more from Inforrm's Blog

Subscribe now to keep reading and get access to the full archive.

Type your email…

Continue reading

Search

  • Annual Reports
  • Case studies
  • Contact us about a pre-GDPR issue
  • Breach Notification Guidance Under The Data Protection Acts 1988-2003
  • Publications

The following is a list of case studies, by year, as featured in Annual Reports published by this Office. These case studies provide an insight into some of the issues that this Office investigates on a day to day basis. For ease of reference, some of the case studies have been indexed by categories below.

  • Prosecution of Guerin Media Limited
  • Prosecution of AA Ireland Limited
  • The Dublin Mint Office Limited
  • Access Request made to NAMA
  • Disclosure of CCTV footage from a direct provision centre
  • The importance of data controllers having appropriate mechanisms in place to respond to access requests and document compliance

1)   Case Study 1 : Prosecution of Guerin Media Limited

The DPC received unrelated complaints from three individuals about unsolicited marketing emails that they had received from Guerin Media Limited. In all cases, the complainants received the marketing emails to their work email addresses. None of the complainants had any previous business relationship with Guerin Media Limited. The marketing emails did not provide the recipients with an unsubscribe function or any other means to opt out of receiving such communications. Some of the complainants replied to the sender requesting that their email address be removed from the company’s marketing list. However, these requests were not actioned and the company continued to send the individuals further marketing emails. In one case, nine marketing emails were sent to an individual’s work email address after he had sent an email request to Guerin Media Limited to remove his email address from its mailing list.

The DPC’s investigation into these complaints established that Guerin Media Limited did not have the consent of any of the complainants to send them unsolicited marketing emails and that it had failed in all cases to include an opt-out mechanism in its marketing emails.

The DPC had previously received four similar complaints against Guerin Media Limited during 2013 and 2014 in which the company had also sent unsolicited marketing emails without having the consent of the recipients to receive such communications and where the emails in question did not contain an opt-out mechanism. On foot of the DPC’s investigations at that time, the DPC warned Guerin Media Limited that it would likely face prosecution by the DPC if there was a recurrence of such breaches of the E-Privacy Regulations. Taking account of the previous warning and the DPC’s findings in its current investigation, the DPC decided to prosecute Guerin Media Limited for 42 separate breaches of the E-Privacy Regulations.

The prosecutions came before Naas District Court on 5 February 2018 and the company pleaded guilty to four sample charges out of the total of 42 charges. Three of the sample charges related to breaches of Regulation 13(1) of the E-Privacy Regulations for sending unsolicited marketing emails to individuals without their consent. The fourth sample charge related to a breach of Regulation 13(12)(c) of the E-Privacy Regulations for failure to include an opt-out mechanism in the marketing emails. The Court convicted Guerin Media Limited on all four charges and imposed four fines each of €1,000, i.e. a total of €4,000. The company was given a period of six months in which to pay the fine. It also agreed to make a contribution towards the prosecution costs incurred by the DPC.

Marketing to work email addresses

There is a common misconception that the sending of email communications to individuals at a work email address is a form of business-to-business communication where consent of the individual is not required. The E-Privacy Regulations allow a carve out to the default rule (i.e. that the sending organisation must have the consent of the receiving individual) which allows for such communications to be sent to an email address that reasonably appears to be one used by a person in the context of their commercial or official activity. However, in order to rely on this exception to the general rule requiring consent, the sender must be able to show that the email sent related solely to the recipient’s commercial or official activity, in other words, that it was a genuine business-to-business communication. In effect, this means that marketing material that is directly relevant to the role of the recipient in the context of their commercial or official activity (i.e. within their workplace) may be sent by an organisation without the prior consent of the recipient. However, this was not the case in the circumstances at issue. Instead, the marketing communications sent by Guerin Media Limited related to attempts by that company to sell advertisement space in various publications and to sell stands at exhibitions. However, none of the individual complainants who received those communications had any role in relation to marketing related matters within their own workplaces.

While not directly applicable here, as the complainants were all individuals, organisations should also take note of a further rule in the E-Privacy Regulations concerning situations where the recipient of an unsolicited direct marketing communication is not an individual (e.g. the email address used is a solely company/corporate one and does not relate to the email account of an individual, whether at work or otherwise). In such a case where the company/ corporate recipient notifies the sender that it does not consent to receiving such emails, it is unlawful for the sender to subsequently send such emails.

This case is an important demonstration that any organisation engaging in electronic direct marketing activities should carefully establish the basis on which it considers that the primary default rule requiring a sending organisation to have the consent of the recipient does not apply to it in any given case, and how it can demonstrate this. The case also illustrates the importance of including an opt-out mechanism in each and every electronic direct marketing communication as failure to do so constitutes a separate offence, (in addition to any offences in relation to failure to obtain consent) in respect of each such email/ message.

2)   Case Study 2: Prosecution of AA Ireland Limited

In December 2017 the DPC received a complaint from an individual who had received unsolicited marketing text messages from AA Ireland Limited. He informed the DPC that he had recently received his motor insurance renewal quotation from his current insurance provider and had decided to shop around to try to get a more competitive quotation. One of the companies he telephoned for a quotation was AA Ireland Limited. The complainant informed the DPC that he had expressly stated to the agent who answered his call that he wanted an assurance his details would not be used for marketing purposes and that he had been given that assurance by the agent. The phone call continued with the agent providing a quotation. The complainant noted that the quotation was higher than the renewal quotation from his current insurance provider and the complainant had indicated to the agent that he would not be proceeding with the quotation offered by AA Ireland Limited. The complainant informed the DPC that at his point in the call he had reiterated to the agent that he should not receive marketing material and he was once again assured by the agent that this would not happen.

The essence of the complainant’s complaint however was that the day after the phone call in question he had received a marketing text message from AA Ireland Limited offering him €50 off the quote provided. A further similar text message was sent to his mobile phone one day later. The complainant stated in his complaint that he felt that this action was a blatant breach of his very clear and precise instructions that he did not wish to receive any marketing communications.

During the course of our investigation, AA Ireland Limited confirmed that it had sent both text messages to the complainant and admitted that it had not obtained consent to send these messages to the complainant. The company acknowledged that the complainant had requested that he not receive marketing messages, that the complainant’s request should have been actioned and that his details should not have been used for marketing purposes. The company claimed that the incident arose as a result of human error. It explained that the correct process had not been followed by the agent so that the complainant’s details had been recorded with an opt-in for him to receive marketing messages therefore resulting in marketing text messages being sent to him.

As the DPC had previously issued a warning in separate circumstances to AA Ireland Limited in relation to unsolicited marketing communications, in this instance the DPC decided to initiate prosecute proceedings. At Dublin Metropolitan District Court on 14 May 2018 AA Ireland Limited entered a guilty plea to one offence. It also agreed to cover the prosecution costs incurred by the DPC. In lieu of a conviction and fine, the Court applied Section 1(1) of the Probation of Offenders Act.

3) Case Study 3:   The Dublin Mint Office Limited

The DPC received a complaint on 13 October 2017 from an individual who had received two marketing telephone calls that same day, one targeted at him and one at his son, from The Dublin Mint Office Limited. The caller in each case had attempted to sell commemorative coins. In his complaint, the complainant explained that he had registered online a few months earlier with the company for an online offer on his own behalf and on behalf of his son, providing the same telephone contact number for both of them during the online registration process. The complainant stated that he ticked the marketing opt-out box during that online registration process.

During the course of the DPC’s investigation, The Dublin Mint Office Limited admitted that it had made the marketing telephone calls. It explained that when the complainant supplied his telephone number during the online application process in May 2017 the order form had only offered an opt-in option to receive marketing mails and emails. The company confirmed that the complainant had not selected the opt-in option and he was therefore marked as opt-out for marketing mails and emails only. The company explained that a gap in the system in place at the time only allowed for an opt-in to marketing mails and emails but that it was not an opt-out for telesales. As a result, the complainant’s details were included in a list for a follow-up telesales call. The company informed the DPC that it had written to the complainant to apologise for the inconvenience caused to him and to his son by its inadvertent mistake.

The DPC had previously issued a warning to The Dublin Mint Office Limited in September 2017 concerning other complaints which had been made to the DPC concerning unsolicited marketing communications by the company. The DPC therefore decided to prosecute The Dublin Mint Office Limited. At Dublin Metropolitan District Court on 14 May 2018 the company pleaded guilty to two charges in relation to both marketing telephone calls. It also agreed to cover the DPC’s prosecution costs. In lieu of a conviction and fine, the Court applied Section 1(1) of the Probation of Offenders Act.

4)   Case Study 4: Access Request made to NAMA.

In February 2018, the DPC issued a decision on a complaint which had been made to it by two individuals against the National Asset Management Agency (NAMA). The complaint concerned allegations of non-compliance with a joint access request which had been made to NAMA in September 2014 by the complainants who were the directors and/or shareholders of a number of companies whose loans had transferred to NAMA. Certain personal loans of those individuals had also transferred to NAMA. The joint access request which had been made to NAMA expressly referenced personal data held by NAMA in connection with both the personal loans and the company loans.

NAMA responded to the complainants in October 2014, asking them to identify which of a number of categories of personal data (which NAMA itself had identified) that they wished to receive. The complainants replied, objecting to the manner in which NAMA’s response had sought to limit the scope of the request. NAMA subsequently provided the complainants with a copy of the personal data which it considered the complainants were entitled to but noted that it was not required to provide personal data which was subject to legal privilege, which comprised confidential expressions of opinion or which would prejudice the interests of NAMA in respect of a claim or which would prejudice the ability of NAMA to recover monies owed to the State. However, NAMA did not identify the personal data in respect of which it considered such exemptions from the right of access applied. While the personal data provided by NAMA to the complainants related to the personal loans of the complainants which had previously transferred to NAMA, it did not include personal data relating to the complainants as directors and/or shareholders in the companies whose loans had transferred to NAMA.

Complaint to the DPC

The data subjects subsequently made a complaint to the DPC which alleged:

  • that NAMA had failed to provide all of the complainants’ personal data to them;
  • that NAMA had incorrectly applied exemptions under the Data Protection Acts 1988 and 2003; and
  • that even if NAMA was entitled to rely on one or more exemptions, that it was obliged to provide the complainants with a description of the personal data so that they had a reasonable and fair opportunity to consider whether it did fall under an exemption; and
  • that NAMA had failed to conduct searches for personal data relating to ten additional categories of

information identified by the complainants.

NAMA’s position on the complaint

NAMA stated that it had fully complied with the access request. Following an exchange of correspondence with the DPC, NAMA contended:

  • that “corporate data”, i.e. information relating to the loans of the companies linked to the complainants did not fall within the definition of “personal data”;
  • that it was released from its obligations to provide access to personal data contained within the totality of the records held in relation to both the personal loans and the company loans, on the basis that conducting such searches would require ‘disproportionate effort’ on the part of NAMA to do so; and
  • that it was appropriate for NAMA to rely on statutory exemptions to the right of access, as provided under Sections 5(1)(a), 5(1)(f) and 5(1)(g) of the Data Protection Acts 1988 and 2003.

DPC Investigation

In a submission to the DPC, NAMA provided estimates  of the number of relevant records it held, and the potential financial cost of completing a comprehensive search for all personal data requested. NAMA confirmed that it had not conducted searches for the complainants’ personal data held in relation to company loans.

In order to substantiate its position, NAMA agreed to conduct sample searches for personal data in respect of a particular two-month period. Authorised officers on behalf of the DPC conducted three on-site investigations at NAMA premises to corroborate NAMA’s position on issues relating to its searches. Following a review of the sample searches carried out, DPC officers were not satisfied that a comprehensive search would involve a disproportionate effort on the part of NAMA, or that information held by NAMA relating to the complainants’ company loans did not also contain personal data of the complainants.

Following engagement between the DPC and NAMA, additional personal data was released to the complainants. However, efforts to resolve this matter informally were to no avail. The DPC subsequently issued a lengthy statutory decision running to some 67 pages in relation to the complaint. This decision addressed the three core issues referred to above. The DPC’s findings on The Commissioner’s Decision each of these issues was as follows.

(1) The Corporate Data Issue

While NAMA acknowledged that the complainants’ names appeared in records relating to the company loans, reflecting that they were directors and/or shareholders of the companies in question and while NAMA accepted that the complainants’ names were their personal data, it contended that this did not make the other information in those records their personal data. The complainants’ position meanwhile was that there was no doubt but that information relating to a person in their capacity as a company director could constitute personal data. They also pointed to the fact that information referencing an assessment of their performance / conduct or the evaluation of their assets constituted personal data even it if was concerned with company loans or the business of those companies. The complainants also contended that while records in relation to the company loans and their personal loans were held separately, the reality was that all of NAMA’s dealings with them were interconnected.

The DPC in her decision noted that the mere fact of one of the complainant’s names appearing in records relating to the company loans (for example if they had simply signed a commercial agreement in their capacity as director of a company) was not sufficient in and of itself for other information in that agreement to constitute personal data. However, the records which had been identified through the sample searches bore out the complainants’ contentions that those records could not be assumed to contain no personal data at all. The DPC noted by way of example that it was clear from a document, the title of which referred to a NAMA board meeting, that while the board meeting had discussed and considered a business plan referable to one of the companies, there was information in that document relating to the financial assets of the complainants in their personal capacities. The DPC accepted the complainants’ position that the records held by NAMA regarding the company loans contained at least some personal data relating to them. Therefore the DPC considered that NAMA must at the very least, identify the records or types of records in which the complaints were identified by name or otherwise but which NAMA considered did not constitute personal data, and provide sufficient information for the complainants to understand why it was said that those records or types of records do not constitute or contain personal data.

(2) The Disproportionate Effort Issue

The DPC then considered whether the time and money costs involved in NAMA conducting searches of the records held in relation to the company loans would be disproportionate relative to the amount of personal data that might be found and disclosed to the complainants. The DPC noted that while there is no express obligation on a data controller to search for personal data in order to comply with a properly made access request, she accepted that there was an implied obligation on a data controller to undertake searches so as to identify what personal data it might hold on a requester. The question for consideration concerned the nature and extent of this implied duty. The DPC noted that the disproportionate effort obligation found in Section 4(9)(a) of the Data Protection Acts 1988 and 2003, on the face of that provision, applied only to limit the obligation to provide to the data constituting the personal day in permanent form. However, it did not limit the earlier steps in the process such as the obligation to search for the data. While the DPC referred to jurisprudence from the UK Courts which has established that the implied obligation to search for personal data is limited to a reasonable and proportionate search, she noted that she was not aware of any judicial authority in Ireland dealing with the nature or extent of a controller’s obligations to conduct searches in order to comply with Section 4 of the Data Protection Acts 1988 and 2003. While accepting that there was no obligation on her to recognise the principles established by the UK authorities, the DPC noted that one particularly pertinent decision to this effect (Deer v. University of Oxford) had previously been referenced by the Irish High Court (in the judgment of Coffey J. delivered on 26 February 2018 in the case of Nowak v. Data Protection Commissioner). The DPC considered that decision to be helpful in interpreting Sections 4(1) and 4(9) of the Data Protection Acts 1988 and 2003, particularly given its analysis of case law from the CJEU. On that basis the DPC therefore accepted NAMA’s contention that the obligation to search for personal data was not without limits but rather NAMA was required to undertake reasonable and proportionate searches to identify the personal data of the complainants which it held.

The DPC then went on to consider whether NAMA had discharged this obligation, by carrying out the type of balancing exercise which had been contemplated in the UK case law, between upholding the data subject’s right of access and the burden which it would impose on the data controller. In doing so, the DPC considered a number of factors bearing upon this balancing exercise, including the intrinsic significance of the personal data and its relative importance to the requesters. In this regard, the DPC noted that the personal data in question related to the business and financial interests of the complainants both personally and in respect of the companies of which they were directors and/ or shareholders. It was also considered relevant that (as evident from the correspondence seen by the DPC’s officers) that the complainants were trying to bring about a situation in which the company loans would be dealt with by NAMA in a way that would ensure the survival of the companies and preserve the complainants’ ability to retain some level of ownership or control in those companies. Consequently, the DPC considered the personal data held by NAMA to be of significant importance to the complainants.

The DPC then considered the countervailing points made by NAMA, including specific estimates (calculated on the basis of the results from the sample searches) provided to the DPC relating to the estimated number of hits produced if searches were to be carried out (approximately 62,000), the estimated number of relevant records which would be identified following a review of those hits (approximately 12,600) and the estimated time which it would take to review, assemble and redact the material for release to the complainants (over 2,700 hours). It was also noted by the DPC that while NAMA had referred to the potential for technical solutions to counteract the manual input required, that NAMA had stated it was not something which it had assessed and its view was that should such solutions exist, they would incur a disproportionate cost of implementation.

The DPC found NAMA’s estimates as regards the time and effort involved in carrying out the full period searched to be speculative in nature and lacking in specific detail, and that it had failed to discharge the burden of proof on it in this regard. This was particularly so in light of the fact that NAMA’s previous position (prior to the sample searches having been conducted) that there was no personal data of the complainants held in the records relating to the company loans, had not been borne out in fact by reference to the results of those sample searches. NAMA had, it was noted, originally agreed to conduct searches for the whole period during which it held the company loans if the sample searches had demonstrated that there was personal data of the complainants held in the records relating to the company loans. However, some 14 months later NAMA had changed its position and decided not to undertake any further searches at all despite the sample searches having shown the presence of personal data in the company loans records. The DPC also considered that NAMA’s claims (in the absence of an assessment to this effect) that (1) a technical solution would not be feasible and (2) its unparticularised claim that the disproportionate effort involved in carrying out the searches and providing the personal data identified would divert its resources away from its statutory remit, did not discharge the burden of proof to which it was subject in respect of its claims of disproportionate effort.

The DPC found that in refusing to conduct the searches NAMA had not sought to balance its rights against the complainants’ rights but had set them at nought. NAMA had not discharged its obligation by conducting reasonable and proportionate searches to find relevant personal data and supply it. The DPC was not satisfied on the basis of the arguments and evidence put forward by NAMA that by conducting the searches this would constitute disproportionate effort on its part.

(3) The Statutory Exemptions Issue

The sample searches which had been carried out by NAMA led to the identification of 14 hard copy documents containing the personal data of the complainants, drawn from NAMA’s records relating to both the company loans and the personal loans. However, NAMA withheld or redacted 3 of these documents on the basis of certain exemptions to the right of access. These exemptions related to Section 5(1)(g), Section 5(1)(f) and Section 5(1)(a) of the Data Protection Acts 1988 and 2003. As a preliminary matter the DPC found that NAMA must prove convincingly, and by evidence, meeting the civil standard of proof that each of the exemptions on which it sought to rely on did in fact apply in this case and operated to trump the complainants’ rights of access.

In the case of the legal privilege exemption which NAMA claimed applied to an internal email passing between solicitors employed at NAMA, the DPC noted that this document on its face was labelled as attracting litigation privilege. However given that no litigation was in being between the complainants and NAMA at the time of its creation (and in fact the only litigation now in being had only come into existence some 2 to 3 years later), the DPC was not satisfied that NAMA had discharged the burden of proof on it to show that litigation privilege applied to the personal data in question. However, the DPC then went on to consider whether legal advice privilege applied and concluded that it did because the content of the email in question set out the basis on which certain issues relating to the personal loans might be considered and addressed. The DPC was therefore satisfied that the email in question was privileged and exempt from release under Section 5(1)(g) of the Data Protection Acts 1988 and 2003.

With regard to two further documents, NAMA claimed that the exemption in Section 5(1)(a) applied. This provides that the right of access does not apply to personal data kept for the purposes of preventing, detecting or investigating offences, apprehending or prosecuting offenders, or assessing or collecting any tax, duty or other moneys owed or payable to the State, a local authority or health board in any case in which granting access to the personal data would prejudice any such matters. The DPC applied the test for application of this exemption which had been set out in the UK judgment of Guriev & another v. Community Safety Development (UK) Limited [2016] EWHC 643. That case had concerned the equivalent exemption under the UK Data Protection Act 1998. The DPC found that NAMA had simply asserted that in the case of the two records in question, providing access to the personal data would have the effect of disclosing its strategy in dealing with liabilities. However NAMA had made no effort to explain the nature and effect of the prejudice that would be suffered if the personal data in question was released, how the release of it would lead to the prejudice, nor how applying the exemption was a necessary and proportionate interference with the complainants’ rights having regard to the gravity of the threat to the public interest. In light of this lack of evidence, the DPC decided that it was not open to NAMA to rely on this exemption.

The final exemption relied on by NAMA and considered by the DPC was Section 5(1)(f) which provides that the right of access does not apply to personal data consisting of an estimate or kept for the purposes of estimating the amount of liability of a data controller on foot of a claim in respect of damages or compensation where granting access would be likely to prejudice the interests of the data controller in relation to the claim. The DPC found that no evidence had been put forward by NAMA as to the factual basis for relying on the exemption. For example, NAMA had not identified the prejudice which it would suffer if it provided the personal data, or how or in what context the prejudice would arise. As NAMA had failed to discharge the burden of proof on it in relation to its claim to this exemption, the DPC found that it was not open to NAMA to rely on it.

Arising from the DPC’s findings, the DPC concluded that NAMA was in breach of its obligations under Section 4(1) (a) and Section 4(9) of the Data Protection Acts 1988 and 2003.

5) Case Study 5:   Disclosure of CCTV footage from a direct provision centre.

We received a complaint from solicitors for a resident of a direct provision accommodation centre in relation to an alleged disclosure of CCTV footage capturing the complainant’s images. The accommodation centre in question is owned by the State (with responsibility for it resting with the Reception and Integration Agency (RIA) which sits within the Department of Justice and Equality). The centre is managed on a day-to-day basis by Aramark Ireland (Aramark). The alleged disclosure of the complainant’s personal data came to her attention during her participation in a radio programme. The subject matter of that radio show concerned a matter that had arisen between residents of the accommodation centre and its staff. During the course of the radio programme, the radio host claimed that he had a copy of CCTV footage, which was apparently taken from a room in the accommodation centre, which allegedly showed an altercation between the complainant and another resident of the direct provision centre.

The complainant subsequently made complaints to RIA, to Aramark and to the radio station which had aired the radio programme in question. An access request for a description of all recipients to whom the complainant’s personal data had been disclosed was also made on behalf of the complainant under Section 4 of the Data Protection Acts 1988 and 2003 to RIA. However, the complaint noted that RIA had not responded to that access request.

We commenced an investigation into the complaint seeking information from both Aramark and the RIA. The RIA informed us that it was liaising with Aramark and had requested a report from it. During the investigation, we established that Aramark was a data processor processing personal data on behalf of the RIA. Aramark submitted that CCTV is used for security purposes and to monitor health and safety within the accommodation centre. Aramark informed us that it processes personal data in line with the RIA’s instructions and that access to the storage medium within the accommodation centre was limited to specific authorised personnel, with accompanying user name and passwords requirements.

In relation to the specific allegation of disclosure of the CCTV footage, Aramark told us that CCTV footage of an altercation involving the complainant had been downloaded by authorised personnel from Aramark and transmitted to the RIA. The reason for the download and transmission were that the captured events related to security, and health and safety issues. According to Aramark, due to the size of the file in question, the employee had saved the footage to a Google link for onward transmission to the RIA.

Aramark informed us about a detailed forensic IT enquiry that had been conducted in relation to the complaint, across its IT systems to identify whether any other disclosure of the CCTV footage had taken place. It maintained on the basis of its own investigations that the link had not been sent from any Aramark email account to an outside party other than the RIA. Amongst other things, as part of the forensic enquiry, Aramark said that it had checked internet logs on the Aramark computer used at the accommodation centre, searched the mailboxes of Aramark staff who worked at the accommodation centre and searched for email correspondence inbound and outbound relating to the incident. A data recovery program had also been installed on the computer inquestion to review all deleted content on the computer. No activity indicating disclosure of the CCTV footage to any third party had been identified. Aramark further informed us that the Google link no longer existed and was therefore not accessible.

Aramark also maintained that the authorised personnel who had downloaded the footage had confirmed that the footage had not been disclosed to any third party and that it had been deleted following transmission to the RIA.

Separately the RIA confirmed to us during our investigation that the Google link to the CCTV footage which it had received, referenced the complainant and another resident. It stated that a copy of the footage had not been retained by the RIA.

In relation to the management of the CCTV system in the accommodation centre, the RIA furnished us with certain documentation including Aramark’s data protection and CCTV policies and a confidentiality agreement in place with Aramark. However, the RIA acknowledged during our investigation that there were no policies or practice documents in place for the management of CCTV  operating in accommodation centres.

Ultimately neither Aramark nor the RIA were able to definitively confirm that CCTV footage in question had not been disclosed to the radio station. In relation to its non-compliance with the access request, the RIA’s position was that it was waiting on a detailed report from Aramark and that it could not respond to the access request until it had received that report.

In her decision, the DPC found that the RIA did not respond to the request by the complainant for a description under Section 4 of the Data Protection Acts 1988 and 2003 of all recipients to whom the personal data was disclosed, within the prescribed timeframe of 40 days. This was in direct contravention of RIA’s obligation under that provision.

In relation to the oversight of the processing carried out by Aramark as a processor for RIA, based on the submissions made by both the RIA and Aramark in the course of the DPC’s investigation, there was no evidence of a written contract in place which delineated the respective obligations applicable to the RIA and Aramark in relation to the processing of personal data by Aramark on the RIA’s behalf. This constituted a contravention by the RIA, as the data controller, of Section 2C(3) of the Data Protection Acts 1988 and 2003.

Although the DPC was unable to establish how the CCTV footage in question came to be in possession of a radio station, the DPC found that ultimately the complainant’s rights were infringed. In this regard both the RIA and Aramark failed in their duty of care to the complainant by failing to ensure that appropriate security measures were taken against the unauthorised disclosure as required by Section 2(1)(d). The DPC also decided that a contravention of Section 2C(2) of the Data Protection Acts 1988 and 2003 had occurred. This provision requires a controller to take reasonable measures to ensure that its employees and other persons at the place of work are aware of and comply with security measures. The lack of agreed procedures and in-depth policies in place between the RIA and Aramark relating to the transfer of personal data over a network led to this decision.

This case illustrates the unintended and unforeseen consequences which can result from an absence of clear, documented policies and procedures governing the transmission of personal data over a network. In this case, that failure was compounded by the further failure by the RIA to also have a written agreement in place which clearly set out the parameters of Aramark’s instructions to process personal data on behalf of the RIA. As this case demonstrates, such failures by a controller to comply with its data protection obligations are not just administrative or regulatory breaches but can result in grave incursions into an individual’s Charter protected right to protection of their personal data which otherwise should have been avoidable.

6)   Case Study 6: The importance of data controllers having appropriate mechanisms in place to respond to access requests and document compliance.

We received a complaint from a data subject concerning the alleged failure of eir to comply in full with an access request. The complainant advised us that in response to his access request he had received from eir what he described as “a bundle of random pages of information without any explanation of content”.

In the course of our investigation we established that eir was in fact seeking to rely on certain statutory exemptions to the right of access. However in its response to the requester’s access request, it had not referred at all to the fact that it had withheld certain personal data. It was only in communications with eir, during the course of our investigation, some five months after eir’s receipt of the access request, that eir indicated that they were withholding personal data based on exemptions and outlined the details of the exemptions relied on by reference to an attached list.

In the course of our investigation it also became apparent that eir was unable to determine what personal data had actually been provided to the complainant as it had not retained a copy of the personal data which had been provided. As a consequence of the lack of records kept on the personal data which had been released, eir was also unable to identify what personal data had been withheld/ not provided either in reliance on an exemption under the Data Protection Acts 1988 and 2003, or otherwise.

We pointed out to eir that it would be difficult to see how eir would be in a position to provide clarification to us as to their purported application of any statutory exemption to this particular access request given that they were not clear on what personal data had been provided to the complainant in the first place. We accordingly directed eir to re-commence the process of responding to the access request afresh. We specified that in doing so, eir should:

  • Examine its systems, both manual and electronic and carry out a review of all the personal data held by it relating to the complainant in manual and electronic form;
  • Write to the complainant within a period of not more than fourteen days of the date of our request, responding to the substance of his access request in accordance with the provisions of Section 4 of the Acts. In so doing, we required that eir provide access to all personal data held or controlled by it, while also explaining to the requester the reason for the re-issue to him of personal data which had already been provided, i.e that eir was unable to determine what personal data had already issued to him. We also directed that in this response, eir also provide the requester with a statement of the reasons for the refusal to provide access to any personal data, identifying any statutory exemption relied on by eir and the basis on which eir contended that such exemption(s) applied in this case. Finally we required that eir’s letter to the requester should be copied to us.

While ultimately the complainant in this case withdrew his complaint against eir, the issues identified during the course of our investigation underline the critical importance of data controllers having adequate organisational and operational mechanisms to allow them to comply with their statutory obligations with regard to access requests. However, it is equally important that a data controller is able to post facto demonstrate (where required by the DPC, such as in the context of a complaint) compliance with its obligations. A data controller must be able to justify decisions it has taken to deny access to personal data in reliance on one or more statutory exemptions. As a basic starting point of being able to provide justification as to the position taken in relation to a request by a data subject to exercise a right, data controllers should have appropriate record keeping systems and processes in place. These mechanisms should allow them to track and produce copies of any correspondence exchanged with a data subject in relation to an access request or request to exercise any other data protection right.

This case study also underscores the fact that the right of an individual to access personal data held about them is not just about being provided with access to the data itself. Importantly it is also concerned with sufficient, meaningful information being given to the data subject so that they can understand, amongst other things, what personal data is processed about them, in what circumstances and for what purposes. In this case the provision of a bundle of unexplained documents in response to the access request failed to satisfy the minimum requirements applicable to eir as a data controller under Section 4 of the Data Protection Acts 1988 and 2003, ultimately causing confusion for the data subject and prompting a complaint to the DPC.

  • Right to be Forgotten
  • Prosecution of Eamon O’Mordha & Company Limited and one of Its Directors
  • Loss of sensitive personal data contained in an evidence file kept by An Garda Síochána
  • Use of CCTV footage in a disciplinary process.
  • Disclosure of sensitive personal data by a hospital to a third party
  • Publication of personal information - journalistic exemption
  • Compliance with a Subject Access Request & Disclosure of personal data / capture of images using CCTV
  • Failure to respond fully to an access request
  • Personal data of a third party withheld from an access request made by the parent of a minor

Disclosure of Personal Data via a Social Media App

  • Failure by the Department of Justice and Equality to impose the correct access restrictions on access to medical data of an employee
  • Virgin Media Ireland Limited
  • Sheldon Investments Limited (trading as River Medical)
  • Tumsteed Unlimited Company (trading as EZ Living Furniture)
  • Cunniffe Electric Limited
  • Argos Distributors (Ireland) Limited
  • Expert Ireland Retail Limited

1)   Right to be Forgotten

We received a complaint from a Lithuanian national concerning articles about that individual which had been published by a number of Lithuanian news sources ten years earlier. Links to these articles were returned in search results when a search against the individual’s name was carried out using a particular search engine. The articles in question detailed the termination of the individual’s employment as an official in a municipal government department in connection with the individual’s involvement in potentially fraudulent activities. The article also detailed criminal charges which had been brought against the individual for allegedly accepting bribes in the context of their employment.

During the course of our investigation into this complaint, the search engine operator contended that the information detailed in the articles in question related to serious professional wrongdoing committed by an individual involved in public administration. It maintained that where such wrongdoing resulted in criminal sanctions that this was sufficiently serious for the information to be considered to be in the public interest and therefore any interference with the data subject’s rights was justified.

However in the course of our investigation the complainant provided us with official court documents which showed that they had been found not guilty of all the charges which had been referred to in the articles. The complainant also provided us with documents which showed that the termination of their employment with the municipal government department had been on a voluntary basis with the complainant having resigned due to personal reasons. We considered that this documentary information demonstrated that the complainant’s personal data, which was being processed by way of the search engine returning search results to the articles in question, was inaccurate, incomplete and out of date and on that basis we requested that the search engine operator delist the links to the webpages in question from search results which were returned from searches conducted against the complainant’s name. The search engine operator complied with our request and delisted the links in question.

This case illustrates that the onus is on a search engine, as the data controller, to satisfy itself to the appropriate level that the personal data to which search engine results provide links fully accords with the laws on data protection. In this case, it appeared that the search engine operator did not properly examine the complaint but simply took the approach of assuming that because the complainant had previously been employed in a public official role that the information in question was automatically in the public interest, regardless of whether it was in fact accurate, complete and up to date. The search engine operator had assumed, without apparently even checking the factual background, that the complainant had been convicted of the criminal charges

2)   Prosecution of Eamon O’Mordha & Company Limited and one of Its Directors

The investigation of this case arose in the context of a wide-ranging investigation of the Private Investigator sector that commenced in 2016. As part of that investigation, the Special Investigations Unit obtained and examined copies of several private investigator reports written in 2014 and 2015 by Eamon O’Mordha & Company Limited (the company) for its clients in the insurance sector. The Special Investigations Unit became suspicious of the origin of some of the personal data in those reports and it immediately commenced an investigation involving the Department of Social Protection and An Garda Síochána.

The investigation subsequently uncovered access by the company to social welfare records held on databases in the Department of Social Protection. An official in that Department was interviewed by Authorised Officers of the Data Protection Commissioner. During the course of that interview, the official revealed that the two directors of the company were friends of hers and she admitted that one of the company directors met with her regularly and asked her to check information on the Department’s database. The official admitted that she carried out those checks and provided personal information to the company director.

Separately, the investigation uncovered access by the company to records held on the PULSE database of An Garda Síochána. Two serving members of An Garda Síochána (who are brothers and nephews of one of the directors of the company) were interviewed by Authorised Officers of the Data Protection Commissioner. During the course of those interviews, both Gardaí confirmed that they had been contacted by their aunt to obtain information from them in relation to individuals and vehicle registration numbers. They both admitted that they had accessed the Garda PULSE database and that they had subsequently passed on personal information to their aunt, the company director.

Eamon O’Mordha & Company Limited was charged with 37 counts of breaches of the Data Protection Acts, 1988 and 2003 (the Acts). All charges related to breaches of section 22 of the Acts for obtaining access to personal data without the prior authority of the data controller by whom the data is kept and disclosing the data to another person. The personal data was kept by the Department of Social Protection and An Garda Síochána. The personal data was disclosed to entities in the insurance sector. Two directors of the company, Eamonn O’Mordha and his wife Ann O’Mordha were separately charged with thirty-seven counts of breaches of section 29 of the Acts for their part in the offences committed by the company. This section of the Acts provides for the prosecution of company directors where an offence by a company is proved to have been committed with the consent or connivance of, or to be attributable to any neglect on the part of the company directors or other officers.

On 8 May, 2017 at Dublin Metropolitan District Court, guilty pleas on behalf of the company were entered to twelve charges for offences under section 22 of the Acts. The Court convicted the company on ten charges and it took the further two charges into account. It imposed ten fines of €1,000 on the company (totalling €10,000). All remaining charges were struck out. Company director Ms. Ann O’Mordha pleaded guilty to twelve charges for offences under section 29 of the Acts. The Court convicted Ms. O’Mordha on ten charges and it took the further two charges into account. It imposed ten fines of €1,000 on Ms. O’Mordha (totalling €10,000). All remaining charges were struck out. The charges against her husband, the other company director, were not proceeded with.

3)   Loss of sensitive personal data contained in an evidence file kept by An Garda Síochána

We received a complaint from a couple against An Garda Síochána (AGS), concerning the loss of an evidence file that held, among other things, the couple’s sensitive personal data relating to details of medical treatment. We established that the couple had previously made a criminal complaint to AGS and had subsequently made an access request. However, in response to the access request, they were informed that the evidence file in relation to their complaint, which contained their original statements, a DVD and postal documents containing their sensitive personal data, had been misplaced while in the possession of AGS. The complainants requested that we conduct a formal investigation into the matter.

AGS informed us that upon identifying that the evidence file in question was missing, a comprehensive search had taken place of all files retained at local level in the District Office, and other relevant sections of AGS, in order to try to locate the file. Ultimately, however, the file had not been located.

During the course of our investigation, we studied the chain of custody supplied to us by AGS and established that the last known whereabouts of the file was in the investigating officer’s possession. That officer had been instructed by a superior to update the couple about the criminal complaint and to then return the file to the District Office for filing. However, the officer had failed to return the file to the District Office for filing. AGS informed us that the failure by the officer to return the file to the relevant location in the District Office was in contravention of its policy and procedures at the time and that consequently both an AGS internal investigation and a Garda Síochána Ombudsman Commission investigation had been conducted. Following the latter investigation, the officer in question had been disciplined and sanctioned for the contravention.

One of the central requirements of data protection law is that data controllers have an obligation to have appropriate security measures in place to ensure that personal data in their possession is kept safe and secure. This requires the controller to consider both technical and organisational measures and importantly, to take all reasonable steps to ensure that its employees, amongst others, are aware of and comply with the security measures. In her decision, the Commissioner found that AGS, as data controller, had infringed Section 2(1)(d) of the Data Protection Acts 1988 and 2003, as it failed to take appropriate security measures to ensure the safe storage of the complainants’ sensitive personal data which was contained on the evidence file in question.

This case demonstrates that the obligation on a data controller to maintain appropriate security measures goes beyond simply putting in place procedures regarding the storage and handling of personal data. Such procedures are only effective as a security control if they are consistently adhered to, so data controllers must monitor staff compliance with these measures and take meaningful steps (for example training, auditing and potentially disciplinary measures where non-compliance is identified) to ensure that staff systematically observe such procedures.

4)   Use of CCTV footage in a disciplinary process.

We received a complaint from an individual regarding the use of CCTV footage by their employer in a disciplinary process against them. The complainant informed us that while employed as a security officer, their employer had used their personal data, in the form of CCTV footage, to discipline and ultimately dismiss them. The complainant stated that they had not been given prior notification that CCTV footage could be used in disciplinary proceedings.

In the course of our investigation, the employer informed us that the complainant had worked as a night officer assigned to client premises, and had been required to monitor the CCTV system for the premises from a control room. The employer’s position was that, upon being assigned to the client premises in question, the complainant had been asked to read a set of “Standing Operating Procedures” which indicated that CCTV footage could be used in an investigative process concerning an employee. The employee had also been asked to sign a certificate of understanding to confirm that he had read and understood his responsibilities. The employer maintained that the CCTV system in place at the client premises was not used for supervision of staff as there was a supervisor at the premises during office hours between Monday and Friday.

The employer informed our investigators that it was the complainant’s responsibility, as the sole night security officer on duty at the client premises, to monitor the CCTV system for the premises from the control room. The requirement to have a night security officer on duty in that control room for that purpose was a term of the employer’s contract with its client. The employer was also contractually obligated under its contract with its client to carry out routine audits of employee access cards (which were swiped by the holder to gain access to various locations in the client premises). The employer told us that during such an audit, it had discovered irregularities in data derived from the complainant’s access card which could not be the result of a technical glitch as those irregularities were not replicated in the access card data of the complainant’s fellow night officers. These irregularities suggested that the complainant had been absent from their assigned post in the control room for prolonged periods of time on a number of separate occasions. On the basis of the access card data irregularities and upon noting the apparent absence of the employee from the control room during prolonged periods, the employer had commenced an investigation into the employee’s conduct. During the course of this investigation, the complainant disputed the accuracy of the access card data, and had sought that the employer provide further evidence of his alleged prolonged absences from the control room. The employer had therefore obtained CCTV stills at times when the access card data suggested the complainant was away from their post in order to verify the location of the complainant. The employer maintained that because the CCTV system was independent of the access card data system, it was the only independent way to verify the access card data. The employer also provided us with minutes of a disciplinary meeting with the complainant where they had admitted to being away from the control room for long periods. The employer also informed us that the complainant had later admitted in an email, also provided to us, that the reason for these absences was that the complainant had gone into another room so that they could lie down on a hard surface in order to get relief from back pain arising from a back injury.

We queried with the employer what the legal basis was for processing the complainant’s personal data from the CCTV footage. The employer’s position was that as a result of its contractual obligations to its client (whose premises were being monitored), if an adverse incident occurred during a period of absence of the assigned security officer (the employee) from the control room, that would potentially expose the employer to a breach of contract action by its client which could lead to significant financial and reputational consequences for the employer. On this basis the employer contended that it had a legitimate interest in processing CCTV footage of the employee for the purpose of the disciplinary process. Under Section 2A(1)(d) a data controller may process an individual’s personal data, notwithstanding that the controller does not have the consent of the data subject, where the processing is necessary for the purposes of the legitimate interests pursued by the data controller. However, in order to rely on legitimate interests as a legal basis for processing, certain criteria have to be met as follows:

  • there must be a legitimate interest justifying the processing;
  • the processing of personal data must be necessary for the realisation of the legitimate interest; and
  • the legitimate interest must prevail over the rights and interests of the data subject.

Having considered the three step test above, the Commissioner was satisfied that the employer had a legitimate interest in investigating and verifying whether there was misconduct on the part of the employee (or whether there was a fault in the access card security system). Furthermore, the Commissioner considered that the use of the CCTV footage was necessary and proportionate to the objective pursued in light of the seriousness of the allegation because it was the only independent method of verifying the accuracy of the access card data. The Commissioner noted that the CCTV footage was used in a limited manner to verify other information and that the principle of data minimisation had been respected. Finally, given the potential risk of damage to the employer’s reputation and the need to ensure the security of its client’s premises, the Commissioner was satisfied that the use of CCTV footage for the purpose of investigating potential employee misconduct, which raised potential security issues at a client premises, in these circumstances took precedence over the complainant’s rights and freedoms as a data subject. On the issue of whether the controller had provided the complainant with notice of the fact that their personal data might be processed through the use of CCTV footage, the Commissioner was satisfied that there had been adequate notice of this by way of the SOP document which had been acknowledged by the complainant signing the certificate of understanding.

This Commissioner therefore formed the view that the employer had a legal basis for processing the complainant’s personal data contained in the CCTV footage under Section 2A(1)(d) of the Data Protection Acts 1988 and 2003.

This case demonstrates that the legal basis of legitimate interests will only be available to justify the processing of personal data where, in balancing the respective legitimate interests of the controller against the rights and freedoms of the data subject, the particular circumstances of the case are clearly weighted in favour of prioritising the legitimate interests of the controller. It is an essential that in order to justify reliance on this legal basis that the processing in question is proportionate and is necessary to the pursuit of the legitimate interests of the controller.

5)   Disclosure of sensitive personal data by a hospital to a third party.

We received a complaint concerning the alleged unauthorised disclosure of a patient’s sensitive personal data by a hospital to a third party. The complainant had attended the hospital for medical procedures and informed us that the medical reports for these procedures were received to their home address in an envelope that had no postage stamp. The envelope had a hand-written address on it which included the name of a General Practitioner (GP) and also included the home address of the complainant’s neighbour. A hand-written amendment had been made to the address, stating that it was the wrong address. The complainant informed us that they had made enquiries with their neighbour in relation to the correspondence and the neighbour had stated that they had received the correspondence a number of days prior but that it had not been delivered by a postman. The neighbour further advised the complainant that they opened the envelope and viewed the contents in an effort to locate the correct recipient/address.

Following the initial complaint, the complainant provided us with correspondence which they subsequently received from the hospital apologising that correspondence containing the complainant’s medical results had been inadvertently sent to the wrong address. The hospital indicated that this appeared to have been due to a clerical error confusing part of the GP’s address and part of the complainant’s address. We commenced an investigation to establish how the error had happened, what procedures the hospital had in place at the time and what the hospital since had done to avoid repetition of this incident.

The hospital informed us that their normal procedure is to issue medical reports in batches to the relevant GP so that multiple sets of medical reports for different patients are placed in a windowed envelope, which shows the relevant GP’s address in the window. In this case however, the medical report was put in a nonwindowed envelope and the address was hand-written on the front. In doing so, the staff member who had addressed the envelope manually, erroneously intermixed the GP’s name, part of the GP’s address and part of the complainant’s address on the envelope. The hospital also informed us that the envelopes containing results to be dispatched to GPs are franked by the hospital post room. However, in this case because the envelope containing the complainant’s medical information was not franked, the hospital concluded that it was unlikely that it had been sent out directly from their post room and indicated that it could have been sent on via the relevant GP, although they acknowledged that they could not be certain about this this. We were unable to establish during the course of the investigation the precise manner in which the envelope containing the complainant’s medical reports came to be delivered to the complainant’s neighbour’s house. The hospital informed us that administrative staff had since been briefed on the correct procedure for issuing medical reports and that non-window envelopes would no longer be used for this purpose.

The complainant rejected the apology from the hospital made by way of an offer of amicable resolution and instead requested a formal decision from the Commissioner. In her decision, the Commissioner found that the hospital had contravened Section 2(1)(b) (requirement to keep personal data accurate, complete and up to date), Section 2(1)(d) (requirement to take appropriate security measures) and Section 2B(1) (requirement for a legal basis for processing sensitive personal data) of the Data Protection Acts 1988 and 2003 when it processed the complainant’s sensitive personal data by way of disclosing their personal data inadvertently to a third party.

This case illustrates how a seemingly innocuous deviation by a single staff member from a standard procedure for issuing correspondence can have significant consequences for the data subject concerned. In this case, highly personal medical information was accessed by a third party in circumstances which were entirely avoidable. If the hospital had had in place appropriate quality control and oversight mechanisms to ensure that all staff members rigidly adhere to its standard procedures it unlikely that this unauthorised disclosure of sensitive personal data would have occurred.

6)   Publication of personal information - journalistic exemption.

We received a complaint concerning an article published in the Sunday World (in both newspaper and online news forms) which named the complainant and published their photograph. The focus of the article was official complaints made by Irish prisoners under the Prisons Act 2007 concerning their treatment in prison (known as “Category A” complaints) and it included details of the number of “Category A” complaints which had been made by the complainant. It was alleged by the complainant that the Sunday World had gained unauthorised access to their personal data from the Irish Prison Service.

The complainant provided us with a letter which they had written to the editor of the Sunday World asserting that the information contained in the article was inaccurate and violated their right to privacy and requesting that the link to the online article be removed. We were also provided with a previous decision of the Press Ombudsman which dealt with various alleged breaches of the Code of Practice of the Press Council of Ireland (the Code) by the Sunday World, including allegations of breaches arising from the article in question. The Press Ombudsman had decided that there had been a breach of Principle 5 of the Code concerning privacy and that the article could have been written without publishing the complainant’s name or photograph. The position taken by the Press Ombudsman was that as “Category A” complaints are not part of the public record, the complainant’s reasonable expectation of privacy had been breached by the publication of their name and photograph.

In the course of our investigation we queried with the Sunday World why it had not removed the online version of the article from its website in light of the Press Ombudsman’s decision and in light of the complainant’s written request to do so. We also queried how the Sunday World had obtained the complainant’s personal data. In its response, the Sunday World stated its position that the publication was in the public interest as it related to the regimes of care and management of inmates as well as staff of prisons. It also contended that the article had highlighted how the [complaint] system was being deliberately over-used and abused. The Sunday World informed us that the online version of the article had been removed upon receiving the formal request from the complainant. However, the Sunday World relied on the journalistic exemption provision under Section 22A of the Data Protection Acts 1988 & 2003 (the Acts) in relation to the obtaining of the information in relation to the “Category A” complaints and the complainant’s personal data.

The Commissioner issued a formal decision in relation to the complaint and specifically in relation to the application of Section 22A exemption. The rationale behind the exemption in Section 22A is to reconcile the protection of privacy and freedom of expression. Following the entry into the force of the Lisbon Treaty, data protection acquired the status of a fundamental right. The right to freedom of expression is also a fundamental right. Both rights are also recognised in the European Convention on Human Rights, and also referred to in the EU’s Data Protection Directive 95/46/EC which is given effect in Irish law through the Acts.

Section 22A of the Acts specifies that personal data that is processed only for journalistic purposes shall be exempt from compliance with certain provisions of that legislation (including the requirement to have a legal basis for processing the personal data) provided that three cumulative criteria are met. Under Section 22A(1) (b), one of these three criteria is that the data controller, in this instance the Sunday World, must reasonably believe that, having regard in particular to the special importance of the public interest in freedom of expression, such processing (in this case by way of publication in the newspaper) would be in the public interest. The Sunday World claimed that the purpose of the article in question was essentially to highlight what it perceived to be an abuse of process within the Irish Prison Service. In her decision, the Commissioner found that it was not reasonable for the data controller to believe that the processing of the complainant’s personal data by publishing their name and photograph would be in the public interest in achieving the stated objective of the Sunday World. It was the view of the Commissioner that the special importance in freedom of expression could have been satisfied had the journalist in question used other means to reach the desired objective for example by using statistics in relation to the number of ‘Category A’ complainant prisoners and the public interest had been neither enhanced nor diminished by identifying the complainant by means of their name and photograph. As one criterion out of the three cumulative criteria for the application of the journalistic exemption under Section 22A of the Acts had not been satisfied, the Commission found that it was not necessary to consider the remaining two criteria.

As the data controller was unable to rely on Section 22A of the Acts as an exemption from the requirement to have a legal basis for processing by publishing the complainant’s personal data, the Commission in her decision then went on to consider whether there was in fact such basis for the processing. While the Commission considered that the Sunday World had a legitimate interest in obtaining and processing statistical information in relation to ‘Category A’ complaints for the purpose of research for the article in question, she considered that the Sunday World had contravened Section 2(1)(c)(iii) by further processing the complainant’s personal data, through publishing it. This contravention arose as the processing of the data by publication was excessive and unnecessary for the purpose of the point being made by the Sunday World in the article i.e. that the system was being abused.

This case illustrates that the journalistic exemption under Section 22A of the Acts is not a blanket exemption that can be routinely relied on by publishers or journalists seeking to justify publishing unnecessary personal data. The mere existence of a published article is not sufficient to come within the scope of this exemption and instead a data controller must be able to demonstrate that they satisfy all three cumulative criteria in this section, as follows:

(i) the processing is undertaken solely with a view to the publication of journalistic, literary or artistic material;

(ii) the data controller reasonably believes that, having regard in particular to the special importance of the public interest in freedom of expression, such publication would be in the public interest; and

(iii) the data controller reasonably believes that, in all the circumstances, that having to comply with the relevant requirement of the Acts would be incompatible with journalistic, artistic or literary purposes.

7)   Compliance with a Subject Access Request & Disclosure of personal data / capture of images using CCTV

We received a complaint from an individual employed as a service engineer by a company, which was contracted to provide certain services to a company which was the operator of a toll plaza (the Toll Company). The complainant alleged, amongst other things, that the Toll Company had disclosed the complainant’s personal data (consisting of an audio recording and CCTV footage of a conversation between the complainant and an individual operating a tollbooth at the toll plaza) to the complainant’s employer without the complainant’s knowledge or consent.

During our investigation we established that an incident had occurred involving the complainant resulting in a request being made by the Toll Company to the complainant’s employers that the complainant was not to attend the toll plaza premises again in his capacity as a service engineer. We established that the incident in question involved a dispute at a toll both between the complainant and an individual operating the toll both, over the price of the toll which the complainant was charged. The Toll Company alleged that during the incident in question (which had been captured on CCTV and by audio recording) the complainant had threatened to “bring down” the toll plaza system. The complainant’s employer had confirmed that it would comply with the Toll Company request that the complainant not attend the toll plaza premises again and the Toll Company confirmed to us that at that point it had considered the matter to be concluded. However, approximately two months after the incident had occurred, the complainant’s employers had requested the CCTV footage and audio recording of the alleged incident which the Toll Company then provided to the employer. It was contended by the Toll Company that it was in its legitimate interests to process the complainant’s personal data as a threat to it had been made by the complainant and that one its employees had reported the threat to the Gardaí, who had been called to the toll plaza by the complainant at the time of the incident. The Toll Company also claimed that Sections 8(b) and Section 8(d) of the Data Protection Acts 1988 and 2003 (the Acts) allowed for this processing of the complainant’s personal data as the processing was necessary to prevent damage to the Toll Company’s property. The Company stated that the personal data of the complainant (the CCTV footage and audio recording) had been sent to the complainant’s employer two months after the incident as it had not been requested by the employer prior to that.

As part of our investigation, we noted that signs at the tollbooth notified the public that there was CCTV in operation. We also examined the Toll Company’s data protection policy which was available on its website and which stated that all vehicles using the toll plaza in question are  photographed/video recorded and that images are retained for enforcement purposes and to address and resolve any disputes that may arise in relation to a vehicle or account.

In her decision, the Commissioner considered the Toll Company’s purported reliance on pursuit of its legitimate interests as the legal basis under Section 2A(1)(d) of the Acts for the processing. Taking account of the two-month period which had elapsed between the incident in question and the request for the CCTV footage and audio recording being made by the employer, and also having regard to the confirmation of the Toll Company that (prior to receiving the employer’s request for the CCTV footage and audio recording) it had considered the incident to be concluded, the Commissioner decided that this legal basis could not be relied upon for the processing of the personal data. Consequently, a contravention of Section 2A(1) occurred as there had been no other legal basis (e.g. the consent of the complainant) to the processing of his personal data by disclosing it to his employer. The Commissioner also found that there was no adequate notice of the processing of the personal data had not been to the complainant, as it was not apparent from the data protection privacy policy or indeed the public signs at the tollbooth what the extent of the processing was, that audio recording was in operation nor was it stated who the data controller was. Consequently the Toll Company had contravened Section 2D(1) arising from this lack of transparency. Finally, the Commissioner also found that Section 2(1) (c)(ii) of the Acts had been contravened because further processing of the complainant’s personal data had occurred for a purpose (sharing it with the complainant’s employer) which was incompatible with the original purpose for its collection (enforcement purposes and resolving

This case is indicative of a common trend amongst data controllers to seek to rely on legitimate interests as the legal basis for processing personal data as something of a catch-all to cover a situation where personal data has been processed reactively and without proper consideration having been given in advance as to whether it is legitimate to carry out the processing. However, as this case illustrates a data controller must be able to provide evidence to support their assertion as to the legitimate interest relied on. Here, the passage of time since the incident and the fact that the data controller of its own admission considered that the matter had been concluded contradicted the purported reliance on this legal basis. This case also underscores the principle of the foreseeability of processing of personal data as an important element of the overarching principle of fair processing in data protection. At its core this means that a data subject should not be taken by surprise at the nature, extent or manner of the processing of their personal data.

8)   Failure to respond fully to an access request.

We received a complaint that an educational organisation had not fully complied with an access request submitted to it by the complainant who was an employee of that organisation. The complainant informed us that in the access request they had specifically sought CCTV footage from the educational organisation’s premises for 4 hour period during which the complainant had allegedly been assaulted by another employee. The complainant informed us that although there were 8 cameras on the premises, in response to their access request they only received an 11 second clip from the CCTV footage for the premises which ended just as the alleged assault came into view. The complainant told us that they had queried the limited amount of CCTV footage and reminded the educational organisation that the access request had been in respect of all footage within that 4 hour period. However, the educational organisation’s response had been that this query would be treated as a new access request. The complainant considered that the CCTV footage had been intentionally withheld and that this approach had been adopted as a delaying tactic so that the CCTV footage would ultimately not have to be released on the grounds that it had been lost or was no longer retained.

In the course of our investigation, we established that the complainant had made a subject access request to the educational organisation which had accepted it as a valid request. The educational organisation’s position was that it understood the complainant’s request to relate to footage of the incident in question only. However, the educational organisation acknowledged that the complainant would have been captured by other CCTV cameras for which the CCTV footage had not been provided. On this basis, we established that, as of the date of the complainant’s access request, additional personal data existed in the form of further CCTV footage which had not been provided to the data subject. The educational organisation informed us that as the CCTV was only retained for 28 days, by the time that the complainant had come back to query the limited amount of CCTV footage received in response to the access request, the additional CCTV footage had been subsequently overwritten without being retained for release to the complainant.

In her decision the Commissioner noted that it was clear that in the complainant’s access request the complainant was specifically seeking access to CCTV footage over a four-hour period and that having received the initial request, the educational organisation should have preserved the footage for that date and sought to clarify with the complainant what CCTV footage exactly they were seeking rather than unilaterally determining that issue itself. The educational organisation therefore contravened Section 4 of the Data Protection Acts 1988 and 2003 in failing to provide the complainant with all of their personal data within the statutory 40-day period.

This case clearly illustrates the position of the DPC which is that upon receipt of an access request relating to CCTV footage from a specific day, a data controller is obliged to preserve any such footage from that day pending resolution of the access request. This obligation applies irrespective of whether any such footage may be ordinarily subject to deletion (whether automated or not) after certain timeframes under the provisions of the data controller’s retention policy. Where a data controller considers that further clarification should be sought from the data subject as to the scope of the personal data requested, that requirement for clarification should not be interpreted as if the access request had not yet been made, as to do so could undermine the data subject’s right to access their personal data or enable a data controller to circumvent its obligations in respect of the access request.

9)   : Personal data of a third party withheld from an access request made by the parent of a minor

We received a complaint from an individual who had submitted an access request to a sports club for the personal data of their minor child, for whom the parent was the joint legal guardian. Following intervention from this office, the complainant had received personal data relating to their child from the sports club which was contained in an application for membership of the sports club which had been submitted to the sports club on behalf of the child. However certain information had been redacted from that application form, namely the names of the persons who were submitted to the sports club as emergency contacts for the child, the signature of the person who consented to images of the child being used on digital media by the sports club and the address of the minor. The complainant asserted that the third-party details and the address were all the personal data of their child and that the complainant as the joint legal guardian was therefore entitled to access to it. The sports club’s position was that there was no express provision within Section 4 of the Data Protection Acts 1988 and 2003 (the Acts) which relates to the right of access, which allows a person access to another party’s personal data without their consent. The sports club had also checked with the third parties whose personal data was the subject of the redactions on the application form as to whether they consented to the release of the data to the complainant but they had refused to give their consent.

Section 4(4) of Acts which precludes the release of third party data without that party’s consent was brought to the attention of the complainant. However, the complainant put forward the argument that because the information requested pertained to matters concerning the minor’s welfare and that because the third party was the legal representative of that minor, this rendered the data to be the child’s personal data. We outlined the definition of personal data to the complainant and highlighted case law which has established that a individual’s name represents the personal data of that individual. The complainant was also advised that the address of their child could not be provided without also providing the personal data of a third party and therefore the complainant had no right of access to it.

The complainant sought a decision on their complaint from the Commissioner. In her decision, the Commissioner pointed out that taking account of Section 8(h) of the Acts (which lifts restrictions on the processing of personal data where the processing is made with the consent of the data subject or a person acting on their behalf), her office’s position is that a parent or legal guardian of a young child has an entitlement to exercise the right of access on that child’s behalf. However, in this case as the child in question could not be identified by the names of third parties who were listed as emergency contacts with the sports club, the information to which the complainant sought access was not the personal data of the complainant’s child. The Commissioner in her decision pointed out that if the complainant’s logic were to be followed and an emergency contact were deemed the personal data of a third party, an adult who has listed another adult as an emergency contact would have the right of access over that third party’ name, telephone number, address, etc. The Commissioner found that no contravention of the Acts had occurred in relation to the redactions made to documents which had been released by the sports club on foot of the access request.

This case illustrates that irrespective of the relationship, dependency or connection between two parties, the name of a third party cannot be deemed to be the personal data of a data subject. As highlighted in the Commissioner’s decision, to do so would deprive that third party of control over their own personal data and allow another individual to exercise data subject rights, including the right of access, over the personal data of the third party. Such an outcome would run contrary to the core principle of data protection which is that each data subject has the right to determine the use of their own personal data. However, it is important to distinguish this principle from the limited circumstances in which the rights of a data subject may be lawfully exercised by another person who is permitted to do so on their behalf. Even where data subject rights may be exercised by a third party (such as the parent of a young minor child) this does not render the personal data of the data subject to be the personal data of the third party who is authorised to exercise the data subject’s rights on their behalf.

10)   Disclosure of Personal Data via a Social Media App.

We received complaints from two individuals who each claimed that their personal data had been unlawfully disclosed when it was broadcast on “Snapchat”, an instant messaging and multimedia mobile application.

The complainants, who were friends, informed us that they had each submitted their CV with a cover letter to a particular retailer, in person, by way of application for employment with that retailer. The applications had been made by the complainants on the same day and had been received by the same employee of the retailer. Later on the same day the complainants had learned from a third party that a photograph showing both cover letters was appearing on “Snapchat” with a message drawing attention to similarities in the cover letters. It was the complainants’ common understanding that the employee of the retailer to whom they had submitted their CVs had taken this photograph and posted it to “Snapchat”.

During the course of our investigation of these complaints, we established that the employee of the retailer to whom the complainants had handed their CVs and cover letters had been recently notified by the retailer of the termination of their employment. Contrary to the retailer’s policy and the terms of their contract of employment, the employee had a mobile phone on their person during work hours and had used it to take a photograph of both the cover letters and to post it to “Snapchat”. The retailer informed our investigators that the employee was aware that this action was contrary to their contract of employment and the actions of the employee were done in circumstances where the employee was about to leave their employment. The retailer insisted that, in this instance, there was nothing further it could have done to prevent this incident from occurring.

In her decision the Commissioner found that the retailer, as the data controller for the complainants’ personal data, had contravened Section 2A(1) of the Data Protection Acts 1988 and 2003 as the processing of the complainants’ personal data, by way of the taking and posting of the photograph by the retailer’s employee, was incompatible with the purposes for which it had been provided to the retailer by the complainants.

The case should serve as a cautionary reminder to data controllers that as a general principle under data protection law, they are responsible for the actions of their employees in connection with the processing of personal data for which they are the data controller. The motive of an employee or the deliberate or accidental nature of the actions which they have undertaken in relation to personal data does not absolve data controllers of such responsibility. Data controllers have an obligation to ensure that their employees comply with data protection law in relation to the personal data which they hold irrespective of whether it is the employee’s first or last day or employment with the data controller. Indeed this obligation will continue even after an employee leaves a data controller’s employment if that employee can still access the personal data controlled by their former employer.

11)   Failure by the Department of Justice and Equality to impose the correct access restrictions on access to medical data of an employee

We received a complaint from an individual concerning an alleged disclosure of their sensitive personal data by the Department of Justice & Equality (the Department). It was claimed by the complainant, who was an employee of the Department, that a report containing information on the complainant’s health had been uploaded to a general departmental open document management database in 2012 and that the report had remained on that database for up to three years where it could be accessed by approximately 80 employees. The complainant informed us that they had been notified of the accessibility of the report on the database by a colleague. The complainant told us that they had requested an explanation from the Department as to why the report had been placed on an open database but had not received official confirmation that the report had since been removed.

We commenced an investigation into the complaint. The Department confirmed that notes relating to a discussion which had taken place between the complainant and their line manager in 2012 (which included a note concerning the complainant’s health) had been stored to the database in question and marked private. However, the line manager had inadvertently omitted to restrict access to the document with the result that it could be accessed by approximately 80 staff members from the Department. The Department informed us that the document had been removed from the database in question some 3 years after having been saved to it. As the line manager in question had since left the Department, it had been unable to establish exactly why the document had been saved there in the first place but claimed that it was due to human error. The Department was also unable to establish how many staff had actually accessed the document during the 3 year period in which it was accessible as the Department’s IT section had been unable to restore the historic data in question.

The Department made an offer, by way of amicable resolution, to write to the complainant confirming that the document in question had been removed from the database and apologising for any distress caused. The complainant chose not to accept this offer and instead sought a formal decision of the Commissioner. In her decision, the Commissioner concluded that the Department had contravened Section 2A(1) and 2B(1) of the Data Protection Acts 1988 & 2003 by processing the complainant’s sensitive personal data without the required consent or another valid legal basis for doing so and by disclosing the complainant’s sensitive personal data to at least one third party. These contraventions had occurred by way of the placing of a confidential document containing details of the complainant’s health on an open database where it appeared to have remained accessible for 3 years and had been accessed by at least one co-worker.

This case is a stark illustration of the consequences for a data subject and general distress which can be caused where the data controller fails to ensure that its staff have adhered to, and continue, to adhere to proper document management protocols for documents containing personal data and moreover, sensitive personal data. While the controller in question was unable to identify how many times and by how many different staff members the document in question had been accessed during the 3 year period when it was accessible to approximately 80 staff members, the potential for further and continuing interference with the data subject’s fundamental rights and freedom remained throughout this period. Had the controller in this case had adequate regular audit and review measures in place for evaluating the appropriateness of documents stored to open access databases, the presence of this confidential document would have been detected much sooner than actually occurred. Further, had the Department an adequate system of training and ensuring awareness by staff managers of basic data protection rules in place, this issue may not have arisen in the first instance.

12)   Virgin Media Ireland Limited.

We received a complaint in May 2016 from an individual who had received unsolicited marketing telephone calls from Virgin Media Ireland Limited in March and in May 2016 after she had previously asked the company not to call her again. The complainant is a customer of Virgin Media Ireland Limited and she informed us that the calls promoted Virgin Media products. She advised us that when the company first called her in January 2016 she had asked that her details be placed on the “Do Not Call” list as she did not wish to receive any further marketing calls. She stated that when the company called her again in March 2016 she repeated that she wanted her details to be placed on the “Do Not Call” list but despite her two requests she had received a further unsolicited marketing telephone call to her mobile phone on 27 May 2016.

During our investigation of this complaint, Virgin Media Ireland Limited informed us that due to human error the complainant’s account was not updated correctly to record the “Do Not Call” requests. The company advised us that a review had been conducted on all “Do Not Call” requests handled by the team in question for the period from January 2016 to July 2016 to ensure that all opt-out requests had been completed correctly. It confirmed that the complainant’s details had been removed from the marketing database and it apologised for any inconvenience caused to her.

Prior to September 2015 Virgin Media Ireland Limited traded under the name UPC Communications Ireland Limited. That company had previously been prosecuted, convicted and fined in March 2011 and in April 2010 for twenty similar marketing offences involving telephone calls to subscribers who had not consented to the receipt of such marketing calls. The Data Protection Commissioner therefore decided to prosecute Virgin Media Ireland Limited in respect of the offences identified following the investigation of the latest complaint.

At Dublin Metropolitan District Court on 3 July 2017, Virgin Media Ireland Limited pleaded guilty to two charges of making unsolicited marketing telephone calls to its customer after she notified the company that she did not wish to receive such calls. The Court convicted the company on both charges and it imposed fines of €1,500 and €1,000 respectively on the charges. The defendant agreed to cover the prosecution costs of the Data Protection Commissioner.

13)   Sheldon Investments Limited (trading as River Medical)

In September 2015 we received a complaint against Sheldon Investments Limited, trading as River Medical, from an individual who had received unsolicited marketing emails to which he had not consented and which were subsequent to his attempts to opt out of such emails. In making his complaint, the individual explained that he had previously had a consultation with River Medical during which he was obliged to complete a form. He stated that when completing the form he expressly stated that he did not wish to receive any marketing emails from them. He subsequently received a marketing email from River Medical in April 2015 and he replied to the email with a request that his address be removed from their marketing list immediately. He received confirmation two days later that his contact details were removed. Despite this, he received a further unsolicited marketing email from River Medical in September 2015 which prompted him to submit a complaint to the Data Protection Commissioner.

During our investigation of this complaint, River Medical told us that the failure to respect the complainant’s opt-out request was due to human error. It explained that it had made his file ‘inactive’ on receipt of his opt-out request, but it did not realise that it needed to manually delete his file in order to prohibit the sending of further marketing material to him. It assured us that on foot of our investigation of the complaint, the individual’s email address had been deleted from its systems. We concluded the investigation of that complaint in December 2015 with a warning to the company that it would likely be prosecuted if it committed any further offences under the marketing regulations.

One year later, in December 2016, the individual submitted a new complaint after he received a further unsolicited marketing email from River Medical. We investigated this complaint and we were informed once again that the latest infringement had been caused by human error in the selection of an incorrect mailing list on Newsweaver, the system used by the company to issue emails. The company apologised for the incident.

As we had previously issued a warning to the company, the Data Protection Commissioner decided to prosecute it in respect of the two unsolicited marketing emails issued in December 2016 and in September 2015. At Dublin Metropolitan District Court on 3 July 2017, Sheldon Investments Ireland Limited pleaded guilty to two charges of sending unsolicited marketing emails without consent. The Court sought the payment of €800 in the form of a charitable donation to Focus Ireland and it adjourned the matter. The defendant agreed to cover the prosecution costs of the Data Protection Commissioner. At the adjourned hearing the defendant produced proof of payment of the charitable donation and the Court struck out the charges.

14)   Tumsteed Unlimited Company (trading as EZ Living Furniture)

In June 2016 we received a complaint from an individual who received unsolicited marketing text messages from EZ Living Furniture despite having, on three previous occasions, requested them to stop. The complainant informed us that she had made a purchase from the company in the past.

As part of our investigation of this complaint, we asked EZ Living Furniture to show us evidence of the consent of the complainant to receive marketing text messages in the first instance. We also sought an explanation as to why her requests to opt out had not been actioned.

In response to our investigation, EZ Living Furniture stated that, in respect of marketing consent, customers sign into the company’s terms and conditions printed on the back of receipts. It drew our attention to one of the terms and conditions to the effect that customer information will be retained by the EZ Living marketing department and will be added to its database to be used for mailing lists and text messages. In relation to the complainant’s opt out requests not being complied with, EZ Living Furniture explained that there had been a changeover of service providers and the new service provider had a different method for opting out. It claimed that it was totally unaware that the opt-out facility was not working until it received our investigation letter. It assured us that the opt-out issue had now been resolved and it said that it had sent an apology to the complainant. In our response to EZ Living Furniture, we advised it, in relation to customer consent, that while it was relying on terms and conditions of sale, it was in fact obliged by law to provide its customers with an opportunity to opt out of receiving marketing communications at the point of collection of their personal data. We pointed out that, in practice, this means that customers must be provided with an opt-out box for them to tick in order to opt out of marketing, if that is their wish. In a subsequent reply, the company informed us that it had examined the matter further and that it had decided to introduce a stamp that would be placed on the sales docket to provide a checkbox to allow customers to opt out of receiving marketing emails and text messages.

The Data Protection Commissioner had previously issued a warning to EZ Living Furniture in April 2015 following the investigation of a complaint from a different individual in relation to sending her unsolicited marketing text messages without consent. Consequently, the Data Protection Commissioner decided to prosecute the company in respect of the offences which came to light arising from the latest complaint.

At Galway District Court on 4 July 2017, Tumsteed Unlimited Company, trading as EZ Living Furniture, pleaded guilty to two charges of sending unsolicited marketing text messages without consent. The Court convicted the company and it imposed fines of €500 on each of the two charges. The company agreed to make a contribution towards the prosecution costs of the Data Protection Commissioner.

15)   Cunniffe Electric Limited.

In December 2016 an individual complained to us that he had recently received unsolicited marketing text messages from Cunniffe Electric Limited of Galway Shopping Centre despite the fact that he had been advised previously on foot of an earlier complaint to us that his mobile phone number had been removed from its marketing database. In early 2015 we had received the complainant’s first complaint in which he informed us that he had given his mobile phone number some years ago to Cunniffe Electric Limited to facilitate the delivery of an electrical appliance which he had purchased from the company. He stated that he did not give the company consent to use his mobile phone number for marketing purposes.

Following our investigation of the first complaint, we received confirmation from Cunniffe Electric Limited that the complainant’s mobile phone number had been removed from its marketing database. We concluded that complaint by issuing a warning to the company that it would likely face prosecution if it breached the marketing regulations again.

On receipt of the complainant’s second complaint, we commenced a new investigation in which we sought from Cunniffe Electric Limited an explanation for the sending of the latest marketing text messages in circumstances where we were previously informed that the complainant’s mobile phone number had been removed from its marketing database. In response, the company admitted that it did not have the consent of the complainant to send him marketing text messages. It said that his mobile number was not on its database but it appeared that there was an error on the part of the service provider that it was using to send marketing text messages and that this error arose from transition issues when the service provider was acquired by another company. It apologised for the inconvenience caused to the complainant.

As the company had previously received a warning, the Data Protection Commissioner decided to prosecute it in relation to the most recent offences. At Galway District Court on 4 July 2017, Cunniffe Electric Limited entered a guilty plea for the sending of an unsolicited marketing text message without consent. In lieu of a conviction and fine, the Court asked the company to make a contribution of €500 to the Court Poor Box and it then struck out the charges. The company agreed to make a contribution towards the prosecution costs of the Data Protection Commissioner.

16)   Argos Distributors (Ireland) Limited

Five individuals lodged complaints with us between December 2016 and February 2017 arising from difficulties they were experiencing in opting out of email marketing communications from Argos Distributors (Ireland) Limited. The complainants had supplied their email addresses in the context of making online purchases and they had not opted out of marketing communications at that point. However, when they subsequently attempted to opt out on receipt of marketing emails, the ‘unsubscribe’ system failed. Some complainants subsequently followed up by email to the company seeking to have their email addresses removed from the marketing database and they received responses by email to inform them that their requests had been actioned. However, they continued to receive further email marketing from Argos Distributors (Ireland) Limited.

In response to our investigation, the company acknowledged that its ‘unsubscribe’ system was not working properly for a period of time. It also discovered an issue in processing ‘unsubscribe’ requests for customers based in Ireland. It found that requests from Irish customers were being added to the ‘unsubscribe’ list for UK marketing. In all cases, it confirmed that the opt-out requests of the individuals concerned were now properly processed.

As the company had been warned previously in 2013 following the investigation of a similar complaint of a breach of the marketing regulations, the Data Protection Commissioner decided to prosecute it in relation to these offences. At Navan District Court on 14 July 2017, Argos Distributors (Ireland) Limited pleaded guilty to five charges of sending unsolicited marketing emails to five individuals without consent. In lieu of a conviction and fine, the Court ordered the defendant to contribute €5,000 to a charity of the Court’s choosing. The defendant agreed to pay the prosecution costs incurred by the Data Protection Commissioner.

17)   Expert Ireland Retail Limited

In October 2016 an individual complained to us about regular marketing text messages which she received from Expert Ireland Retail Limited. She informed us that in August 2014 she purchased a tumble dryer at the Expert Naas store and she stated that she gave her mobile phone number at the point of sale for the sole purpose of arranging the delivery of the appliance. She stated that she was not asked if she wished to receive marketing text messages and she did not request or agree to same. She informed us that she began receiving regular marketing text messages from December 2015 onwards and despite replying by text message on numerous occasions with the opt-out keyword, further text messages continued to arrive on her phone. She advised us that early in October 2016 her husband called to the Expert store in Naas and he asked the staff there to remove her number from their marketing database. Despite this request the complainant received a further marketing text message about two weeks later, prompting her to lodge a complaint with the Data Protection Commissioner.

In response to our investigation, the company claimed that the complainant would have been asked during the course of the sale if they would like to be contacted by text message for marketing purposes. However, it was unable to provide any evidence that the complainant was given an opportunity to opt out of marketing at the point of sale. Furthermore, it admitted that the sending of the first marketing message after a period of over twelve months had expired was an oversight. The company was unable to explain why no action was taken to remove the complainant’s mobile phone number from the marketing database after her husband called to the Naas store.

As the company had previously been issued with a warning in May 2010 on foot of a similar complaint which we received about unsolicited marketing text messages sent to a different former customer of the Expert store in Naas without her consent, the Data Protection Commissioner decided to prosecute this latest complaint. At Mullingar District Court on 13 October 2017, Expert Ireland Retail Limited pleaded guilty to one charge of sending an unsolicited marketing text message to the complainant without her consent. The Court convicted the company and it imposed a fine of €500. The defendant company agreed to pay the legal costs incurred by the Data Protection Commissioner in respect of this prosecution.

  • Prosecution of James Cowley Private Investigator
  • Disclosure of Personal Data to a Third Party in Response to a Subject Access Request
  • Data Breach at Retail and Online Service Provider
  • Prosection of Yourtel for Marketing Offences
  • Prosecution of Glen Collection Investments Limited and One of its Directors
  • Prosecution of Shop Direct Ireland Limited T/A Littlewoods Ireland for Marketing Offences
  • Further Processing of an Individual's Personal Data in an Incompatible Manner
  • Disclosure of Personal Information to a Third Party by a Data Processor
  • The Necessity to Give Clear Notice When Collecting Biometric Data at a Point of Entry
  • Residential Care Home's Legimate Use of Audio Recording and Photograph of Data Subject Concerning Allegations of Misconduct
  • Disclosure of Personal Information to a Third Party
  • Failure of a Data Controller to Keep Individual's Personal Information Accurate and Up to Date Which Resulted in the Disclosure of Personal Data to a Third Party
  • Failure by BOI to Properly Verify the Identity of Individual on the Phone Which Resulted in the Disclosure of Personal Information to a Third Party
  •  Data Controller Obliged to Demonstrate Effort Made to Locate Data Within the Statutory 40 Day Period
  • Personal Data Withheld from an Access Request by Airbnb on the Basis of an Opinion Given in Confidence
  •  Crypto Ransomware Attack on a Primary School
  • Data Breach at an Online Retailer
  •  Incorrect Association of an Individual's Personal Details with Another File
  • Prosecution of The Irish Times Limited for Marketing Offences
  • Prosecution of Coopers Marquees Limited for Marketing Offences
  • Prosecution of Robert Lynch T/A The Energy Centre for Marketing Offences
  • Prosecution of Paddy Power Betfair Public Limited Company for Marketing Offences
  •  Prosecution of Trailfinders Ireland Limited for Marketing Offences
  • Prosecution of Topaz (Local Fuels) Limited for Marketing Offences
  • Prosecution of Dermaface Limited for Marketing Offences 

1)   Prosecution of James Cowley Private Investigator

James Cowley was charged with sixty-one counts of breaches of the Data Protection Acts, 1988 & 2003. All charges related to breaches of Section 22 of the Data Protection Acts for obtaining access to personal data without the prior authority of the data controller by whom the data is kept and disclosing the data to another person. The personal data was kept by the Department of Social Protection. The personal data was disclosed to entities in the insurance sector – the State Claims Agency, Zurich Plc and Allianz Plc.

On 13 June 2016, at Dublin Metropolitan District Court, James Cowley pleaded guilty to thirteen sample charges. He was convicted on the first four charges and the Court imposed a fine of €1,000 in respect of each of these four charges. The remaining nine charges were taken into consideration in the sentence imposed.

The investigation in this case uncovered access by the defendant to social welfare records held on databases in the Department of Social Protection. To access these records, the defendant used a staff contact who was known to him. Mr. Cowley then used the information he obtained for the purposes of compiling private investigator reports for his clients. These activities continued for a number of years up to September 2015 when our investigation team first made contact with him about its concerns in relation to his processing of personal data.

2) Disclosure of Personal Data to a Third Party in Response to a Subject Access Request

An ex-employee of Stobart Air made a complaint in August 2015 to us regarding the unlawful disclosure of their redundancy details to another member of staff following an access request made by that person to the company. The complainant also informed us they had equally received third party personal information in response to a subject access request that they themselves had made to the company in May 2015.

Stobart Air, on commencement of our investigation, confirmed to us that a breach of the complainant’s data had occurred in November 2014. It stated that it had not initially notified the complainant of the breach when it first learned of it as it was unaware of the data protection guidelines that advise the reporting of disclosures to the data subjects involved where the disclosure involves a high risk to the individual’s rights and requesting the third party in receipt of the information to destroy or return the data involved.

The complainant in this case declined an offer of amicable resolution and requested a formal decision of the Commissioner. In her decision the Commissioner found that Stobart Air had, in including the complainant’s personal data in a letter to ex-employees, had carried out unauthorised processing and disclosure of the complainant’s personal data. This had contravened Section 2A(1) of the Data Protection Acts, 1988 and 2003, by processing the complainant’s personal information without the complainant’s consent or another legal basis under the Data Protection Acts 1988 and 2003 for doing so.

Stobart Air identified itself that it had inadequate training and safeguards around data protection in place which it has since sought to rectify.

In a separate complaint received by the DPC in September 2015, we were notified that Stobart Air had disclosed financial data of a third party to the complainant in response to a subject access request. We proceeded to remind Stobart Air of its obligations as a data controller and Stobart Air identified a number of individuals who had been affected by these issues. Stobart Air subsequently notified all affected third parties of the breach of their personal data. However, in trying to comply by notifying the affected individuals, Stobart Air disclosed the complainant’s data, by divulging the fact that the complainant was the recipient of this data, in a letter notifying the individuals whose data was originally disclosed.

Stobart Air had no legal basis to disclose the complainant’s personal data to the third parties involved nor did it have consent of the individual affected. The disclosure of the complainant’s identity to the individuals affected by the original breach was unnecessary in the circumstances and in contravention of Section 2A(1) of the Data Protection Acts 1998 and 2003.

3) Data Breach at Retail and Online Service Provider

In July 2016, we received a breach report from an organisation providing retail and online services.

The organisation was victim of a “brute force” attack, whereby over a two-week period, the attackers tried various username/password combinations, with some combinations successfully being used to gain access to user accounts. When these accounts were accessed, the attackers attempted to withdraw user balances. These withdrawals were enabled by the attacker having the ability to add new payment methods. It was also possible for the attacker to access the personal data associated with the account.

On assessing the breach, we identified that the organisation had deficiencies in the measures it had taken to secure users’ personal data including:

  • Insufficient measures on password policy and user authentication;
  • Insufficient control measures to validate changes to a user’s account; and
  • Insufficient control measures on the retention of dormant user accounts.

We considered that the organisation contravened Section 2(1)(d) of the Data Protection Acts 1988 and 2003 by failing to take appropriate security measures against unauthorised access to, or unauthorised alteration, disclosure or destruction of, its users’ personal data.

Recommendations were issued to the organisation that it take steps to mitigate the deficiencies identified or face enforcement action. The organisation subsequently informed us that it had taken the following steps based on our recommendations:

  • Implementation of passwords which require more than one factor
  • Implementation of a comprehensive data retention policy

This case highlights the need for organisations to ensure that they have appropriate technical organisational and security measures in place to prevent loss of data through “brute force” or reuse of password attacks. In this scenario, the use of appropriate access and authentication controls, such as multifactor authentication, network rate limiting and logon alerts, could have mitigated the risks. Further, poor retention policies provide an “attack vector” for hackers such as that used as a means of entry in this breach.

4) Prosection of Yourtel for Marketing Offences

We received a complaint in December 2014 from an individual who received marketing telephone calls from Yourtel Limited, a telephone service provider which entered the Irish market in 2013, after he had instructed the company during a previous call not to call him again. The complainant informed us that the calls related to an offer to switch telephone service providers.

In February 2015 a separate complaint was received on behalf of another individual who received marketing telephone calls from Yourtel Limited after the company had been instructed during a similar marketing call on Christmas Eve 2014 not to call his number again. The marketing calls to this individual also concerned switching telephone service provider.

During our investigation of these complaints Yourtel Limited acknowledged the making of the marketing telephone calls. It claimed that it blocked the telephone numbers from receiving further marketing calls on the occasion of the last call in each case when it was informed by the individuals concerned that they did not wish to be contacted again for marketing purposes. It did not accept in either case that it continued to call the individuals after they had instructed Yourtel Limited not to call them again.

The Data Protection Commissioner decided to prosecute the offences as Yourtel Limited had come to our attention previously in 2014 on foot of a complaint about the making of a marketing telephone call to a telephone number which stood recorded on the National Directory Database (NDD) Opt Out Register. Following the investigation of that complaint, we warned the company that it would likely face prosecution if it committed further offences under Regulation 13 of SI 336 of 2011 (known as the ePrivacy Regulations) at any future time.

At Dublin Metropolitan District Court on 21 January 2016 Yourtel Limited pleaded guilty to two charges of making unsolicited marketing telephone calls after the two individuals it called had notified the company that they did not consent to the receipt of such calls. The Court convicted the company on both charges and it imposed two fines of €2,500 each. The defendant agreed to cover the prosecution costs of the Data Protection Commissioner.

5) Prosecution of Glen Collection Investments Limited and One of its Directors

The investigation in this case established that the defendant company obtained access to records held on computer databases in the Department of Social Protection over a lengthy period of time and that a company director used a family relative employed in the Department of Social Protection to access the records. The defendant company had been hired by a Dublin-based firm of solicitors to trace the current addresses of bank customers that the respective banks were interested in pursuing in relation to outstanding debts. Having obtained current address information or confirmed existing addresses of the bank customers concerned from the records held by the Department of Social Protection, the defendant company submitted trace reports containing this information to the firm of solicitors which acted for the banks. The case came to light on foot of a complaint which we received in February 2015 from a customer of AIB bank who alleged that an address associated with him and which was known only to the Department of Social Protection was disclosed by that department to an agent working on behalf of AIB bank.

The Data Protection Commissioner decided to prosecute both the company and the director in question, Mr Michael Ryan. Glen Collection Investments Limited was charged with seventy-six counts of breaches of the Data Protection Acts, 1988 & 2003. Sixty-one charges related to breaches of Section 19(4) of the Data Protection Acts for processing personal data as a data processor while there was no entry recorded for the company in the public register which is maintained by the Data Protection Commissioner under Section 16(2) of the Data Protection Acts. Fifteen charges related to breaches of Section 22 of the Data Protection Acts for obtaining access to personal data without the prior authority of the data controller by whom the data is kept and disclosing the data to another person.

Mr. Michael Ryan, a director of Glen Collection Investments Limited, was separately charged with seventy-six counts of breaches of Section 29 of the Data Protection Acts, 1988 & 2003 for his part in the offences committed by the company. This Section provides for the prosecution of company directors where an offence by a company is proved to have been committed with the consent or connivance of, or to be attributable to any neglect on the part of the company directors or other officers.

The cases against Glen Collection Investments Limited and its director were called in Tuam District Court in January, May and July of 2016 before the defendants eventually entered guilty pleas on 10 October 2016. While the defendant company was legally represented in court on all occasions, the Court issued a bench warrant for the arrest of the company director, Mr Ryan, on 10 May 2016 after he had twice failed to appear. The bench warrant was executed at Tuam District Court on 10 October, 2016 prior to the commencement of that day’s proceedings.

At Tuam District Court on 10 October 2016 Glen Collection Investments Limited pleaded guilty to twenty-five sample charges – thirteen in relation to offences under Section 22 and twelve in relation to offences under Section 19(4). The company was convicted on the first five counts with the remainder taken into consideration. The court imposed five fines of €500 each. Mr. Ryan pleaded guilty to ten sample charges under Section 29. He was convicted on all ten charges and the court imposed ten fines of €500 each. In summary, the total amount of fines imposed in relation to this prosecution was €7,500

6) Prosecution of Shop Direct Ireland Limited T/A Littlewoods Ireland for Marketing Offences

In January 2015 we received a complaint against Shop Direct Ireland Limited T/A Littlewoods Ireland from an individual who received an unsolicited marketing email after she opted out of marketing from the company. The individual, who was a customer of Littlewoods Ireland, complained further a few weeks later when she received a marketing email promoting offers for Mother’s Day from Littlewoods Ireland. We had previously issued a warning to Littlewoods Ireland in December 2014 following the investigation of a complaint received from the same complainant with regard to unsolicited marketing emails which she had received after she opted out of receiving marketing. That previous complaint led to an investigation which found that the customer had not been given the opportunity to opt out of marketing from Littlewoods when she opened her account. (She had been given the opportunity to opt out from third party marketing only – an option which she availed of). Arising from our investigation of that complaint, Littlewoods Ireland informed us that the customer’s email address was opted out of direct marketing from 7 March, 2014.

The Data Protection Commissioner decided to prosecute the company. At Dublin Metropolitan District Court on 4 April 2016 Shop Direct Ireland Limited T/A Littlewoods Ireland pleaded guilty to one charge of sending an unsolicited marketing email without consent. The Court ordered the payment of €5,000 in the form of a charitable donation to Pieta House and it adjourned the matter for seven weeks. The defendant agreed to cover the prosecution costs of the Data Protection Commissioner. At the adjourned hearing the defendant produced proof of payment of the charitable donation and the Court struck out the charge.

7) Further Processing of an Individual's Personal Data in an Incompatible Manner

An individual submitted a complaint regarding the unfair processing of their personal data. The individual stated that they had received letters from Thornton’s Recycling and Oxigen Environmental respectively explaining that there would be a change-over of refuse collection services from Oxigen Environmental to Thornton’s Recycling within a week of the issuing of the letters. The complainant advised that they had not authorised the transfer of their personal details and had not been previously informed of this transfer of ownership.

We raised the matter with Oxigen Environmental requesting an explanation as to the reason for processing personal data in this manner in light of the data protection requirements of fair obtaining and fair processing of personal data. Oxigen Environmental confirmed that the customer details that were transferred to Thorntons consisted of a name, address and any balance that remained on the customer’s pre-paid account. It advised that no banking details were passed over at any stage. It also alleged that a letter had been sent out to all customers advising them of the transfer and that this letter had been issued before any customer data had been transferred but they were not able to clarify the date on which this allegedly occurred.

Oxigen Environmental indicated that the first and only notification that customers received regarding the transfer of services from Oxigen Environmental to Thorntons Recycling was made by way of two letters, one each from Oxigen Environmental and Thorntons Recycling, contained in the same envelope delivered to customers. The interval between this notification and the transfer of services spanned less than four working days. We considered that this was an insufficient timeframe for customers to consider the change-over and to make alternative arrangements to prevent the further processing of personal data. Whilst the issue of takeovers/mergers is often covered by a company’s contractual terms with its customers, we established that Oxigen Environmental’s terms and conditions and Customer Charter did not cover such issues.

Taking into account the short timeframe that had elapsed between the notification of the transfer of services and the date from which the transfer became effective, our view was that the fair processing requirements under the Acts were not fulfilled. Whilst a proposal for amicable resolution was put forward, we were unable to conclude an amicable resolution of the complaint and a formal decision of the Commissioner issued in July 2016. The Commissioner found Oxigen Environmental to be in contravention of Section 2(1)(a) of the Data Protection Acts 1988 and 2003 in that it unfairly processed personal data without sufficient notice to its customers.

The requirement to provide proper notice of processing to data subjects in accordance with Section 2(1)(a) and Section 2D of the Data Protection Acts 1988 and 2003 is an essential pre-requisite to the lawful processing of personal data. A data subject has the right to be properly informed with adequate notice of a change in the ownership of a business holding his or her personal data, in order to be able to withdraw from the services being provided and prevent the further processing of their personal data (including preventing the transfer to a new owner) and to make alternative arrangements. The issue of what constitutes adequate notice will vary from case to case but in any event it must be at minimum a sufficient period that will allow a data subject to have a meaningful opportunity to consider the changes contemplated and to take steps to exercise their preferences in relation to the proposed changes.

8) Disclosure of Personal Information to a Third Party by a Data Processor

We received a complaint concerning the alleged unauthorised disclosure of the complainant’s personal information by An Post to a third party. The complainant, who had recently been bereaved, informed us that An Post had erroneously issued a valuation statement in respect of a joint savings deposit account that they had previously held with their late partner, to a solicitor acting on behalf of their late partner’s son. The statement contained the complainant’s personal financial data in relation to their joint State Savings account held with the National Treasury Management Agency (NTMA). Prior to making the complaint to this Office, the complainant had received an apology from An Post, on behalf of the NTMA, who acknowledged that the complainant’s personal information had been disclosed in error. However, because the complainant had received very little information as to how the disclosure had occurred they requested that we investigate this matter.

Although the complainant submitted a complaint against An Post, we established in our preliminary that An Post offers products and services on behalf of State Savings, which is the brand name used by the NTMA to describe the range of savings products offered by the NTMA to personal savers. An Post is therefore a "data processor" as defined under the Data Protection Acts 1988 and 2003 as it processes customers’ personal data on behalf of the NTMA. The NTMA is the "data controller" as defined under the Data Protection Acts 1988 and 2003 as it controls the content and use of its customers’ personal data for the purposes of managing their State Savings account.

We commenced an investigation by writing to the NTMA which NTMA did not contest the fact that the complainant’s personal information had been disclosed. The NTMA stated that, having received a full report from its data processor, An Post, it had confirmed that, contrary to State Savings standard operating procedures, a valuation statement, which included details of an account held jointly by the complainant and their deceased partner, was sent to a solicitor acting on behalf of a third party. The NTMA acknowledged that the information should not have been sent to the third party and that correct procedures were not followed in this instance by the data processor.

The complainant chose not to accept an apology and goodwill gesture from the NTMA as an amicable resolution of their data protection complaint, opting instead to seek a formal decision of the Data Protection Commissioner.

A decision of the Data Protection Commissioner issued in July 2016. In her decision, the Commissioner formed the opinion that the NTMA contravened Section 2A(1) of the Data Protection Acts 1988 and 2003 by processing the complainant’s personal information without their consent by way of the disclosure, by An Post as an agent of the NTMA, of the complainant’s personal information to a third party.

This case illustrates that it is vital for data controllers to ensure that their policies and procedures for the protection of personal data are properly and routinely adhered to by all staff. Staff awareness is key to this issue but employers should also ensure that regular reviews of how those policies and procedures are applied in practice are carried out so as to identify potential issues and enable the taking of appropriate remedial actions/ changes to the practices, policies and procedures.

9) The Necessity to Give Clear Notice When Collecting Biometric Data at a Point of Entry

In October 2015, we received a complaint from a contractor in relation to the alleged unfair obtaining and processing of their personal data. The complainant stated that in the course of attending a data centre for work-related purposes the company had collected their biometric data without their consent and had also retained their passport until they had completed the training course. While the complainant had been advised in advance by the data controller to bring identification on the day of attendance at the data centre for security purposes, they had not been informed at that time that the data controller would be collecting their biometric data upon arrival at the data centre.

In the course of our investigation, we established that the data controller had collected the complainant’s biometric data upon their arrival at the data centre by way of a fingerprint scan. However, no information about this process had been provided to the complainant at that time – they were simply told that they could not go through security without this biometric fingerprinting.  The data controller confirmed to us that this fingerprint scan data had not been retained, rather it had been used to generate a numerical template which was then stored in encrypted form and that numerical information was associated with a temporary access badge provided to the complainant for the duration of the time which the complainant was in attendance at the data centre. The data controller confirmed that it had deleted this information from its system and back-up files at the data subject’s request upon the data subject’s departure from the data centre. The data controller further confirmed that, while it had retained the complainant’s passport for the duration of the complainant’s attendance at the data centre pursuant to a policy to ensure the return of temporary access badges, it had not taken or retained a copy of the complainant’s passport.

The complainant in this case did not wish to accept the offer of amicable resolution made by the data controller and instead requested that the Commissioner make a formal decision on their complaint.

The decision by the Data Protection Commissioner in October 2016 found that the data controller contravened Section 2(1)(a) and Section 2D(1) of the Data Protection Acts 1988 and 2003 as the data controller should have supplied the complainant with the purposes of the collection and processing of the biometric data, the period for which it would be held and the manner in which it would be retained, used and, if applicable disclosed to third parties. This could have been done by the data controller either when it was in contact with the complainant to advise them of the requirement to bring identification to gain entry to the data centre, or at the latest, at the time the complainant arrived at the data centre.

However in relation to the obtaining and processing of the complainant’s biometric data, having reviewed the information provided by the data controller in the course of the investigation by this office, the Data Protection Commissioner found that the data controller had a legitimate interest under Section 2A(1)(d) of the Acts in implementing appropriate security procedures for the purposes of safeguarding the security of data centre, in particular for the purposes of regulating and controlling access by third parties to the data centre. Given that the biometric data was used solely for the purposes of access at the data centre, it was not transferred to any other party and was deleted in its entirely at the data subject’s request upon departing the data centre, the Data Protection Commissioner’s view was that this did not amount to potential prejudice which outweighed the legitimate interests of the data controller in protecting the integrity of the data centre and preventing unauthorised access to it. Accordingly, the Data Protection Commissioner concluded that the data controller had a legal basis for processing the complainant’s biometric data.

In relation to the retention of the complainant’s passport for the duration of their visit at the data centre, the Commissioner found that this did not give rise to any contravention of the Data Protection Acts 1988 and 2003, as the data controller had a legitimate interest in doing so and the limited processing of the complainant’s passport information (i.e. the retention of the passport itself) did not give rise to any disproportionate interference with the complainant’s fundamental rights.

Transparency is a key principle under data protection law and the giving of notice of processing of personal data to a data subject is a major element of demonstrating compliance with this principle. In particular, the central tenet that individuals whose data is collected and processed should not generally be “surprised” at the collection and processing or its scale or scope, should inform all aspects of a data controller’s data processing operations.

10) Residential Care Home's Legimate Use of Audio Recording and Photograph of Data Subject Concerning Allegations of Misconduct

We received a complaint from a former employee of a residential care home who claimed that photographic evidence and an audio recording of them were used in a disciplinary case against them by their employer resulting in their dismissal.

During our investigation, the complainant’s former employer (the operators of the residential care home) advised us that a formal, externally led investigation had been conducted into allegations that the complainant had been found by a supervisor to be asleep during a night shift on two separate occasions. On the nights in question, the complainant had been the sole staff member on duty responsible for the care of a number of highly vulnerable and dependent adults who had complex medical and care needs and who needed to be checked regularly. Having discovered the complainant asleep on the first occasion, the supervisor had warned the complainant that if it happened again it would be reported in line with the employer’s grievance and disciplinary procedure. On the second occasion, when the supervisor discovered the complainant to be asleep, fully covered by a duvet on a recliner with the lights in the room dimmed and the television off, the supervisor had used their personal phone to take photographs of the complainant sleeping and make a sound recording of the complainant snoring. The allegations had been upheld by the investigation team and a report prepared. This was followed by a disciplinary hearing convened by the employer. The employer had informed the complainant at that hearing that it accepted the verbal and written account given by the supervisor. The employer had found that the act of sleeping on duty constituted gross misconduct in light of the vulnerabilities and dependencies of the clients in the complainant’s care and the complainant had been dismissed.

Having regard to the information supplied to us by the operators of the residential care home and, in particular, the vulnerability of the clients involved and the nature of the complainant’s duties, we formed the view that no breach of the Data Protection Acts 1988 and 2003 had occurred. In this case, we considered that the processing of the complainant’s data, by way of the photograph and audio recording made by the supervisor, and the subsequent disclosure of these to the employer was necessary for the purposes of the legitimate interests pursued by the data controller, the employer, under Section 2A(1)(d) of the Data Protection Acts 1988 and 2003. This legal basis for processing requires the balancing of the data controller’s (or a third party’s or parties’) legitimate interests against the fundamental rights and freedoms or legitimate interests of the data subject, including an evaluation of any prejudice caused to those rights of the data subject.

We considered that the processing of personal data here was limited in nature and scope as it consisted of a one-off taking of a photograph and the making of an audio recording by the supervisor, who acted of their own volition and not in response to any direction or request from the employer. There had been limited further disclosure of the personal data concerned afterwards, i.e. to the employer, while the original photograph and recording were deleted from the supervisor’s phone. A copy of the material had also been provided to the complainant in advance of the complainant meeting the investigation team. We therefore considered that, in the circumstances, the processing was proportionate and that the legitimate interests of the data controller (and indeed the legitimate interests of third parties, being the clients of the residential care home) outweighed the complainant’s right to protection of their personal data.

While the right to protection of one’s personal data attracts statutory protection within the national legal system and, moreover, is a fundamental right under EU law, such rights are not absolute. Accordingly, they must be interpreted to allow a fair balance to be struck between the various rights guaranteed by the EU legal order. In particular, as this case demonstrates, data-protection rights should not be used to ‘trump’ the rights of particularly vulnerable members of society or the legitimate interests pursued by those organisations responsible for safeguarding the health and life of such persons in discharging their duties of care and protection

11) Disclosure of Personal Information to a Third Party

We received two complaints from public servants (a husband and wife) whose personal data was disclosed by PeoplePoint, the human resources and pension shared services for public service employees. The initial complainant, in November 2015, stated that after applying for annual leave, he subsequently made an application to change this request to sick leave. The officer in PeoplePoint responsible for this section proceeded to email the complainant’s line manager at the government department in which the complainant worked. However, on receiving an ‘out-of-office’ reply the officer proceeded to email the complainant’s non-supervisory peer. PeoplePoint had notified us of the breach in June 2015. However, on commencing an investigation and receiving a copy of the email at the centre of the breach, we established that the personal data of the complainant’s spouse, who was also a public servant in a different department, was also contained in the email and that the email had been sent to three third parties. It became apparent that the official in PeoplePoint, when considering the initial complainant’s annual leave, had also accessed his spouse’s personal information without the authorisation of her employer or her consent.

Upon further investigation into this matter it became apparent that the PeoplePoint official had informed the complainant’s spouse and their colleagues about information in relation to the complainant when they had no legal basis to do so and without any authority from the data controller of their personal data, i.e. the employer.

PeoplePoint were subject to an audit by this Office. In relation to this complaint, it informed us that upon being made aware of the breach, it acted to retrieve the data and confirmed that the data had been deleted by all parties involved. It also stated that corrective action had been taken to improve the relevant official’s awareness of data privacy. Whilst a proposal for amicable resolution was proposed by Peoplepoint, the complainants declined it and requested a formal decision of the Commissioner.

The Commissioner concluded the opinion that Section 21(1) of the Data Protection Acts 1988 and 2003 had been contravened. PeoplePoint, is a processor engaged by the data controller (being the relevant government department which is the employer) and as such the data processor owes a duty of care to the data subjects whose personal data it is processing. Under Section 21, a data processor must not disclose personal data without the prior authority of the data controller on behalf of whom the data are processed.

This case is a stark reminder to data processors of the importance of processing data only with the prior consent of the data subject or the data controller. Actions in relation to personal data which may appear innocuous to ill-informed staff can have serious ramifications for data subjects. It is not acceptable for data processors and data controllers to rely on an excuse that an employee did not realise that what they were doing was a breach of data protection law. It is the responsibility of such employers to ensure that all staff are appropriately trained and supervised in relation to the processing of personal data, in order to minimise to the greatest degree possible, the risks to the fundamental rights and freedoms of data subjects whose personal data they process.

12) Failure of a Data Controller to Keep Individual's Personal Information Accurate and Up to Date Which Resulted in the Disclosure of Personal Data to a Third Party

We received a complaint in February 2015 concerning the alleged unauthorised disclosure by Permanent TSB (PTSB) of the data subject’s personal information to a third party. In this complaint the data subject stated that she had lived at a property with her ex-husband, that the mortgage for this property was a joint account in both her and her ex-husband’s names and that she was subsequently removed from this mortgage as part of a divorce settlement. The data subject informed this Office that she subsequently took out a separate mortgage with PTSB, solely in her own name, for a different property. However, PTSB had sent a letter of demand, addressed to her at her new property and also addressed to a third party property which she had never been associated with. The complainant’s ex-husband had been raised at this property; his stepmother was still living there and she had opened the PTSB letter of demand and notified her stepson (the data subject’s ex-husband), who in turn had notified the data subject. We commenced and investigation and PTSB accepted that the data subject’s personal data had been disclosed to a third party. PTSB informed us that this had occurred because the third party address (which the data subject had provided to PTSB as a correspondence address when applying for the previous loan which she held with her ex-husband), was incorrectly linked to the entirely separate subsequent mortgage loan in the data subject’s sole name.

We sought an amicable resolution of this complaint but the proposal which PTSB made the data subject was declined and she instead sought a formal decision of the Commissioner.

The Commissioner found that PTSB had contravened both Section 2A(1) of the Data Protection Acts 1988 and 2003 by processing the data subject’s personal data without her consent or another legitimate basis for doing so and also Section 2(1)(b) by failing to keep her personal data accurate, complete and up to date.

The circumstances of this complaint are a case in point as to the rationale behind the principle that personal data must be kept accurate, complete and up to date. Failure to adhere to this principle, particularly in the context of contact information perpetuates the risk that further data protection failures (such as unauthorised disclosure to third parties) will flow from such non-compliance.

13) Failure by BOI to Properly Verify the Identity of Individual on the Phone Which Resulted in the Disclosure of Personal Information to a Third Party

We received a complaint that Bank of Ireland (BOI) had disclosed the complainant’s personal information to a third party. BOI had notified the complainant of this disclosure which occurred when, in an attempt to contact him regarding his account, a member of BOI staff called his mobile and did not get an answer. BOI stated that as the staff member could not contact him on his mobile, they then attempted to contact him via the landline number listed on his account. According to BOI’s notification, the complainant’s mother had answered the phone and the BOI advisor requested to speak with the complainant, who shares his name with his father, and explained to the complainant’s mother that they could not discuss the account with her as she was not listed on the account. By referring to the complainant by his last name Mr X, his mother mistakenly thought the call was in relation to the account she held with her husband who is also called Mr X. BOI’s position was that that the complainant’s mother was adamant that she was listed on the account and therefore the advisor should speak to her about it. Certain information was then provided to the complainant’s mother regarding his account.

We commenced the investigation of this complaint by writing to BOI asking it to confirm if it had already reported this breach to us as is considered good practice under our “Personal Data Security Code of Practice”. BOI did not contest the fact that the complainant’s personal data had been disclosed and it confirmed that the breach had been previously reported to us. BOI had indicated that some confusion arisen, due to complainant’s father having the same name as him and having a banking relationship with the same bank branch and as a result of this confusion, BOI failed to properly identify the person with whom it was dealing and disclosed the complainant’s personal information to a third party. BOI claimed that it was only made aware of the disclosure of his personal information when the complainant’s mother phoned the advisor later that day to inform BOI that the complainant was her son and that the information was in relation to his loan accounts. BOI also advised us that a letter of apology had been issued to the complainant.

The complainant in this case declined the offer of amicable resolution which was made by BOI and requested a formal decision of the Commissioner.

The Commissioner concluded in her June 2016 decision that BOI contravened Section 2A(1) of the Data Protection Acts 1988 and 2003 when it processed the complainant’s personal information without his consent by disclosing it to a third party.

This case is a further demonstration of how a simple failure by a staff member to rigorously adhere to the requirement to verify a data subject’s identity before disclosing their personal data can result in unauthorised disclosure of personal data. While the circumstances of this case involved the verbal unauthorised disclosure of personal data to a family member of the data subject concerned, this in no way makes it any less serious than if it had been a written disclosure to an unrelated third party

14) Data Controller Obliged to Demonstrate Effort Made to Locate Data Within the Statutory 40 Day Period

We received a complaint from an individual concerning an access request which they had submitted to Meteor seeking a copy of their personal data and, in particular, the call recordings of calls which they had made to Meteor Customer Care for a particular period. Meteor responded initially to his request by stating that only 10% of calls to its Customer Care line are recorded and retained for 30 days and that there was no guarantee that his calls from the previous 30 days had been recorded. Meteor subsequently replied to the complainant’s access request definitively stating that there were no calls recorded and available in relation to the complainant.

We commenced an investigation of the complaint requesting information from Meteor in relation to the efforts it had undertaken to retrieve the call recordings which were the subject of the access request as well as information on the locations and/or business units to which enquiries were made in relation to the requester’s access request. Meteor supplied us with a printout showing the searches undertaken and it responded that that it did not hold any calls in relation to the complainant.

In this case the issue of compliance with the 40 days for responding to an access request under the Data Protection Acts 1988 and 2003 was at issue. The complainant had made a valid access request to Meteor by email dated 24 August 2015. Meteor had finally responded to the requester by email on 29 October 2015 with a substantive answer. This substantive response to the access request fell nearly four weeks outside the 40 day statutory period for responding. Furthermore, Meteor did not provide us with any evidence that it had commenced the search for the call recordings which the complainant had sought within that 40 day period but instead chose to rely on its policy that only 10 % of Customer Care line calls are recorded and simply assumed that the complainant’s calls had not been recorded.

Despite attempting to amicably resolve this complaint we were unable to do so and the data subject requested a formal decision from the Data Protection Commissioner. In her decision the Data Protection Commissioner concluded that Meteor had contravened the Data Protection Acts 1988 and 2003 by not responding to the complainant’s access request within the 40 day period as provided for under Section 4(1)(a).

This case demonstrates that a data controller must not approach a valid data access request on a simple assumption that it does not hold the personal data which is sought. Irrespective of the circumstances of the request, any policies employed or assumptions held by a data controller, it must take all steps necessary to establish in fact whether the requested data is, or is not, held by the data controller and to respond substantively to the access request within the 40 day statutory period. The right of access of a data subject is one of the cornerstones to the protection of an individual's personal data and this right must not be stymied by the actions of data controllers, whether unintentional or otherwise.

15) Personal Data Withheld from an Access Request by Airbnb on the Basis of an Opinion Given in Confidence

We received a complaint in July 2016 from an individual (an Airbnb guest) concerning an access request which he had submitted to Airbnb. The essence of the complaint was that Airbnb had not provided the guest with a particular email about him which had been sent to Airbnb by the host of Airbnb accommodation which the guest had rented. That email related to a complaint by the host about the guest. In responding to the guest’s access request, Airbnb had withheld this email on the basis that it consisted of an expression of opinion given in confidence by the host.

Of relevance here was Section 4(4A)(a) of the Data Protection Acts 1988 and 2003 which allows for personal data which consists of an expression of opinion about the data subject by another person to be disclosed by the data controller to the data subject in response to an access request without the need to obtain the consent of the person who gave the opinion. Equally relevant was Section 4(4A)(b)(ii) of the Data Protection Acts 1988 and 2003 which provides for an exemption from the right of access to personal data where the personal data consists of the expression of an opinion about the data subject by another person which has been given in confidence or on the understanding that it could be treated as confidential.

We commenced an investigation which examined in particular whether the email in question from the host to the data controller, Airbnb, consisted of the expression of a confidential opinion by the host about the guest. We found that the content of the email in question was predominately factual in nature. While one element of the email comprised of an expression of opinion, there was no reference or indication in the email to an expectation on the part of the host that the contents of the email would be kept confidential or not disclosed by Airbnb to the guest. In fact, we noted that in another email directly from the host to the guest, the host had indicated to the guest that they had contacted the Airbnb about the guest.

While Airbnb was clearly trying to fairly balance the rights of the guest against the rights of the host in this case, it was our view based on our examination of the issues and communications involved that there was no evidence at all of an expectation or understanding by the host that their email about the guest would not be released to him. In those circumstances no exemption from the right of access applied under Section 4(4A)(b)(ii). Airbnb accepted our position and accordingly released the email in question to the guest. This allowed the complaint to be amicably resolved.

As this case demonstrates, before withholding personal data on the basis that it consists of the expression of an opinion given in confidence or on the understanding that it could be treated as confidential, a data controller must ensure that there is a solid basis for such an assertion. It is not enough for a data controller to simply assume that this was the case in the absence of any indication to this effect from the person who expressed the opinion.

Furthermore, the inclusion of an opinion which attracts this exemption does not mean that all other personal data which is contained within the same document is similarly exempt from the right of access. Rather, in the context of a full document of personal data, the data subject is entitled to access the personal data within it which is not an opinion given in confidence and the data controller may only redact the part or parts to which the exemption validly applies. Opinions about individuals in respect of which no expectation of confidentiality can be shown to apply, or indeed information which is simply confidential, are not exempt from an access request.

As outlined in our published guidance, an opinion given in confidence on the understanding that it will be kept confidential must satisfy a high threshold of confidentiality. Simply placing the word "confidential" at the top of the page, for example, will not automatically render the data confidential. In considering the purported application of this exemption to a right of access, we will examine the data and its context and will need to be satisfied that the data would not otherwise have been given but for this understanding of confidentiality.

16) Crypto Ransomware Attack on a Primary School

In October 2016, we received a breach report from a primary school that had been the victim of a “Crypto Ransomware” attack, whereby parts of the school’s information systems had been encrypted by a third party thereby rendering the school’s files inaccessible. These files contained personal details including names, dates of birth and Personal Public Service Numbers (PPSNs). A ransom was demanded from the school to release the encrypted files.

Our assessment of the attack identified that the school had deficiencies in the measures it had taken to secure pupils’ personal data including:

  • No polices or procedures were in place to maintain adequate backups;
  • No procedures or policy documents existed focusing on system attacks such as ransomware or viruses;
  • No contracts with data processors (the ICT services providers) were in place (as is required under Section 2C(3) of the Data Protection Acts 1988 and 2003) setting out their obligations and, as a result, actions taken by the ICT suppliers were inadequate in response to the attack; and
  • A lack of staff training and awareness of the risks associated with opening unknown email attachments or files.

We considered that the school had contravened the provisions of Section 2 (1) (d) of the Acts, having failed to ensure that adequate security measures were in place, to protect against the unauthorised processing and disclosure of personal data.

Recommendations were issued to the school that it take steps to mitigate the risks identified. The school subsequently informed us that it had taken the following steps based on the recommendations issued:

  • Implement a staff training and awareness programme on the risks associated with email and the use of personal USB keys.
  • Implementation of a contract review process to ensure appropriate contracts are in place with its ICT suppliers
  • Ensure that any ICT support the school engages with either on a local basis or as recommended by the Board is performed by competent data processors.

This case demonstrates that schools, like any other organisation - commercial, public sector or private, operating electronic data storage systems and interacting online must ensure that they have appropriate technical security and organisational measures in place to prevent loss of personal data, and to ensure they can restore data in the event of Crypto Ransomware attacks. 

17) Data Breach at an Online Retailer

In July 2016, we received a breach report from an organisation operating retail and online sales. The organisation had been notified by a customer that their credit card was used in a fraudulent transaction without their knowledge which they believed arose from their provision of payment details online to the organisation.

The organisation engaged an expert third party to conduct an analysis of its website. It was determined that the payments system on the website had been compromised by malware for the previous 6-8 weeks. The malware copied data entered by customers during the online payment stage to an external destination.

Our assessment of the breach identified that there were deficiencies in the measures which the organisation had taken to secure users’ personal data including the following.

  • No contract or service level agreement existed between the data controller and the data processor.
  • No steps were taken to ensure that the data processor was compliant with technical security and organisational measures.
  • ensure that the server and website platform were maintained and that the software versions were up to date;
  • ensure that appropriate user authentication and access control measures were in place;
  • ensure appropriate technical security was in place, such as secure configuration of the website platform, measures to detect malware, measures to monitor suspicious activity and measures to ensure regular backups were taken; and
  • ensure governance processes were in place such as periodic reviews of the data processor and its technical security and organisational measures.

In light of the above, we considered that the organisation had contravened Section 2(1)(d) of the Data Protection Acts 1988 and 2003 by failing to take appropriate security measures against unauthorised access to, or unauthorised alteration, disclosure or destruction of, its users’ personal data.

Recommendations were issued to the organisation that it take steps to mitigate the risks identified. The organisation subsequently informed us that it had taken the following steps to address the recommendations:

  • Contracts are now in place to ensure that the appropriate technical security and organisational measures are in operation;
  • The organisation conducts regular reviews of the server and website platforms to ensure they are maintained and that the software versions are up to date;
  • The organisation conducts annual reviews by a third party expert to ensure compliance and to independently validate that the appropriate technical security and organisational measures are in place.

This case highlights the need for organisations to ensure that they have appropriate technical security and organisational measures for ICT security in place, particularly when engaging a data processor. Organisations should be cognisant of the measures outlined under Section 2C of the Acts to understand their obligations, in particular:

  • To ensure that appropriate security measures are in place;
  • Reasonable steps are taken to ensure that employees of the Data Controller and any other persons, for example, Data Processor employees, associated with the processing are aware of their obligations;
  • To ensure that proper contractual agreements are in place governing the processing;
  • That reasonable steps are taken to ensure compliance with the measures. 

18)  Incorrect Association of an Individual's Personal Details with Another File

We received a complaint concerning an alleged breach of an individual’s data protection rights by an insurance company.

During our investigation, the insurer (Insurer X) advised us that the complainant had in the past requested a quotation for household insurance from another insurance company (Insurer Y), the undertakings of which had been transferred to Insurer X. Insurer Y had failed to delete the quotation (the complainant had never proceeded to take out a policy) in line with its own data retention policy. In addition, Insurer Y had mistakenly linked the complainant’s personal details on the quotation to an insurance claim file in respect of a claim it had received from a person with an identical name.

When a transfer of Insurer Y's undertakings to Insurer X was being completed, the insurance claim file which mistakenly included the complainant as the claimant (rather than another individual who had the same name) was transferred to Insurer X. The claim when assessed later turned out to be fraudulent and Insurer X had its solicitors write to the complainant advising that their claim was found to be fraudulent and indicating the follow-up action which Insurer X intended to pursue to protect its interests.

At its centre, this case concerned sloppy handling of personal data. Many people in Ireland have the same name and there was no reason why the complainant’s personal details collected when the complainant obtained a quotation should have been added to an insurance claim file. Sufficient checks and balances should have existed in Insurer Y's data handling processes. However, the more significant issue that arose for this complainant is that they were unable to ascertain, prior to our involvement, how their details came to be in the possession of Insurer X and how the issue that arose had come about.

A number of contraventions therefore occurred in this case – a breach of the requirement of a reasonable retention period due to holding onto the quotation data longer than necessary and longer than was set out in the company’s own retention policy; unlawful further processing of the personal data by associating it with a claim file; failure to respond in a clear and timely manner to the complainant to explain how their data had been sourced and how it came to be processed in the way that it was. The complainant in this case suffered particularly serious consequences as they incurred significant legal costs in defending the accusation of making a fraudulent claim and the threat by Insurer X of instigating Circuit Court proceedings against them.

19) Prosecution of The Irish Times Limited for Marketing Offences

On 28 April 2015 we received a complaint from an individual who received an unsolicited marketing email earlier that day from The Irish Times Limited in the form of a “Get Swimming” newsletter. He explained that he signed up for the “Get Swimming” newsletter some months previously and he told us that he opted out after the receipt of the third or fourth issue by using the unsubscribe instruction at the bottom of the newsletter. However, he claimed that The Irish Times Limited continued to send him the “Get Swimming” newsletter each week thereafter and he continued to unsubscribe using the unsubscribe instruction. He informed us that he also emailed Customer Care in The Irish Times Limited on 21 April 2015 asking to be removed from the newsletter and warning that if not, he would report the matter to the Data Protection Commissioner. Customer Care responded on the same day stating that they would remove him from the newsletter immediately. However, he received a further newsletter one week later.

In response to our investigation, The Irish Times Limited stated that this was a once-off issue that arose from a human error in configuring the unsubscribe process, which had subsequently been fixed. It confirmed that sixty-four other users were affected. It informed us that a procedure had been put in place to prevent a recurrence.

The Data Protection Commissioner had previously issued a warning to The Irish Times Limited in November 2012 following the investigation of a complaint from a different individual in relation to marketing emails which he continued to receive after he had opted out of the receipt of such emails.

The Data Protection Commissioner decided to prosecute the company. At Dublin Metropolitan District Court on 4 April 2016, The Irish Times Limited pleaded guilty to one charge of sending an unsolicited marketing email without consent. The Court ordered the payment of €3,000 in the form of a charitable donation to Pieta House and it adjourned the matter for seven weeks. The defendant agreed to cover the prosecution costs of the Data Protection Commissioner. At the adjourned hearing the defendant produced proof of payment of the charitable donation and the Court struck out the charge.

20) Prosecution of Coopers Marquees Limited for Marketing Offences

In September 2015 we received a complaint from an individual about a marketing email which she received a few weeks earlier from Coopers Marquees Limited. The same individual had previously complained to us in January 2014 after she received a marketing email from that company which, she stated, she had not consented to receiving. During the course of our investigation of the first complaint, the company undertook to remove the individual’s email address from its marketing database. We concluded that complaint by issuing a warning to the company that the Data Protection Commissioner would likely prosecute it if it re-offended.

In response to our investigation of the second complaint, we were informed that a new marketing executive for the company used an old version of the marketing database for a marketing campaign. This resulted in the sending of the offending marketing email to the email address of the individual whose details had been removed for over a year. The company accepted that it did not have consent to contact the individual concerned by email and it claimed that there was human error on the part of the new staff member which caused the email to be sent. The Data Protection Commissioner decided to prosecute the company.

At Virginia District Court on 7 June, 2016 Coopers Marquees Limited pleaded guilty to one charge of sending an unsolicited email without consent. The Court ordered a contribution in the amount of €300 as a charitable donation to Mullagh Scout Troop and it indicated that it would apply the Probation of Offenders Act in lieu of a conviction. The defendant company agreed to make a contribution towards the prosecution costs of the Data Protection Commissioner.

21) Prosecution of Robert Lynch T/A The Energy Centre for Marketing Offences

In January 2015 two individuals complained to us about unsolicited marketing calls which they received from The Energy Centre on their landline telephones. In the case of both complainants, their telephone numbers stood recorded on the National Directory Database (NDD) Opt-Out Register. In the case of the first complainant, he informed us that he received an unsolicited marketing call on 5 January 2015 during which the caller offered to arrange to conduct a survey of his home for the purpose of recommending energy saving initiatives that The Energy Centre could sell him. The complainant said that he told the caller not to call him again and he pointed out that his number was on the NDD Opt-Out Register. Three days later, the complainant received a further unsolicited marketing call from The Energy Centre. In the case of the second complainant, he received an unsolicited marketing phone call on 23 January 2015 from a caller from The Energy Centre who told him that there were sales agents in his area and that she wished to book an appointment for one of them to visit his home. The same complainant had previously complained to us in November 2013 having received an unsolicited marketing phone call from the same entity at that time. His first complaint was amicably resolved when he received a letter of apology, a goodwill gesture and an assurance that steps had been taken to ensure that he would not receive any further marketing calls.

By way of explanation during the course of our investigation of the two complaints received in January 2015 The Energy Centre indicated that its IT expert had examined the matter and concluded that there was human error somewhere along the line when someone transferred some telephone numbers from a non-contact list back into the system to be contacted.

The Data Protection Commissioner had previously issued a warning to The Energy Centre following the investigation of a complaint from a different individual in relation to unsolicited marketing calls which he received on his landline telephone while his number was recorded on the NDD Opt-Out Register.

The Data Protection Commissioner decided to prosecute. At Drogheda District Court on 21 June 2016, Robert Lynch T/A The Energy Centre pleaded guilty to three charges of making unsolicited marketing telephone calls to the telephone numbers of two individuals whose numbers were recorded on the NDD Opt-Out Register. In relation to the first case where the complainant’s number was called on two occasions three days apart, the Court convicted the defendant in respect of the charge for the second telephone call, it applied a fine of €100 and it took the other charge in relation to the first telephone call into account. In relation to the second case, the Court applied the Probation of Offenders Act in respect of that charge. The defendant agreed to pay the prosecution costs incurred by the Data Protection Commissioner.

22) Prosecution of Paddy Power Betfair Public Limited Company for Marketing Offences

In June 2016 an individual complained to us about marketing text messages he was receiving from Paddy Power Betfair Plc and he also alleged that the ‘stop’ command at the end of the text messages was not working. He stated that he had never placed a bet with Paddy Power Betfair Plc but he recalled having used its Wi-Fi once.

During our investigation of this case, the company, in relation to the allegation that the ‘stop’ command was not working, admitted that there were technical issues with the opt-out service of its text provider and stated that it had it acted immediately to rectify this once it became aware of it. On the matter of marketing consent, the company informed our investigation that the complainant had logged onto the Wi-Fi at its Lower Baggot Street, Dublin outlet in April 2016. It described how a user must enter their mobile phone number on the sign-in page following which they receive a PIN number to their phone which enables the user to proceed. After entering the PIN correctly, the customer is presented with a tick box to accept the terms of service which includes a privacy policy. Having examined the matter, we advised Paddy Power Betfair Plc that we did not see any evidence that the user was given an opportunity to opt out of marketing as is required by S.I. 336 of 2011 (the ePrivacy Regulations). We formed the view that the company was unable to demonstrate that the complainant unambiguously consented to the receipt of marketing communications. The company understood our position and it undertook to work with its Wi-Fi providers to add the required marketing consent tick box on its registration page. It also immediately excluded all mobile phone numbers acquired through the Wi-Fi portals from further marketing communications.

The Data Protection Commissioner decided to prosecute the company. A warning had previously been issued to the company in 2015 following the investigation of a complaint from a different individual who continued to receive marketing text messages after opting out.

At Dublin Metropolitan District Court on 28 November2016 Paddy Power Betfair Plc pleaded guilty to one charge of sending an unsolicited marketing text message without consent and one charge of not providing the recipient with a valid means of opting out of the receipt of further marketing messages. In lieu of a conviction and fine, the Court ordered the defendant to contribute €500 to the Simon Community by 12 December 2016 and it adjourned the matter for two weeks. The company agreed to discharge the prosecution costs incurred by the Data Protection Commissioner. At the adjourned hearing the defendant produced proof of payment of the charitable donation and the Court struck out the charges.

23) Prosecution of Trailfinders Ireland Limited for Marketing Offences

A complaint was lodged with us in June 2016 by an individual who received unsolicited marketing emails at that time from Trailfinders Ireland Limited despite having been informed previously that her email address had been removed from the company’s marketing database in August 2015. In its response to our investigation, the company acknowledged that the offending emails were sent in error. It explained that it had received a written communication about a customer care issue from the complainant a few days prior to the sending of the marketing emails and that its Customer Care team had updated her case concerning that particular issue. This update triggered an automated process which inserted the complainant’s email address into its marketing database. Trailfinders Ireland Limited apologised for the system error and it said that it should not have happened in any circumstances.

On foot of a previous complaint in 2015 against Trailfinders Ireland Limited from the same complainant concerning unsolicited marketing emails to which she had not consented, the Data Protection Commissioner had issued a warning to the company in January 2016. Following our investigation of the second complaint, the Data Protection Commissioner decided to prosecute the company.

At Dublin Metropolitan District Court on 28 November, 2016 Trailfinders Ireland Limited pleaded guilty to two charges of sending unsolicited marketing emails without consent. In lieu of a conviction and fine, the Court ordered the defendant to contribute €500 to the Simon Community by 12 December 2016 and it adjourned the matter for two weeks. The company agreed to discharge the prosecution costs incurred by the Data Protection Commissioner. At the adjourned hearing the defendant produced proof of payment of the charitable donation and the Court struck out the charges. 

24) Prosecution of Topaz (Local Fuels) Limited for Marketing Offences

In July 2016 an individual complained to us about an unsolicited marketing telephone call which he received on his mobile telephone from Topaz (Local Fuels) Limited. He had previously complained to us in November 2015 about marketing text messages which the company sent him without his consent and he informed us that despite attempting to opt out by replying ‘Stop’ he continued to receive more text messages. In its response to our first investigation, the company said that the inclusion of the complainant’s mobile telephone number in its promotional campaign was as a result of a human error and it acknowledged the failure of its system to register his opt out attempts. It informed us in February 2016 that it had removed the mobile phone number concerned from its marketing database. We concluded that complaint at the time with a warning to Topaz (Local Fuels) Limited.

At Dublin Metropolitan District Court on 28 November, 2016 Topaz (Local Fuels) Limited pleaded guilty to one charge of sending an unsolicited marketing text message without consent and one charge of not providing the recipient with a valid means of opting out of the receipt of further marketing messages. In lieu of a conviction and fine, the Court ordered the defendant to contribute €500 to Our Lady’s Children’s’ Hospital Crumlin by 12 December, 2016 and it adjourned the matter for two weeks. The company agreed to discharge the prosecution costs incurred by the Data Protection Commissioner. At the adjourned hearing the defendant produced proof of payment of the charitable donation and the Court struck out the charges.

25) Prosecution of Dermaface Linited for Marketing Offences

In August 2016 we received a complaint from a former customer of Dermaface Limited after she received an unsolicited marketing email. The complainant had previously been informed in 2014 on foot of a previous complaint about unsolicited marketing emails that Dermaface Limited had removed her details from its marketing list. Our investigation sought an explanation from Dermaface Limited. It informed us that the marketing email which was the subject of the latest complaint was sent through the clinic’s software system which it had purchased. It claimed that the new system contacted patients and former patients who had previously been opted out of receiving marketing communications from it. It admitted that the complainant was one of those patients/ former patients who had been sent a marketing email. It sent an apology to the complainant.

Following an investigation in 2011 of a complaint from a different individual who received numerous marketing text messages from Dermaface Limited, the Data Protection Commissioner had issued a warning to the company. The Commissioner decided, therefore, to prosecute the company in respect of the latest offence.

At Dublin Metropolitan District Court on 28 November 2016 Dermaface Limited pleaded guilty to one charge of sending an unsolicited marketing email without consent. In lieu of a conviction and fine, the Court ordered the defendant to contribute €300 to Our Lady’s Children’s’ Hospital Crumlin by 12 December, 2016. The Court also indicated that it expected the company to discharge the prosecution costs incurred by the Data Protection Commissioner and it adjourned the matter for two weeks. At the adjourned hearing the defendant produced proof of payment of the charitable donation and the Data Protection Commissioner’s costs. The Court struck out the charge.

  • Marketing offences by MTS Property Management Limited – prosecution
  • Marketing offences by Greyhound Household – prosecution
  • Marketing offences by Imagine Telecommunications Business Limited – prosecution
  • Marketing Offences by Eircom Limited – prosecution
  • Defence Forces Ireland – failure to keep data safe and secure
  • Further processing of personal data by a state body
  • Supermarket’s excessive use of CCTV to monitor member of staff
  • Disclosure of personal information to a third party by the Department of Social Protection
  • Covert CCTV installed without management knowledge
  • Danske Bank erroneously shares account information with third parties
  • Failure to update customer’s address compromises the confidentiality of personal data
  • Unfair use of CCTV Data

Case Study 1: Marketing offences by MTS Property Management Limited – prosecution

We received a complaint in February 2013 from an individual who received marketing SMS messages from MTS Property Management Limited advertising the company’s property-management services. The complainant informed us that she had dealt with the company on one occasion over five years previously but she did not consent to her mobile phone number being used for marketing purposes. She also pointed out that the SMS messages that she received did not provide her with a means of opting out.

Our investigation of this complaint became protracted as the company denied knowledge of the mobile number to which the SMS messages were sent and it denied knowledge of the account holder of the sending phone number. However, our investigation established sufficient evidence to satisfy itself that MTS Property Management Limited was responsible for the sending of the marketing SMS messages to the complainant. We decided to prosecute the offences.

MTS Property Management Limited had come to our attention previously in the summer of 2010 when two individuals complained about unsolicited marketing SMS messages sent to them without consent and without the inclusion of an opt-out mechanism. Following the investigation of those complaints, we warned the company that it would likely face prosecution if it committed further offences under Regulation 13 of SI 336 of 2011 at any future time.

At Dublin Metropolitan District Court on 23 February 2015, MTS Property Management Limited pleaded guilty to one charge of sending an unsolicited marketing SMS without consent and it pleaded guilty to one charge of failing to include an opt-out mechanism in the marketing SMS. The Court convicted the company on both charges and it imposed two fines of €1,000 each. The defendant agreed to cover the prosecution costs of the Data Protection Commissioner.

Case Study 2: Marketing offences by Greyhound Household – prosecution

In May 2014, we received a complaint against Greyhound Household from an individual who received an unsolicited marketing phone call on his mobile telephone from the company’s sales department. The same individual had previously complained to us in December 2013 as he was receiving marketing SMS messages from Greyhound Household that he had not consented to receiving. He informed us that he had ceased being a customer of the company in May 2013. Arising from the investigation of the previous complaint, Greyhound Household had undertaken to delete the former customer’s details and it apologised in writing to him. On that basis, we concluded the matter with a formal warning to the effect that any future offences would likely be prosecuted.

On receipt of the latest complaint, we commenced a further investigation. Greyhound Household admitted that a telephone call was made to the complainant’s mobile phone number without consent but it was unable to explain why his details had not been deleted in line with the company’s previous undertaking. We decided to prosecute the offence.

At Dublin Metropolitan District Court on 23 February 2015, Greyhound Household pleaded guilty to one charge of making an unsolicited marketing phone call to a mobile phone number without consent. The Court applied Section 1(1) of the Probation of Offenders Act subject to the defendant making a charitable donation of €1,000 to Pieta House. The defendant agreed to cover the prosecution costs of the Data Protection Commissioner.

Case Study 3: Marketing offences by Imagine Telecommunications Business Limited – prosecution

In March 2015, we received a complaint against Imagine Telecommunications Business Limited from a company that had received unsolicited marketing telephone calls. The same company had previously complained to us in 2014 about repeated cold calling to its offices. Despite having submitted an opt-out request to Imagine Telecommunications Business Limited, it continued to receive marketing phone calls. Following our investigation of the first complaint, and having been assured that the phone number of the complainant company had been removed from the marketing database, we issued a formal warning to Imagine Telecommunications Business Limited that any future offences would likely be prosecuted.

On investigating the current complaint, we were informed by Imagine Telecommunications Business Limited that it had failed to mark the telephone number concerned as ‘do not contact’ on the second of two lists on which it had appeared. This led to the number being called again in March and June 2015. It stated that the only reason the number was called after the previous warning was due to this error and it said that it took full responsibility for it.

We prosecuted the offences at Dublin Metropolitan District Court on 2 November 2015. Imagine Telecommunications Business Limited pleaded guilty to one charge of making an unsolicited marketing telephone call without consent. The Court applied Section 1(1) of the Probation of Offenders Act conditional upon a charitable donation of €2,500 being made to the Merchant’s Quay Project. Prosecution costs were recovered from the defendant.

Case Study 4: Marketing offences by Eircom Limited – prosecution

We received complaints from two individuals in February and April 2015 concerning marketing telephone calls that they had received on their landline telephones from Eircom Limited. In both cases, and prior to lodging their complaints, the individuals had submitted emails to Eircom Limited requesting that they not be called again. Eircom’s Customer Care Administration Team replied to each request and informed the individuals that their telephone numbers had been removed from Eircom’s marketing database. Despite this, each individual subsequently received a further marketing telephone call in the following months, thus prompting their complaints to this Office.

Eircom informed our investigations that the agents in its Customer Care Administration Team who handled the opt-out requests had not updated the system to record the new marketing preference after sending out the replying email to the individuals concerned. It undertook to provide the necessary refresher training to the agents concerned.

Separately, a former customer of Eircom complained in May 2013 that he continued to regularly receive unsolicited marketing phone calls from Eircom on his landline telephone despite clearly stating to each caller that he did not wish to receive further calls. He stated that the calls were numerous and that they represented an unwarranted intrusion into his privacy. Eircom continued to make a further ten marketing telephone calls to the individual after the commencement of our investigation of this complaint. Our investigation subsequently established that this former customer had received over 50 marketing contacts from Eircom since 2009 when he ceased to be an Eircom customer. Eircom explained that the continued calls arose from a misunderstanding of what systems the former customer’s telephone number was to be opted out from.

In October 2014, an Eircom customer complained that he had received a marketing SMS from Eircom that did not provide him with a means to opt out of receiving further marketing SMS messages. Eircom informed our investigation of this complaint that the inclusion of an opt-out is the norm in all of its electronic-marketing campaigns but, in this instance, and due to human error, the link to the necessary opt-out had not been set properly. Our investigation established that this error affected over 11,600 marketing messages that were sent in the campaign concerned.

We proceeded to prosecute the offences identified on foot of the complaints received in the aforementioned cases. At Dublin Metropolitan District Court on 2 November 2015, Eircom Limited pleaded guilty to six charges of making unsolicited marketing calls without consent and it pleaded guilty to one charge of sending a marketing SMS without a valid address to which the recipient may send an opt-out request. The Court applied Section 1(1) of the Probation of Offenders Act conditional on the defendant making donations amounting to €35,000 as follows: €15,000 to Pieta House, €10,000 to LauraLynn (Children’s Hospice) and €10,000 to Our Lady’s Children’s Hospital, Crumlin. The company agreed to pay the prosecution costs incurred by this Office.

Case Study 5: Defence Forces Ireland – failure to keep data safe and secure

A member of the Defence Forces made a complaint to this Office that certain personal data relating to him was not kept safe and secure by the Defence Forces.

The circumstances of the individual’s complaint to our Office arose when a Military Investigating Officer (MIO) was appointed to review an internal complaint made by him as a member of the Defence Forces. Subsequently, the Defence Forces Ombudsman was appointed to review the process of the handling of the complaint and, during the course of its review, it was ascertained that the MIO could not supply details of interview notes of an interview he had conducted with the complainant as he had stored them at an unsecure location and they were damaged or lost following flooding and a burglary at that location when the MIO was on an overseas mission. The unsecure location was in fact the MIO’s private house.

We raised the matter with the Defence Forces, who confirmed the complainant’s allegation that the notes had been stored at an unsecure location and had been damaged or lost as stated.

The Defence Forces informed us of the measures taken to keep data safe and secure, and referred us to its Administration Instruction, which provides for the prohibition of removal of records.

The Defence Forces further stated thatthe removal of records from their place of custody to a private residence would breach this instruction and that a breach of this provision may constitute an offence under S.168 of the Defence Act 1954. It advised that, as the MIO was no longer a serving member of the Defence Forces, he is not subject to military law.

The Defence Forces unequivocally acknowledged that the loss of the data in this case should not have occurred and was fully regretted. It informed us that it had recently undertaken a full review of practices and procedures in respect of both the processing and disclosure of data to mitigate the possibility of any future unauthorised or accidental disclosure of personal data.

The Commissioner’s decision on this complaint issued in June 2015, and it found that the Defence Forces contravened Section 2(1)(d) of the Data Protection Acts by failing to take appropriate security measures against unauthorised access to, or unauthorised alteration, disclosure or destruction of, the complainant’s personal data when it allowed it to be stored at an unsecure location, namely a private house.

This Office acknowledges that the Defence Forces has procedures in place in relation to the protection of personal data as set out in its Administration Instruction. However, those procedures were not followed in this case and when an official record was removed from its place of custody, it resulted in the complainant’s personal data being lost or stolen because the appropriate security measures in place were not followed.

There are many workplace scenarios where staff and managers, in particular, may need to take files, including personal data, home with them. Extreme caution should always be exercised in such cases to ensure that there is no risk to the security of personal data either in the transit of the files or while the files are in the employee’s home. Data controllers must ensure that employees act in a responsible manner with regard to the safe custody and handling of workplace files. This demands a proper system that records the taking of and returning of files and the following of prescribed procedures for the safe keeping of personal data while the files concerned are absent from the workplace. Likewise, it is critical that employees are prohibited from emailing official files from their workplace email account to their personal email account for afterhours work or for any other reason. In such situations, data controllers lose control of personal data that they are obliged by law to protect.

Case Study 6: Further processing of personal data by a state body

In February 2015, we received a complaint from an employee of a state body in relation to the alleged unfair processing of his personal data. The complainant stated that, in the course of a meeting, he had been advised that his manager had requested access to data from his security swipe card in order to compare it with his manually completed time sheets. The complainant explained that this had been carried out without any prior consultation with him or his line manager. By way of background, the complainant informed us that the security swipe cards used by the employees are for accessing the building and secured areas only, and are not used as a time management/attendance system.

We sought an explanation from the body concerned as to how it considered that it had complied with its obligations under the Data Protection Acts in the processing of the complainant’s personal information obtained from his swipe-card data. We also advised it that we had sight of the relevant section of its staff handbook and we noted that there was no reference to the swipe card being used for the purpose of checking attendance.

We received a response explaining that the swipe-card data relating to the complainant was handed over to the complainant’s manager in good faith on the basis that it was corporate rather than personal data. The organisation also confirmed that it checked the staff handbook and any other information that may have been circulated to staff regarding the purposes of the swipe card and that there was no mention of the use of swipe cards in relation to recording time or attendance. It advised that the focus of the information circulated with regard to swipe cards was on security and access only.

After consideration of the response received, along with the content of the complaint, we informed the organisation concerned that we considered that the Data Protection Acts were breached when the employee’s swipe-card details were provided to his manager to verify his working hours. We referred to the provisions of Section 2(1)(c)(ii) of the Data Protection Acts, which state that data shall not be further processed in a manner incompatible with the purpose for which it was obtained. Given that we considered the information concerned had been processed in contravention of the Data Protection Acts 1988 and 2003, we required an assurance that all email records created in relation to the further processing of the swipe-card details concerned be deleted from its systems; this assurance was duly provided.

The complainant in this case agreed, as an amicable resolution to his complaint, that he would accept a written apology from his employer. This apology acknowledged that the complainant’s data protection rights had been breached and it confirmed that the organisation had taken steps to ensure that this type of error did not recur in the future.

This case highlights the temptation organisations face to use personal data that is at their disposal for a purpose other than that for which it was originally obtained and processed. The scenario outlined above is not uncommon, unfortunately. Time and attendance monitoring may occasionally prove difficult for managers, and contentious issues arise from time to time. The resolution of those issues should not involve an infringement of the data protection rights of employees similar or otherwise to the circumstances in this case.

Case Study 7: Supermarket’s excessive use of CCTV to monitor member of staff

A former staff member of a supermarket submitted a complaint to this Office regarding her employer’s use of CCTV.

The complainant informed us that she had been dismissed by her employer for placing a paper bag over a CCTV camera in the staff canteen. She informed us that the reason for her covering the CCTV camera was that when she was on an official break in the staff canteen, a colleague styled her. The complainant also stated that the camera was placed in the corner of the staff canteen and there was no signage to inform staff that surveillance was taking place. She informed us that she was never officially advised of the existence of the camera nor had her employer ever informed her of the purpose of the CCTV in the canteen.

In its response to our investigation, the supermarket informed us that the complainant was dismissed for gross misconduct, which occurred when she placed a plastic bag over the camera in the canteen to prevent her actions being recorded and thereby breaching the store’s honesty policy as outlined in the company handbook. The supermarket owner informed us that the operation of CCTV cameras within the retail environment was to prevent shrinkage, which can arise from customer theft, waste and staff theft. He stated that it was also used for health and safety, to counter bullying and harassment and for the overall hygiene of the canteen. In relation to the incident concerning the complainant, the owner informed us that, on the day in question, the store manager noticed some customers acting suspiciously around the off-licence area and that on the following day CCTV footage was reviewed. It was during the viewing of the footage in relation to suspicious activity in the off-licence area that he noticed the complainant putting a bag over the camera.

Following an inspection by one of our Authorised Officers, we informed the supermarket owner that, in our view, there was no justification from a security perspective for having a camera installed in the canteen area.

The complainant in this case declined an offer of an amicable resolution and she requested a formal decision of the Commissioner.

The decision by the Commissioner in January 2015 found that the supermarket contravened Section 2(1)(c)(iii) of the Data Protection Acts, 1988 and 2003, by the excessive processing of the complainant’s personal data by means of a CCTV camera in a staff canteen.

Data controllers are tempted to use personal information captured on CCTV systems for a whole range of purposes. Many businesses have justifiable reasons, usually related to security, for the deployment of CCTV systems on their premises but any further use of personal data captured in this way is unlawful under the Data Protection Acts unless the data controller has at least made it known at the time of recording that images captured may be used for those additional purposes, as well as balancing the fundamental rights of employees to privacy at work in certain situations, such as staff canteens and changing rooms.

Case Study 8: Disclosure of personal information to a third party by the Department of Social Protection

This Office received a complaint in July 2014 concerning an alleged unauthorised disclosure of the complainant’s personal information by the Department of Social Protection to a third party. The complainant informed us that, in the course of an Employment Appeals Tribunal hearing, her employer produced to the hearing an illness-benefit statement relating to her. The statement contained information such as her name, address, PPSN, date of birth, bank details and number of child dependants. She stated that her employer was asked how he had obtained this illness-benefit statement. He stated that he had phoned the Department of Social Protection and the statement had subsequently been sent to him by email. Prior to making the complaint to this Office, the complainant had, via her solicitors, received an apology from the Department, who acknowledged that her information had been disclosed in error and that proper procedures had not been followed. However, she informed us that she had very little information as to how the disclosure had occurred and that the matter had caused her considerable distress.

We commenced an investigation by writing to the Department of Social Protection. In response, it stated that it accepted that a statement of illness benefit was disclosed to the complainant’s employer in error, on foot of a telephone call from the employer. The Department acknowledged that the information should not have been sent out to the employer and that the correct procedures were not followed on this occasion. It stated that the staff member who supplied the information was new to the Department. It explained that it was not normal practice to issue a screenshot to the employer; the correct procedure was to issue a statement to the employee along with a note informing the employee that the information had been requested by their employer.

The data subject chose not to accept an apology from the Department as an amicable resolution of her data protection complaint, opting instead to seek a formal decision of the Data Protection Commissioner.

A decision of the Data Protection Commissioner issued in October 2015. In her decision, the Commissioner formed the opinion that the Department of Social Protection contravened Section 2(1)(c)(ii) of the Data Protection Acts 1988 and 2003 by the further processing of the complainant’s personal data in a manner incompatible with the purpose for which it had been obtained. The contravention occurred when the Department of Social Protection disclosed the complainant’s personal data to an unauthorised third party.

This case serves as a reminder to data controllers of the importance of ensuring that new staff are fully trained and closely supervised in all tasks, particularly in those tasks that involve the processing of personal data. Errors by staff present a high risk of data breaches on an ongoing basis and it is critically important that efforts are made to mitigate against those risks by driving data protection awareness throughout the organisation, with particular focus on new or re-assigned staff.

Case Study 9: Covert CCTV installed without management knowledge

This Office received a complaint from staff of Letterkenny General Hospital in relation to the operation of covert CCTV surveillance by management within the Maintenance Department of Letterkenny General Hospital.

We also received a ‘Data-Breach Incident Report’ from the Health Service Executive (HSE) about this matter. This breach report recorded the incident as ‘Unauthorised CCTV Surveillance of Office Area’ and stated that a covert CCTV camera was installed by two maintenance foremen in their two-man office due to concerns they had in relation to the security of their office.

We commenced an investigation of the complaint by writing to the Health Service Executive (HSE), outlining the details of the complaint. We sought information from it in relation to the reporting arrangements between the maintenance staff in Letterkenny General Hospital and the maintenance foremen who installed the covert CCTV; the whereabouts of footage captured by the covert CCTV; the outcome of the internal investigation; how the covert CCTV was installed without notice to the management of Letterkenny General Hospital; and details of any instruction or notification issued to staff on foot of the internal investigation.

In response, the HSE stated that the foremen who had installed the camera were direct supervisors of the maintenance department staff and that the footage recorded was stored on a DVD and secured in a locked safe. It further stated that an internal investigation concluded that two staff had installed the covert CCTV without the authority, consent or knowledge of the management of Letterkenny General Hospital, due to concerns regarding unauthorised access/security in their office. We established that the camera in question was previously installed in a now disused area of the hospital, had been decommissioned and was re-installed in the office in question.

As well as confirming that the footage captured by the covert camera was of normal daily comings and goings to the maintenance office, the HSE stated that this was an unauthorised action by staff in the maintenance section and that it was keenly aware of its duty to all staff to provide a workplace free from unauthorised surveillance. The HSE confirmed that it would initiate steps to ensure that there would be no repetition of this action.

The HSE subsequently issued a written apology to the complainants in which it also confirmed that the recordings had been destroyed.

A decision of the Data Protection Commissioner issued in April 2015. In her decision, the Commissioner formed the opinion that the HSE contravened Section 2(1)(a) of the Data Protection Acts 1988 and 2003 by failing to obtain and process fairly the personal data of individuals whose images were captured and recorded by a covert CCTV camera installed without its knowledge or consent.

Covert surveillance is normally only permitted on a case-by-case basis, where the data is kept for the purpose of preventing, detecting or investigating offences, or apprehending or prosecuting offenders. This implies that a written specific policy must be put in place detailing the purpose, justification, procedures, measures and safeguards that will be implemented in respect of the covert surveillance, with the final objective being an active involvement of An Garda Síochána or other prosecutorial authority. Clearly, any decision by a data controller to install covert cameras should be taken as a last resort after the full exhaustion of all other available investigative steps.

Case Study 10: Danske Bank erroneously shares account information with third parties

We received a complaint against Danske Bank alleging that it had disclosed personal data and account information in relation to a mortgage on a property owned by the complainant to third parties. We commenced an investigation of the matter by writing to Danske Bank, outlining the details of the complaint. We received a prompt response from Danske Bank, which stated that the complainant and the individual who received his personal data were joint borrowers on certain loan facilities and that it was during the course of email communications with the other individual in respect of that individual’s loan arrears that the personal data relating to the complainant was disclosed to two third parties. Danske Bank admitted that this was an error on its part and stated that it was unfortunate that it had occurred. It went on to explain that, in dealing with the queries raised by the other individual in respect of his arrears and entire exposure to Danske Bank, the relationship manager also included information on all arrears in respect of that individual’s connections, which included the complainant. The staff member concerned expressed his regret at the incident and Danske Bank confirmed that the staff member was reminded of its procedures with regard to data protection and the need to be vigilant when dealing with the personal data of customers. Danske Bank apologised for the incident and offered reassurance that it would endeavour to prevent a future reoccurrence.

Danske Bank went on to state that it had robust controls in place to ensure that such incidents did not occur; however, it admitted that, despite such controls, this was a case of a human error and it did not believe that it was in any way intentional.

The complainant requested that the Data Protection Commissioner issue a formal decision on his complaint. A decision of the Commissioner issued in January 2015, and it stated that, following the investigation of the complaint, she was of the opinion that Danske Bank contravened Section 2(1)(d) the Data Protection Acts 1988 and 2003 by disclosing the complainant’s personal data to a number of third parties without his knowledge or consent.

This case is illustrative of the need for financial institutions to be vigilant when dealing with the personal data of individuals who have common banking relationships with others, and to ensure that appropriate safeguards are in place to prevent accidental or erroneous sharing of personal data.

Case Study 11: Failure to update customer’s address compromises the confidentiality of personal data

This Office received a complaint that Allied Irish Banks (AIB) failed to keep the complainant’s personal data up-to-date over a prolonged period, despite repeated requests by the individual to do so, and that it failed to maintain the security of the individual’s personal information. The complainant informed us that he had repeatedly asked AIB to update his address details but that it had failed to do so. As a result, his correspondence from AIB continued to be sent to a previous address. The complainant alleged that, arising from the failure of AIB to update his address, his correspondence containing his personal data, which was sent to his previous address by AIB, was disclosed to unknown third parties at this previous address.

We commenced an investigation of the matter by writing to AIB, outlining the details of the complaint. AIB confirmed to us that, due to a breakdown in internal processes, the complainant’s correspondence address was had not been updated on all its systems in a timely manner, resulting in automated arrears letters continuing to issue to an old address.

In circumstances where AIB had been advised that the complainant had changed address, our investigation was satisfied that its continued sending by post or delivering by hand of correspondence intended for the complainant to the previous address failed to secure the complainant’s personal data against unauthorised access by parties who had access to the letterbox at the previous address.

Efforts to resolve the complaint by means of an amicable resolution were unsuccessful and the complainant sought a formal decision. In her decision, the Commissioner formed the opinion that AIB contravened Section 2(1)(b) of the Data Protection Acts 1988 and 2003 by failing to keep the complainant’s personal data up to date. This contravention occurred when AIB failed to remove the complainant’s previous address from his account despite notification from him to do so. The Commissioner also formed the opinion that AIB contravened Section 2(1)(d) by failing to take appropriate security measures against unauthorised access to the complainant’s personal data by sending correspondence by post and by hand delivery to an address at which he no longer resided, while knowing that this was no longer his residential address.

This case demonstrates the need for all data controllers to ensure that personal data is kept accurate and up-to-date at all times. Failure to do so may result in the disclosure of personal data to unauthorised persons as well as unnecessary distress and worry for data subjects who have updated the data controller with the most accurate information, only to find that the necessary safeguards were not in place to prevent their personal data being compromised by use, as in this case, of a previous address.

Case Study 12: Unfair use of CCTV data

The subject matter of this complaint was the use by the data controller of CCTV footage in a disciplinary process involving one of its drivers. The data controller, Aircoach, advised that it was reviewing CCTV footage from one of its coaches as part of dealing with an unrelated customer-complaint issue when it happened to observe a driver using her mobile phone while driving a coach.

As is often the case with such complaints, the complainant objected to the use of the CCTV footage as evidence in a disciplinary process that was taken by Aircoach against her, the basis of the objection being that it was unfairly obtained.

Aircoach informed us that it had introduced CCTV across its fleet in order to further enhance safety and security for both staff and customers. It further advised that all staff are informed that CCTV is installed and of the reasons behind its use, but admitted that it was not until the middle of 2014 that significant efforts were made to fully inform both staff and customers as to the presence of CCTV on its coaches.Aircoach provided us with a copy of its new CCTV policy and it also provided us with photos showing the CCTV signage on the coach entrance doors, adding that the process of putting appropriate signage in place on its coaches commenced in January 2014 and was concluded by October 2014.

The law governing the processing of personal data, including CCTV images, is provided for under Section 2 of the Data Protection Acts 1988 and 2003. Processing includes, among other things, the obtaining and use of personal data by a data controller and it must be legitimate by reference to one of the conditions outlined under Section 2A(1) of the Acts. In addition, a data controller must also satisfy the fair-processing requirements set out under Section 2D(1) of the Acts, which requires that certain essential information is supplied to a data subject before any personal data is recorded.

The investigation in this case established that, at the time of the relevant incident on 19 February 2014, the roll-out of CCTV signage by Aircoach had commenced; however, the company failed to properly or fully inform staff that CCTV footage might be used in disciplinary proceedings. Any monitoring of employee behaviour through the use of CCTV cameras should take place in exceptional cases rather than as a norm and must be a proportionate response by an employer to the risk faced, taking into account the legitimate privacy and other interests of workers. In this case, when processing the complainant’s image, Aircoach was not aware of any particular risk presented and, by its own admission, was investigating an unrelated matter. While it subsequently transpired that the incident in question was indeed a very serious matter, involving alleged use by a driver of a mobile phone while driving, there was no indication at the time of the actual processing that this was the case and the processing therefore lacked justification. In addition, the fair-processing requirements set out in Section 2D were not fully met and fair notice of the processing for the specific purpose of disciplinary proceedings was not given to drivers whose images might be captured and used against them. In those circumstances, the processing could not be said to have been done in compliance with the Acts and the Commissioner found that Section 2(1)(a) had been contravened.

It is important to note that the processing of CCTV images in disciplinary proceedings against an employee is very much circumstance-dependent. Thus, while on this occasion the employer was found to have been in contravention of the Acts because the images were processed without justifiable cause or fair notice to the employee in question, in other circumstances the processing might be regarded as being proportionate and fair, especially if the processing is done in response to an urgent situation and the employer has the correct procedures in place. Employers should therefore be careful to ensure that a comprehensive CCTV policy is in place and followed if they wish to stay within their legal obligations.

  • Prosecutions: Private Investigators
  • Prosecutions: Marketing Offences
  • Excessive Data Collection by An Post
  • Disclosure of Employee Salary Details by the HSE
  • Excessive Data Collection by a Letting Agency
  • Disclosure of Financial Information by a Credit Union
  • Complaint of Disclosure by Permanent TSB Not Upheld
  • Patient Denied Right of Access by SouthDoc
  • Excessive Data Collection by the Department of Agriculture
  • Personal Data Disclosed by County Council
  • Eircom Fails to Meet Statutory Timeframe for Processing Access Request
  • Third-Level Student Data Appeared on Third-Party Website
  • Data Controller Discloses Personal Data to Business Partner
  • Employee of Financial Institution Resigns Taking Customer Personal Data
  • Theft of Unencrypted Laptop
  • Compromise of Adobe Network

Case Study 1: Prosecutions: Private Investigators

This Office initiated prosecutions in the private investigator/tracing-agent sector for the first time in 2014. These prosecutions arose from a detailed investigation that commenced in the summer of 2013. Arising from audits carried out in a number of credit unions at that time, the Office became concerned about the methods employed by some private investigators hired by credit unions to trace the current addresses of members who had defaulted on their loans. The Office launched a major investigation to identify the sources from which the private investigators had obtained the current address data. This investigation involved a wide range of public bodies and private companies. As a result of our findings, the Office established that personal data on databases kept by the Department of Social Protection, the Primary Care Reimbursement Service of the Health Service Executive, An Garda Síochána and the Electricity Supply Board had been accessed unlawfully and the information was disclosed thereafter to credit unions. Details of the prosecutions that ensued are as follows:

M.C.K. Rentals Limited and its Directors

M.C.K. Rentals Limited (trading as M.C.K. Investigations) was charged with 23 counts of breaches of Section 22 of the Data Protection Acts 1988 and 2003 for obtaining access to personal data without the prior authority of the data controller by whom the data is kept, and disclosing the data to another person. The personal data was kept by the Department of Social Protection (7 cases) and by the Primary Care Reimbursement Service of the Health Service Executive (16 cases). In all cases, the personal data was disclosed to various credit unions in the state.

The two directors of M.C.K. Rentals Limited, Ms Margaret Stuart and Ms Wendy Martin, were separately charged with 23 counts of breaches of Section 29 of the Data Protection Acts 1988 and 2003 for their part in the offences committed by the company. This Section provides for the prosecution of company directors where an offence by a company is proved to have been committed with the consent or connivance of, or to be attributable to any neglect on the part of, the company directors or other officers.

At Bray District Court on 6 October 2014, M.C.K. Rentals Limited pleaded guilty to five sample charges for offences under Section 22 of the Data Protection Acts 1988 and 2003. The Court convicted the company in respect of each of the five charges and it imposed a fine of €1,500 per offence. Company Secretary and Director Ms Margaret Stuart pleaded guilty to one sample charge for an offence under Section 29 of the Data Protection Acts 1988 and 2003. The Court convicted Ms Stewart in respect of that offence and imposed a fine of €1,500. Company Director Ms Wendy Martin pleaded guilty to one sample charge for an offence under Section 29 of the Data Protection Acts 1988 and 2003. The Court convicted Ms Martin in respect of that offence and it imposed a fine of €1,500.

This was the first occasion on which company directors were prosecuted by the Data Protection Commissioner for their part in the commission of data-protection offences by their company, and the proceedings in this case send out a strong warning to directors and other officers of bodies corporate that they may be proceeded against and punished in a court of law for criminal offences committed by the body corporate.

The investigation of this company uncovered wholesale and widespread “blagging” techniques used by the offenders, and this was the first prosecution by the Data Protection Commissioner of offenders engaged in such practices. The findings of the investigation carried out in this case expose the constant threat to the security of personal data that is in the hands of large data controllers and the vigilance that is required by front-line staff at all times to prevent unlawful soliciting of personal data, in particular by means of telephone contact, by unscrupulous agents. Data controllers across the state should regularly review their data-protection procedures to maximise the effectiveness of their security protocols in order to counter such criminal activity. They must ensure that all staff, and particularly those at the front line who handle telephone calls, are fully trained in the security protocols in order to be able to recognise and deal with the threat of information blagging or pretext calling if it arises.

Michael J. Gaynor

Michael J. Gaynor (trading as MJG Investigations) was charged with 72 counts of breaches of the Data Protection Acts 1988 and 2003. Twelve charges related to breaches of Section 22 of the Data Protection Acts for obtaining access to personal data without the prior authority of the data controller by whom the data is kept, and disclosing the data to another person. The personal data was kept by the Electricity Supply Board (9 cases) and by An Garda Síochána (3 cases). In all cases, the personal data was disclosed to various credit unions in the state. A further 60 charges related to breaches of Section 16(2) of the Data Protection Acts in respect of the processing of personal data of a number of individuals in circumstances where no record was recorded in respect of the accused in the public register maintained by the Data Protection Commissioner. Mr Gaynor is a former member of An Garda Síochána.

On 25 November 2014, at Dublin Metropolitan District Court, Michael J. Gaynor was convicted on two charges for offences under Section 22 of the Data Protection Acts 1988 and 2003. The Court imposed a fine of €2,500 in each of these two charges. Separately the defendant pleaded guilty to 69 charges (60 of which related to breaches of Section 16(2)) and these were taken into consideration in the sentence imposed.

This was the first prosecution to be completed by the Data Protection Commissioner of a data processor for processing personal data without having registered as a data processor on the public register of the Office of the Data Protection Commissioner. The investigation in this case uncovered access by the defendant to customer data held on databases held by the Electricity Supply Board. To access the personal data, the defendant used a staff contact in the Electricity Supply Board, which he had established during his previous Garda career.

These prosecutions send a strong message to private investigators and tracing agents to comply fully with data-protection legislation in the conduct of their business, and that if they fail to do so they will be pursued and prosecuted for offending behaviour. They also serve to remind all companies and businesses who hire private investigators or tracing agents that they have onerous responsibilities under the Data Protection Acts to ensure that all tracing or other work carried out on their behalf by private investigators or tracing agents is done lawfully. Specifically, in this regard, those operating in the credit union, banking, financial services, legal and insurance sectors should review their engagement of private investigators and tracing agents to ensure they have fully safeguarded all personal data against unlawful forms of data processing.

These investigations uncovered serious issues in relation to the hiring of private investigators or tracing agents by credit unions, particularly in respect of a lack of awareness on their part of how the private investigators were tracing members and, in some cases, in relation to the disclosure of PPS numbers by credit unions to private investigators. This Office has pursued all of these issues with the credit unions concerned and with their representative bodies in recent months. In addition, we have undertaken a range of follow-up work with the Department of Social Protection, the Health Service Executive, An Garda Síochána and the Electricity Supply Board on the implications of the data-security breaches that occurred in their organisations and on the measures required to deal with those breaches and to prevent a recurrence. This Office welcomes the fact that the Private Security Authority has proposed the introduction of regulation of private investigators.

Case Study 3: Prosecutions: Marketing Offences

Pure Telecom Limited

We received a complaint in March 2013 from an individual who received two marketing phone calls from Pure Telecom Limited on his landline telephone. The individual’s telephone number was listed on the National Directory Database opt-out register. It is an offence to make a marketing call to a telephone number listed on that register.

Pure Telecom Limited informed our investigators that it used the services of a third-party representative to make the marketing calls and it explained that the agent sourced the individual’s number themselves rather than using marketing data provided by Pure Telecom Limited. The company admitted that the third-party agent did not have consent to contact the complainant for marketing purposes.

At Dublin District Court on 3 February 2014, Pure Telecom Limited pleaded guilty to two charges concerning breaches of Regulation 13 (5)(b) of S.I. 336 of 2011 relating to two marketing phone calls to a phone number listed on the opt-out register. The Court imposed a conviction in respect of both charges and a fine of €500. It further ordered payment of the prosecution costs of the Data Protection Commissioner. The hearing was informed that the defendant had a previous conviction from 2010 for a similar offence.

Next Retail Limited

In February 2013, this Office received a complaint from an individual who received a number of unsolicited marketing emails from Next Retail Limited after she requested the company not to send her any more such emails. The complainant claimed to have unsubscribed firstly by using the unsubscribe link that was provided in a marketing email sent by the company and, following this, in four separate emails to the company requesting not to be contacted with marketing emails again.

Next Retail Limited informed our investigators that as it no longer used the services of the company that it had engaged to process unsubscriptions it was unable to explain what happened to the first unsubscribe request. With regard to the emails containing unsubscribe requests, the company confirmed that they did reach its complaints inbox but it was unable to trace where the emails went afterwards.

At Dublin District Court on 3 February 2014, Next Retail Limited pleaded guilty to two charges concerning breaches of Regulation 13(1) of S.I. 336 of 2011 relating to the sending of two unsolicited marketing emails without consent. The Court imposed a conviction in respect of one charge, with the second charge taken into consideration. A fine of €100 was imposed. The defendant agreed to cover the prosecution costs of the Data Protection Commissioner.

Next Retail Limited subsequently appealed the severity of the sentence. On 19 March 2014, the Circuit Court affirmed the conviction and penalty previously imposed by the District Court and it noted the appellant’s intention to discharge the Data Protection Commissioner’s reasonable costs for the appeal.

Airtricity Limited

In May 2013, this Office received a complaint against Airtricity Limited from a person who received an unsolicited marketing phone call on his landline telephone, which was listed on the National Directory Database opt-out register. The complainant informed us that the purpose of the marketing call was to encourage him to switch energy supplier to Airtricity.

In response to our investigation, Airtricity admitted that the phone call had been made by a third-party contractor acting on its behalf. It explained that the error occurred when an old PC, on which the 2009 phone book was installed, was re-commissioned by the contractor. A spreadsheet containing the complainant’s phone number was still on the old PC and this led to the number being dialled in error.

At Dublin District Court on 3 February 2014, Airtricity Limited pleaded guilty to one charge concerning a breach of Regulation 13(5)(b) of S.I. 336 of 2011 relating to one marketing phone call to a phone number listed on the opt-out register. The Court imposed a conviction in respect of the charge and a fine of €75. The defendant agreed to cover the prosecution costs of the Data Protection Commissioner.

The Carphone Warehouse Limited

In March 2013, we received a complaint from a customer of The Carphone Warehouse Limited after he received marketing text messages from the company despite having ticked the marketing opt-out box when he had previously made a purchase in one of its stores. The company informed our investigators that a systems error resulted in the customer being incorrectly included in its marketing list.

In April 2013, we received a complaint from another customer of The Carphone Warehouse Limited who received regular offers by text message from the company even though he had called the company on at least three occasions, asking that it stop. The company told our investigators that its system temporarily did not recognise the customer’s preference not to receive marketing due to an internal issue within the electronic filter process and this resulted in the customer’s phone number being accidentally selected for marketing campaigns.

At Dublin District Court on 3 March 2014, The Carphone Warehouse Limited entered a guilty plea in respect of five charges concerning breaches of Regulations 13(1) and 13(4) of S.I. 336 of 2011. The court imposed convictions in respect of four charges, with the fifth charge taken into consideration. It imposed fines of €1,500 in respect of each conviction. The defendant agreed to cover the prosecution costs of the Data Protection Commissioner. The hearing was informed that the defendant had two previous convictions from 2012 in relation to the sending of unsolicited marketing emails.

Valterous Limited (trading as Therapie Clinic and/or Therapie)

A former customer of Valterous Limited (trading as Therapie Clinic and/or Therapie) complained to this Office in June 2013 after receiving an unsolicited marketing text message despite having opted out of receiving such communications over three months earlier. Therapie explained to our investigators that the complainant’s contact details were on systems in two branches and that when the opt-out request was made the company removed their details from one database and did not realise they were also on another one, thus leading to a further unsolicited text message being sent to the same contact number.

In July 2013, we received a complaint from another former customer of Therapie who had received marketing text messages on several occasions. The complainant informed us that she sent a text message to opt out but the company continued to send her further marketing text messages. Our investigation found no evidence that Therapie had obtained consent at any time for the sending of marketing text messages to this individual. In relation to the sending of text messages after the former customer had opted out, Therapie explained that the individual should have texted the word “STOP” rather than the word “OPTOUT” at the time of attempting to opt out of the marketing database. We did not accept this as a valid excuse as the opt-out instruction on the marketing text message sent to the individual read “OptOut:086.......”.

At Dublin District Court on 3 March 2014, Valterous Limited (trading as Therapie Clinic and/or Therapie) pleaded guilty in relation to three charges concerning breaches of Regulation 13(1) of S.I. 336 of 2011 concerning the sending of unsolicited marketing text messages without consent. The Court imposed convictions in respect of two charges, with the third charge taken into consideration. It imposed fines of €1,500 in respect of each conviction. The defendant agreed to pay the prosecution costs of the Data Protection Commissioner. The Court was told that in 2012 Therapie Laser Clinics Limited (trading as Therapie Clinic and/or Therapie) was convicted for two offences in relation to the sending of unsolicited marketing text messages.

Case Study 4: Excessive Data Collection by An Post

This Office received two complaints from members of the public concerning new requirements that were introduced in November 2013 by An Post in relation to direct-debit applications for payment of TV licence fees. A mandatory requirement was introduced to provide a recent bank statement with the direct-debit application and mandate form. An Post’s TV licence website explained that a copy of a bank statement was required to verify the bank-account details provided by the licensee for payment of their TV licence fee. It went on to state that the bank statement must show the BIC, IBAN and the full name and address of the bank-account holder. The complainants argued that requesting a copy of confidential financial information that appears on bank statements was excessive.

We investigated these complaints with An Post. By way of background, An Post explained that the new SEPA regulations impose significant new obligations on direct-debit originators such as An Post with the TV Licence Direct Debit Scheme. It said that the commercial risk attached to accepting direct debits is now the sole responsibility of An Post and therefore An Post has to verify the direct-debit details supplied by the customer. It stated that An Post does not have proof that the bank-account details exist, are accurate or that the account is owned by the person stated on the mandate. Accordingly, it developed its new bank-detail verification process to check the mandate details supplied, and in that new process it seeks extra documentation to verify that the bank-account details supplied by the applicant are accurate, complete and up to date. It also pointed out that it cannot process a direct-debit application without having valid BIC and IBAN numbers in respect of the account on which the direct debit is drawn. An Post indicated that, further to our correspondence, it had decided that customers who choose direct-debit payment are no longer required to submit details of their bank balances.

We considered the matter further and we advised An Post that applicants should either be allowed to submit a copy of only the portion of the bank statement containing the name, address, BIC and IBAN numbers or they should be allowed to blacken out all of the transaction information on any copies supplied. An Post agreed to implement our advice. It amended its TV licence direct-debit application form to include the following text: “You should ensure that financial transactions on your bank statement are fully masked or removed before you attach it to your application. All bank statements are destroyed once the first successful payment has gone through.” An Post also amended its website to reflect this change and to clarify that it does not require the balance on the bank statement to be shown. We were satisfied with the changes implemented by An Post and with the manner in which it dealt with the matter expeditiously once we had drawn it to its attention.

Organisations that seek copies of bank statements for purposes such as proof of current address, as a verifier of identify or other similar issues should bear in mind that such documents contain a range of financial information that is private to the individual to whom it relates. As a general rule, individuals must be permitted to blacken out or otherwise mask those financial details and transactions as they are irrelevant for the purposes of address verification, etc. This case study should serve as a reminder to organisations to consider all the implications and the potential to collect an excessive amount of personal data in circumstances where they seek copies of bank statements from customers or clients.

Case Study 5: Disclosure of Employee Salary Details by the HSE

An employee of the Health Service Executive (HSE) complained in March 2014 concerning the alleged disclosure on two occasions of his salary details to his ex-wife. He informed us in his complaint that the matter came to his attention when his ex-wife went to court in the summer of 2013 in relation to maintenance issues, and in court she provided exact details from his payslips. In December of the same year, his ex-wife went back to court for a review of maintenance and on that occasion she produced a copy of his P60 along with his salary details for the previous four months.

We commenced an investigation of the matter by writing to the HSE. In response, the HSE accepted that on two separate occasions, in May 2013 and in November 2013, personal data relating to its employee was disclosed to a third party without his consent. It acknowledged that there was no legal basis for the disclosure of the personal data. It stated that it established who, within the HSE, made the first disclosure but it was not possible to establish who made the second disclosure. It explained that its payroll department had received a number of court orders directing the HSE to make maintenance payments to its employee’s ex-wife. It stated that numerous queries were raised by a firm of accountants and tax professionals called Accountax on behalf of its employee’s ex-wife. Those queries sought clarifications with regards to the payments made. It went on to state that, in relation to the first breach, a specific request was made seeking a copy of its employee’s most recent payslip showing the maintenance deductions from January 2013 to date. The HSE admitted that the requests for constant updates regarding maintenance payments ultimately resulted in the unauthorised disclosure of its employee’s personal data. The HSE accepted that in hindsight the only data that should have been released by its payroll department to its employee’s ex-wife (or to a person acting on her behalf) was a summary of payments made that related to the court orders.

We informed the HSE that we considered that the Data Protection Acts were breached when the personal data of its employee was disclosed to a third party without his consent. The HSE indicated that it wished to pursue an amicable resolution to the complaint and, to this end, it enclosed a letter of apology for the complainant. The data subject considered the letter of apology and he decided that he did not wish to accept it, opting instead to seek a formal decision of the Data Protection Commissioner on his complaint.

A decision of the Data Protection Commissioner was issued in August 2014. In his decision, the Commissioner formed the opinion that the HSE contravened Section 2(1)(c)(ii) of the Data Protection Acts 1988 and 2003 on two occasions by the further processing of the complainant’s personal data in a manner incompatible with the purpose for which it had been obtained. These contraventions occurred in May 2013 and in November 2013 when the HSE disclosed his personal information to a third party. Section 2(1)(c)(ii) of the Data Protection Acts 1988 and 2003 provides that data shall not be further processed in a manner incompatible with the purpose for which it was obtained. In this case, the HSE acknowledged that on two separate occasions the personal data was disclosed to a third party without the consent or knowledge of the data subject. Such disclosures constitute further processing of personal data.

Case Study 6: Excessive Data Collection by a Letting Agency

In July 2014, a prospective tenant complained about the collection of bank details, PPS numbers and copies of utility bills by a letting agency when applying to rent a property. The complainant stated that this information was in addition to the usual material, such as previous landlord’s reference, which one would expect to submit at application stage. She stated that she believed that if she did not supply all of the sought data up-front, her application would not be seriously considered by the letting agency. The complainant said that the practice of collecting such a broad range of personal data forces prospective tenants who are desperate to rent a property to submit this personal information at application stage even though they do not know if their application will be successful. She pointed out that the majority of applications are unsuccessful given the high demand for a limited supply of available rental properties in the Dublin area.

We commenced an investigation of the matter with the letting agency concerned, seeking an explanation for the collection of such a broad range of personal data at application stage. In response, the letting agency said that it requested PPS numbers from applicants because this verifies that they are entitled to work in the state, and that bank details are required to show that a tenant has a bank account because they would be ineligible if they were not able to pay rent through a bank account. We told the letting agency that we could not see any basis for collecting bank details, PPS numbers or copies of utility bills at application or property-viewing stage and we urged it to cease the practice immediately. We questioned the letting agency further about using the PPS number to verify the applicant’s work status. It replied to the effect that the main reason it requests PPS numbers is that it is required for the Private Residential Tenancies Board (PRTB) registration form and it said that it cannot register a tenant without it. It went on to say that it is only an added assurance that the applicant is working and it stated that it does not verify the PPS number.

We accepted that personal data concerning bank details, PPS numbers and utility bills could be requested once the applicant had been accepted as a tenant. In October 2014, the letting agency confirmed, following our investigation, that it had ceased the requesting of this personal data prior to the property being let and it undertook that it would only request this information once the tenant had been accepted. The complainant informed us that she was very satisfied with the outcome of her complaint.

This case study is a classic example of the temptation of some data controllers to collect a whole range of personal data in case they might need it in the future. In this case, the letting agency collected a significant amount of personal data from every applicant who expressed an interest in renting a property even though, at the end of the process, only one applicant could be accepted as the new tenant and it was only in the case of that successful applicant that the full range of personal data was required. Section 2(1)(c)(iii) places an obligation on data controllers to ensure that personal data which they process is adequate, relevant and not excessive in relation to the purpose or purposes for which it is collected or are further processed. Data controllers must be mindful of this requirement and abide by it despite the temptation for convenience or other reasons to embark on an unnecessary broad data collection exercise.

Case Study 7: Disclosure of Financial Information by a Credit Union

A member of a credit union complained in 2013 in relation to the alleged disclosure of his loan and savings information by the credit union to his daughter. By way of background, the complainant explained that he was a guarantor on a credit union loan to his daughter. He received a letter from the credit union to inform him of difficulties that his daughter was experiencing with her loan. The purpose of the letter was to call on him, as the loan guarantor, to pay the balance of monthly repayments. He outlined that the letter was addressed to him and that it contained his membership number along with his savings and loan details, including balance outstanding. Soon afterwards, his daughter called to his house with a copy of the same letter as the credit union had also sent it to her. The complainant said that he considered this disclosure of his financial information to be a gross violation of his privacy.

We investigated the matter with the credit union concerned. It explained that the error that led to the disclosure occurred when the letter to the guarantor was issued under the guarantor’s membership number and not under the membership number of his daughter, whose loan it referred to. It explained that the computer system automatically brings across the account details of the membership number keyed in. The credit union admitted that a member of its credit-control staff inadvertently typed the letter under the guarantor’s membership number and, as a result, his account details were printed on the letter.

The credit union proposed that, as a means of trying to reach an amicable resolution of the complaint, it would issue a letter of apology to the guarantor. It also carried out staff training in regard to issuing letters to members, in particular letters to guarantors, and it re-circulated its data-protection policy to all staff. The complainant considered the offer and rejected it. He sought a formal decision of the Data Protection Commissioner on his complaint.

In April 2014, a decision issued to the complainant. In his decision, the Commissioner formed the opinion, following the investigation of the complaint, that the credit union contravened Section 2(1)(d) of the Data Protection Acts by providing details of the complainant’s membership account to a third party by means of a letter that was copied to the third party. Section 2(1)(d) obliges data controllers, among other things, to take appropriate security measures against unauthorised disclosure of personal data.

This case highlights the serious consequences for the complainant concerned arising from what appeared to be an innocuous error on the part of the staff member typing a letter for the complainant on his own account rather than on the account of his daughter, to whom the subject matter of the letter related. It serves as a reminder to data controllers generally to keep data-protection awareness to the forefront, with regular staff training for those whose work involves any form of data processing.

Case Study 8: Complaint of Disclosure by Permanent TSB Not Upheld

A complaint from a customer of Permanent TSB alleged that the bank had violated the Data Protection Acts by discussing their accounts and personal details with a third party, the complainant’s tenant, thereby causing financial loss and stress.

We investigated the allegation with Permanent TSB. In response, the bank informed us that it had made no contact with residents in the properties concerned to discuss the mortgage account details of the complainant concerned. It further stated that all telephone calls received from the tenant concerned had been listened to and at no time did any staff member discuss the details of the mortgage account with her. As part of our investigation we sought a copy of the recordings of phone calls that took place between Permanent TSB and the tenant. We listened to the call recordings and we were satisfied that no personal data relating to the complainant was passed to the tenant during the phone calls with Permanent TSB. Instead, the tenant was repeatedly told that Permanent TSB could not discuss anything with her without the written authority of the account holder. In one instance, the tenant offered to give her contact number to Permanent TSB but she was informed that it was not required as Permanent TSB would not be contacting her. This Office’s investigation found no evidence that Permanent TSB disclosed any personal data relating to the complainant to the third party concerned.

In a separate aspect to the same complaint, it was alleged by the complainant that Permanent TSB had sent correspondence to a previous residential address after it had been notified of a change of address. The complainant supplied us with a copy of a letter sent by them in August 2011 notifying the bank of the new address for correspondence and we were also supplied with copies of letters sent by Permanent TSB to the previous address after that date. In response to our investigation of this matter, Permanent TSB confirmed that it had received the August 2011 letter, which notified it of the new address, but it could offer no explanation as to why its systems had not been updated at that time to reflect this. It informed us that it was not until it received a further letter in January 2012 that the system was updated. To assist with trying to resolve the complaint, the bank offered a goodwill gesture as an acknowledgement of the delay encountered and of any stress the delay may have caused, but this was rejected by the complainant.

The complainant sought a formal decision on the complaint. With regard to the failure to update the contact address, having been requested to do so in August 2011, the Commissioner formed the opinion that Permanent TSB contravened Section 2(1(b) of the Data Protection Acts. This section obliges data controllers to comply with the requirement to keep personal data accurate and up to date.

With regard to the allegation of disclosure of the complainant’s personal data to a tenant, the Commissioner was unable to form the opinion that a contravention of the Data Protection Acts occurred in this instance.

Case Study 9: Patient Denied Right of Access by SouthDoc

We received a complaint in June 2014 from a firm of solicitors whose client had made an access request in May 2014 to the Practice Manager at South West Doctors-On-Call Limited (trading as SouthDoc) seeking a copy of his medical notes. In response to the access request, SouthDoc replied to the solicitors, stating that they are advised to contact the patient’s own GP, who holds a complete record for the patient. The solicitors wrote back to SouthDoc, pointing out that the access request was made to SouthDoc and that it was a separate request to any request their client may make to his own GP. The solicitors pointed out that SouthDoc was obliged to comply with the request. In submitting the complaint to this Office, the solicitors informed us that SouthDoc had not replied to their latest letter but had returned it to them unanswered.

We began an investigation by writing to SouthDoc. It responded by return post, indicating that the request for medical records had now been dealt with. Soon afterwards, the solicitors for the complainant supplied us with a copy of a letter they had received from SouthDoc stating that, further to the access request, the patient’s records had been forwarded to his own GP. The solicitors pointed out that SouthDoc had not complied with the access request as it was their client who requested the records, and it was not sufficient for SouthDoc to give them to his GP. We wrote to SouthDoc again, seeking an explanation. A few days later we received from SouthDoc a copy of a letter that it had issued to the patient’s solicitors, enclosing a copy of the patient’s medical records. We then concluded our investigation.

There are a number of after-hours or on-call service providers such as SouthDoc in operation in Ireland, all of which provide an essential medical service for the general public. In doing so, these service providers collect and process both personal data and sensitive personal data (data relating to the physical or mental health of the attending patient). For the purposes of data protection, it is important that patients and service providers understand that when a patient attends one of those services, they provide their personal data to an organisation (data controller) that is entirely separate to their usual GP practice. Accordingly, the records created by the service provider in respect of the patient’s attendance and treatment are new records in respect of which the service provider is the data controller. For that reason, the patient has a right to access those records directly from the service provider by making an access request for a copy of them. This right of access to the records of the service provider exists whether or not the service provider passes on details of the patient’s attendance and treatment to the patient’s GP. Furthermore, the service provider is obliged to supply a copy of the personal data directly to the requesting patient (or to the solicitor acting on his behalf, as in the above case) rather than to the patient’s own GP. (Access to medical records is subject to the provisions of S.I. 82 of 1989, which prohibits the supply of data to a patient in response to an access request if that would cause harm to his or her physical or mental health.)

Case Study 10: Excessive Data Collection by the Department of Agriculture

An individual complained to this Office about new requirements introduced by the Department of Agriculture to produce bank-account details in relation to registering premises to comply with the Diseases of Animals Act 1966–2001. He explained that horse owners are required to register the premises in which horses are kept with the Register of Horse Premises and he said he had no difficulty with that requirement. However, he objected to being asked to supply his bank-account details and he pointed out that there was no possibility of this information being needed by the Department as there were no schemes or grants that entitle horse owners to payment. He told us that he and his wife each own a horse and that both horses are kept purely for pleasure purposes. He said that he had expressed his concerns directly to the Department initially but the Department continued to insist that he submit bank details.

We sought an explanation from the Department of Agriculture. In its response, the Department referred to the government’s drive towards e-commerce and the fact that government departments can no longer issue payable orders. It said that payments due by the Department can only be made by way of electronic fund transfer to a bank account. Accordingly, all clients of the Department in receipt of payments are asked to supply bank details as a prerequisite for entry onto the Department’s Corporate Customer System. It said that as most of the Department’s clients are in receipt of payments or could potentially receive payments, it was decided that all new clients (applicants), including those who exceptionally might not currently qualify for payments, would be asked for their bank-account details.

We referred the Department to the provisions of Section 2(1)(c)(iii) of the Data Protection Acts, which places a requirement on data controllers to ensure that personal data shall be adequate, relevant and not excessive in relation to the purpose for which it is collected. We pointed out that the principle established by this provision required that personal data should be collected when required and not on the basis that it might be required at some future point. We received confirmation from the Department in February 2014 that the practice of seeking bank details in anticipation of possible future payments had ceased. We were informed that an information notice had been issued to staff, stating that customer bank details are required only where a customer will be in receipt of payments from the Department.

The complainant in this case raised a very valid complaint with this Office, having failed to resolve the matter directly with the Department himself. Insufficient thought appears to have been given at the outset to the concept of requiring bank details from every customer or potential customer of the Department – whether that information was needed or not. More disappointingly, however, was the fact that the Department did not review the situation and fix it after this individual drew the Department’s attention to his circumstances and the circumstances of others who keep horses for pleasure purposes – pointing out that the Department would never need to use his bank-account details as he was not an applicant for a scheme or grant. In the end, it took the intervention of this Office to persuade the Department to cease seeking excessive personal data and to comply with the principle that data collection shall be adequate, relevant and not excessive.

Case Study 11: Personal Data Disclosed by County Council

In April 2014, we received a complaint from an individual who alleged that her private email address was disclosed to third parties without her permission by Dun Laoghaire Rathdown County Council. The complainant had made a submission to the county council in respect of a local area plan. She found out about the disclosure when one of the parties to whom her email address had been disclosed made an unsolicited contact with her using her email address. She indicated that she was worried as she did not know how many people were in possession of her private email address as a result of the disclosure.

We commenced an investigation by writing to Dun Laoghaire Rathdown County Council. In response, the county council by way of background explained that it supplies notices, agendas and minutes of its meetings to parliamentary representatives in accordance with Local Government Act 2001 (Section 237A) Regulations 2003.

It went on to state: “It has been the practice of this Authority heretofore to supply copies of all reports that issue with these agenda, as this is how the agenda issues to our councillors. In accordance with the Planning and Development Act 2000 [as amended], Section 20(3)(c)(ii), a Manager's Report for a Local Area Plan must list the persons who made submissions or observations. In all cases a list of submitters is prepared, for internal use and file, which includes necessary contact details, home address and email address. It is our standard practice, however, to remove the email addresses before circulation to councillors. The home addresses are left on as councillors wish to see who in their constituency made a submission. In this case we inadvertently included the email and home addresses with the list of submitters. This was an error on our part, and not standard practice. What has been placed on our website, however, is the list without the contact details. In order to prevent a recurrence of this, we have reminded all staff not to include the contact details of submitters in reports which are circulated to councillors or placed on the website. Additionally, although as mentioned above the list that went to councillors usually contained the submitter's address for the councillors’ information, we will not include either home address or email address in any reports issuing to councillors. In addition to the above, and to further prevent the inadvertent release of personal information, the Council will cease the practice of issuing reports with the agenda which are supplied to parliamentary representatives.”

The county council stated that it had issued a revised report, with all of the personal contact details removed, to all of the recipients and it asked that they delete the original version. The county council concluded by saying that in this case the information was disclosed accidentally and it said that it would endeavour to ensure that there will be no repeat of this incident by adhering to its standard procedure and by reminding all staff concerned of those procedures.

The complainant sought a formal decision on her complaint.

Section 2(1)(c)(ii) of the Data Protection Acts provides that personal data shall not be further processed in a manner incompatible with the purpose for which it was obtained. The data controller in this case, Dun Laoghaire Rathdown County Council, explained to our investigation that in accordance with the Planning and Development Act 2000, a County Manager's Report for a Local Area Plan must list the persons who made submissions or observations. The data controller further stated that in all cases a list of submitters is prepared for internal use, which includes contact details, home address and email address, and that it is its standard practice to remove the email addresses from this list before circulation to councillors. However, it was clear that in this particular instance the email addresses of the submitters was not removed from the circulation list. In making his decision, the Commissioner formed the opinion that Dun Laoghaire Rathdown County Council contravened Section 2(1)(c)(ii) of the Data Protection Acts. This contravention occurred by the further processing of the complainant’s personal data in a manner incompatible with the purpose for which it had been obtained when her email address was disclosed by Dun Laoghaire Rathdown County Council via the circulation of a report to county councillors, TDs and senators in relation to a local area plan.

Case Study 12: Eircom Fails to Meet Statutory Timeframe for Processing Access Request

A staff member of Eircom submitted a complaint to this Office in relation to the alleged failure of Eircom to comply with an access request submitted by him to the company in September 2013. In his access request, he specifically requested a copy of a particular letter that was sent on a date in February 2013 to Eircom's Chief Medical Officer.

We commenced the investigation of the complaint and we asked Eircom to respond to the access request without further delay. We were informed by Eircom that it had already provided the data subject with a copy of the letter that was the subject of his access request, and it subsequently provided us with a copy of its response to an access request. However, on further inspection of Eircom's response to that access request, it was unclear to us that the response was in relation to the particular access request that was the subject of the current complaint as the response issued to the data subject prior to the date of his access request. We asked Eircom to review the matter. Eventually, on 2 May 2014, we received an email from Eircom enclosing a copy of the response of that date to the data subject’s access request of 22 September 2013, supplying a copy of the document that the data subject had sought access to.

The complainant asked for a formal decision of the Data Protection Commissioner on his complaint. In making his decision, the Commissioner formed the opinion that Eircom Limited contravened Section 4(1)(a) of the Data Protection Acts by failing to supply the data subject with a copy of his personal data in response to his access request submitted on 22 September 2013 within the statutory period of 40 days. This contravention occurred when Eircom Limited released a copy of the data subject’s personal data to him on 2 May 2014 – which was outside the statutory period of 40 days.

As outlined elsewhere in this annual report, over half of the complaints received by this Office in 2014 were made by data subjects who experienced difficulties in accessing their personal data. One common theme that emerges in many of these complaints is lateness on the part of the data controller in processing the access request. The Acts lay down a period of 40 days for compliance with an access request and if this is not met, as in the case outlined above, the data controller contravenes the Data Protection Acts. The Office of the Data Protection Commissioner is very concerned about the prevalence of this particular contravention. In some instances, the data controller fails to even acknowledge receipt of the access request within the 40-day period. This means that the requester has no idea whether their access request is being dealt with or ignored. There have been many instances where the data controller has taken no action whatsoever in terms of processing the access request until this Office commences an investigation on foot of receiving a complaint from the data subject. Clearly, that is an undesirable situation. Data subjects have a statutory right to access their personal data held by a data controller by the simple means of submitting an access request, and the data controller has a statutory obligation to comply with that request within 40 days. A data subject should not have to resort to the extra step of lodging a complaint with the Office of the Data Protection Commissioner in order to have their statutory right of access enforced. Unfortunately, as the complaint statistics reveal, far too many data subjects are experiencing barriers and access-denying tactics on the part of data controllers.

In the above case, the data subject’s right of access was severely delayed. There is no justification for such a lengthy delay in any circumstances. Such a delay is particularly unacceptable in a situation where the requester simply sought a copy of personal data contained in one relatively recently created letter and where the data controller is a large telecommunications company that is well aware of the Data Protection Acts and receives and processes subject access requests on a regular basis. Eircom is the subject of several data-protection complaints every year across a range of issues, many of which relate to access requests. The Office of the Data Protection Commissioner expects to see a marked improvement in that company’s data-protection performance in the near future, particularly in the context of processing subject access requests in a timely manner.

Case Study 13: Third-Level Student Data Appeared on Third-Party Website

The Office received a notification from a data controller, in accordance with the Personal Data Security Breach Code of Practice. The notification alerted the Office to the fact that data relating to a large number of students had been discovered on a website that was unrelated to the data controller. The data related to the 2010 academic year.

The Office began an investigation of the matter. The data controller advised the investigation team that the information disclosed on the website included the name, email address and password of the student. The investigation team confirmed that there was no financial or sensitive data involved.

The data controller engaged an external security company to carry out its own investigation into the security breach.

Due to the passage of time, there were no server logs showing when or by whom the data had been uploaded to the website. However, the data controller was able to identify that the data published matched a file created for testing purposes in mid-2011. This file was then sent to a third-party service provider who was engaged in developing a management system for the data controller. The file was sent via unsecured email.

The third-party service provider informed the data controller that while there was a relationship between their staff and the website on which the data was published, they had conducted a very thorough review of the matter and could find no evidence to show that the file had been posted onto the website due to an act of omission on their part.

Our evaluation of the information showed that the data controller, when creating student accounts, used generic passwords when generating the student accounts. The password was the date of birth of the student. While students could change their passwords, they were never advised to change them.

While it could not be determined exactly how the data appeared on the website, it was evident that there had been a breach of the Data Protection Acts, in that appropriate security measures were not in place to prevent the unauthorised disclosure of personal data.

Our investigation also found that the use of live data for testing purposes was not in accordance with data-protection best practices. Where live data is being used by an organisation for testing purposes, there would have to be a strong justification for such use and we were not aware of any justification applicable in this particular case. The Office recommended that the data controller cease the use of live personal data for testing and either anonymise the data or create a fictitious data set for testing purposes.

The transmission of such student data via an unsecured channel is also inconsistent with the Data Protection Acts. It was found that, during the development of the management system, personal data, including passwords, was exchanged between the data controller and the service provider, using an unsecured channel. The data controller advised my Office of the fact that they now transmit such data via a secure mechanism. The Office recommended that this mechanism be brought to the attention of all staff.

Another issue discovered during our investigation that caused great concern was the use of a generic password. The fact that the date of birth of the student was assigned as their password meant that any individual who had access to the date of birth of another student could access the user account of that student. The Office recommended that the data controller communicate with students, advising that they change their password and that the new password be a minimum of 12 characters and include upper- and lower-case characters, numerals and special characters, such as a symbol or punctuation mark.

Case Study 14: Data Controller Discloses Personal Data to Business Partner

The Office received notification from a data controller advising that an email had been issued to a business partner which included personal data that should not have been disclosed.

The data controller advised the Office that it had entered into a business agreement with a third-party company to provide anonymised data to allow for a feasibility assessment of a proposed business venture. An email was issued to the third-party company which included the names of individuals in addition to the agreed anonymised data. This allowed for the third-party company to identify the individuals involved.

The data controller, in notifying this Office, stated that the third-party company had provided assurances that the data had been deleted.

The Office commenced an investigation of a data-security breach, under Section 10 of the Data Protection Acts.

Given the nature of the data involved and additional information received by a third party, this Office decided to visit the premises of the third-party business partner to satisfy ourselves that the data had been deleted and not further processed.

An investigation team, using our powers under Section 24 of the Data Protection Acts, arrived unannounced at the premises of the business partner. The team obtained documents in relation to the business agreement; these showed that only anonymised data had been sought. The team also obtained reports that had been created on foot of the receipt of the personal data. It was evident from these reports that, while personal data was available to the third party, it had not been used in the preparation of the reports and had no impact on the reports.

The team then examined the computer systems of the company and discovered several instances of the email it had received which contained the personal data.

The Commissioner felt it appropriate to issue an Enforcement Notice to the third-party company, requiring them to engage an external IT security company to delete any and all copies of the personal data it had received. The IT security company was to provide my Office with a report on the completion of the work. This report was duly received and this Office was satisfied that all copies of the personal data had been securely deleted.

The investigation found that personal data had been disclosed without consent or a legal basis. The investigation also noted that non-business related email accounts had been used by members of staff of the data controller in the conduct of business matters. The data controller was advised to prevent the use of non-business email accounts as the data controller could not control any data that would be transmitted through these non-business accounts.

Case Study 15: Employee of Financial Institution Resigns Taking Customer Personal Data

The Office received a notification from a data controller, in accordance with the Personal Data Security Breach Code of Practice. The notification stated that an employee had tendered their resignation and the data controller then discovered that the employee had emailed a spreadsheet to their personal email account prior to their resignation. The spreadsheet contained details of customers, including their employment details, salaries, contact details and medical consultant.

The data controller provided the name and home address of the employee.

The Office was also contacted by the umbrella organisation of the data controller seeking assistance on how to advise their member.

The Office verified, through the Companies Registration Office, that a business was operating from the home address of the employee. We then contacted the employee on the basis that they were now operating as a data controller in their own right. We sought clarification from the employee as to the consent they had to process any personal data they obtained from their previous employment.

The employee advised the Office that, as part of their employment, they were asked to use their own laptop and personal phone for all business dealings. The employee also advised that they had not yet started canvassing for clients. The employee also confirmed that they had deleted all the personal data they held in relation to their previous employment.

We also engaged with the data controller who had made the notification in relation to the security procedures that were in place to protect customer data in its possession. The Office noted that the employment contract contained appropriate data-protection clauses. However, of concern was the fact that employees were using their own equipment for business purposes. In such circumstances, the data controller has little or no control over that data held on personal equipment.

The data controller introduced further procedures and policies on foot of the issue to prevent a repeat of this type of incident, including the introduction of software to password protect any data records being emailed. Furthermore, all employees must sign an undertaking on termination of employment that all data has been returned and will not be further processed.

Case Study 16: Theft of Unencrypted Laptop

The Office received a data-security breach notification during the year from a medical professional relating to a stolen laptop.

The notification advised that the laptop was password protected, but not encrypted. The notification also advised that the data stored on the laptop related to a medical study that was undertaken in 2009 and included audio files of interviews carried out with the study subjects which contained limited information. It was determined that a file listing the subjects of the study contained an ID number rather than the name of the individual. However, a further file that correlated the ID number with the subject name was also stored on the laptop. This file was also password protected.

It was noted that, before the study began, approval was obtained from the relevant Ethics Committee that covered the storage of data.

This Office advised the data controller of our guidance in relation to the notification of the affected individuals. In this particular case, the data controller advised the Office that it was of the view that notification to affected individuals would cause more distress than help to the affected individuals. This view was offered by the relevant medical professional overseeing the project. This Office must note the opinion of a medical professional who has a professional relationship with the affected individuals. We assume this decision is taken weighing the potential effects of an unauthorised disclosure of this data against the potential distress of the individual being notified of the security breach.

The Office, however, noted that laptops are now being encrypted. This case highlights the fact that data-protection considerations need to be constantly monitored. What may have been an acceptable standard five years previous may not now be acceptable, and security arrangements must be periodically reviewed.

Case Study 17: Compromise of Adobe Network

Adobe Systems Software Ireland Ltd notified this Office in October 2013, in accordance with the Personal Data Security Breach Code of Practice, of a data-security breach regarding an unauthorised access to their systems. Personal data was compromised and the attacker also took Adobe software source-code elements.

Two data controllers were affected: Adobe US and Adobe Systems Software Ireland Ltd (Adobe Irl). We engaged in a coordinated investigation with the Office of the Privacy Commissioner of Canada and we were co-joined in our investigation by the Office of the Australian Information Commissioner.

Nature of Data Compromised

Adobe Irl created three classifications of individuals affected:

  • Payment-card users, i.e. those whose encrypted payment-card numbers were accessed during the breach. The data involved was encrypted payment-card data – approximately 3.65 million payment cards (1 million controlled by Adobe Irl) relating to approximately 3.1 million individuals.
  • Active users, i.e. those who had logged in to Adobe systems at least once in the two years prior to the discovery of the breach. The data involved was: email address and current encrypted password – 41 million (reduces to 33 million, as 8 million email notifications were undeliverable) (20.5 million controlled by Adobe Irl).
  • Non-active users, i.e. those who had not logged in to Adobe in the two years prior to the discovery of the breach. The data involved was: email address and current encrypted password – 71 million (reduces to 46.5 million due to 25 million email notifications undeliverable) (28.5 million controlled by Adobe Irl).

How the Breach Occurred

The attack was a sophisticated and sustained intrusion of Adobe’s computer systems. Attackers identified and removed data from a backup server that stored the compromised data described above. Adobe states it has no evidence to show that unencrypted card details were taken. Forensic consultants engaged by Adobe supported this conclusion.

When Adobe learned of the security breach, they began an investigation of the cause of the issue and also initiated a series of measures including the following:

  • Disconnected the impacted database server from the network
  • Blacklisted IP addresses from which the attacker accessed their systems
  • Reset passwords for all potentially affected users (including active, non-active)
  • Changed passwords for relevant administrator accounts
  • Notified the banks processing customer payments for Adobe, so they could work to protect customers’ accounts
  • Reported the breach to law-enforcement authorities
  • Employed a third-party company to conduct an investigation of the cause of the security breach of its systems and to identify what data may have been compromised
  • Took actions to reduce the risks related to the theft of certain source-code elements
  • Issued notifications to affected individuals, beginning on 3 October 2013, which alerted customers to the security breach

At risk: the attacker posted some data that was exfiltrated on a website and included the email address and encrypted password of certain Adobe users. A number of research articles have demonstrated that some passwords have been deciphered by reference to password hints and repeated passwords (i.e., the same password used by more than one user). One article highlighted an organisation that had checked the compromised usernames and deciphered passwords against its own platform and found a significant number of these credentials would have worked on its own platform. The organisation contacted some of its affected users, alerting them to the issue, and also confirmed the scenario to this office. At issue here is that while Adobe enforced a password change on its own site and advised users to change their passwords elsewhere, it is evident that not all users followed such advice.

Hints: Parts of the data exfiltrated by the attacker were the password hints of a small percentage of users. These hints were stored in clear text and associated with the username (email address). This information, along with an analysis of the encrypted passwords, will allow for the identification of certain simple passwords. However, as previously noted, Adobe reset the passwords for all impacted users.

Storage: The Office queried why passwords were stored in one system in an encrypted manner rather that hashed and salted. Encrypted passwords can be unencrypted, which would allow a data controller to see the passwords of users, or attackers, if they gained access. Adobe stated it was actually hashing and salting passwords within a new system for a number of years prior to the discovery of the security breach, but decided to also keep the database in the old system as a backup measure in case of issues with the new system. Passwords in the old system’s database had been encrypted.

Retention of Card Data with Customer Records

Customers who used payment cards to purchase Adobe products or services had their card details (encrypted) stored with the customer account within one particular system. Card numbers have now been replaced with a token system. This process began prior to the discovery of the security breach and was completed shortly thereafter. The token, which is encrypted, represents the payment-card number within the customer record and Adobe systems transmits the encrypted token to a third-party service provider, whose systems are located outside Adobe’s network, for payment processing.

Notifications to Affected Individuals

Adobe provided the Office with a list of when they notified each class of affected individuals and the relevant notification. In addition, Adobe publicly announced the 2013 breach in posts on its website, which included discussion of the theft of source code. The various notifications did advise individuals to monitor their credit-card statements and change their password if it was used on another site.

When we queried why notifications did not issue to those individuals where only contact details were compromised and did not include password or payment-card data, Adobe replied that it believed that notice in this scenario would lead to over-notification and notification fatigue and that there is not a significant risk of harm with respect to a compromise of this type of data element. The Code of Practice recommends that affected users are notified, so that each affected individual can consider the consequences for themselves and take appropriate measures.

This Office would expect that if a similar incident were to occur in the future, Adobe, or any other data controller, would automatically include all individuals for whom personal data had been compromised in its notification process.

Conclusion and Findings

Adobe fully cooperated with our investigation of the security breach reported to us on 2 October 2013. Adobe took appropriate action on discovery of the attack to prevent further access to their systems as required under Section 2(1)(d) of the Data Protection Acts 1988 and 2003. It also enforced a password change for its users to protect against unauthorised access to account data. Adobe’s quick reaction on learning of the security breach prevented the attacker from exfiltrating unencrypted payment-card details.

Adobe’s transitioning from the use of encrypted passwords in the old system to the use of hashed and salted passwords in the new system could have been achieved more effectively and expeditiously than was the case. Of concern to those users who provided password hints, Adobe stored these in plain text rather than in an encrypted format, some of which have been compromised.

This Office is cognisant of the fact that data controllers such as Adobe will always be a target for attackers and new attack methods are constantly being devised.

This Office found that Adobe was in breach of Section 2(1)(d) of the Acts by failing to have in place appropriate security measures to protect the data under its control, despite its documented security programme. It was also recommended that Adobe engages a third party to carry out an independent review of its systems.

Adobe has since put in place substantial improvements in its security protocols, practices and procedures, and this Office is satisfied that it now has appropriate procedures in place to minimise the possibility of a similar security breach in the future.

  • - Google Chrome

Intended for healthcare professionals

  • Access provided by Google Indexer
  • My email alerts
  • BMA member login
  • Username * Password * Forgot your log in details? Need to activate BMA Member Log In Log in via OpenAthens Log in via your institution

Home

Search form

  • Advanced search
  • Search responses
  • Search blogs
  • Data protection...

Data protection legislation: interpretation and barriers to research

  • Related content
  • Peer review
  • Judith Strobl ( strobl{at}liv.ac.uk ) , research fellow a ,
  • Emma Cave , research fellow b ,
  • Tom Walley , professor, clinical pharmacology a
  • a Prescribing Research Group, Department of Pharmacology and Therapeutics, University of Liverpool, Liverpool L69 3GE,
  • b Centre For Professional Ethics, University of Central Lancashire, Preston PR1 2HE
  • Correspondence to: Judith Strobl
  • Accepted 26 June 2000

Research has been described as “a powerful means of achieving” the objectives of the Department of Health, namely “to improve the health and well-being of the population and to secure high quality care.” 1 There is, however, a need to find a balance between facilitating important research and protecting the confidentiality of patients. As the capabilities of information technology grow, legal frameworks and professional guidance need to be created or refined to safeguard the rights of patients.

Some areas of the common law duty of confidentiality and the new Data Protection Act 1998 (box, p 891), which constitutes the United Kingdom's implementation of the relevant European Union directive, 2 are causing difficulties of interpretation within the NHS. With few exceptions, broad debate about the implications of the new act is lacking, particularly in the context of epidemiological research that uses patients' records. 6-8 Questions of consent, anonymisation of data for research, and access to medical notes for research purposes (rather than audit) have been addressed in a range of literature. 9-13 Some of these documents are being updated; this may indicate that there are uncertainties about the legal issues involved in implementing the act. Local variations in interpretation may cause particular difficulties for researchers conducting multicentre epidemiological studies, as the case study that will be described in this article shows.

In the meantime, those who must make decisions about confidentiality are still confused. This confusion exists for several reasons. Firstly, there is the interpretation of the act (and to an extent the common law duty of confidentiality). The interpretation is subject to debate, and no case law exists which might clarify the interpretation. Secondly, there is a dearth of up to date and clear policy guidance. Thirdly, the new system of “Caldicott guardians” (box) is untried, and guardians as well as others are only beginning to learn to exercise their new responsibilities. Fourthly, clarification is needed about the role that research ethics committees should have in data protection and confidentiality. Guidance recently issued by the NHS should help clarify some of these areas. 14 We highlight issues for future discussion that have arisen in a case study of a multicentre epidemiological project that sought to use patients' records.

Summary points

The interpretation of the Data Protection Act 1998 and how it affects the NHS, healthcare, and epidemiological research is riddled with uncertainties

Clarification is needed to determine how the common law duty of confidentiality affects the health sector in terms of using patients' data for research

Different interpretations of the act and the duty of confidentiality may adversely affect the ability of researchers to conduct multicentre studies

Regional NHS sources funded our department in collaboration with clinicians from five NHS trusts to undertake a retrospective pragmatic study of the effectiveness and cost effectiveness of a new drug treatment. In the initial phase it was expected that a registered nurse employed by the university would extract data on treatment and on the utilisation of health services from the routine records of patients seen in collaborating trusts.

The relevant multicentre research ethics committee approved the study but advised the researchers that the question of whether explicit consent was needed from patients to allow the researchers to have access to the medical records needed to be clarified with data protection officers at the five hospital trusts. The responses to this request are shown in the box (p 891). The trusts' decisions varied considerably and usually involved complex internal discussions and consultation; consequently, this led to delays.

Does the diversity in the outcomes mean that some trusts made erroneous judgments or that the law is ambiguous, or can the situations in individual trusts be sufficiently different for them to reach contrary decisions? Although the latter case seems unlikely, there are individual circumstances under which trusts may arrive at a different decision about the same project. One such condition may involve cases in which trusts have in place routine mechanisms to obtain consent from patients for the use of their personal data for future research, a procedure which would be subject to the approval of a research ethics committee.

Explanation of terms

Data Protection Act 1998 —This brings into UK law European Directive 95/46/EC on the processing of personal data. It came into effect on 1 March 2000, and in comparison with the 1984 act (which it replaces) it is concerned with both records on paper and records held on computers. The act is based on eight principles the first of which stipulates that “personal data shall be processed fairly and lawfully.” Interpretation of the phrase “fairly and lawfully” may give rise to different opinions about implementation.

Common Law Duty of Confidentiality —This legal duty applies to information entrusted to someone in confidence. The duty of confidentiality applies independently of the Data Protection Act. The Department of Health acknowledges that there are conflicting legal views on applying this duty and is trying to interpret it for the health sector. 3 In particular, the issue of consent and the conditions under which consent can be implied or waived need to be clarified.

Caldicott guardian —In 1997 the Caldicott Committee reported on its review of information that identifies NHS patients. 4 In keeping with the report's main recommendations each health authority, trust, and primary care group in the United Kingdom appointed a “Caldicott guardian.” One key responsibility of the guardians is to agree and review internal protocols for the protection and use of identifiable information obtained from patients. 5

Trusts' decisions on whether patients needed to give explicit consent

Trust 1 —This trust decided that the researcher could have access to patients' records without explicit consent from patients as long as no identifiable information was removed from the hospital (for example, the researcher could extract information from records and retain it in coded form but the key for decoding would be kept at the hospital). (Time to decision: <3 weeks.)

Trust 2 —The Caldicott guardian decided that consent from patients was required. This decision was later revised after the trust sought legal advice, and the researcher was then permitted to have access to patients' records because the Data Protection Act 1998 only came into force after the start of the study (1 March 2000). (Time to decision: 4-5 months.)

Trust 3 —The data protection officer and the Caldicott guardian advised the researcher to obtain explicit consent from patients because the researcher was not a staff member of the trust and no explicit consent exists from patients to permit the use of their data for research (for example, no agreements are signed by patients when they are first seen). (Time to decision: 6 weeks.)

Trust 4 —The data protection officer immediately decided that the proposed study required explicit consent from patients since only staff with a duty of care to the patient are permitted to have access to that patient's medical records, and, unlike audit, research is not seen as part of the healthcare process. (Time to decision: immediate.)

Trust 5 —The data protection officer made a formal decision only about records held on the computer. The outsider status of the researcher was problematic. The case of deceased patients (which is not covered by the Data Protection Act) would have to be decided by the research ethics committee. (Time to decision: no formal decision at 7 weeks.)

As a result of the trusts' decisions there seemed to be three options available to the researchers: abandon the project entirely, seek explicit consent from patients who have been treated in the trusts that demand explicit consent, or alter the design of the study so that only anonymised data are used.

Issues of consent, anonymisation, and access to patients' records for research need to be more widely discussed and evaluated in terms of the 1998 act and the Common Law Duty of Confidentiality. Well meaning clinicians may be passing anonymised or non-anonymised data to researchers without realising the legal implications.

It is not easy to answer questions about data protection requirements for particular research projects, and many individuals within trusts who are responsible for tackling these questions face difficulties in answering them. Because of the current uncertainty, insurmountable problems may arise in cases in which researchers hope to conduct their studies at a variety of centres, especially since they may have to comply with conflicting interpretations of the existing law and conflicting guidance from various bodies. This situation has created inconsistencies in the access to routine NHS data allowed to researchers. Additionally, the appropriate interactions between the new Caldicott guardians, the data protection officers in each trust, clinicians, and research ethics committees has not yet been fully clarified; however, a revision of the guidance for local research ethics committees is expected to be published later in the year and may partially address this problem. 15 Also, anxieties about the requirements for consent have increased as a result of the exposure of cases in which organs were retained for research and medical research procedures were performed on children. 16 17

One of the options for resolving the issue of consent in our case study was to use anonymous data. A High Court decision in May 1999 increased uncertainty in the healthcare and medical research communities about the legality of processing even fully anonymised data without consent 18 : in this case the judge held that confidentiality can be breached even when anonymisation is used if the patient has not consented and the research is not in the public interest (in this case, data were being sold by pharmacists indirectly to the pharmaceutical industry). The Court of Appeal overturned the judge's decision in December 1999, ruling that as a reasonable pharmacist's conscience would not be troubled by the proposed use of the information any claim for breach of confidentiality was unlikely to be successful. 19 Unfortunately this aspect of the law remains unresolved because leave may be given to appeal to the House of Lords.

The view of the data protection commissioner is that any personal data which has been encoded remains personal data in the sense of the Data Protection Act 1998 provided that the key for decoding it remains in existence. Thus, coded data falls within the scope of the Data Protection Act even if the key for decoding it is not accessible to the researcher. The new NHS number being assigned to patients is an example of such a code, and chronic disease registers and reporting systems or postmarketing surveillance systems of new drug treatments might use codes that can be linked to individuals. Much epidemiological research and research into health economics would simply be impossible to conduct if completely anonymous data had to be used because updating, linking, or validating data is impossible without using codes.

The processing of coded personal data (sometimes called “pseudonymised” data to distinguish it from fully anonymised data) for research does not necessarily contravene the act. However, in considering whether data processing is “fair and lawful” routine mechanisms to merely inform patients in advance about the potential use of their personal data for future research (for example, through form letters or notices posted in waiting rooms) may not be seen as constituting sufficient consent. It is also unclear whether patients who do not register their refusal can be said to have consented. Neither the Data Protection Act 1998 nor the confidentiality law give sufficient guidance as to what constitutes explicit and implied consent and when each ought to be used.

Strict, clear criteria are urgently needed to determine under which limited situations such consent requirements for research using patient data might be waived; these must take into account the degree of anonymisation. The Department of Health's proposal to set up a national confidentiality and security advisory body, which was announced on 15 March, is welcome. 20 This new body should have the potential to provide the necessary clear guidance for research, similar to the guidance in the United States on disclosure of individually identifiable health data for research under specified conditions. 21

Researchers performing epidemiological studies in the United Kingdom need clear guidance in several areas. Firstly, the definition of explicit consent and the situations in which it is required need further explanation. Secondly, there is an unacceptable amount of uncertainty over when consent can be considered to have been implied or when it may be waived on grounds of public interest. Research ethics committees may be asked to advise on whether processing identifiable data without consent is in the public interest. This is an onerous responsibility, especially in light of the uncertainties described in this paper. The Department of Health's ongoing review of guidelines for local research ethics committees will help illuminate this situation. The legal responsibility lies ultimately with the trust, and any decision regarding disclosure must be able to be justified as being in the public interest. Thirdly, anonymisation and its effects need to be clarified especially taking into account the court case described earlier. Fourthly, issues of access to confidential data must be resolved. The effect of a contract between the NHS and outside research staff also needs to be clarified (for example, in cases in which research staff are funded by the NHS itself or when they have an NHS contract with some, but not all, of the trusts involved in a multicentre research study). Ultimately, the legality of any guidance or decision can only be determined by the courts.

(Credit: SUE SHARPLES)

In the meantime, a workable solution that respects patients' rights may be to ensure that data are fully anonymised whenever possible. In this case, the data is not personal and does not fall within the scope of the Data Protection Act. If full anonymisation is not possible or the design of the study does not permit it, the use of pseudonymous data (created using codes and carefully restricting access to them) should be considered, bearing in mind that it is still seen by the data protection registrar as personal data. To facilitate future research, trusts need to ensure that sufficient mechanisms are in place to inform patients about any potential use of their data for research and to obtain consent when necessary. Finally, researchers should agree their project design with those responsible for data protection well in advance.

Competing interests None declared.

Funding JS is funded by the NHS Executive North West. At the time of writing this paper, EC was funded by the Liverpool, Manchester, Preston Ethics Training Project (LiMPET), a collaborative venture between the Universities of Central Lancashire, Manchester, and Liverpool, funded by the NHS Executive North West.

  • Department of Health
  • Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data.
  • NHS Executive
  • Caldicott Committee
  • Tobias JS ,
  • Warnock M ,
  • Donnelly P ,
  • Shellens T ,
  • Romano-Critchley G ,
  • Sommerville A
  • General Medical Council
  • Medical Research Council Working Group on Personal Information in Medical Research
  • Research based on archived information and samples
  • 18. ↵ R v Department Of Health, Ex Parte Source Informatics Ltd and (1) Association Of the British Pharmaceutical Industry (2) General Medical Council (3) Medical Research Council (4) National Pharmaceutical Association Ltd (Interveners) (1999) Times Law Report, 14 June.
  • 19. ↵ R v Department Of Health, Ex Parte Source Informatics Ltd and (1) Association Of the British Pharmaceutical Industry (2) General Medical Council (3) Medical Research Council (4) National Pharmaceutical Association Ltd (Interveners) (2000) Times Law Report, 18 January.
  • 20. ↵ Department of Health. New body to advise on patient confidentiality: applications invited for board members . 15 March 2000. (Press release.) pipe.ccta.gov.uk/coi/coipress.nsf/70e1fa6684c1d3f380256735005750fb/b726155c1d2ba69d802568a30044aab0?OpenDocument accessed 15 March 2000 .
  • Secretary of Health and Human Services.

data protection act 1998 case study

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • v.321(7265); 2000 Oct 7

Data protection legislation: interpretation and barriers to research

Judith strobl.

a Prescribing Research Group, Department of Pharmacology and Therapeutics, University of Liverpool, Liverpool L69 3GE, b Centre For Professional Ethics, University of Central Lancashire, Preston PR1 2HE

Research has been described as “a powerful means of achieving” the objectives of the Department of Health, namely “to improve the health and well-being of the population and to secure high quality care.” 1 There is, however, a need to find a balance between facilitating important research and protecting the confidentiality of patients. As the capabilities of information technology grow, legal frameworks and professional guidance need to be created or refined to safeguard the rights of patients.

Some areas of the common law duty of confidentiality and the new Data Protection Act 1998 (box, p 891), which constitutes the United Kingdom's implementation of the relevant European Union directive, 2 are causing difficulties of interpretation within the NHS. With few exceptions, broad debate about the implications of the new act is lacking, particularly in the context of epidemiological research that uses patients' records. 6 – 8 Questions of consent, anonymisation of data for research, and access to medical notes for research purposes (rather than audit) have been addressed in a range of literature. 9 – 13 Some of these documents are being updated; this may indicate that there are uncertainties about the legal issues involved in implementing the act. Local variations in interpretation may cause particular difficulties for researchers conducting multicentre epidemiological studies, as the case study that will be described in this article shows.

In the meantime, those who must make decisions about confidentiality are still confused. This confusion exists for several reasons. Firstly, there is the interpretation of the act (and to an extent the common law duty of confidentiality). The interpretation is subject to debate, and no case law exists which might clarify the interpretation. Secondly, there is a dearth of up to date and clear policy guidance. Thirdly, the new system of “Caldicott guardians” (box) is untried, and guardians as well as others are only beginning to learn to exercise their new responsibilities. Fourthly, clarification is needed about the role that research ethics committees should have in data protection and confidentiality. Guidance recently issued by the NHS should help clarify some of these areas. 14 We highlight issues for future discussion that have arisen in a case study of a multicentre epidemiological project that sought to use patients' records.

Summary points

  • The interpretation of the Data Protection Act 1998 and how it affects the NHS, healthcare, and epidemiological research is riddled with uncertainties
  • Clarification is needed to determine how the common law duty of confidentiality affects the health sector in terms of using patients' data for research
  • Different interpretations of the act and the duty of confidentiality may adversely affect the ability of researchers to conduct multicentre studies

Regional NHS sources funded our department in collaboration with clinicians from five NHS trusts to undertake a retrospective pragmatic study of the effectiveness and cost effectiveness of a new drug treatment. In the initial phase it was expected that a registered nurse employed by the university would extract data on treatment and on the utilisation of health services from the routine records of patients seen in collaborating trusts.

The relevant multicentre research ethics committee approved the study but advised the researchers that the question of whether explicit consent was needed from patients to allow the researchers to have access to the medical records needed to be clarified with data protection officers at the five hospital trusts. The responses to this request are shown in the box (p 891). The trusts' decisions varied considerably and usually involved complex internal discussions and consultation; consequently, this led to delays.

Does the diversity in the outcomes mean that some trusts made erroneous judgments or that the law is ambiguous, or can the situations in individual trusts be sufficiently different for them to reach contrary decisions? Although the latter case seems unlikely, there are individual circumstances under which trusts may arrive at a different decision about the same project. One such condition may involve cases in which trusts have in place routine mechanisms to obtain consent from patients for the use of their personal data for future research, a procedure which would be subject to the approval of a research ethics committee.

Explanation of terms

Data Protection Act 1998 —This brings into UK law European Directive 95/46/EC on the processing of personal data. It came into effect on 1 March 2000, and in comparison with the 1984 act (which it replaces) it is concerned with both records on paper and records held on computers. The act is based on eight principles the first of which stipulates that “personal data shall be processed fairly and lawfully.” Interpretation of the phrase “fairly and lawfully” may give rise to different opinions about implementation.

Common Law Duty of Confidentiality —This legal duty applies to information entrusted to someone in confidence. The duty of confidentiality applies independently of the Data Protection Act. The Department of Health acknowledges that there are conflicting legal views on applying this duty and is trying to interpret it for the health sector. 3 In particular, the issue of consent and the conditions under which consent can be implied or waived need to be clarified.

Caldicott guardian —In 1997 the Caldicott Committee reported on its review of information that identifies NHS patients. 4 In keeping with the report's main recommendations each health authority, trust, and primary care group in the United Kingdom appointed a “Caldicott guardian.” One key responsibility of the guardians is to agree and review internal protocols for the protection and use of identifiable information obtained from patients. 5

Trusts' decisions on whether patients needed to give explicit consent

Trust 1 —This trust decided that the researcher could have access to patients' records without explicit consent from patients as long as no identifiable information was removed from the hospital (for example, the researcher could extract information from records and retain it in coded form but the key for decoding would be kept at the hospital). (Time to decision: <3 weeks.)

Trust 2 —The Caldicott guardian decided that consent from patients was required. This decision was later revised after the trust sought legal advice, and the researcher was then permitted to have access to patients' records because the Data Protection Act 1998 only came into force after the start of the study (1 March 2000). (Time to decision: 4-5 months.)

Trust 3 —The data protection officer and the Caldicott guardian advised the researcher to obtain explicit consent from patients because the researcher was not a staff member of the trust and no explicit consent exists from patients to permit the use of their data for research (for example, no agreements are signed by patients when they are first seen). (Time to decision: 6 weeks.)

Trust 4 —The data protection officer immediately decided that the proposed study required explicit consent from patients since only staff with a duty of care to the patient are permitted to have access to that patient's medical records, and, unlike audit, research is not seen as part of the healthcare process. (Time to decision: immediate.)

Trust 5 —The data protection officer made a formal decision only about records held on the computer. The outsider status of the researcher was problematic. The case of deceased patients (which is not covered by the Data Protection Act) would have to be decided by the research ethics committee. (Time to decision: no formal decision at 7 weeks.)

As a result of the trusts' decisions there seemed to be three options available to the researchers: abandon the project entirely, seek explicit consent from patients who have been treated in the trusts that demand explicit consent, or alter the design of the study so that only anonymised data are used.

Issues of consent, anonymisation, and access to patients' records for research need to be more widely discussed and evaluated in terms of the 1998 act and the Common Law Duty of Confidentiality. Well meaning clinicians may be passing anonymised or non-anonymised data to researchers without realising the legal implications.

It is not easy to answer questions about data protection requirements for particular research projects, and many individuals within trusts who are responsible for tackling these questions face difficulties in answering them. Because of the current uncertainty, insurmountable problems may arise in cases in which researchers hope to conduct their studies at a variety of centres, especially since they may have to comply with conflicting interpretations of the existing law and conflicting guidance from various bodies. This situation has created inconsistencies in the access to routine NHS data allowed to researchers. Additionally, the appropriate interactions between the new Caldicott guardians, the data protection officers in each trust, clinicians, and research ethics committees has not yet been fully clarified; however, a revision of the guidance for local research ethics committees is expected to be published later in the year and may partially address this problem. 15 Also, anxieties about the requirements for consent have increased as a result of the exposure of cases in which organs were retained for research and medical research procedures were performed on children. 16 , 17

One of the options for resolving the issue of consent in our case study was to use anonymous data. A High Court decision in May 1999 increased uncertainty in the healthcare and medical research communities about the legality of processing even fully anonymised data without consent 18 : in this case the judge held that confidentiality can be breached even when anonymisation is used if the patient has not consented and the research is not in the public interest (in this case, data were being sold by pharmacists indirectly to the pharmaceutical industry). The Court of Appeal overturned the judge's decision in December 1999, ruling that as a reasonable pharmacist's conscience would not be troubled by the proposed use of the information any claim for breach of confidentiality was unlikely to be successful. 19 Unfortunately this aspect of the law remains unresolved because leave may be given to appeal to the House of Lords.

The view of the data protection commissioner is that any personal data which has been encoded remains personal data in the sense of the Data Protection Act 1998 provided that the key for decoding it remains in existence. Thus, coded data falls within the scope of the Data Protection Act even if the key for decoding it is not accessible to the researcher. The new NHS number being assigned to patients is an example of such a code, and chronic disease registers and reporting systems or postmarketing surveillance systems of new drug treatments might use codes that can be linked to individuals. Much epidemiological research and research into health economics would simply be impossible to conduct if completely anonymous data had to be used because updating, linking, or validating data is impossible without using codes.

The processing of coded personal data (sometimes called “pseudonymised” data to distinguish it from fully anonymised data) for research does not necessarily contravene the act. However, in considering whether data processing is “fair and lawful” routine mechanisms to merely inform patients in advance about the potential use of their personal data for future research (for example, through form letters or notices posted in waiting rooms) may not be seen as constituting sufficient consent. It is also unclear whether patients who do not register their refusal can be said to have consented. Neither the Data Protection Act 1998 nor the confidentiality law give sufficient guidance as to what constitutes explicit and implied consent and when each ought to be used.

Strict, clear criteria are urgently needed to determine under which limited situations such consent requirements for research using patient data might be waived; these must take into account the degree of anonymisation. The Department of Health's proposal to set up a national confidentiality and security advisory body, which was announced on 15 March, is welcome. 20 This new body should have the potential to provide the necessary clear guidance for research, similar to the guidance in the United States on disclosure of individually identifiable health data for research under specified conditions. 21

Researchers performing epidemiological studies in the United Kingdom need clear guidance in several areas. Firstly, the definition of explicit consent and the situations in which it is required need further explanation. Secondly, there is an unacceptable amount of uncertainty over when consent can be considered to have been implied or when it may be waived on grounds of public interest. Research ethics committees may be asked to advise on whether processing identifiable data without consent is in the public interest. This is an onerous responsibility, especially in light of the uncertainties described in this paper. The Department of Health's ongoing review of guidelines for local research ethics committees will help illuminate this situation. The legal responsibility lies ultimately with the trust, and any decision regarding disclosure must be able to be justified as being in the public interest. Thirdly, anonymisation and its effects need to be clarified especially taking into account the court case described earlier. Fourthly, issues of access to confidential data must be resolved. The effect of a contract between the NHS and outside research staff also needs to be clarified (for example, in cases in which research staff are funded by the NHS itself or when they have an NHS contract with some, but not all, of the trusts involved in a multicentre research study). Ultimately, the legality of any guidance or decision can only be determined by the courts.

An external file that holds a picture, illustration, etc.
Object name is strj2450.f1.jpg

In the meantime, a workable solution that respects patients' rights may be to ensure that data are fully anonymised whenever possible. In this case, the data is not personal and does not fall within the scope of the Data Protection Act. If full anonymisation is not possible or the design of the study does not permit it, the use of pseudonymous data (created using codes and carefully restricting access to them) should be considered, bearing in mind that it is still seen by the data protection registrar as personal data. To facilitate future research, trusts need to ensure that sufficient mechanisms are in place to inform patients about any potential use of their data for research and to obtain consent when necessary. Finally, researchers should agree their project design with those responsible for data protection well in advance.

  Competing interests: None declared.

Funding: JS is funded by the NHS Executive North West. At the time of writing this paper, EC was funded by the Liverpool, Manchester, Preston Ethics Training Project (LiMPET), a collaborative venture between the Universities of Central Lancashire, Manchester, and Liverpool, funded by the NHS Executive North West.

  • Free Samples
  • Premium Essays
  • Editing Services Editing Proofreading Rewriting
  • Extra Tools Essay Topic Generator Thesis Generator Citation Generator GPA Calculator Study Guides Donate Paper
  • Essay Writing Help
  • About Us About Us Testimonials FAQ
  • Studentshare
  • The Role of the Data Protection Act 1998

The Role of the Data Protection Act 1998 - Case Study Example

The Role of the Data Protection Act 1998

  • Subject: Law
  • Type: Case Study
  • Level: Ph.D.
  • Pages: 4 (1000 words)
  • Downloads: 3
  • Author: bettyeherzog

Extract of sample "The Role of the Data Protection Act 1998"

  • Cited: 0 times
  • Copy Citation Citation is copied Copy Citation Citation is copied Copy Citation Citation is copied

CHECK THESE SAMPLES OF The Role of the Data Protection Act 1998

The self-control theory of crime, the proliferation of electronic patient records, has uk anti-discrimination legislation succeeded in providing equality in workplace, the meaning of personnel management, toddler vocabulary production in low-income families, analysis of the marketing promotion of the beverage company coca-cola worldwide, relationship of management and leadership, the effects of fair trade on the third world countries.

data protection act 1998 case study

  • TERMS & CONDITIONS
  • PRIVACY POLICY
  • COOKIES POLICY

CaseQuiz.com

Data Protection Act 1998

  • Harvard Case Studies

Harvard Business Case Studies Solutions – Assignment Help

In most courses studied at Harvard Business schools, students are provided with a case study. Major HBR cases concerns on a whole industry, a whole organization or some part of organization; profitable or non-profitable organizations. Student’s role is to analyze the case and diagnose the situation, identify the problem and then give appropriate recommendations and steps to be taken.

To make a detailed case analysis, student should follow these steps:

STEP 1: Reading Up Harvard Case Study Method Guide:

Case study method guide is provided to students which determine the aspects of problem needed to be considered while analyzing a case study. It is very important to have a thorough reading and understanding of guidelines provided. However, poor guide reading will lead to misunderstanding of case and failure of analyses. It is recommended to read guidelines before and after reading the case to understand what is asked and how the questions are to be answered. Therefore, in-depth understanding f case guidelines is very important.

Harvard Case Study Solutions

Harvard Case Study Solutions

STEP 2: Reading The Data Protection Act 1998 Harvard Case Study:

To have a complete understanding of the case, one should focus on case reading. It is said that case should be read two times. Initially, fast reading without taking notes and underlines should be done. Initial reading is to get a rough idea of what information is provided for the analyses. Then, a very careful reading should be done at second time reading of the case. This time, highlighting the important point and mark the necessary information provided in the case. In addition, the quantitative data in case, and its relations with other quantitative or qualitative variables should be given more importance. Also, manipulating different data and combining with other information available will give a new insight. However, all of the information provided is not reliable and relevant.

When having a fast reading, following points should be noted:

  • Nature of organization
  • Nature if industry in which organization operates.
  • External environment that is effecting organization
  • Problems being faced by management
  • Identification of communication strategies.
  • Any relevant strategy that can be added.
  • Control and out-of-control situations.

When reading the case for second time, following points should be considered:

  • Decisions needed to be made and the responsible Person to make decision.
  • Objectives of the organization and key players in this case.
  • The compatibility of objectives. if not, their reconciliations and necessary redefinition.
  • Sources and constraints of organization from meeting its objectives.

After reading the case and guidelines thoroughly, reader should go forward and start the analyses of the case.

STEP 3: Doing The Case Analysis Of Data Protection Act 1998:

To make an appropriate case analyses, firstly, reader should mark the important problems that are happening in the organization. There may be multiple problems that can be faced by any organization. Secondly, after identifying problems in the company, identify the most concerned and important problem that needed to be focused.

Firstly, the introduction is written. After having a clear idea of what is defined in the case, we deliver it to the reader. It is better to start the introduction from any historical or social context. The challenging diagnosis for Data Protection Act 1998 and the management of information is needed to be provided. However, introduction should not be longer than 6-7 lines in a paragraph. As the most important objective is to convey the most important message for to the reader.

After introduction, problem statement is defined. In the problem statement, the company’s most important problem and constraints to solve these problems should be define clearly. However, the problem should be concisely define in no more than a paragraph. After defining the problems and constraints, analysis of the case study is begin.

STEP 4: SWOT Analysis of the Data Protection Act 1998 HBR Case Solution:

SWOT analysis helps the business to identify its strengths and weaknesses, as well as understanding of opportunity that can be availed and the threat that the company is facing. SWOT for Data Protection Act 1998 is a powerful tool of analysis as it provide a thought to uncover and exploit the opportunities that can be used to increase and enhance company’s operations. In addition, it also identifies the weaknesses of the organization that will help to be eliminated and manage the threats that would catch the attention of the management.

This strategy helps the company to make any strategy that would differentiate the company from competitors, so that the organization can compete successfully in the industry. The strengths and weaknesses are obtained from internal organization. Whereas, the opportunities and threats are generally related from external environment of organization. Moreover, it is also called Internal-External Analysis.

In the strengths, management should identify the following points exists in the organization:

  • Advantages of the organization
  • Activities of the company better than competitors.
  • Unique resources and low cost resources company have.
  • Activities and resources market sees as the company’s strength.
  • Unique selling proposition of the company.

WEAKNESSES:

  • Improvement that could be done.
  • Activities that can be avoided for Data Protection Act 1998.
  • Activities that can be determined as your weakness in the market.
  • Factors that can reduce the sales.
  • Competitor’s activities that can be seen as your weakness.

OPPORTUNITIES:

  • Good opportunities that can be spotted.
  • Interesting trends of industry.
  • Change in technology and market strategies
  • Government policy changes that is related to the company’s field
  • Changes in social patterns and lifestyles.
  • Local events.

Following points can be identified as a threat to company:

  • Company’s facing obstacles.
  • Activities of competitors.
  • Product and services quality standards
  • Threat from changing technologies
  • Financial/cash flow problems
  • Weakness that threaten the business.

Following points should be considered when applying SWOT to the analysis:

  • Precise and verifiable phrases should be sued.
  • Prioritize the points under each head, so that management can identify which step has to be taken first.
  • Apply the analyses at proposed level. Clear yourself first that on what basis you have to apply SWOT matrix.
  • Make sure that points identified should carry itself with strategy formulation process.
  • Use particular terms (like USP, Core Competencies Analyses etc.) to get a comprehensive picture of analyses.

STEP 5: PESTEL/ PEST Analysis of Data Protection Act 1998 Case Solution:

Pest analyses is a widely used tool to analyze the Political, Economic, Socio-cultural, Technological, Environmental and legal situations which can provide great and new opportunities to the company as well as these factors can also threat the company, to be dangerous in future.

Pest analysis is very important and informative.  It is used for the purpose of identifying business opportunities and advance threat warning. Moreover, it also helps to the extent to which change is useful for the company and also guide the direction for the change. In addition, it also helps to avoid activities and actions that will be harmful for the company in future, including projects and strategies.

To analyze the business objective and its opportunities and threats, following steps should be followed:

  • Brainstorm and assumption the changes that should be made to organization. Answer the necessary questions that are related to specific needs of organization
  • Analyze the opportunities that would be happen due to the change.
  • Analyze the threats and issues that would be caused due to change.
  • Perform cost benefit analyses and take the appropriate action.

Pest analysis

Pest analysis

PEST FACTORS:

  • Next political elections and changes that will happen in the country due to these elections
  • Strong and powerful political person, his point of view on business policies and their effect on the organization.
  • Strength of property rights and law rules. And its ratio with corruption and organized crimes. Changes in these situation and its effects.
  • Change in Legislation and taxation effects on the company
  • Trend of regulations and deregulations. Effects of change in business regulations
  • Timescale of legislative change.
  • Other political factors likely to change for Data Protection Act 1998.

ECONOMICAL:

  • Position and current economy trend i.e. growing, stagnant or declining.
  • Exchange rates fluctuations and its relation with company.
  • Change in Level of customer’s disposable income and its effect.
  • Fluctuation in unemployment rate and its effect on hiring of skilled employees
  • Access to credit and loans. And its effects on company
  • Effect of globalization on economic environment
  • Considerations on other economic factors

SOCIO-CULTURAL:

  • Change in population growth rate and age factors, and its impacts on organization.
  • Effect on organization due to Change in attitudes and generational shifts.
  • Standards of health, education and social mobility levels. Its changes and effects on company.
  • Employment patterns, job market trend and attitude towards work according to different age groups.

case study solutions

case study solutions

  • Social attitudes and social trends, change in socio culture an dits effects.
  • Religious believers and life styles and its effects on organization
  • Other socio culture factors and its impacts.

TECHNOLOGICAL:

  • Any new technology that company is using
  • Any new technology in market that could affect the work, organization or industry
  • Access of competitors to the new technologies and its impact on their product development/better services.
  • Research areas of government and education institutes in which the company can make any efforts
  • Changes in infra-structure and its effects on work flow
  • Existing technology that can facilitate the company
  • Other technological factors and their impacts on company and industry

These headings and analyses would help the company to consider these factors and make a “big picture” of company’s characteristics. This will help the manager to take the decision and drawing conclusion about the forces that would create a big impact on company and its resources.

STEP 6: Porter’s Five Forces/ Strategic Analysis Of The Data Protection Act 1998 Case Study:

To analyze the structure of a company and its corporate strategy, Porter’s five forces model is used. In this model, five forces have been identified which play an important part in shaping the market and industry. These forces are used to measure competition intensity and profitability of an industry and market.

porter's five forces model

porter’s five forces model

These forces refers to micro environment and the company ability to serve its customers and make a profit. These five forces includes three forces from horizontal competition and two forces from vertical competition. The five forces are discussed below:

  • THREAT OF NEW ENTRANTS:
  • as the industry have high profits, many new entrants will try to enter into the market. However, the new entrants will eventually cause decrease in overall industry profits. Therefore, it is necessary to block the new entrants in the industry. following factors is describing the level of threat to new entrants:
  • Barriers to entry that includes copy rights and patents.
  • High capital requirement
  • Government restricted policies
  • Switching cost
  • Access to suppliers and distributions
  • Customer loyalty to established brands.
  • THREAT OF SUBSTITUTES:
  • this describes the threat to company. If the goods and services are not up to the standard, consumers can use substitutes and alternatives that do not need any extra effort and do not make a major difference. For example, using Aquafina in substitution of tap water, Pepsi in alternative of Coca Cola. The potential factors that made customer shift to substitutes are as follows:
  • Price performance of substitute
  • Switching costs of buyer
  • Products substitute available in the market
  • Reduction of quality
  • Close substitution are available
  • DEGREE OF INDUSTRY RIVALRY:
  • the lesser money and resources are required to enter into any industry, the higher there will be new competitors and be an effective competitor. It will also weaken the company’s position. Following are the potential factors that will influence the company’s competition:
  • Competitive advantage
  • Continuous innovation
  • Sustainable position in competitive advantage
  • Level of advertising
  • Competitive strategy
  • BARGAINING POWER OF BUYERS:
  • it deals with the ability of customers to take down the prices. It mainly consists the importance of a customer and the level of cost if a customer will switch from one product to another. The buyer power is high if there are too many alternatives available. And the buyer power is low if there are lesser options of alternatives and switching. Following factors will influence the buying power of customers:
  • Bargaining leverage
  • Switching cost of a buyer
  • Buyer price sensitivity
  • Competitive advantage of company’s product
  • BARGAINING POWER OF SUPPLIERS:
  • this refers to the supplier’s ability of increasing and decreasing prices. If there are few alternatives o supplier available, this will threat the company and it would have to purchase its raw material in supplier’s terms. However, if there are many suppliers alternative, suppliers have low bargaining power and company do not have to face high switching cost. The potential factors that effects bargaining power of suppliers are the following:
  • Input differentiation
  • Impact of cost on differentiation
  • Strength of distribution centers
  • Input substitute’s availability.

STEP 7: VRIO Analysis of Data Protection Act 1998:

Vrio analysis for Data Protection Act 1998 case study identified the four main attributes which helps the organization to gain a competitive advantages. The author of this theory suggests that firm must be valuable, rare, imperfectly imitable and perfectly non sustainable. Therefore there must be some resources and capabilities in an organization that can facilitate the competitive advantage to company. The four components of VRIO analysis are described below: VALUABLE: the company must have some resources or strategies that can exploit opportunities and defend the company from major threats. If the company holds some value then answer is yes. Resources are also valuable if they provide customer satisfaction and increase customer value. This value may create by increasing differentiation in existing product or decrease its price. Is these conditions are not met, company may lead to competitive disadvantage. Therefore, it is necessary to continually review the Data Protection Act 1998 company’s activities and resources values. RARE: the resources of the Data Protection Act 1998 company that are not used by any other company are known as rare. Rare and valuable resources grant much competitive advantages to the firm. However, when more than one few companies uses the same resources and provide competitive parity are also known as rare resources. Even, the competitive parity is not desired position, but the company should not lose its valuable resources, even they are common. COSTLY TO IMITATE: the resources are costly to imitate, if other organizations cannot imitate it. However, imitation is done in two ways. One is duplicating that is direct imitation and the other one is substituting that is indirect imitation. Any firm who has valuable and rare resources, and these resources are costly to imitate, have achieved their competitive advantage. However, resources should also be perfectly non sustainable. The reasons that resource imitation is costly are historical conditions, casual ambiguity and social complexity. ORGANIZED TO CAPTURE VALUE: resources, itself, cannot provide advantages to organization until it is organized and exploit to do so. A firm (like Data Protection Act 1998)  must organize its management systems, processes, policies and strategies to fully utilize the resource’s potential to be valuable, rare and costly to imitate.

STEP 8: Generating Alternatives For Data Protection Act 1998 Case Solution:

After completing the analyses of the company, its opportunities and threats, it is important to generate a solution of the problem and the alternatives a company can apply in order to solve its problems. To generate the alternative of problem, following things must to be kept in mind:

  • Realistic solution should be identified that can be operated in the company, with all its constraints and opportunities.
  • as the problem and its solution cannot occur at the same time, it should be described as mutually exclusive
  • it is not possible for a company to not to take any action, therefore, the alternative of doing nothing is not viable.
  • Student should provide more than one decent solution. Providing two undesirable alternatives to make the other one attractive is not acceptable.

Once the alternatives have been generated, student should evaluate the options and select the appropriate and viable solution for the company.

STEP 9: Selection Of Alternatives For Data Protection Act 1998 Case Solution:

It is very important to select the alternatives and then evaluate the best one as the company have limited choices and constraints. Therefore to select the best alternative, there are many factors that is needed to be kept in mind. The criteria’s on which business decisions are to be selected areas under:

  • Improve profitability
  • Increase sales, market shares, return on investments
  • Customer satisfaction
  • Brand image
  • Corporate mission, vision and strategy
  • Resources and capabilities

Alternatives should be measures that which alternative will perform better than other one and the valid reasons. In addition, alternatives should be related to the problem statements and issues described in the case study.

STEP 10: Evaluation Of Alternatives For Data Protection Act 1998 Case Solution:

If the selected alternative is fulfilling the above criteria, the decision should be taken straightforwardly. Best alternative should be selected must be the best when evaluating it on the decision criteria. Another method used to evaluate the alternatives are the list of pros and cons of each alternative and one who has more pros than cons and can be workable under organizational constraints.

STEP 11: Recommendations For Data Protection Act 1998 Case Study (Solution):

There should be only one recommendation to enhance the company’s operations and its growth or solving its problems. The decision that is being taken should be justified and viable for solving the problems.

IMAGES

  1. The 8 Principles of Data Protection Act 1998

    data protection act 1998 case study

  2. DPA: Data Protection Act of 1998

    data protection act 1998 case study

  3. PPT

    data protection act 1998 case study

  4. The Data Protection Act 1998 Freedom of Information

    data protection act 1998 case study

  5. data protection act 1998

    data protection act 1998 case study

  6. Understanding The UK's Data Protection Act of 1998

    data protection act 1998 case study

VIDEO

  1. Digital Personal Data Protection Act Lecture and presentation in a critique format

  2. DATA PROTECTION ACT OF 2019 TRAINING SEMINAR

  3. Data Protection Act 2023: What is Data Protection Act

  4. New Digital Personal Data Protection Act 2023 by CA. Narasimhan E. on 02.09.23 at ISACA Mumbai

  5. Knowledge Session: Demystifying Digital Personal Data Protection Act

  6. Data Protection Act and Data Privacy

COMMENTS

  1. Top 10 Privacy and Data Protection Cases of 2021: A selection

    Inforrm covered a wide range of data protection and privacy cases in 2021. Following my posts in 2018, 2019 and 2020 here is my selection of most notable privacy and data protection cases across 2021:. Lloyd v Google LLC [2021] UKSC 50 In the most significant privacy law judgment of the year the UK Supreme Court considered whether a class action for breach of s4(4) Data Protection Act 1998 ...

  2. Data Protection Breaches

    The nurse who accessed the data was the man's partner at the time. The patient claimed that the breach of the Data Protection Act 1998 (DPA) and the way his subsequent complaint regarding the matter was handled had made worse a pre-existing paranoid personality disorder and prevented him from working. He was awarded damages of £12,500 for ...

  3. Lloyd v Google UK GDPR: Data Privacy Class Action

    The UK Supreme Court handed down its much-anticipated decision in the Lloyd v Google LLC [2021] UKSC 50 case on 10 November 2021 restricting claimants' ability to bring data privacy class ...

  4. GDPR: Key cases so far

    Although this incident occurred in 2014 and therefore decided under the Data Protection Act 1998, this case demonstrates how vital it is that organisations put in place appropriate technical and organisational security measures adequate for the type of data that is being held and also taking into account the risk of disgruntled employees and ...

  5. Lewis Silkin

    It is not just businesses that need to worry about the long arm of data protection, the Information Commissioner's Office ("ICO") has warned, after two employees were convicted of unlawfully accessing personal data and fined. Both cases were prosecuted under section 55 of the Data Protection Act 1998 (now repealed), which states that a ...

  6. PDF Lloyd (Respondent) v Google LLC (Appellant)

    A. INTRODUCTION. Mr Richard Lloyd - with financial backing from Therium Litigation Funding IC, a commercial litigation funder - has issued a claim against Google LLC, alleging breach of its duties as a data controller under section 4(4) of the Data Protection Act 1998 ("the DPA 1998").

  7. The Data Protection Act 1998

    The Data Protection Act 1988 creates a series of rights for people in relation to data which is held about them, and also a mechanism (the Information Commissioner) to enforce those rights. It sets out a series of data protection principles which have now stood the test of time. The eight data protection principles are set out in schedule 1 of ...

  8. 2

    Huntley [2005], better known as the Soham murders, where a police force wrongly claimed that data on the defendant could not be shared under the Data Protection Act 1998 and there was a lack of a clear retention policy for documents. This case was followed by an in-depth study of the way that the Huntley information was handled, and resulted in ...

  9. Morrisons & A landmark Judgment in Data Protection

    Morrisons vindicated: A landmark judgment in data protection and vicarious liability. 03 April 2020. First hand insights from the team who worked on the ground-breaking case before the Supreme Court. DWF acted for Wm Morrison Supermarkets in their successful defence of a group action for vicarious liability arising out of a mass employee data ...

  10. Qualitative Research and the Data Protection Act 1998

    The Data Protection Act 1998 is the UK's response to an EU Data Protection Directive designed to protect individual rights in the collection, processing and transferring of personal data. Similar responses are being produced all over Europe and they vary in severity from the relatively relaxed regimes proposed in Ireland and Sweden to the tough ...

  11. Data Protection Act 1998

    Unstructured personal data held by public authorities. 10. Right to prevent processing likely to cause damage or distress. 11. Right to prevent processing for purposes of direct marketing. 12. Rights in relation to automated decision-taking. 12A. Rights of data subjects in relation to exempt manual data.

  12. PDF Data Protection Act 1998

    Data Protection Act 1998c.2933 Part VI. 53. —(1) An individual who is an actual or prospective party to anyAssistance by. proceedings under section 7(9), 10(4), 12(8) or 14 or by virtue of section Commissioner in 13 which relate to personal data processed for the special purposes maycases involving.

  13. Case Study: Data Protection & The GDPR

    The Data Protection Act 1998 (DPA) already imposes significant penalties for data protection compliance failures, non-compliance, and non-disclosure. The new EU General Data Protection Regulation (GDPR) penalties are even higher! Of course, most of us are not legal experts so navigating the path of compliance seems far from straightforward.

  14. Top 10 Privacy and Data Protection Cases of 2018: a selection

    The Court of Appeal referred the question of whether the "journalistic exemption" in section 32(4) of the Data Protection Act 1998 is compatible with the Data Protection Directive and the EU Charter of Fundamental Rights to the CJEU. There was a Panopticon Blog post on the case.

  15. Facebook fined £500k for UK data protection law breaches

    The £500,000 fine is the maximum the ICO could impose for the breaches under the UK's previous data protection regime, the Data Protection Act 1998, which was applicable during the time that the breaches occurred. Had the same breaches occurred after the General Data Protection Regulation (GDPR) began to take effect, the fine imposed on ...

  16. Case studies and examples

    Our data sharing code provides real-world examples and case studies of different approaches to data sharing, including where organisations found innovative ways to share data while protecting people's information. Here are some case studies additional to those in the code. Data sharing to improve outcomes for disadvantaged children and families.

  17. The Data Protection Act (1998): implications for health researchers

    The Data Protection Act (1998): implications for health researchers. ... In Study 1, the difficulties encountered when the Multicentre Research Ethics Committee refused permission for researchers to recruit patients directly to a multicentre randomized controlled trial are discussed. In Study 2, the method used to compile a sampling frame for a ...

  18. The UK Data Protection Act of 1998: Summary & Principles

    The UK Data Protection Act of 1998 worked to help make sure that the private information of UK citizens was protected. To unlock this lesson you must be a Study.com Member. Create your account

  19. Case Studies

    The DPC applied the test for application of this exemption which had been set out in the UK judgment of Guriev & another v. Community Safety Development (UK) Limited [2016] EWHC 643. That case had concerned the equivalent exemption under the UK Data Protection Act 1998.

  20. Data protection legislation: interpretation and barriers to research

    Some areas of the common law duty of confidentiality and the new Data Protection Act 1998 (box, p 891), which constitutes the United Kingdom's implementation of the relevant European Union directive,2 are causing difficulties of interpretation within the NHS. ... as the case study that will be described in this article shows. In the meantime ...

  21. Data protection legislation: interpretation and barriers to research

    Data Protection Act 1998 —This brings into UK law European Directive 95/46/EC on the processing of personal data. It came into effect on 1 March 2000, and in comparison with the 1984 act (which it replaces) it is concerned with both records on paper and records held on computers. The act is based on eight principles the first of which ...

  22. The Role of the Data Protection Act 1998 Case Study

    This work "The Role of the Data Protection Act 1998" gives a closer look at the data protection act 1998, its key aspects. The author outlines its main elements, positions, a simplified version of all the provisions in the Act with regards to the acquisition, management, and propagation of information. …

  23. Data Protection Act 1998 Case Study Solution and Analysis of Harvard

    STEP 2: Reading The Data Protection Act 1998 Harvard Case Study: To have a complete understanding of the case, one should focus on case reading. It is said that case should be read two times. Initially, fast reading without taking notes and underlines should be done.