It all started with an American investigation into a drug-trafficking case. Data on this criminal network was reportedly located on a user’s Outlook account in Microsoft’s servers in Ireland. The U.S. Government issued a warrant requiring Microsoft to disclose data in its possession but the Redmond firm refused to comply on the grounds that the data was located outside the United States. Microsoft faced backlash over its refusal, some even questioning its patriotism.
While the case was being decided by the Supreme Court, the U.S. Congress tackled the issue by enacting on March 23, 2018, a rider tacked onto an omnibus budget bill, called the “CLOUD Act” (standing for Clarifying Lawful Overseas Use of Data Act).
CLOUD ACT: WHAT DOES IT SAY?
The CLOUD Act amends the Stored Communications Act of 1986 that involved a tedious process —requests for international legal assistance based on bilateral treaties — in order to obtain the communication of any data hosted outside the American territory.
Now, a simple warrant is sufficient to enjoin any U.S. company to provide such information, regardless of the data’s physical location.
The CLOUD Act applies to any “United States person”, defined very broadly (for legal persons) as a corporation that is incorporated in the United States, including a foreign subsidiary.
Not surprisingly, the procedure against Microsoft Ireland was abandoned and reopened under the CLOUD Act, Microsoft having already publicly announced that the data would be transmitted in accordance with this new framework .
CLOUD ACT: THE EUROPEAN RESPONSE
Beyond preparing its own piece of legislation, the European Union expressed, via its European Digital Commissioner, its serious concerns following the hasty passing of the CLOUD Act.
Already in 2001, when the Patriot Act providing the U.S. Government access to some data for cases relating to national defence was signed into law, Europeans feared data “leaks” to the United States. Those fears were subsequently confirmed by the Snowden, PRISM or Echelon cases. From now on, with the CLOUD Act, the transmission of data to the American justice system can be systematised for any ordinary criminal cases.
However, the processor or the controller who would respond too quickly to a U.S. court order would necessarily incur liability, to the extent that Article 48 of the European General Regulation on the Protection of Personal Data (GDPR) clearly provides that any judgement of a court or tribunal and any decision of an administrative authority of a third country requiring a controller or processor to transfer or disclose personal data may only be recognised or enforceable in any manner if based on an international agreement. The problem is that such international agreement does not exist (yet).
The protection of European citizens’ data would mean not entrusting their data to a company governed by American law
Under the very strong influence of the GDPR,and data sovereignty and number of EU CSP companiesare now offering cloud platforms that are not under obligation to hand over EU data Subjects data under the cloud act.
What are EU government bodies saying
German Economics Minister Peter Altmaier plans to build up a German cloud service to allow European companies to store data independent of Asian or U.S. rivals such as Amazon.com Inc.
Germany’s central bank has alsorecently warned the region’s banking sector that the move to shifting data on the cloud will make the industry harder to monitor.
The question of who can access bank data in the cloud and under what circumstances must be set out clearly and restrictively. As a means in the fight against crime, the current US administration signed into law an Act that in certain cases permits access to the data of a CSP without a court order. This can even apply to cases in which the data are stored outside the US.
Current Cloud Service Providers who build their platforms with local and GDPR in mind are a good place to start your search for EU businesses looking to move to the cloud.
At Relentless Privacy and Compliance we build GDPR compliance programs from the ground up. Looking at risk from every angle. As you can see below Article 48 of the GDPR clearly states that any judgement of a court or tribunal in the US under the Cloud Act can only be complied with if their is a bilateral international agreement in place .
Therefore in order to provide GDPR guidance we cannot say that a company moving to a US cloud CSP provides security or complies with Article 48 of the GDPR. Cloud Migration Projects being launched by EU companies should be carrying out a full DPIA and data sovereignty should play an integral part in that DPIA.
Art. 48 GDPR Transfers or disclosures not authorised by Union law
Any judgement of a court or tribunal and any decision of an administrative authority of a third country requiring a controller or processor to transfer or disclose personal data may only be recognised or enforceable in any manner if based on an international agreement, such as a mutual legal assistance treaty, in force between the requesting third country and the Union or a Member State, without prejudice to other grounds for transfer pursuant to this Chapter.
Relentless Privacy and Compliance Provides full GDPR Services for all sizes of organisations from startups to PLC’s
The US (CLOUD) Act came into force in 2018, amending the US Stored Communications Act 1986 (SCA 1986) primarily for the purpose of allowing US law enforcement to demand (via warrant or subpoena) personal data from US electronic communications and cloud service providers (together, ‘CSPs’), to assist in investigations relating to serious crime and terrorism, even when that data is held in a third country.
Where it all began
The CLOUD Act represents the culmination of a series of attempts to amend SCA 1986, which came into force before cloud technology existed. The Bill originated following the case of United States v Microsoft Corp 548 US (2018), in which the Federal Bureau of Investigation (FBI) was granted a warrant directing Microsoft to disclose to the US Government the contents of, and all other records associated with, a specified email account within its control. Microsoft determined that the requested data was all stored in its data-centre in Dublin, Ireland and refused to comply with the request.
The Congressional Findings in the CLOUD Act recognise that CSPs may face conflicts in providing the requested disclosure, due to the data protection laws of the country their servers are based in. The CLOUD Act therefore provides a procedure by which the CSP can apply to the court to have the request either quashed or modified. However, such a request can only be made if the data subject is not a US citizen or resident and if disclosure ‘would create a material risk’ of violating the laws of the third country. The threshold for a ‘material risk’ is not clear. In any event, after considering a list of specified factors balancing foreign and US Government interests, the court can still order that the data be provided, even if this would violate the law of the country in which the data is stored.
Let’s clear up the confusion of the Name the “Cloud” Act
The term CLOUD Act mistakenly suggests that this is only relevant for cloud services. In fact, the aim of the ‘Clarifying
Lawful Overseas Use of Data Act’ – the full title – is to remove all boundaries so that it is irrelevant where the data is processed or stored: in the cloud, in a data centre outside the cloud, in the US or abroad. All that matters is that it belongs to a US company that must support the US authorities when it comes to any aspect of their jobs, including criminal investigations.
Which companies are directly affected?
Those directly affected by the CLOUD Act include internet providers, IT service providers, and cloud providers based in the US, as well as their customers, i.e. European companies whose data is processed via an American service provider, possibly via the cloud.
While companies could previously argue that a court order for the release of data is only effective in the United States, they must now inevitably also transfer data stored abroad to the requesting US authorities. In addition, there is a danger that US authorities will not limit their data search to companies based in the US, such as Microsoft, Google, and Amazon (among others), but will also extend their request for information to all companies as soon as they have found a connection to the US.
How far reaching is it
The history of the CLOUD Act suggests that the data in question is exclusively only personal data – which in itself is worrying enough given the particular importance of data protection in Europe. But the CLOUD Act allows US authorities access not only to data of US citizens stored in the EU, but also all other data that a US company processes or has processed abroad.
This means that the personal data of EU citizens worth protecting is just as insecure as operational data or company data – from business details to trade secrets and intellectual property. The CLOUD Act thus collides with laws in Germany, such as the Unfair Competition Act, and with the European Union, above all the General Data Protection Regulation (GDPR).
The EU and USA Interpretation of Data Protection are far apart.
The fact that the US does not have the same ideas about data protection as Germany and the European Union should not come as a surprise. There is a reason why the United States of America is regarded by the EU as an “insecure third country”. The latest developments confirm this once again: the CLOUD Act creates an immense contradiction to the GDPR that applies within Europe.
Where does the idea of data protection come from?
Data protection is based on the fundamental right to informational self-determination.
Data protection is anchored as part of the consumer protection and thus part of commercial law.
Is there a universal legal basis?
Yes, the Basic Data Protection Regulation (GDPR).
No , but there are industry-specific solutions (e.g. SCA, CLOUD Act).
Duties of companies
The rights and obligations of companies that process data and those that commission such processing are comprehensively regulated by the GDPR.
Companies that process data and those that commission such processing should ensure the security of such data.
Rights of companies
Companies can define their own level of data protection and set up self obligatory regulations (compliance).
Consequences of infringement
Violations of the GDPR may result in hefty fines and prohibition orders.
Violations of compliance are considered to be deceptive or unfair actions and are punished with consequences under competition law.
Data protection authorities in accordance with Art. 51 GDPR, check compliance with the GDPR. Companies must cooperate with the supervisory authorities.
Data protection supervision is carried out by the Federal Trade Commission, which is responsible for monitoring companies under competition law and consumer protection law.
Art. 32 of the GDPR recommends encryption of pseudonymisation of data.
The CLOUD Act does not prevent data storage or processing companies from supporting the decryption of data.
Shortly after the US enacted the CLOUD Act, the GDPR came into force in Europe. The regulation of the European Union regulates the processing of personal data by private companies and public authorities. The aim of the GDPR is to safeguard the fundamental rights and freedoms of natural persons, to protect personal data and at the same time to ensure the free movement of data within the EU.
Workarounds Are there any and do they work?
What chance do companies and their customers, who are subject to both laws, have of avoiding this dilemma? On the part of relevant US providers who store or process data, numerous solutions have already been identified or tried out to solve this conundrum. So far, none of the attempts to circumvent the problem have been satisfactory.
If there is no possible way to avoid data access by US authorities… If the discrepancy between European and American understanding of data protection cannot be eliminated. If the contradiction between the GDPR and CLOUD Act cannot be resolved by negotiation… What does this mean for companies in Germany and the EU? The following five typical scenarios illustrate the impact of the CLOUD Act in Germany and the corresponding recommended course of action
1: Subsidiary of a US corporation
The simplest case is a company operating in Germany or the EU that is part of a US company’s group structure. In this case, the CLOUD Act also applies without there having to be a data transfer with the USA. The parent company is subject to US law, as are all of its subsidiaries. An objection is not possible; protective measures (such as technical encryption or a data trustee) are ineffective.
2: EU company with a subsidiary in the USA
For an EU-based company that has a subsidiary in the USA and thus a data transfer with the USA, the GDPR could initially be invoked as an objection in the event of a request for data by a US authority. In this scenario the corporate structure is relevant. For example, it is advisable to define a data separation in the company (if possible), which can reduce the relationship with the US. Whether this really helps in individual cases is unclear. The local companies must also expect that the US authorities could threaten the US subsidiary with reprisals in order to increase the pressure on the parent company in the EU to grant data access after all. In the case of personal data, a European company behaving this way would be in violation of the GDPR and would have to be reported to the supervisory authorities.
3: EU company with US service providers in the broader sense
The CLOUD Act does not only oblige companies to disclose their own data, but to disclose any data in their possession, custody or control. Consequently, scenarios 1 or 2 apply to any service provider (unless it is considered to be merely a US provider) that is contracted to store and process data. For example, for a German or EU company that has its data processed by a hosting provider or cloud service provider with a “connection” to the USA, the CLOUD Act applies. Any obligations and measures on the part of the service provider that are set out in a contract for the processing of personal data pursuant to
GDPR Article 28, and which serve to protect personal data, cannot invalidate the CLOUD Act. All other economic data is also not secure in a US-related cloud. In the event of a request by US authorities, the service provider must grant it, but inform its customer of access by third parties in accordance with the processing contract.
4: Other uses of American cloud services
Even if a processing contract cannot release a cloud service provider from its obligation to provide data under the CLOUD Act, it is a signal that companies in times of the CLOUD Act to take a closer look at the provider. But what about data services for which there is no processing contract? Anyone who believes that something like this does not happen in their company and that all data, even remotely personal or otherwise sensitive, is safe should check carefully which tools and programs they use:
Is there a social media account with a relevant US provider in which new employees are introduced?
Do teams use free sharing solutions from US providers to work together on projects?
Does the company send marketing emails via US servers?
Does the company use popular analytics programs from US providers for website visitors?
Any US service provider whose tool or platform companies use falls within the scope of the CLOUD Act. The question that users of cloud services must ask themselves is: how sensitive, mission-critical, or worth protecting is the data that organisations put in the cloud using such services?
5: Cloud solutions from the EU for the EU
As clear as the situation is for subsidiaries of a US group, it is for EU companies to choose a cloud provider based in the EU that does not store or process data anywhere other than in European data centres. Providers that are subject to German or EU law must act in accordance with the GDPR. If they are also exempt from any influence or “association” with the USA or US service providers, there is no danger of being obliged to disclose personal data on the basis of the CLOUD Act. If a European cloud service provider is acquired by a US company, it falls directly within the scope of the CLOUD Act. In this case, the cloud provider would have to inform its customers at an early stage and offer them the opportunity to export and delete data.
What about non-personal data?
The CLOUD Act also applies to non-personal data. It must therefore be clear that in the course of IoT measurement and telemetry data, raw data for big data analysis, data in merchandise management systems, and for ERP software – and even data representing protected intellectual property – can be viewed by US authorities. Therefore, the European cloud servers are also the recommendable storage location for other corporate data in order to protect it from access by US authorities.
Conclusion and Roundup
Unfortunately, the fact that the GDPR and the CLOUD Act are so fundamentally incompatible creates only limited security. The mood remains dark and for local companies it is unclear what will really happen if the worst comes to worst. For cloud users and cloud service providers, there are still many questions:
Should we trust self-obligatory data protection rules, for example from
Microsoft, Google, etc.?
Are we prepared to submit to a data query by the US authorities?
What would an obligation to disclose our data mean for us and our customers from an economic point of view?
Ultimately, each company must think carefully about which provider it wants to entrust with what data. Cloud providers and IT service providers from the EU currently offer maximum security and are GDPR-compliant. Especially since one can never know when the next threatening storm will brew in the USA.
Is there any compatibility with the CLOUD Act?
As far as facilitating prosecutions on both sides of the Atlantic is concerned, it remains to be seen where the road leads. After all, the European Commission is also endeavouring to regulate the release of data for criminal prosecution by law. In addition to an E-Evidence Regulation, which advocates requesting electronic evidence (including user and content data) directly from data processing service providers in order to speed up investigations, there is also a paper setting out the arguments in favour of an agreement with the USA on the CLOUD Act.
Better to play it safe
Those who want to take their data quickly out of danger should rely on an experienced GDPR-compliant service provider from Germany or the EU, one that processes their data according to the current highest data protection and data security standards and that will continue to support this in the future.
Is the whole of Europe data safe from EU agency prying eyes?
Well not all recently three EU member countries have passed laws
In the Netherlands, the lower house approved a bill that allows the police to hack suspects in a criminal case. It’s called Cyber-crime III and, in its original form, gave the police the power to make use of software vulnerabilities that the developers were unaware of (zero-day vulnerabilities). This divided lawmakers and ultimately the
he French government imitates the PRISM project with it expansive electronic surveillance networks, reports Le Monde. It has found that French intelligence collected massive amounts of data and stored it on its servers. The data included telephone records — the identifiers of participants, place, date, duration and the size of the message — as well as emails (metadata) and all internet activity which goes through Google, Microsoft, Apple and Yahoo.
In the UK, the law to watch out for is the Investigatory Powers Act. It allows for the government to access and store data of everyone in the country. That data includes browser history, phone records and messages. The government issued a restriction that justifies intrusions only in the case of “serious crime.” However, they defined “serious crime” itself as any offence punishable by six months in prison and any crime that involves sending a communication.
Our recommendation at the moment in time is to find a CSP based in Germany for maximum protection.
Relentless Privacy and Compliance Services provides expert advisory services for Global Data Protection. Are you an EU business ? Do you have questions regarding your data positioning? Contact us to find out how we can advise on the best CSP solution for your organisations data.
Rolling out new regulations is only the first step in dealing with Europe’s massive cyber-security and data protection problems. Almost half of UK businesses which identify issues, discover one attack or security breach per month, according to the University of Portsmouth’s Cyber Security Breaches Survey (CSBS). Since 2018, the General Data Protection Regulation (GDPR) is the primary law protecting data and privacy, by establishing a framework for fining organisations which are lax in protecting consumers. More than a year after GDPR enactment, breaches still occur at unprecedented rates, and most corporations have yet to see fines for non-compliance. In 2019, all that is changing . Here are five reasons GDPR compliance is on the rise.
Officials realise that enforcing GDPR is essential for consumer protection. Forty-one companies have received fines from Germany for GDPR-related offences. The highest fine, $80000, penalised an organisation for failing to protect health information from public disclosure. In July, the London Stock Exchange were advised that following an extensive investigation the ICO has issued a notice of its intention to fine British Airways £183.39M for infringements of the General Data Protection Regulation (GDPR). The UK Data Protection Authority says it is fining the Marriott £99 million for a data breach, exposing private information for 383 million guests. This included 30 million European Union residents.
Cyber-crime prevention is one aspect of GDPR. Regulations also restrict data sharing and protect consumer privacy. France slapped Google with a £50 million GDPR fine for failing to disclose its process for gathering and using personal information. It’s the first massive fine under GDPR for a global technology company. Google also failed to obtain each user’s consent to personalise ads. The technology giant is not alone in its need to improve privacy and data protection.
Facebook faces a $5 billion US Federal Trade Commission (FTC) fine, after settling with the FTC over the user data scandal involving Cambridge Analytica, a third-party consulting firm. During the 2016 US presidential campaign, Cambridge Analytica acquired private data for tens of millions of Facebook’s users to create psychological profiles to sell to political campaigns. Facebook’s fine is the most significant civil penalty in FTC history for a technology company. Although not GDPR-related, the fine is a wake-up call for businesses to develop or enhance data protection policies.
Reputational damage will be a core consequence of any GDPR-related fine or penalty, similar to the aftermath of a privacy or cyber-related security incident. Associated financial costs may be difficult to discern immediately, because reputational damage is less a stand-alone loss and more an impetus for several potential consequences, namely lost consumers (in both the B2B and B2C contexts), stock price decline, and subsequent difficulty for innovation and growth due to higher borrowing costs.
GDPR-related reputational damage is an elusive risk because the size and scope is contingent upon many factors, such as revenue size and industry, the nature of the alleged noncompliance, the duration of the investigatory process, and timing.
Large revenue companies may face greater regulatory scrutiny and therefore have more reputational exposure based on the sheer size and scope of their data collection and processing efforts (in addition to their wider brand recognition). This is particularly likely for industries already in the EU regulatory crosshairs, such as the U.S. technology sector.
Between 2013-2014, almost three billion Yahoo user accounts were affected in a hacking attack, making it the largest data breach in history and yet, it took over two years for Yahoo to report it. Not only did the breach harm Yahoo’s reputation, it cost real money. They faced a $23 million fine by the SEC and the incident also threatened Yahoo’s acquisition by Verizon, who cut the deal by $350 million.
Highly publicised data breaches are fuelling the desire for enhanced security protection measures. A recent cyber attack at SingHealth in Singapore compromised data for 1.5 million patients. When news outlets publicise information about high-profile attacks, they raise awareness about the need for secure information technology infrastructure. According to the CSBS, nearly 60 per cent of businesses give senior management updates on cybersecurity.
In fact, Forbes predicts that global cybersecurity spending will surpass $124 billion. Technology platforms are driving business growth and increasing competitiveness. As a result, security drivers, such as industry changes, security risks, and business needs, are critical concerns for organisations seeking to enhance online business interactions. New technologies are offering consumers convenience with online banking, service delivery, remote working and cloud computing. These operational changes are moving businesses toward greater cybersecurity to ensure seamless and secure internet experiences for users.
Most businesses and charities which handle sensitive information are aware of GDPR and its implications. GDPR is impacting the shift toward improving cybersecurity schemes because companies know they can receive a fine for non-compliance. CSBS respondents report that more than a third of entities are making changes in cybersecurity policies as a direct result of GDPR’s enactment and enforcement. These changes include staff training, updating systems, and improving processes.
GDPR is sparking greater engagement between corporate board members and internal data security professionals. Some organisations are reporting a greater consistency in maintaining encryption for sensitive files. Staff training and better communication about cybersecurity are ways in which organisations are protecting consumer data overall. These steps toward greater privacy and data protection are proportionate to a business’ ability to meet the growing need for security specialists. While experts project GDPR will have long-term positive effects on Europe’s cybersecurity landscape, the regulations are a starting point.
Skill shortages prevent some businesses from tackling security challenges and implementing processes which ensure consumer protection. The skill gap forces workers to take on the role of protecting digital assets without formal training. Thirty per cent of CSBS respondents send staff to training, and nearly half of the businesses outsource cybersecurity to enhance online protection. Security professionals assist in implementing vital protective measures, affecting data classification and a wide variety of document management processes for businesses seeking to mitigate risks.
Take a look at the wide range of services and the the fast growing GDPR 24/7 platform that Relentless privacy and Compliance Services has to offer
GDPR recently made the news in the Netherlands when the Dutch Data Protection Authority, the Autoriteit Persoonsgegevens (AP) issued a €460000 fine to a hospital in The Hague. The Haga Hospital did not adequately protect the medical records of a TV celebrity, who entered the hospital in 2018 after a suicide attempt. No less than 197 staff at the hospital were found to have “snooped” at the well-known patient’s records.
The fact that a sizeable GDPR fine has been levied at a Dutch hospital comes as no surprise, as the AP had previously announced its focus on public and health sectors. On top of the initial fine, the Haga Hospital will be liable for a further €100000 every fortnight after October 2nd 2019 if it hasn’t improved the data security of patients by that date (up to €300000).
Why Target the Health Sector?
You can always argue that hefty fines against hospitals are immoral. Maybe that amount of money would be better spent elsewhere, but part of patient care should always be a respect for privacy. In fact, this is the first GDPR fine which the Dutch Data Protection Authority has issued, so it hasn’t been undertaken lightly.
The Haga Hospital breached Article 32 of the GDPR. This demands a level of security which corresponds to the level of risk posed by the data in question. Health-related data is considered high risk because it has a more significant potential to harm individuals if leaked. For instance, it might affect the subject’s job prospects or their reputation. Like all data, it needs managing and mapping, so that the hospital or organisation knows where it is and who has access to it.
The risks associated with processing medical data, and all aspects of its handling, (e.g. who has access, IT security, the security of premises, anonymisation), must be assessed to avoid the type of GDPR breach which occurred at the Dutch hospital. Small to medium-sized companies or bodies can use GDPR compliance software to assist them in these steps.
Two-Factor Authentication and Control of Logging
The Dutch Data Protection Authority found that the Haga Hospital did not use two-factor authentication and failed in control of logging. This is a breach of GDPR Article 32, but what does it mean?
Two-factor authentication is as it sounds. The user must be able to identify himself using two different pieces of evidence. It’s a double-check on who is accessing the data, based on something they know or have, or who they are. This type of multi-factor authentication is often used in banking, where you must know answers to security questions and your password, for instance. At an ATM you need your bank card as well as your PIN.
Control of logging refers to being in command of who consults files and the ability to check this at all times, thus quickly identifying unauthorised access. In the case of the Dutch hospital, only six security checks on random patients were carried out each year. This does not correspond to the sheer volume of data a hospital processes. To meet GDPR standards, systematic, risk-oriented control is necessary.
Similar Case in Portugal
There have been several high-profile GDPR infractions recently. Hospital data breaches tend to shock because of the sensitivity of the data. In Portugal, the Centro Hospitalar Barreiro Montijo was fined €400000 last year under similar circumstances. Too many people had easy access to patient records. This poses a particular risk to high-profile patients but is disconcerting to all. The GDPR threat at both hospitals came from a lack of security within.
Lessons to Learn
In a healthcare setting, patients have an absolute right to expect confidentiality, regardless of who they are. And that can only happen if data is accessible on a need-to-know basis; a key factor in keeping it secure. Raising staff awareness is paramount in protecting data.
Similar principles apply to all companies. GDPR offers a chance to show customers that you respect their data and process it responsibly. Using the right software can help; why not begin the compliance process today?
Having good security measures in place is great but you can never be 100% safe from a data breach. A data breach could lead to an investigation from the Data Protection Authority (DPA) , resulting in potential enforcement action against your organisation and reputation / brand damage. Being prepared with a solid breach plan is essential.
You need to know how to
respond to a breach.
While it’s possible to do all of this in the event of one occurring, implementing the right actions when you consider the relatively short deadline to inform the Data Protection Authority (DPA) if you haven’t prepared your breach procedure in advance.
You also need to keep records of breaches and take action to reduce the risk of them happening again.
The GDPR also requires you to have appropriate security measures in place. Demonstrating that you’ve done this will not only help to avoid breaches, but will show that you’ve not been negligent in your approach.
Recognising a breach
The next vital aspect of this is detection. How do you detect a data breach? At the operational level, if your employees realise that they’ve exposed sensitive personal data through an error, that could be a breach detection. At the other end of the scale, security monitoring systems should highlight personal data breaches. That could also be classified as breach detection. To do this successfully involves having the correct level of controls in place, an essential requirement of GDPR,. A nightmare scenario is when you fail to detect the breach due to a lack of controls but a third party does discover it and goes public with it. This could lead to a maximum fine of 2% of global annual turnover or €10 million, whichever is greater.
Protecting yourself from cyber incidents The National Cyber Security Centre (NCSC) provides lots of useful and practical information on protection your organisation from cyber threats. Some useful resources include:
Weekly Threat Reports
Information for small and medium sized organisations
Small Charity Guide
Cyber Essentials certification
Some of these incidents may happen through human error and honest mistakes. They could also occur through carelessness and a lack of procedure or guidance. It is therefore essential that your organisation has a suitable data protection policy in place, and that all of your staff, including any volunteers, have completed GDPR and data protection awareness training.
Even when a crime has been committed against you it is your responsibility to follow the necessary procedures, as the breach involves personal data under your control. All staff must know how to recognise a breach and that they have a duty to make the organisation aware. Inform employees that they should report a suspected breach to an identified member of staff (possibly a Data Protection Officer) who handles the rest of the procedure.
When a breach occurs, the organisation should first establish:
the facts of what happened
what personal data was involved
the number of people likely to be affected
the likelihood and severity of impact on the people affected
Reporting a breach
After a breach has been escalated within your organisation, you must decide if you need to report it to the Information Commissioner’s Office. If you fail to notify a re portable breach it can result in a significant fine. When should a breach be reported? Not all breaches need to be reported to the ICO, but if the breach is likely to involve a ‘risk to people’s rights and freedoms’, it must be (Article 33). Such a risk would be one where the people involved could suffer adverse effects as a result of the breach. This depends on what was in the data and how it might be used to damage them, as well as the scale of the breach. The inappropriate disclosure of sensitive or confidential information could be reportable if it would have a negative impact on someone’s sense of privacy. Identify theft, fraud, financial loss and damage to reputation are other risks to rights and freedoms that could result.
The context, scale and level of sensitivity are more important than the nature of the breach. The same type of breach could be reportable or not, depending on the likely effect on individuals.
For example, accidentally sending a bulk email to invite a small number of people to a community event using the ‘to’ and not the ‘bcc’ field is unlikely to be a reportable breach. But sending a similar email to a group of people who are receiving mental health counselling from you would be, as the context identified health information about those people. If you are satisfied that there is no risk to anyone’s rights or freedoms, then the breach does not need reported. In coming to this conclusion, you should make clear the reasons for this decision. How is a breach reported? A breach must be reported to the ICO without undue delay and within 72 hours from when you became aware that a breach had occurred, where feasible. This 3-day limit applies whether the incident happens over weekends or holidays. You need to report to the local DPA and give details of the incident. Even if you haven’t established all of the facts you should still report within 72 hours. Don’t delay, as you will have the opportunity to provide follow up information. The helpline staff can assist with what to do next, whether you need to inform the individuals, and how to take measures to prevent re occurrence. As most DPA helplines are only available from 9:00 am to 4.30 pm Monday to Friday, you should report through their online facility if you need to do so at other times. What happens next? The Local DPA decides what happens next. Breaches are not routinely made public by the Local DPA. In some cases they will simply record the incident. In other cases they can investigate the circumstances that led to the breach. The outcome can range from no further action through to a monetary penalty in the rarer case of a serious breach involving negligent or deliberate action. Informing individuals There is also a requirement in the GDPR to inform individuals affected as soon as possible (Article 34). This will allow them to take precautions and protect themselves against any negative effects, such as identify fraud. The requirement to inform individuals is slightly higher than the need to report to the ICO. Compared to a “likely risk to individuals’ rights and freedoms”, you need to inform people if there is a “high risk”. This difference can be hard to judge. It’s best to take the view that if you need to report to the local DPA you probably need to also tell the individuals. The local DPO can tell you if you need to inform individuals, or require you to do so.
You need to clearly communicate to the people involved:
what personal information was involved
what risks are likely or possible
measures you’re taken or proposing to address the breach
your contact details where they can get more information
Whether you need to report a breach to the ICO or not, you should keep a clear record of every breach incident. The GDPR requires controllers to: document any personal data breaches, comprising the facts relating to the personal data breach, its effects and the remedial action taken
The GDPR also requires organisations to be accountable and transparent. Under the security of processing, controllers and processors must put in place appropriate measures “to ensure the ongoing confidentiality, integrity, availability and resilience of processing systems and services” (Article 32).
Keeping a clear record of breaches will help you to meet accountability requirements and is an appropriate measure to ensure the security of processing.
These records also allow the Local DPA to verify that compliance with the reporting of relevant breaches is happening. You will also need to act on any breach to reduce the risk of recurrence. Identifying patterns or gaps in your practice is important, and keeping records shows that you’re taking responsibility for what happened. You can choose how you keep this record. It could be a long-form written document, or on a spreadsheet. It is advisable to record: the date that the breach happened when it was identified and by whom
if and when the Local DPA were notified (include a case number if given one)
the nature and circumstances of the breach
what types of personal information was involved
how many people were affected
likely effects of the breach, especially if there is evidence of effects
if a breach was not reported to the ICO, the reasons for this decision
remedial action taken to remedy the breach and prevent further occurrences any other information you think relevant
Relentless GDPR 24/7 compliance platform includes all the necessary tools to manage your data breach processes
The GDPR can be seen as a complex and far reaching piece of legislation. One area where data privacy professionals may have a better understanding is Article 32-Security of Processing.
GDPR does not downplay security at all, but rather, the language of Article 32 takes a broad, flexible and risk based approach. In other words, it is reasonable and pragmatic.
Security measures must ensure
availability’ of your systems and services and the personal data you process within them.
Every good Article 32 strategy has the above three pillars at its heart and much like the crooked Amsterdam houses in the main feature image above they rely on each other to stay secure and upright, remove one and it fails.
Some of the requirements noted include business continuity, testing, encryption and prevention of unauthorized access. Security professionals will find it all very familiar, reasonable and most likely included in any reasonably complete information security programs.
However, it would be unwise and incorrect to perceive the Article 32 as the only place where security considerations are required under GDPR. A well-rounded GDPR compliance program should include security measures that are raised in other GDPR mandates. Here are just a few other GDPR mandates where security measures must be considered and addressed:
Breach Notification and Response:
Articles 33 and 34 cover the long-standing issue tightly aligned with a security program—incident response and breach notification. Notice to both supervisory authorities and data subjects is required in certain instances and knowing how the GDPR is similar to or different from your existing response and notification requirements is of prime importance. Two key issues to keep in mind:
a 72 hour reporting timeline is required for certain breaches; and
in addition to legal requirements, there are usually more restrictive contractual obligations that controllers may impose on their processors.
Records of Processing Activities:
Article 30 requires that technical and organizational security measures implemented for processing activities be included in the documentation that organizations create.
Articles 44 through 47 provide for various requirements that must be in place before transfers to a third country outside of the EU can take place. Each of the types of transfer mechanisms have security obligations embedded that are important to understand and incorporate.
Adequacy decisions in place prior to GDPR are still in effect and the applicable countries have achieved that designation as a result of the data protection laws and enforcement procedures they impose, including security mandates. For each relevant transfer based on adequacy, security obligations specific to the applicable country should be understood and addressed.
Appropriate safeguard transfers such as model clauses likewise include security obligations and since the model clauses cannot be edited, security measures are effectively a contractual obligation even before GDPR is enforced.
Finally, binding corporate rules (BCRs) take a holistic approach to data protection including policies, accountability and training around appropriate security assessments and protections.
Data Protection Impact Assessments
(DPIA): Articles 35-36 describe the obligation to implement DPIAs. Both in the language of GDPR (Art 35(7)) and the guidance released by the Art. 29 working party, security measures are a key factor in conducting an effective DPIA. For example, to have true business impact and comply with GDPR, DPIAs must assess risk including risks to security, use input from security experts, implement adequate security measures and identify residual risk including residual risks to security.
While these are just a few examples, the key point in GDPR is to take a holistic approach, identify and manage security risk in your entire business cycle and to see the foundation of the regulation as seeded in consumer protection and transparency Just like a well-rounded security program relies on a comprehensive, risk based approach, GDPR requires us to apply broad, meaningful security protections.
Relentless GDPR services provide a full coverage of all the compliance components needed for your organisations privacy strategy.
Organisations of all sizes can be weighed down by the volume of records that they create or gather both in paper and electronic formats. How does your company deal with this mountain of paper and electronic records?
How long should your company retain and archive such records when considering the countless number of complex national and international record retention requirements and other government agency standards?
A blanket indefinite retention and storage policy related to all of your company’s paper and electronic records is impractical and could still fall foul of data minimisation requirements of data privacy laws , costly and not the answer!
In contrast, an effective record management and retention policy will help to answer the above practical questions because such a detailed policy will define a company’s legal and compliance recordkeeping requirements. In addition, the policy should outline a system by which a litigation hold can override certain record retention requirements if the litigation hold requires a longer retention period, as well as when a company’s records may be destroyed following expiration of the applicable retention periods.
Scope and Application of a Company’s Record Management and Retention Policy
The scope of a company’s record management and retention policy should apply to all records of the company, regardless of the format that such records are created or stored. Each business unit and all of the company’s employees and officers should be required to adhere to the policy. Data awareness programs play an important part in an organisations data privacy strategy. The terms of the policy should be followed consistently and reevaluated on a periodic basis by management, the length of which should be identified for in the policy.
Retention Schedule in a Company’s Record Management and Retention Policy
Taking into account the global spread of operations of organisations there is no single law or regulation that establishes an identical record retention period with which a company must comply. Instead, the number of laws and regulations requiring a company to retain certain documents is increasing, along with the penalties a company may face for failing to follow best practices in their record retention management.
Therefore, a well planned record retention schedule should be included in a company’s policy that addresses each type or category of data created by a company in the course of its business and indicates the associated time period that these records are required to be retained.
Key components of a Record Management and Retention Policy
The policy should provide, at a minimum, the following:
Types of records covered by the policy
Specified procedures related to maintenance of each category of records created or obtained
Record retention instructions, retention time periods and storage procedures
Timeframe for when the policy should be reviewed and evaluated
Steps that a company will take to ensure compliance with the policy and specified consequences for violations
Organisations should also have a system in place in which they identify the types or categories of records that are subject to a specific retention period. This identification system will provide guidance to the company as to when these records may be destroyed once the requisite retention period has passed. The policy should also provide clear record disposal and destruction guidelines that the company and/or its third-party contractors will follow.
The Importance of Having a Record Management and Retention Policy and Next Steps for Your Company
The most significant takeaway here for organisations is that they have a written record management and retention policy, and that their employees, officers and applicable third parties are following this policy consistently and effectively.
If your company does not have such a policy in place,a shrewd decision would be to engage proficient advisors to assist in creating a written record management and retention policy and putting appropriate protocols in place to ensure compliance with national or global requirements.
Relentless Privacy and Compliance Services is uniquely situated to provide policy advice and services in this area as its Data Security & Privacy Team has vast experience in assisting companies of all sizes with creating and updating their record management and retention policies, as well as creating frameworks by which companies can manage their types of records based on the applicable retention periods.
At Relentless we are always looking to provide our readership with value add content.
Relentless Global Comprehensive Data Privacy Assessment includes looking at data security by design and by default .This great article looks at the details and consequences behind data breaches.
Today we are pleased to share with you our first partner content from civic.com Author Chris Smith
The Titanic taught us about a fundamental lesson about icebergs: only a small part is visible above the surface. If we’re talking about data breaches, 2018 was the year we discovered the part of the iceberg floating below the surface.
Despite all this information, people, not to mention companies, still are not taking significant measures to protect their online accounts. Some statistics show that people are actually less worried about privacy and security, and they trust companies more than they did a few years ago. We are starting to see reports of people taking data privacy measures, like deleting the Facebook App, but there is still an emphasis on convenience over security.
When the news covers these data breaches, the focus is on the bigger picture: the fines, the number of records that have been compromised, the combined cost to consumers, or undue influence on elections. There is less focus on the individual, yet proving identity is a fundamental part of our day to day lives.
We want to ensure that the individual impacts of data breaches and security failures are not overlooked. So we put together an infographic that shows the daily touch points that make everyone more vulnerable, as people continually distribute their identity information on the Internet.
As hacks become more widespread and the consequences become more severe, it’s critical to consider these interactions and consider how companies and people can make changes to protect their identity information without sacrificing convenience.