Privacy regulations regarding data protection” are both a challenge and an opportunity. A key indicator for “GDPR maturity is the data protection mindset within companies and the level of data protection provided, which means what, where and how business-critical workloads operate on cloud infrastructures.
For many companies, the GDPR is still seen as a complex project. Legal, technical and organisational challenges brought about by the GDPR have proved stressful, resource sapping and ultra-difficult to maintain. Particularly in the case of large migration projects in the cloud computing environment, in the IoT environment or in big data scenarios, day-to-day business leaves little time to afford the correct attention data regulations require.
However, in addition to numerous implementation challenges, the GDPR also offers the opportunity to excel by redefining and implementing new data protection and IT security strategies, especially in the context of cloud computing.
As a result, the topic of cloud computing raises many questions in the context of the GDPR. In technical terms, cloud computing is a data processing contract. Hence, the cloud user should be fully aware of the way their data is always processed. Cloud providers and resource providers only support their functions and are dependent on the legal requirements of the responsible authority. In other words, both cloud providers and businesses must meet the minimum legal requirements for each cloud service under the GDPR.
Special attention should be focused on the ability of your cloud solution to meet Article 28 and Article 48. In light of the US Cloud Act US cloud service providers cannot guarantee to meet Article 48
“Any judgement of a court or tribunal and any decision of an administrative authority of a third country requiring a controller or processor to transfer or disclose personal data may only be recognised or enforceable in any manner if based on an international agreement, such as a mutual legal assistance treaty, in force between the requesting third country and the Union or a Member State, without prejudice to other grounds for transfer pursuant to this Chapter.”
Seize the moment
How to take advantage of the potential challenges behind the GDPR? There are two main questions. On the one hand, companies need to know which cloud providers they can trust to fully comply with the above articles.
On the other hand, companies need to know which technical and organisational measures they must take in order to be “GDPR-compliant”.
Opportunities and obligations for CISOs – the right cloud partner
The right cloud partner can be a valuable sparring partner in the light of GDPR, since with their expertise in compliance and security they can assist your company on its journey towards GDPR Compliance
If you apply a multi-cloud strategy, you need to assess the data protection policies of each cloud provider.
Hybrid and multi-cloud approaches are much more complex to coordinate and therefore may present a higher data protection risk. The multitude of different cloud providers, especially in the public cloud environment, makes it difficult for CISOs to ensure GDPR compliance. GDPR compliance is only as strong as the weakest link. For example, a breach or non-compliance by a single cloud provider within a multi-cloud deployment can undermine all efforts for a successful GDPR compliance.
Six Key Criteria
Here is a quick set of criteria you can use to evaluate your potential or existing cloud partners in terms of GDPR compliance:
Security and Privacy
A first necessary step is to assess to what extent the provider is able to comply with your IT security requirements. One easy way cloud providers can demonstrate compliance with security and “Privacy by Design” is by being ISO 27001 or ISO 27018 certified
Companies that work with a wide range of critical data must provide sufficient guarantees (in accordance with Article 28 of the GDPR Regulation) that the data controller uses “only processors providing sufficient guarantees to implement appropriate technical and organisational measures in such a manner that processing will meet the requirements of this Regulation and ensure the protection of the rights of the data subject.
” Hence, you need to make sure that your cloud provider is conducting regular audits for the review, scoring and evaluation of technical and organisational measures to guarantee the security of processing. In addition, you need to make sure that the cloud partner grants the right to audit to their customers. For EU companies appointing a EU member state sovereign owned cloud service provider ensures that your data is in safe hands and will not be handed over to a 3rd country court for example as this would expose the cloud service provider and company to large fines under the GDPR.
Knowing the location where your data is being stored and processed is important. Yet, not all cloud partners provide you with the necessary transparency related to the cloud locations. Note that the cloud provider’s headquarters might not necessarily be the location where your data is hosted. In addition, your data may be moved between different cloud locations in the background, without letting you know. This may be part of the Terms of Service of the cloud partner. Finally, cloud service providers may store data within multiple location and some of these may be outside the EEA. As a Data Controller you need to define a multi-country cloud strategy to adhere to adequacy requirements as well as data localisation laws.
The GDPR requires a slew of data protection safeguards, from encryption at rest and in transit to access controls to pseudonymous data and anonymization. In order to achieve this, the easiest way is to choose a cloud partner that has enough security features to choose from such as backup, encryption, access control policies and others. If your cloud partner does not have such policy, you need to take care of the security features yourself.
As a customer of a cloud provider you are the Data Controller which means you must maintain control and ownership of your own data. This can be achieved by signing a Data Processing Agreement with your cloud partner to guarantee that the partner is adhering to the data privacy protection requirements as per the GDPR. You can either draft your own or check if your cloud partner has created a DPA as a standard part of the Terms of Service. The advantage of using your own is that you can specify the type of personal data and “special” data collected. No matter if you use your partner’s DPA or your own, make sure that the terms state clearly that the Data Controller (i.e. you) owns the data and that the Data Processor (i.e. the cloud partner) will not share the data with third parties.
You need to make sure that once your contract with the cloud partner has ended, you can download/erase the data and that the cloud partner will delete the data once you’ve terminated the service. Some cloud providers, especially when they are ISO-certified, have defined a standardised policy for deleting data after contract expiration. Try to find out how long it takes for the cloud provider to delete your data.
A data governance framework as found in Relentless GDPR247 ensures a detailed pathway to compliance that is easily maintained and has built in continuous improvement.
Organisations are realising that failure to protect customer data is creating long-term business problems. One of the biggest is the fear of being unable to manage the fallout of a data breach involving a third-party processor.
Consumer reaction to data breaches
In a recent survey 69 percent of 7,500 consumers surveyed from France, Germany, Italy, U.K. and the U.S. say they have or would “boycott an organisation that showed a lack of integrity for protecting customer data” the concerns are real.
Furthermore, 62 percent of consumers felt inclined to blame the company (controller) certainly not a third party processor — if they lost their personal data.
Placing your data, the cloud, doesn’t mean you wash your hands of all your responsibility. With the introduction of the GDPR, third-party risk became even more heightened. If the data handler or data processor suffers a breach, you, the data controller, would almost certainly be held accountable. However, if you are going to work with third parties and you have done your due diligence, the regulators are obviously going to look on that very differently.
The recent low-cost airline Lion Air group found 30 million records posted online including passport details, names, addresses, contact details etc. It seems that an AWS bucket container was not secured and was left open.
With the Asia region still playing catch-up with privacy laws the fines imposed and the obligations to report the breach and more importantly the data subjects are sketchy to say the least. It is not certain yet whether the Lion Group or any of the third parties involved were subject to GDPR. If it were to be the case the fine and damage of the brand could result in a large dent and could threaten its operations.
Quite often security is an afterthought. Data centre hosting can be myriad of ample complex contracts, data centre for example owned by one company, operated by another, with a contract to yet another and everyone points fingers at each other .
From a legal standpoint, there can still be issues with cloud service providers.
Most controllers concentrate on two requirements of their processors
Processor will follow the processing instructions, and
that they will keep the data secure.
But third party due diligence needs to go further and deeper.
A full 3rd party due diligence audit should take place, and this option should be clearly stated in data processing addendum’s / SCC’s (Standard Contract Clauses).
Under the GDPR, serious breaches must be reported within 72 hours — not almost a year, like Uber. If a data breach carries a “high risk of adversely affecting individuals’ rights and freedoms” the regulation is even more strict saying a breach must be reported without “undue delay.”
There only exception is for cases where a data controller judges that the breach is “unlikely to result in a risk to the rights and freedoms of natural persons,” but even in this case the breach must be thoroughly documented internally, along with the reason for not informing a DPA, something a DPA can at any time ask to see.
A large percentage of data breaches reported were found not to have met the criteria of reporting, because companies possible rushed the decision process in fear of missing that 72-hour window.
There are already notions that organisations are comparing which would be the most lenient authorities, so a multinational for example may choose to report a breach to an authority with less enforcement powers.
Third parties are very often the weak link in data security. According to some reports, third-party failure plays a part in 63 percent of all data breaches.
However, the headlines about breaches always centre upon the controller and rarely mention the third-party processors that may have played a part in the breach.
Third party due diligence frameworks
The process approach
Life cycle phase 1: Planning—Management develops plans to manage relationships with third parties.
Life cycle phase 2: Due diligence and third-party selection—The enterprise conducts due diligence on all potential third parties before selecting and entering into contracts or relationships.
Life cycle phase 3: Contract negotiation— Management reviews or has legal counsel review contracts before execution.
Life cycle phase 4: Ongoing monitoring—Management periodically reviews third-party relationships.
Life cycle phase 5: Termination and contingency planning—Management has adequate contingency plans that address steps to be taken in the event of contract default or termination.
Relentless Privacy and Compliance Services outsourced DPO service manages all third party contracts and due diligence.
In September 2018 , California became the first state to pass a law addressing the security of connected devices. The law will go into effect in 2020 and requires that manufacturers of any internet-connected devices equip them with “reasonable” security features. It is a good first step toward addressing the risks inherent in the world’s increasing connectivity.
The legislation predates federal legislation securing IoT devices, which is not the first time that California has led the way on data privacy and security policy; the new law may serve as a template for future legislation. The new legislation has faced both praise and criticism, but as with any policy addressing new technology, it brings up many new — and sometimes difficult to answer – questions, such as the following:
What is IoT security and what are the potential consequences of insufficiently secured internet-of-things devices?
IoT security refers to steps that are taken to secure or enhance the safety of internet-connected devices – everything from Amazon Echo, Google Home and Ring doorbell to internet-connected devices like stoves, refrigerators and thermostats. It can mean anything from requiring a unique password on devices to ensuring that devices use only password-protected internet connections.
There are many consequences to insufficient or nonexistent IoT device security, chief among them being that the devices can be taken over by cyber criminals and used against their owners. For example, internet-connected devices that have cameras or microphones could be used to record or listen to their owners without permission. Additionally, internet-connected devices like webcams, digital video recorders and home routers can be strung together and used in botnets for distributed denial-of-service attacks launched by cyber criminals.
What is the government doing about this?
While several IoT security bills have been submitted in Congress, none has made it to a vote. However, some states like California are implementing bills that include security requirements for IoT devices.
The main provision of the California IoT security law is that “a manufacturer of a connected device shall equip the device with a reasonable security feature or features.” What does “reasonable” security features mean?
The California’s IoT law leaves “reasonable security features” intentionally vague, as what “reasonable” looks like will vary by device. Generally speaking, “reasonable” security measures would include the ability to change the default username and set up a unique password for the devices. For some devices, it could mean the ability to set the device to only allow certain voices or faces to give commands.
Will this law make the IoT secure?
It is difficult to say whether this law, or any law, will make the internet of things secure, because each device has different security vulnerabilities. That said, this bill’s vagueness, especially the password requirements, does not address different authentication methods like PIN’s or facial recognition that are not considered passwords.
What are the benefits and consequences of California passing legislation ahead of the federal government?
Because California’s IoT bill requires manufacturers include specific features when producing these devices, it will likely set off a trend that is followed nationwide. It will be less expensive for manufacturers to produce all of their devices to meet California’s requirements regardless of where they will be distributed than would be for them to produce products exclusively for California. Should this happen, it could negate the need for any type of federal legislation. However, other states or federal lawmakers may enact laws that go further than the California bill. Stronger requirements for passwords and security would require manufacturers to pivot again and would make the California laws obsolete.
What next steps should state and federal legislators take when it comes to data security and privacy?
Lawmakers should continue looking for gaps in security practices and data protections and create legislation that protects users from these built-in vulnerabilities. However, it is important for users and tech companies not to wait for legislation that mandates security measures, but rather begin implementing data protections and security measures proactively.
Relentless CCPA and Data Privacy Services has You Covered
On December 28, 2018, the Provision Measure no. 869/2018 was published, which amended certain LGPD provisions and created the National Data Protection Authority (ANPD). Among other modifications, the LGPD will go into full force in August 2020, rather than February 2020 as required when the LGPD was first published. The LPGD, as amended, will take effect in August 2020.
Data Collection and Processing
Under LGPD collection and processing is referred to as data treatment, and defined as all operations carried out with personal data, such as:
The treatment of personal data may only be carried out based on one of the following legal bases, which largely align to the GDPR:
With data subject consent
To comply with a legal or regulatory obligation by the controller
By the public administration, for the processing and shared use of data which are necessary for the execution of public policies provided in laws or regulations or contracts, agreements or similar instruments
For carrying out studies by research entities, ensuring, whenever possible, the anonymization of personal data
For the execution of a contract or preliminary procedures related to a contract of which the data subject is a party
For the regular exercise of rights in judicial, administrative or arbitration procedures
As necessary for the protection of life or physical safety of the data subject or a third party
For the protection of health, in a procedure carried out by health professionals or by health entities
To fulfil the legitimate interests of the controller or a third party, and
For the protection of credit
Notwithstanding the above, personal data processing shall be done in good faith and based on the following principles:
Quality of the data
As for the processing of sensitive personal data, the treatment can only occur when the data subject or her or his legal representative consents specifically and in highlight, for specific purposes; or, without consent, under the following situations:
As necessary for the controller’s compliance with a legal or regulatory obligation
Shared data processed as necessary for the execution of public policies provided in laws or regulations
For studies carried out by a research entity
For the regular exercise of rights, including in a contract or in a judicial, administrative and arbitration procedure
Where necessary to for the protection of life or physical safety of the data subject or a third party
The protection of health, carried out by health professionals or by health entities, or
ensuring the prevention of fraud and the safety of the data subject
The controller and operator must keep records of the data treatment operations they carry out, mainly when the processing is based on a legitimate interest.
In this sense, the ANPD may determine that the controller must prepare an Impact Report on Protection of Personal Data, including sensitive data, referring to its data processing operations, pursuant to regulations, subject to commercial and industrial secrecy. The report must contain at least a description of the types of data collected, the methodology used for collection and for ensuring the security of the information, and the analysis of the controller regarding the adopted measures, safeguards and mechanisms of risk mitigation.
The Relentless Privacy and Compliance Services provide a wide range of LGPD, GDPR services
In part One of Malaysia Personal Data Protection Act (PDPA) Your Guide we discussed the structure of the PDPA. Here in part two we explain the operational mechanics of the PDPA.
Collection and Processing
Under the PDPA, subject to certain exceptions, data users are generally required to obtain a data subject’s consent for the processing (which includes collection and disclosure) of his or her personal data. Where consent is required from a data subject under the age of eighteen, the data user must obtain consent from the parent, guardian or person who has parental responsibility for the data subject. The consent obtained from a data subject must be in a form that such consent can be recorded and maintained properly by the data user.
Malaysian law contains additional data protection obligations, including, for example, a requirement to notify data subjects regarding the purpose for which their personal data are collected and a requirement to maintain a list of any personal data disclosures to third parties.
On December 23, 2015, the Commissioner published the Personal Data Protection Standard 2015 (“Standards”), which set out the Commission’s minimum requirements for processing personal data. The Standards include the following:
Security Standard For Personal Data Processed Electronically
Security Standard For Personal Data Processed Non-Electronically
Retention Standard For Personal Data Processed Electronically And Non-Electronically
Data Integrity Standard For Personal Data Processed Electronically And Non-Electronically
Under the PDPA, a data user may not transfer personal data to jurisdictions outside of Malaysia unless that jurisdiction has been specified by the Minister. However, there are exceptions to this restriction, including the following:
The data subject has given his or her consent to the transfer.
The transfer is necessary for the performance of a contract between the data subject and the data user.
The data user has taken all reasonable steps and exercised all due diligence to ensure that the personal data will not be processed in a manner that would contravene the PDPA.
The transfer is necessary to protect the data subject’s vital interests.
In 2017, the Commissioner published a draft Personal Data Protection (Transfer of Personal Data to Places Outside Malaysia) Order 2017 to obtain public feedback on the proposed jurisdictions to which personal data from Malaysia may be transferred. As of December 26, 2018, the Minister has yet to approve the safe harbor jurisdictions. Once approved, a data user may transfer personal data to these safe harbour jurisdictions without having to rely on the data subject’s consent or other prescribed exceptions under the PDPA.
Under the PDPA, data users have an obligation to take ‘practical’ steps to protect personal data, and in doing so, must develop and implement a security policy. The Commissioner may also, from time to time, set out security standards with which the data user must comply, and the data user is required to ensure that its data processors comply with these security standards.
In addition, the Standards provide separate security standards for personal data processed electronically and for personal data processed non-electronically (among others) and require data users to have regard to the Standards in taking practical steps to protect the personal data from any loss, misuse, modification, unauthorised or accidental access or disclosure, alteration or destruction.
Data Breach Notification
There is no requirement under the PDPA for data users to notify authorities regarding data breaches in Malaysia. However, news reports dated October 5, 2018 suggest that Malaysia’s laws could be updated, as early as the middle of 2019, to include data breach notification requirements modelled after those under the European Union’s General Data Protection Regulation (GDPR), including requiring providing notice to government authorities.
Under the PDPA, the Commissioner is empowered to implement and enforce the personal data protection laws and to monitor and supervise compliance with the provisions of the PDPA. Under the Personal Data Protection Regulations 2013, the Commissioner has the power to inspect the systems used in personal data processing and the data user is required, at all reasonable times, to make the systems available for inspection by the Commissioner or any inspection officer. The Commissioner or the inspection officers may require the production of the following during inspection:
The record of the consent from a data subject maintained in respect of the processing of that data subject’s personal data by the data user
The record of required written notices issued by the data user to the data subject
The list of personal data disclosures to third parties
The security policy developed and implemented by the data user
The record of compliance with data retention requirements
The record of compliance with data integrity requirements, and
Such other related information which the Commissioner or any inspection officer deems necessary
Violations of the PDPA and certain provisions of the Personal Data Protection Regulations 2013 are punishable with criminal liability. The prescribed penalties include fines, imprisonment or both. Directors, CEOs, managers or other similar officers will have joint and several liability for non-compliance by the body corporate, subject to a due diligence defence.
However, there is no express right under the PDPA allowing aggrieved data subjects to pursue a civil claim against data users for breaches of the PDPA.
The PDPA applies to electronic marketing activities that involve the processing of personal data for the purposes of commercial transactions. There are no specific provisions in the PDPA that deal with electronic marketing. However, the PDPA provides that a data subject may, at any time by notice in writing to a data user, require the data user at the end of such period as is reasonable in the circumstances to cease or not to begin processing his or her personal data for direct marketing purposes. ‘Direct marketing’ means the communication by whatever means of any advertising or marketing material that is directed to individuals.
There are no provisions in the PDPA that specifically address the issue of online privacy (including cookies and location data). However, any electronic processing of personal data in Malaysia will be subject to the PDPA and the Commissioner may issue further guidance on this issue in the future.
Malaysia’s first comprehensive personal data protection legislation, the Personal Data Protection Act 2010 (PDPA), was passed by the Malaysian Parliament on June 2, 2010 and came into force on November 15, 2013.
Definition of personal data
‘Personal data’ means any information in respect of commercial transactions that is:
Being processed wholly or partly by means of equipment operating automatically in response to instructions given for that purpose
Recorded with the intention that it should wholly or partly be processed by means of such equipment, or
Recorded as part of a relevant filing system or with the intention that it should form part of a relevant filing system, and, in each case
…that relates directly or indirectly to a data subject, who is identified or identifiable from that information or from that and other information in the possession of a data user.
Personal data includes any sensitive personal data or expression of opinion about the data subject. Personal data does not include any information that is processed for the purpose of a credit reporting business carried on by a credit reporting agency under the Credit Reporting Agencies Act 2010.
Definition of sensitive personal data
‘Sensitive personal data’ means any personal data consisting of information as to the physical or mental health or condition of a data subject, his or her political opinions, his or her religious beliefs or other beliefs of a similar nature, the commission or alleged commission by him or her of any offence or any other personal data as the Minister of Communications and Multimedia (Minister) may determine by published order. Other than the categories of sensitive personal data listed above, the Minister has not published any other types of personal data to be sensitive personal data as of December 26, 2018.
Pursuant to the PDPA, a Personal Data Protection Commissioner (Commissioner) has been appointed to implement the PDPA’s provisions. The Commissioner will be advised by a Personal Data Protection Advisory Committee who will be appointed by the Minister, and will consist of one Chairman, three members from the public sector, and at least seven, but no more than eleven other members. The appointment of the Personal Data Protection Advisory Committee will not exceed a term of three years; however, members can be appointed for two successive terms.
The Commissioner’s decisions can be appealed through the Personal Data Protection Appeal Tribunal. The following are examples of such appeals
Decisions relating to the registration of data users under Part II Division 2 of the PDPA
The refusal of the Commissioner to register a code of practice under Section 23(5) of the PDPA
The service of an enforcement notice under Section 108 of the PDPA
The refusal of the Commissioner to vary or cancel an enforcement notice under Section 109 of the PDPA, or
The refusal of the Commissioner to conduct or continue an investigation that is based on a complaint under Part VIII of the PDPA.
If a data user is not satisfied with a decision of the Personal Data Protection Advisory Committee, the data user may proceed to file a judicial review of the decision in the Malaysian High Courts.
Which Organisations are Required to Register
Currently, the PDPA requires the following classes of data users to register under the PDPA:
A licensee under the Communications and Multimedia Act 1998
A licensee under the Postal Services Act 2012
Banking and financial institution
A licensed bank and licensed investment bank under the Financial Services Act 2013
A licensed Islamic bank and licensed international Islamic bank under the Islamic Financial Services Act 2013
A development financial institution under the Development Financial Institution Act 2002
A licensed insurer under the Financial Services Act 2013
A licensed takaful operator under the Islamic Financial Services Act 2013
A licensed international takaful operator under the Islamic Financial Services Act 2013
A licensee under the Private Healthcare Facilities and Services Act 1998
A holder of the certificate of registration of a private medical clinic or a private dental clinic under the Private Healthcare Facilities and Services Act 1998
A body corporate registered under the Registration of Pharmacists Act 1951
Tourism and hospitality
A licensed person who carries on or operates a tourism training institution, licensed tour operator, licensed travel agent or licensed tourist guide under the Tourism Industry Act 1992
A person who carries on or operates a registered tourist accommodation premises under the Tourism Industry Act 1992
Certain named transportation services providers
A private higher educational institution registered under the Private Higher Educational Institutions Act 1996
A private school or private educational institution registered under the Education Act 1996
A licensee under the Direct Sales and Anti-Pyramid Scheme Act 1993
A company registered under the Companies Act 1965 or a person who entered into partnership under the Partnership Act 1961 carrying on business as follows:
A company registered under the Companies Act 1965 or a person who entered into partnership under the Partnership Act 1961, who conducts retail dealing and wholesale dealing as defined under the Control Supplies Act 1961
A company registered under the Companies Act 1965 or a person who entered into partnership under the Partnership Act 1961, who carries on the business of a private employment agency under the Private Employment Agencies Act 1981
A licensed housing developer under the Housing Development (Control and Licensing) Act 1966
A licensed housing developer under the Housing Development (Control and Licensing) Enactment 1978, Sabah
A licensed housing developer under the Housing Developers (Control and Licensing) Ordinance 1993, Sarawak
Certain named utilities services providers
A licensee under the Pawnbrokers Act 1972
A licensee under the Moneylenders Act 1951
Certificates of registration are valid for at least one year, after which data users must renew registrations and may not continue to process personal data.
Data users are also required to display their certificate of registration at a conspicuous place at their principal place of business, and a copy of the certificate at each branch, where applicable.
The Commissioner may designate a body as a data user forum for a class of data users. Data user forums can prepare codes of practice to govern compliance with the PDPA, which can be registered with the Commissioner. Once registered, all data users must comply with the provisions of the code, and non-compliance violates the PDPA. As of December 26, 2018, the Commissioner has published several codes of practice, including for the banking and financial sector, the aviation sector, the utilities sector and the insurance and takaful industry in Malaysia.
Do I Need to Appoint a Data Protection Officer
Currently, Malaysian law does not require that data users appoint a data protection officer.
On May 25th 2018, the EU General Data Protection Regulation came into force, requiring companies based and operating in the European Union to comply with updated regulation about how they handle third party data.
Other countries have taken similar approach to data protection, with Brazil adopting a law governing how organisations collect, use and share customer data. The LGPD (Lei Geral de Proteção de Dados) will go into effect in August 2020, leaving companies with less than a year from now to make sure they are compliant with the strict requirements related to processing and managing personal data.
Lawfulness of Processing
Article 7 LGPD on the lawfulness of data processing contains the ten legal bases that allow organisations to process personal data in Brazil, at least one of which should apply to any data processing operation. The legal bases are:
compliance with a legal or regulatory obligation
execution of public policies (only for the public administration)
research (anonymity, if possible)
execution of a contract (or the pre-contractual phase)
exercising of rights in judicial, administrative, or arbitration procedures
protection of life or physical safety
protection of health
protection of legitimate interests
protection of credit, according to the pertinent legislation
Many of these legal bases are similar to what can be found in laws like the GDPR, although some of the formulations are slightly different, and some additional criteria have been set. Additionally, the LGPD contains specific criteria on how to deal with sensitive data, stipulating that those data cannot be processed unless with a specific legal basis, including the individual’s consent.
When looking at consent, the LGPD seems a little less strict than the GDPR, citing in article 8 that consent needs to be provided in writing or by another means that demonstrates “the manifestation of the will of the data subject.” The burden to prove that consent was validly obtained is on the data controller, and consent should be clearly distinguished from other items, such as contractual clauses. Consent can be revoked at any time and at no cost. Also, consent needs to refer to “particular purposes.” In other words, it needs to be specific.
Legitimate interest under the LGPD is further explained in article 10 of the law. It is rather similar to legitimate interest as we know it from the European Union. It includes the need for data controllers to identify the specific activities for which they process data, as well as the way the rights of the data subject are protected. Also, the data controller needs to make sure the data subject will not be surprised by the fact that his/her data will be processed, i.e. that there is a reasonable expectation of data processing. Some level of transparency is expected from the data controller. The supervisory authority can request that a privacy impact assessment (PIA) be performed when a data controller wants to rely upon legitimate interest.
The LGPD is yet another data protection law that is built on the accountability principle, which means that organisations are required to adopt measures that help to demonstrate compliance. One of those measures is the obligation to maintain a register of processing activities, similar to the one that is required under the GDPR. However, the LGPD does not spell out which elements need to be documented as part of the register. The same is true for the obligation to complete privacy impact assessments. The obligation is part of the law, but it needs to be further specified by the supervisory authority, including in which situations PIAs are mandatory and how they need to be completed. Based on the law, it seems private sector organisations may only need to complete impact assessments when processing personal data on the basis of legitimate interest, and a broader obligation would be imposed on the Brazilian public sector.
Data Subject Rights
Chapter III LGPD is devoted to data subject rights. Brazil will extend several rights to individuals, including a right of confirmation that an individual’s data are being processed, as well as the more traditional rights of access, correction, blocking, and deletion. Under the LGPD, an individual can also request the anonymisation of their data. Requests can be filed at any time, and organisations are bound to respond within 15 days (for the right of access, at least). Data subject rights can be exercised at no cost to the data subject.
LGPD guarantees a number of rights and guarantees to holders of personal data, which may be exercised upon request the holder or their legal representative: • Right to information • Right of access • Right of rectification • Right of data deletion • Right of opposition • Portability right
Contraventions under the LGPD can be sanctioned with a fine of up to 2% of the annual turnover of an organization, with a maximum of 50 million real (US$ 12.85 million). Also, a warning can be issued. The enforcement notice can furthermore contain the order to block or delete the data to which the infraction refers. Other sanctions, which were part of the draft law, including the possible suspension of processing, were vetoed by Brazilian president Temer when the LGPD was presented to him for his signature. Similarly, the provisions on the creation of an independent supervisory authority were vetoed. Therefore, it is not clear how the LGPD will be enforced. Previously, the Brazilian government had announced that the supervisory authority would be established by a different act, although a bill to this end has not yet been published.
How Relentless Helps
Like the GDPR, the LGPD is an omnibus law. This means that it covers many principles of data protection law, unlike, for example, the California Consumer Privacy Act which focuses on data subject rights. Relentless has identified 24 provisions of the LGPD that contain accountability requirements for which some form of evidence would be required. These 24 provisions map to 43 privacy management activities from the Relentless Privacy Management Maturity Framework. For comparison, the GDPR contains 55 mandatory privacy management activities.
New laws are taking effect across the globe to regulate the collection, use, retention, disclosure and disposal of personal information. At the same time, the rate of cyber attacks, data breaches and unauthorised use of personal data is growing exponentially.
In the current environment, it is more important than ever, particularly for those organisations handling financial data, health information and other personally identifiable information, to understand the rights and obligations of individuals and organisations with respect to personal information.
Our latest article provides an overview of some of the new data privacy laws, rules and regulations that are, or soon will be, in effect, outlines cyber security and data protection best practices and compliance programmes to help organisations comply with the evolving new data privacy requirements, and touches on the role of new technologies in mitigating risks and supporting compliance.
The exponential evolving data privacy regulatory space
The European Union’s enforcement of the Global Data Protection Regulation (GDPR) commenced on 25 May 2018, and came with sweeping changes in the privacy and data security policies for the vast majority of companies operating, not only in the EU, but across the globe.
The provisions of the GDPR that are important for all companies to take note of include the requirement for explicit and informed consent for collecting personal data and mechanisms to withdraw such consent, breach notifications, the right to access all data that a company has collected, and the right to be forgotten through the erasure and cessation of the dissemination of data. Penalties for breach of the GDPR are steep of course – up to 4 percent of annual global turnover or €20m, whichever is greater.
The regulatory environment in the US comprises a somewhat convoluted, patchwork system of federal and state laws governing privacy and data security concerns that is continuing to evolve to try to address the rash of data breaches and unauthorised use of personal data that are occurring with ever-increasing frequency.
All 50 states, as well as the District of Columbia, Puerto Rico and the US Virgin Islands enacted laws requiring notification of security breaches involving personal information. Companies can face both civil and criminal penalties for a data breach of sensitive information, and some state and federal laws provide the right for individual citizens to file class action lawsuits for privacy violations. Massive class action lawsuits, like the 2013 Target data breach litigation and the currently pending 2017 Equifax data breach litigation, highlight the significant risks that companies face in the wake of a cyber security attack or as a consequence of either not having best practices and compliance programmes in place or simply not following them.
Importance of cyber security and data protection best practices.
The stakes have never been greater than they are right now with respect to the collection, use, retention, disclosure and disposal of personal information. With the present regulatory framework and knowledge of where it is heading, companies can expect to continue to face rising costs and escalating risks associated with their privacy and data security practices.
A number of resources are available that can provide guidance and assistance with addressing privacy and data security practices, as well as to ensure that the practices and programmes implemented are compliant with relevant laws and regulations. The EU and some US Federal agencies, including the Federal Trade Commission (FTC) and the National Institute of Standards and Technology (NIST), have been promulgating updated guidelines and recommendations for privacy and data security best practices in a variety of industries, including some of the newer Internet of Things and peer platform (sharing economy) marketplaces. Additionally, several industry groups have adopted self-regulatory programmes and rules, including certification programmes, to which a company can voluntarily abide.
For companies with a public-facing website, website privacy policies are a must. Additionally, a written incident response plan is critical for establishing protocols for initiating a response team, assessing data breach activity, containing the data breach, and providing guidelines for including other parties, such as law enforcement and officials that require notification under data breach laws. Further, a company must continue to audit and maintain certification as necessary to ensure that their policies and procedures are enforced and remain current. A variety of enterprise privacy management software and compliance solutions may be used internally to help companies audit their systems.
Privacy and data security must form part of the conversation when utilising new technology
While it may be easier said than done to implement new policies and best practices, companies are faced with the additional challenges of evaluating and deploying new technologies that simultaneously may both hinder and help with compliance in view of the new privacy and data security regulations. For example, block-chain technology offers significant advantages for a wide variety of applications from a data security perspective, offering the ability to record transactions in a decentralised and immutable fashion. However, these same technological principles may raise complex issues when looking at compliance with new privacy regulations. For example, in connection with the “right to be forgotten” under the GDPR, how is a subject’s personal information to be erased from an immutable and fully-distributed block-chain? A variety of solutions have been proposed to provide for greater control and management of information with block-chain, including anonymous transactions and voting systems, secret contracts and blind auctions, but they will have to be evaluated in view of the evolving regulatory framework.
Artificial intelligence (AI), and specifically machine learning (ML) techniques, are now widely employed to enable computers to learn and adapt to new input. Such AI technology can be used in cyber security systems to provide automated processes for the identification of new threats and the implementation of technology controls and protection. On the flip-side, hackers have also started to weaponise AI, creating programmes that can study systems, evaluate vulnerabilities or even create persuasive phishing schemes based on the behavior of social networks. AI applications may also raise privacy issues, especially given the large volume of data required to build a model and the often ‘black box’ lack of transparency behind the logic used by AI agents to arrive at a decision about a person.
New outward-facing tools and platforms have also been developed in order to allow users to control how their data is being used. For example, Facebook recently released a set of privacy tools, including a unified privacy dashboard, and has announced the launch of a new clear history tool. Such tools cannot be overlooked, as they may be essential for compliance with the new privacy regulations, such as data portability, right to be forgotten, and withdrawal of consent of the collection of personal data.
Recognition of the new and evolving international privacy and security regulations is a requirement, especially in view of the threat of increasing liability and risk with statutory penalties and class action lawsuits. Implementing a compliance programme with a set of best practices for privacy and data security will surely help mitigate these risks, but it is a continuing process, especially as companies face new hurdles when rolling out new systems and technologies.
This is particularly true where newer technologies, such as block-chain and AI, are incorporated into systems in a manner that simultaneously offers important contributions to security and privacy while exposing new vulnerabilities and concerns. Thus, companies may be well-served by a privacy by design approach that promotes privacy and data security compliance from the start in order to mitigate risk down the road.