Ensuring Coherence between Personal Data Protection and Algorithmic Accountability
In one line, this paper examines the overlaps between privacy law and algorithmic accountability principles, and recommends clean boundary setting between both kinds of laws in the future, especially when broad and common principles are involved.
The right to privacy exists in a legal and technological ecosystem where other rights are also implicated. Various examples indicate that privacy law must necessarily rely on additional laws or rights to address harms in the digital world. For example, WhatsApp users may only consent to its terms of service because of a lack of adequate substitutes in the messenger market. Thus, the right to free consent may be difficult to obtain in monopolised markets. An inter-connected deployment of laws may effectively address harms in a way that no individual law might. However, overlapping laws in any single sector may complicate matters unless the connected laws are carefully defined and limited to their domains.
In this paper, the Centre for Applied Law and Technology Research (ALTR) at Vidhi proposes to analyse the scope of the fairness, transparency and accountability principles. These principles, recognised as part of data protection, have been picked to address their current lack of clarity and scope. Further, these three principles commonly overlap with instances needing non-privacy related regulation of algorithms. The analysis is carried out for India’s proposed data protection law as well as for the GDPR, given the latter’s status as a reference law worldwide and the availability of relevant cases under the same.
This work provides a foundation for thinking about an algorithmic accountability law for India, in order to ensure that it works in consonance with India’s forthcoming data protection law.
The principle of fairness in privacy and non-privacy contexts:
The European Union’s General Data Protection Regulation, 2016 (GDPR) includes a principle of fairness that calls for fair data processing. This means data controllers must process data in a manner that can reasonably be expected, and not in a manner that has an adverse effect on the data subject. Further, data controllers must gather personal data from data subjects in a fair manner without deception. In India, the Srikrishna Committee Report recommends the imposition of a fiduciary obligation on data controllers, bringing in the additional obligation to act fairly. This understanding has since been reflected in the Personal Data Protection Bill, 2019 (PDP Bill 2019).
Outside privacy, a more expansive idea of fairness prevails. Fairness in platform, business and consumer interactions can be seen through the lens of competition and consumer laws. The rise of gig-work through the platform economy raises the need to consider fairness in this nexus. In demands for algorithmic fairness, fairness is also sometimes understood as equality, which is a broader understanding than “fair processing”. Such an understanding is also present under Article 14 of the Indian constitution.
The principle of accountability in privacy and non-privacy contexts:
The GDPR frames the accountability principle as the requirement on data controllers to be responsible for, and show compliance of, all other data protection principles set out by the law. It also includes the obligation to implement technical and organizational measures to ensure compliance of the GDPR. In India, a similar understanding of the accountability principle has been cited in the AP Shah Report (2012). It is reflected in provisions of the PDP Bill 2019, with the introduction of the Data Protection Authority of India along with various compliance-based obligations.
Outside privacy contexts, the accountability principle is commonly noted in the financial sector. The financial sector, both in the domestic market and the global market, has set out various internal and external compliance requirements on the use of algorithms in trading and advisory roles.
Considering the fact that accountability measures can be multipurpose, both privacy and algorithmic accountability laws can be harmonised to ensure that regulated entities can use the same accountability architecture for both. This will reduce their compliance burden.
The principle of transparency in privacy and non-privacy contexts:
The transparency principle under the GDPR focuses on providing information and access to data subjects in a manner that encourages informed decision-making and trust. The GDPR requires data controllers to inform the user on various aspects when directly or indirectly processing the data subject’s personal data. In India, the need for transparency is noted by the Supreme Court in KS Puttaswamy v. Union of India (2017) and elaborated in the Srikrishna Committee Report as a ‘link’ principle that allows data principals to exercise their other rights. The PDP Bill 2019 further codifies the transparency principle through notice, disclosure and privacy-by-design obligations.
The need for transparency is also felt outside privacy contexts. Transparency regarding the algorithms used in various sectors plays a vital role in ensuring accountability in gig work, algorithmic liability and to improve decision-making and consumer protection. Again, transparency mechanisms can be multi-purpose and should be designed as such to avoid duplication of legal provisions and high compliance costs.
Harmonising privacy with the regulation of algorithms
The principles of fairness, accountability and transparency are applicable in both privacy and non-privacy contexts, especially in the use of algorithms. The deployment of algorithm-based regulations requires harmonising these principles with their current understanding under data protection laws. The paper points out inconsistencies between the GDPR and the proposed Artificial Intelligence Bill published by the EU. Overlapping regulations that take different interpretations of similar principles may result in regulatory inefficiency and loopholes. On the other hand, regulatory harmonisation reduces compliance burdens and strengthens both kinds of rights.
Conclusions
Four major conclusions are made:
- Technology-specific legislation must not reinvent principles that are already present in technology-neutral legislation;
- If privacy law is principle-based, a closely related law like an algorithmic accountability law must be more specific;
- If an algorithmic accountability law is to have principles, these principles as well as principles in privacy law need to be well-defined to clarify boundaries between the two; and
- Principles under privacy law must be interpreted as solely deriving from the right to privacy, while other laws must protect other rights.