ChatGPT

Lord Justice Birss of the Court of Appeal of England and Wales, it is reported,1 used ChatGPT to generate a paragraph summarising an area of law, and he then used the summary in his judgment as he found the summary “jolly useful”. However, this use was not acknowledged in the judgment itself and was acknowledged only later at a conference held by the Law Society.

We have to give credit to Anoop Chitkara, J. of the Punjab & Haryana High Court, who used ChatGPT to decide (read reject) a bail application, for letting us know that he used ChatGPT but only to discard it. The way the order passed is worded, it appears that in Jaswinder Singh v. State of Punjab2, the Punjab & Haryana High Court had in eight paragraphs decided the bail application of the applicant who had two previous cases of attempt to murder on the ground that the offences against him involved the infliction of cruelty including the amputation of a finger. Interestingly, para 7 of the judgment3 relies on Matthew Lippman, Contemporary Criminal Law: Concepts, Cases, and Controversies4, and on Kemmler, In re5 to explain what “cruelty” implies. Lippman (2016) is a footnote corresponding to this sentence: “Kemmler, In re6 implies there is something inhuman and barbarous, something more than the mere extinguishment of life”. Interestingly, Kemmler, In re7 was a case dealing with the method of capital punishment decided in 1890 and had no connection with bail. Subject to the record of the case, it is unlikely that a case from 1890 dealing with the method of execution of capital sentence in the United States was cited by the State or the accused.

However, the “post-reasoning”, that begins from para 9 onwards, discloses that after having decided the bail application, the Court with a view to ascertain “the jurisprudence on bail when assailants assaulted with cruelty” used ChatGPT. The response of ChatGPT as explained in the order8 is as below:

10. … The jurisprudence on bail for cases where the assailants have assaulted with cruelty will depend on the specific circumstances of the case and the laws and regulations of the jurisdiction where the case is being tried. However, in general, if the assailants have been charged with a violent crime that involves cruelty, such as murder, aggravated assault, or torture, they may be considered a danger to the community and a flight risk.

In such cases, the Judge may be less inclined to grant bail or may set the bail amount very high to ensure that the defendant appears in court and does not pose a risk to public safety. The severity of the assault, the defendant’s criminal history, and the strength of the evidence against them are all factors that a Judge will consider when deciding whether to grant bail.

It is important to note that the presumption of innocence is a fundamental principle of the justice system, and all defendants are entitled to bail unless there are compelling reasons to deny it. Therefore, even in cases where the assailants have assaulted with cruelty, they may still be granted bail if the Judge determines that they do not pose a risk to the community or a flight risk.

The Court also further clarified that:

11. Any reference to ChatGPT and any observation made hereinabove is neither an expression of opinion on the merits of the case nor shall the trial court advert to these comments. This reference is only intended to present a broader picture on bail jurisprudence, where cruelty is a factor.9

This is one of the first known uses of ChatGPT in the Indian legal system, and to me as a student of criminal law and judicial process raises several questions. In my reading of the bail rejection order, if the decision to reject bail because of alleged cruelty in the commission of the alleged crime had already been made, then why ask a question to ChatGPT? There is no reason to confirm the decision with the summary generated by ChatGPT — a large language model fraught with its own set of challenges.

In fact, as the judgment itself disclosed, the response of ChatGPT is quite categorical that the general rule of presumption of innocence is a fundamental principle and that despite cruelty in assault there must be a compelling reason of flight risk to decline bail. Thus, clearly, the response given by ChatGPT neither confirmed the reasoning adopted nor persuaded the court to take another view. This may be contrasted with the approach taken by the Colombian Judge Juan Manuel Padilla who, in addition to discussing precedents, asked ChatGPT how the laws applied in the case of an autistic boy’s medical funding, and justified the use of ChatGPT only as a “Secretary” to generate coherent structured reasoning.10

The question then is: what is the advantage or benefit of using ChatGPT as far as judicial processes are concerned? Is there a constitutional and legal right to have a case determined and reasoned by a qualified and duly appointed human being? These questions arise not only in the context of reliance on artificial intelligence (AI) but also in the context of the extent of assistance made available to a Judge (recognising his or her humanness and limitations on efficiency). The Colombian Judge defended the use of technology specifically noting that the judicial function of thinking and deciding is not being relinquished in favour of technology, but only the AI’s assistance is being used. This underlines the thinness of the distinction between thinking and reasoning. Any assistance to a Judge, whether by a human or by AI, is not free from its own biases and predicaments. Any assistance generated by technology and algorithms may seem impartial, but that may not be the case.

For example, consider the role of a stenographer in India, whose primary responsibility is to transcribe dictation verbatim. On the other hand, there are highly skilled law clerks who provide comprehensive assistance to a Judge. This assistance may include reading, interpreting, and summarising the pertinent facts and applicable laws as required. These roles represent two ends of the spectrum in terms of the level of assistance provided to the judicial process. If the assertion is made that the fundamental function of lawmaking cannot be delegated, it raises the question of what would be considered the essential judicial function among a multitude of tasks. Furthermore, it prompts us to consider which of these tasks should not be assisted by technology or automated systems. Could it be said that barring the actual decision-making and providing (oral) reasoning for the decision, everything else could be provided by another human being as assistance to a Judge, whose humanness prevents him from doing all those chores given the volume of judicial pendency?

In terms of the judicial process, the presentation of legal research by another person to a judicial person could raise the ethical issue of the legal researchers becoming “tools” for offering ex post facto explanations for decisions already made. For instance, rather than considering the facts of the case and the evolution of the law in the journey of decision-making, one could ask ChatGPT (subject to the limitation that ChatGPT can generate fake citations11) or a researcher to find case laws that support a particular view or decision that had already been taken.

The European Union’s General Data Protection Regulation (GDPR)12 provides individuals with the right not to be subject to a decision based “solely” on automated processing, including profiling, which produces legal effects concerning them or similarly significantly affects them, subject to certain exceptions. However, even this right can be waived by consent with “at least the right to obtain human intervention on the part of the Controller, to express his or her point of view and to contest the decision”.

The process of hearing a case, understanding the nuances, empathising with the litigants, applying legal principles, and then articulating a judgment requires a human touch. Especially in cases that involve significant moral, social, or personal implications, the human element of understanding, discretion, and empathy becomes crucial. The act of providing oral reasoning, in particular, is a reflection of the Judge’s thought process, and it ensures transparency and accountability in the decision-making process. These factors, coupled with the possibility of machine-learnt biases creeping into the decision-making process and the unexplainability of AI’s decision-making process, will perhaps, keep ChatGPTfication of the judicial process at bay, at least for a few years.

While the Indian judiciary is marching ahead in embracing technology,13 one needs to pause for a moment to explore and acknowledge its limitations, foremost among which is the lack of transparency. As black boxes of decision-making,14 machines do not give reasons for why they do what they do. In the process of self-learning, machines learn and replicate biases. While the judicial system stands to benefit from technological advancements like AI, it is crucial to approach its integration with caution to avoid pitfalls similar to those encountered with Aadhaar. Issues such as inscrutability, bias, and due process, need to be meticulously addressed.15 Therefore, the legal system should aim to adapt these technologies to its specific needs, rather than adopting them wholesale. This nuanced approach will help ensure that the use of AI enhances the judicial process while preserving essential values like fairness, justice, and accountability. The Indian judiciary needs to “adapt” to and not simply “adopt” technology.


†Advocate-on-Record, Supreme Court of India.

1. Hibaq Farah, “Court of Appeal Judge Praises ‘Jolly Useful’ ChatGPT After Asking it for Legal Summary”, (The Guardian, 15-9-2023).

2. CRM-M-22496-2022, order dated 27-3-2023. [pending uploading]

3. Jaswinder Singh v. State of Punjab, CRM-M-22496-2022, order dated 27-3-2023.

4. Lippman, Matthew, Contemporary Criminal Law: Concepts, Cases, and Controversies, 51 (Fourth Edn., University of Illinois at Chicago, SAGE Publications, 2016).

5. 1890 SCC OnLine US SC 200 : 34 L Ed 519 : 136 US 436 (1890).

6. 1890 SCC OnLine US SC 200 : 34 L Ed 519 : 136 US 436 (1890).

7. 1890 SCC OnLine US SC 200 : 34 L Ed 519 : 136 US 436 (1890).

8. Jaswinder Singh v. State of Punjab, CRM-M-22496-2022, order dated 27-3-2023.

9. Jaswinder Singh v. State of Punjab, CRM-M-22496-2022, order dated 27-3-2023.

10. Luke Taylor, “Colombian Judge Says He Used ChatGPT in Ruling” (The Guardian, 3-2-2023).

11. Dan Mangan, “Judge Sanctions Lawyers for Brief Written by AI with Fake Citations” (cnbc.com, 22-6-2023).

12. European Union’s General Data Protection Regulation (GDPR), Art. 22, https://gdpr-info.eu/art-22-gdpr/

13. Apoorva Anand, “Supreme Court, IIT Madras Sign MoU to Enable Digital Transformation of Judiciary” (indiatoday.in, 12-10-2023).

14. J. Burrell, “How the Machine ‘Thinks’: Understanding Opacity in Machine Learning Algorithms” (2016) Big Data & Society, 3, 1-12.

15. Raso, Filippo, Hannah Hilligoss, Vivek Krishnamurthy, Christopher Bavitz and Kim, Levin, “Artificial Intelligence & Human Rights: Opportunities & Risks”, Berkman Klein Center for Internet & Society Research Publication (2018).

Must Watch

maintenance to second wife

bail in false pretext of marriage

right to procreate of convict

Criminology, Penology and Victimology book release

Join the discussion

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.