Skip to main content
Normal View

Joint Committee on Justice debate -
Tuesday, 13 Feb 2024

General Scheme of the Garda Síochána (Recording Devices) (Amendment) Bill: Discussion

I remind members and witnesses to please switch their mobile phones to silent or flight mode.

The purpose of today's meeting is to consider the general scheme of the Garda Síochána (recording devices) (amendment) Bill. We will have engagement with several stakeholders as part of our scrutiny of the general scheme. The scrutiny has been expedited as the matter was referred to the committee before Christmas. We are keen to do due diligence, so we are holding two sessions today. The first will take place now with those present and the second will convene after 6 p.m. We will be tight on time for this session, which will finish at 6 p.m., if not sooner, because we need a break before holding the second session.

Before we begin, I advise members that one of the witnesses has had to withdraw from the today's meeting due to ill health, namely, the representative from Safe Ireland. I wish that representative a speedy recovery. Safe Ireland has participated in our hearings many times in the past, and it is always a useful witness. In any event, we have noted the submission Safe Ireland has provided. It will be taken into account when we produce our report.

I welcome the following witnesses: Mr. David Murphy, deputy commissioner of the Data Protection Commission, DPC; Mr. Andrew Carroll, assistant commissioner, DPC; Mr. Simon McGarr, solicitor with Digital Rights Ireland; Ms Olga Cronin, senior policy officer at the Irish Council for Civil Liberties, ICCL; Dr. Cliona Saidléar, executive director of the Rape Council Network Ireland, RCNI; Ms Donna Parau, legal director, RCNI; Mr. Mark Garrett, director general of the Law Society of Ireland; Ms Amiee McCumiskey, member of the Law Society's criminal law committee; Mr. Drew Harris, Garda Commissioner; and Mr. Andrew O'Sullivan, chief information officer with An Garda Síochána. From the Department of Justice, we are joined by Ms Rosaleen Killian, principal officer, and Mr. Frank McNamara, legal researcher. We have a full house today on both sides of the room, which is always conducive to a good session and a useful hearing. I also welcome those in the Public Gallery and the people who are following proceedings online.

The usual procedure applies regarding parliamentary privilege. Witnesses and members are reminded of the long-standing parliamentary practice that they should not criticise or make charges against any person or entity by name or in such a way as to make him, her or it identifiable, or otherwise engage in speech that might be regarded as damaging to the good name of the person or entity. If their statements are potentially defamatory in respect of an identifiable person or entity, they may be directed to discontinue their remarks and it is imperative they comply with any such direction if given.

As regular attendees will be aware, the format of our meeting is that we will have a short opening round of three minutes per organisation. We find that works better for a wider discussion as we then have time for a further discussion with questions from members and answers from witnesses over the course of the meeting. I will invite each group to make its opening statements in three minute blocks. We will then have a six minute window per member. Members will have the floor for six minutes to put their questions and have them answered. It is up to each member to choose how to use that time, whether to get six minutes of answers or use the six minutes to make submissions.

I will start with the opening statements. Five organisations will make opening statements, beginning with Mr. Murphy from the DPC. I note for all witnesses that there are clocks in the corner to assist them with the time. If they go egregiously over the allocated time, I will call them in. There is a small amount of latitude, but it is small because we have to get through the business.

The witnesses are all welcome. I am delighted to have such a full house today and such accomplished witnesses around the table. It will be an interesting session. I call upon Mr. Murphy to deliver his opening remarks.

Mr. David Murphy

I thank the committee for the invitation to contribute to its deliberations on the general scheme of Garda Síochána (recording devices) (amendment) Bill. I am one of the deputy commissioners of the DPC and have responsibility for our function of supervision of the public sector. As the Cathaoirleach noted, I am accompanies by Mr. Carroll, who is an assistant commissioner from the DPC's supervision team and who has responsibility for matters pertaining to law enforcement.

This proposed Bill will provide the legal basis for the processing of personal data by An Garda Síochána for the purpose of biometric identification by way of facial recognition. The DPC acknowledges the potential for facial recognition technology to benefit the work of An Garda Síochána. However, as the use of this technology presents serious risks to an individual's right to data protection, the legislation must implement the necessary restrictions, limitations and safeguards to ensure that any deployment of facial recognition technology by An Garda Síochána is strictly necessary and proportionate and respects the requirements of data protection law and the fundamental rights of individuals.

EU Directive 2016/680, the law enforcement directive, sets out that legislation should be clear and precise, its application should be foreseeable to those subject to it and it should specify the objectives of data processing, the personal data to be processed and the purposes of the processing. The statutory code of practice envisaged in the general scheme will be an essential element in meeting these requirements of EU law by setting out the specific details of how biometric identification and facial recognition technology may be used by An Garda Síochána.

Head 4 specifically links compliance with the code of practice to the strict necessity for proportionality in the use of biometric identification, highlighting its importance in this context. Head 4 further provides that An Garda Síochána can use any image or video to which it lawfully has access. This currently offers little clarity as to what is intended. A concern is that large existing public databases of facial images could be brought within the scope of biometric identification without specific safeguards to prevent this. The inclusion of such databases would represent a serious and disproportionate intrusion on the rights and freedoms of affected persons. Facial recognition technology does not provide definitive results but relies on probability through comparison of facial images with an inherent underlying margin of error and risk of in-built bias. These factors can significantly impact on the reliability and accuracy of the technology and indicate a high level of risk for affected data subjects. Consequently, it will be necessary for a data protection impact assessment to be carried out prior to the introduction of the technology. The DPC recommends publication of such an impact assessment in the interests of transparency.

Biometric identification also constitutes a form of automated individual decision-making permitted under the directive only subject to the right to obtain human intervention. The general scheme provides that the results of biometric identification must be verified by a member of Garda personnel. The efficacy of this safeguard will depend on the expertise of relevant Garda personnel in the operation of the system and their ability to effectively challenge its results. The code of practice should provide detail on this key safeguard, as well as on all oversight mechanisms governing the use of biometric identification.

I hope these comments will be of assistance to the committee and I am happy to answer any questions members may have and speak about any of the issues raised in my written submission.

I call Mr. McGarr solicitor with Digital Rights Ireland. He is also speaking on behalf of the ICCL in his opening remarks. Is that right?

Mr. Simon McGarr

Yes. Both ICCL and Digital Rights Ireland thank the committee for the opportunity to discuss this proposed legislative scheme.

The use of facial recognition technology, FRT, by police engages many fundamental human rights, including but not limited to the rights to human dignity, privacy, protection of personal data, non-discrimination, protest and freedom of expression, all of which are enshrined in the EU Charter of Fundamental Rights. We call on the committee to urge the Government to reconsider the proposal, as currently presented, to introduce FRT into Irish policing as we believe the risks to these fundamental rights are too high in the legislation before us at the moment. We make this call for several reasons.

We endorse the statement by the Data Protection Commission. The results of FRT are unreliable. It is not a silver bullet solution. FRT involves comparing a biometric template created from a face detected in an image or video against a reference database of biometric templates in an attempt to identify a person. However, even when there are optimal conditions with respect to image quality, facial recognition technology is not designed to give police a singular positive identification or 100% match for a person. Instead, at best it gives a person running an FRT search a guess list of who the person could be. It provides a list of potential candidates accompanied by similarity scores. A threshold value is fixed to determine when the software will indicate that a probable match has been found. Should this value be fixed too low or too high, it can create a high false-positive rate or high false-negative rate. There is no single threshold setting which eliminates all errors.

Facial recognition technology is also inherently discriminatory. As the technology stands at the moment, the discriminatory effects of FRT are well documented. While error rates vary depending on the multiple factors that can affect the performance of an FRT system, these errors do not affect all individuals equally. Studies have clearly demonstrated deeply inherent racial and gender biases in FRT, meaning women and people of colour would be more likely to be misidentified and brought to the attention of An Garda Síochána than white men. FRT can enable powerful, mass, indiscriminate and pervasive surveillance. The implications for police use of this highly invasive technology can vary depending on the purpose and scope of its use.

The use of FRT by gardaí, as proposed in this scheme - to use any images or recorded footage that An Garda Síochána legally retains, or can legally access, to locate, identify and track people in respect of certain crimes - would result in a seismic shift in the surveillance capabilities of Irish policing. This brings us to our fourth point.

The general scheme is unlawful under EU law. The committee must consider: the EU law enforcement directive, as transposed in our Data Protection Act 2018; the forthcoming artificial intelligence Act; case law from the Court of Justice of the European Union; and recent guidelines published by the European Data Protection Board, EDPB, on FRT. With those in mind, it is our position that this general scheme is not in step with those frameworks. For example, the use of FRT, as provided for, is not clear, precise or foreseeable. It creates a model of indiscriminate surveillance of people in Ireland. It fails to limit the use of facial data to when it is strictly necessary. It fails to ensure that any FRT use would be targeted in terms of the individuals to be identified. It fails to ensure that anyone whose biometric data is processed by FRT is directly linked to a specific crime, as required under the EU law principles of necessity and proportionality.

In conclusion, we urge the committee to urge the Government to reconsider introducing the FRT proposals before us at the moment. We warn that to do so on foot of ill-defined methods and purposes is to invite not only breaches of innocent people’s rights but also to see otherwise secure convictions at risk on subsequent appeal if the underlying legislation is found to be unlawful.

Ms Donna Parau

I thank the Chairman, and members of the committee for the invitation to this meeting. From a crime investigation perspective, we support the appropriate use of this technology for the detection of perpetrators and for the collection of evidence in support of charges against them. We have two specific points which we would like to address further.

The first is how data will be collected, processed and protected, specifically, data that contains information on or about survivors of sexual violence. This legislation and the code of practice in the Act must state what procedures and protections will be put in place to ensure that survivors are not subjected to humiliating, degrading and dangerous breaches of their privacy. The protection of privacy for survivors must be prioritised and severe criminal penalties applied to those who access or distribute any information outside of the strict parameters allowed for the processing of this data and in any way that could adversely affect survivors.

The second point is on the schedule of offences. As an investigative tool, we strongly support the use of biometrics in all sexual offences and do not agree with any limitation or privileging of some offences over others. The value of being able to use this technology, not only in the detection of perpetrators in public spaces but also in the online space, is invaluable. Technology which is already being used to combat online harms such as child sexual abuse, trafficking and other forms of exploitation, must continue to be available to the An Garda Síochána so its members can do their job effectively.

As stated, we support the use of this technology in collecting evidence but having this evidence stand up in court is of concern. Ensuring that the use of this technology is not left open to challenge on both its efficacy and accuracy is highly important. We have not addressed the merits of the technology from a technical standpoint and have approached our submission from the position of trusting that the very suggestion of the use of such technology carries with it assurances that extensive and robust research and testing into its use and application for this and any other purpose has been done.

I again thank the committee for the opportunity to contribute to this discussion.

Mr. Mark Garrett

I thank the Chair and members of the committee for the invitation to speak to them this afternoon. I am joined by my colleague Ms Amiee McCumiskey, a member of the Law Society criminal law committee and a partner at MacGuill and Company Solicitors.

The Law Society is keenly aware of strong feelings in support of, and in opposition to, the Bill. Working on the front line of the justice system, solicitors know the needs of law enforcement to have at its disposal the necessary tools and technology to deter and detect crime. We are also very much attuned to the need to respect, protect and enforce the rights and civil liberties of individuals.

As explained in the Law Society submission, we have identified a number of possible weaknesses in the published general scheme of the Bill. These weaknesses give rise to concern that this legislation will be challenged on several grounds including through the lens of privacy rights, data protection, the right to non-discrimination and the right to a fair trial. For these reasons, the Law Society believes that the necessity and proportionality required for the introduction of biometric identification in the Irish context merit further examination. Furthermore, we believe the Bill could provide more safeguards and oversight relating to when biometric identification may be used by An Garda Síochána and the external monitoring of its use.

While time does not permit me to go through all of the specific issues we have raised about the Bill, I would like to raise four of them briefly. Much has been made of the intention of this Bill to allow for the use of FRT by the Garda. The general scheme says that biometric data, as it is phrased, will not extend to such physical features as a person’s height or build - presumably because reliance on such information would not lead to a definite identification of a suspect. If it is the intention that the Bill will only apply to facial images and not other physical characteristics of individuals, such as their height, it would be preferable that this is set out more explicitly in the legislation.

The scheme permits the Garda to process and store “images which have been legally provided by other national or international organisations”. The draft does not specify what national or international organisations it refers to and we believe it should do so.

Further, the Bill should place an onus on the Garda only to use images which have been legally obtained by outside organisations. For example, the Department of Social Protection has been implicated in the past for illegally processing biometric data. If such data is subsequently transferred to An Garda Síochána then the whole chain could be tainted and attempts to secure convictions could fail.

Finally, for now, it is not explained how, or against what criteria, a chief superintendent is to assess whether the use of biometric identification is both necessary and proportionate. As we say in our submission, it is assumed that such technology will only be required for complex investigations and where there is a threat to the public security, concern for a person’s safety or the protection of life. The objectives of the proposed Bill could be set out in clearer detail so that the test as to what is necessary and proportionate can be better assessed and reviewed. We also recommend that judicial oversight might be more appropriate here, namely an application to a District Court judge.

In conclusion, the Law Society is mindful that this is complex legislation that deserves close scrutiny. I am very happy to assist the committee with any further elements of its work.

Mr. Drew Harris

I thank the committee for inviting me and my colleague the Garda chief information officer, Andrew O’Sullivan, to present today.

Every major criminal investigation now involves processing digital evidence. This evidence can take the form of images or footage obtained through warrant from seized devices or CCTV. Two separate judgments from the Court of Appeal recently confirm that the Garda has a duty to process available footage to identify or exclude suspects.

Digitalisation in society has led to an explosion in the volume of digital footage as evidence. For instance, the footage from the 23 November riots runs to 22,000 hours or a total of 916 days of footage. Individual murder investigations have had upwards of 50,000 hours of footage. Seized devices can have over 1 million images of child sexual abuse material.

The key to these cases may just be in a few frames out of millions. A child’s school uniform crest can help to identify the victim. The importance of brief footage in a murder investigation or an arson investigation cannot be overemphasised in making a detection for serious crime.

Digital evidence that the Garda has a duty to process is now at big data scale in terms of its massive volume, complexity of formats and the rate at which it is generated. Digital crime and evidence can only be investigated with digital tools. Manual processing by Garda personnel sitting at screens is unfeasible and ineffective. In the case of child sexual abuse material, which is the rape of children and every form of sexual depravity that can be visited on a defenceless victim, there is the traumatic impact on Garda members who view the material.

To be effective at fulfilling our mandate to protect victims, investigate crime and vindicate the human rights of citizens in a digital society, An Garda Síochána must have access to modern digital image analysis and recognition tools.

There is understandable public concern, and perhaps some confusion, about AI technology and how An Garda Síochána intends to use it as an investigative tool. I wish to clarify that digitalisation in An Garda Síochána means that electronic tools act only in support of decisions taken by gardaí. There is never a question of autonomous machine decision-making. All decisions that can impact on a person are only taken by identifiable and accountable personnel. This decision support approach is already used. People make the final decision in many areas within An Garda Síochána, including for driver penalty notices that are initiated by Go Safe vans, the 600,000 vetting applications that we process annually, the use of technology to flag uninsured vehicles and the existing use of biometric processing in online abuse cases. Most of these cases involve searching or sifting massive amounts of evidence for the material relevant to the decision-maker’s decision.

An Garda Síochána has invested significantly in digital policing, including the in-house expert professionals required to build and manage the underlying technology and data. This has directly contributed to our effectiveness in major investigations. The reliability of biometric decision support tools is demonstrated by the success of the Garda National Cyber Crime Bureau in detecting and prosecuting online abuse cases, often as part of transnational investigations. The accuracy of more modern biometric identification systems is clearly demonstrated by the biannual review by the US National Institute of Standards and Technology, NIST. We intend to follow the practice of European law enforcement partners in using these ratings to select the best available technology. There must be safeguards but these should be proportionate to the risks involved in the specific use cases.

In summary, extending the already accurate, reliable and safe usage of image analysis and biometric identification technology beyond abuse cases to other serious criminal investigations is essential for An Garda Síochána in our mission to keep people safe in a rapidly changing digital society, to counter emerging threats and to meet our obligations to work with European law enforcement partners to counter transnational organised crime.

I thank all of the witnesses for their opening remarks. I invite members to put their questions and have their engagement. As I said, there are six and a half minutes per member and interactions will flow within that. I call Senator Ruane.

I thank the witnesses for their presentations. I apologise in advance for pressing for concise answers but the time constraint is not helpful in one sense, although it means I have to look for yes-no answers. The first question is to the Garda Commissioner. I am looking for a simple answer. Does he oppose judicial overview of FRT?

Mr. Drew Harris

In effect, there is already judicial overview in terms of seized devices. We would seek a warrant to enter a seized mobile phone or seized laptop and to search that laptop or mobile phone for evidence, so there is already judicial oversight in respect of us obtaining evidence through a warrant.

I am asking with specific reference to facial recognition technology, which will not only be on seized devices. This is in terms of having to get judicial approval for access.

Mr. Drew Harris

The issue with judicial approval is to ensure the approval is actually proportionate to the intrusion that there is going to be. It is not our intention to run images against a database. We have no such database.

Therefore, the Commissioner does not oppose specifically referencing judicial approval in regard to this legislation.

Mr. Drew Harris

We have to be exactly clear at what point we require further judicial authorisation.

It is a completely new technology so surely, with everything else, the Commissioner would prefer to have judicial approval of FRT so it stands in regard to the Bill being legal in the first place.

Mr. Drew Harris

I might refer to our submission and the various stages around how we would intend to use-----

I have read the submission. It is a simple question. Does the Commissioner oppose us drafting judicial approval for FRT into this specific legislation?

Mr. Drew Harris

That is a choice for the legislators. I am the Commissioner. It is not my role to say what should be passed into law. We have set out the stages that form part of a digital examination.

The witnesses can express a view but decisions will be exclusively made by the committee and the Oireachtas.

The Commissioner cited the biometric system as 99% accurate. Where did he obtain that figure and can he explain how that 99% is calculated?

Mr. Andrew O'Sullivan

I will explain it. The figure comes from the National Institute of Standards and Technology in the US, which is the accepted standard for measuring the accuracy of FRT. Twice a year, it measures more than 500 available algorithms for their accuracy. It involves a number of different tests, including measuring their false matching ratio and their positive matching ratio, that is, how many times they get it right and how many times they get it wrong.

Is the 99% based on how many times it does not match?

Mr. Andrew O'Sullivan

That they get it right.

No, not right-----

Mr. Andrew O'Sullivan

They get it right 99%-----

What does that mean? I am short of time. Let us say there are ten people and it has been tested against ten faces. Two people have come up wrong and eight people have come up as having not been matched at all.

Mr. Andrew O'Sullivan

It would not be anything like that if it is 99%.

The 99% is based not on a positive number; it is actually based on how many it did not wrongly match.

Mr. Andrew O'Sullivan

Yes. The number of times it got things wrong varies and it does vary by, for example, the ethnicity of the individual. For example, in the case of eastern European males, it gets it right 999 times out of 1,000, whereas in the case of west African females, where it is less accurate, it gets it right 993 times out of 1,000.

The 99% is not based on how accurate it is. It is actually based on how many times that face did not match.

Mr. Andrew O'Sullivan

It is based on its inaccuracy rate. It has an inaccuracy rate in both cases, whether it is positive matching or negative matching, of less than 1%. It gets it wrong in less than 1% of the cases.

If it wrongly matched two people and did not match at all against eight people, it would claim that it has 80% accuracy because it did not match against eight people. That is not really 99% accuracy.

Mr. Andrew O'Sullivan

That is what NIST publishes and that has been-----

The Metropolitan Police and the South Wales Police say the matches from 2016 to 2023 have been more than 86% inaccurate, if we look at the inaccuracy rating.

Mr. Andrew O'Sullivan

I can explain that.

We will allow the witnesses to answer. I will allow a little latitude on time as I know the Senator is racing against the clock, but she does not need to.

I do not think how it is measured is being explained correctly for people to understand.

Let us get the witnesses to explain that.

Mr. Andrew O'Sullivan

It is important that this is measured on a twice-yearly basis. The figures I am quoting are from February 2023. It is very important to understand that this technology is moving at an extremely rapid rate. When looking at a study from 2016 or 2020, it does not necessarily relate to the new standards. If we want to talk about the accuracy of facial recognition, it is important to talk about which algorithm and which version of that algorithm, and particularly which date we are talking about. If the South Wales Police or the Metropolitan Police are talking about historical figures, it is important to understand when that was done and which algorithm it was rated against, and then we can look at the corresponding NIST figures to be able to get the corresponding figures.

It is how the 99% accuracy is calculated that is the issue, rather than how many times they check. It is how it is calculated.

Mr. Andrew O'Sullivan

I will come back on that. It is very important to understand that the National Institute of Standards and Technology is accepted as the world expert on how to calculate this. While we do not have time to go through its statistical methodology and how this is calculated, that is published in enormous detail on its website.

It is important.

Mr. Andrew O'Sullivan

It is important that it is accurate and reliable. Our statement stands, and that is certainly what the director of NIST will say. It is accepted by other governments and other police forces that this is the correct accuracy of those figures.

It is also important to understand that, as the Commissioner said, we do use biometric identification. We do not use it for the purposes of comparing against a reference database but we do use it to sift and cluster large numbers of images. There has never been a single case, among the hundreds of cases that have gone through the courts, of us getting that wrong based on the use of biometric technology.

The statement that it is only 70% or 80% accurate is based on older algorithms from three or four years ago. The current and latest technology that we intend to procure has a much higher accuracy rate. The other point to be clear about is that we do not have any intention of doing autonomous machine decision-making and have explicitly said so. Every single decision is made by a human. The technology is just there to provide a set of options, to sift and to break down the images into something manageable.

With regard to the riots, reference has been made to 22,000 hours of footage but surely it is not being suggested that those 22,000 hours of footage would be scanned and every single face processed. Scenario No. 3 of the European Data Protection Board guidelines on FRT and law enforcement deals with the example of a riot or protest. The guidelines clearly state that a generic national legal basis will not meet EU law requirements but even if there was a legal basis, it would not meet the EU law's necessity and proportionality requirements. How would violent disorder be policed using facial recognition technology without breaching EU law?

Mr. Drew Harris

During the riots of 23 November multiple offences were committed. For instance, 211 individuals entered Foot Locker on O'Connell Street and in the space of four minutes, stripped the whole place out. They then made their way to other shops and committed serious offences of looting, burglary, destruction of property as well as physical assaults on individuals. We use the generic expression "riot" but in itself, rioting is a serious offence. We are using footage that was shot at night and that is of varying quality and in varying formats. What we need to do is sift through it for identifiable characteristics as much as identifiable faces-----

Is An Garda Síochána seeking to process all of those faces in over 22,000 hours of footage?

Mr. Drew Harris

Only where we have a suspect for a specific offence. It is not a question of a blanket identification of everybody on the street that day; it is only those individuals that we can identify for specific incidents, where we suspect they have committed a serious crime. Numerous crimes were committed, as we all know, including the destruction of property, wholesale theft, as well as assaults on individuals and on members of An Garda Síochána. Where we have individuals who we suspect for those crimes, we are obliged to investigate and do our best to bring them to justice. That involves sifting through 22,000 hours of CCTV footage, which is why we need digital tools-----

That includes the processing of-----

Please let the witness answer the question, Senator.

That would include the processing of people who were not perpetrators.

Mr. Drew Harris

No, it includes a focus on those individuals who are suspects. An identifiable suspect-----

Other people would have to be processed in order for An Garda Síochána to be able to whittle it down to-----

Mr. Drew Harris

No, in practical terms what we do is focus in on individuals who have identifiable characteristics. Then we seek to find them in the 22,000 hours so that we have a full gamut of all of their movements and we get to a point where we can hope to see their faces to make an identification. Those individuals who engaged in serious criminality that night had no expectation of privacy as they all covered their faces but at some point those facial coverings are removed and that is the point at which we want to be able to, in effect, get an image of them and see if we can identify them through normal policing means. We would try to determine whether anybody local knows them, or whether anybody in the Dublin metropolitan region or the wider organisation knows them before we would consider other steps. We have to be proportionate in how we use technology but we do need technology to sift through that amount of imagery.

I thank the Commissioner and the Senator. I gave a little bit of latitude there because there was a robust engagement and it was helpful to set the scene, which may inform questions and comments from other members. I will move on now because I want to be fair and make sure everybody gets a chance to ask questions. Deputy Ó Ríordáin is next.

I thank everyone for their presentations. I will make some general comments and then invite the witnesses to respond. We all want to arm gardaí with as much weaponry as possible to enable them to target and effectively prosecute those engaged in very serious crimes. We are all on the same page in that regard. I am reminded of a television miniseries that I watched recently about postmasters in the UK who were told that new technology would solve lots of problems but many of them were criminalised over a period of years. I make that point because sometimes technology can be presented to us as something that will solve a lot of problems but due diligence is not done. I have learned a bit more than I knew before I sat down today, particularly from Mr. Carroll, in terms of how this technology has advanced over recent years.

The context in which we are deliberating is that this technology was effectively banned in many parts of America, including San Francisco, Somerville, Massachusetts, Oakland, California, and Boston, Massachusetts in 2019 and 2020. There is also a list of other large population centres with their own police jurisdictions that have had difficulties with this technology and have banned it. Senator Ruane referenced the 2019 study conducted by Peter Fussey and Daragh Murray into the London Metropolitan Police, which found that 81% of suspects flagged by mass FRT were innocent. That study put the technology in the dock, so to speak. Only last June the European Parliament voted in favour of a total ban on live facial recognition in public spaces.

I understand what people have said about the traumatic experience of individual gardaí having to sift through a huge volume of very distressing material, the eating up of hours upon hours of manpower and the fact that this technology would be beneficial in that regard. However, I am also taken by accusations of gender and racial bias. I ask Mr. McGarr to expand on his contention that the law, as drafted, contravenes existing EU legislation. I ask Ms. Parau to respond to the suggestion or accusation that this technology has gender and racial bias within it and to outline how her organisation, in particular, would deal with that. I ask the Garda Commissioner or Mr. Carroll to give us some comfort as to why so many police authorities in so many places in the USA and in the UK have had difficulty with this. Is it because it is an old way of doing things? Are they wrong? Are we right? How is it that they have made that determination and yet we are expected to make a different determination in this jurisdiction?

Mr. Simon McGarr

On the question of incompatibility with European law, it is thus as currently drafted but that it is not to say that changes cannot be made that will bring it in line with European law. Indeed, the European Data Protection Board has produced a guidance note - which the Data Protection Commission, as one of the members of that board, could speak to better than I could - on what to look out for to ensure that facial recognition technology, when it is being brought in for the purpose of law enforcement, will not overstep the boundaries of European law. The primary problem with the scheme of the Bill that we have in front of us is that it lacks the necessary foreseeability, precision and strict necessity and proportionality built into usage. Without those, we run into difficulties both in terms of the Charter of Fundamental Rights and the court's decisions on Digital Rights Ireland and subsequent decisions on the application of that charter, even in law enforcement areas.

It is important that we make sure the difference between European law and Irish law does not open up an opportunity for challenges in the future. I appeared before this committee a couple of years ago and discussed data retention and the dangers of having challenges to what would otherwise be, and should be, very strong prosecutions. All of the witnesses present will agree it is important that people who have committed crimes should face the consequences of those crimes and that gardaí should be given the necessary and appropriate powers to ensure that those prosecutions are brought about. However, if we lay an elephant trap in relying on national legislation that can then be overturned at a European level - and foreseeably so - it would mean that we would run into difficulties with the JC decision, which requires that if any evidence was gathered or used in a prosecution, it could only be relied upon if the fact of its illegality was not foreseeable. It would also mean that we would run into difficulties where we could not identify which cases and which pieces of evidence should be faced, that Garda should rely upon, when it comes to prosecutions.

Strong prosecutions resulting in reliable outcomes for the courts are what everybody wants to see.

Mr. McGarr has eaten up all my time.

We can take another answer. The Deputy had directed a question to someone else.

I had asked a question of Ms Parau and Mr. O'Sullivan but I do not know if there is enough time for them to reply.

We will take brief responses. I want to get around the table and get everyone in.

I had asked about gender and racial biases and other jurisdictions that have outlawed the use of technology.

Ms Donna Parau

May I suggest the Deputy goes to Mr. O'Sullivan? Making a comment on the technicalities of gender bias is outside our remit.

That is fair enough.

Mr. Andrew O'Sullivan

The Deputy asked about the other jurisdictions that have banned the use of FRT It is important to make the distinction between real-time facial recognition, where a decision is being made autonomously by a machine, such as a camera, and retrospective analysis, where a decision is being made by a human on the basis of the recommendation or suggestion of the software. We have absolutely no intention of doing real-time facial recognition. We also have no time at all for autonomous machine decision-making. Every decision will be made by a person and he or she will be responsible for that.

The Metropolitan Police, the Met, study is five years old at this stage so it relates to older technology. It also relates to the use of real-time autonomous machine decision-making as opposed to that retrospective basis. In terms of the approach that is followed and the banning by other jurisdictions or other police forces, it is absolutely not the case that the US has banned the use of facial recognition in retrospective cases such as those relating to child sexual abuse materials. All of our child sexual abuse material investigations are transnational and many of them are with the US, where this material is either sourced or published. I can say definitively that the same technology we have outlined in our use cases in our submission is absolutely in use in every developed country and in no cases has it been banned. It is absolutely impossible to process child sexual abuse material cases without the use of this type of technology.

I will make the case again that what we are primarily talking about in use cases Nos. 1 to 8 in our submission is the use of the technology to filter, cluster or sift evidence and to boil it down to a series of suggested cases at which the examiner would look. It does not make definitive identifications. We do not have a reference database, as is sometimes misunderstood, that we can compare against. We are not attempting to make identifications. What we are doing is boiling down considerable amounts of images to a manageable set. The decision will always be made by a member of Garda personnel. That is an identifiable and accountable member of Garda personnel. There have already been hundreds of prosecutions and court cases where this information has been available for the courts to challenge. There has never been much of a challenge and there has certainly never been a successful challenge to that process. The digitalisation process is a combination of the electronic tools and boiling the cases down to something manageable for a person to make a decision. That is exactly what we are talking about. There is absolutely no question of autonomous machine decision-making, which has been rightly banned in other jurisdictions. We have never asked for that.

Before we move on, I wish to drill into that answer because it is pertinent to the entire discussion. Mr. O'Sullivan said that in the use of this technology, the Garda is not attempting to make identifications. #the gist of what he said was that the intention is to reduce information to a smaller subset. I do not quite get that. Is the purpose not to retrospectively scan a crowd scene, as it were, in the right situation and to identify suspects within that group? Perhaps I did not understand the point he made.

Mr. Drew Harris

We have recent examples of where we have examined CCTV. Our starting point is the investigative mindset. The use of our powers for the purpose of investigation is not just about us seeing who we recognise in a crowd and working back from there. We look at what offences were committed and start from a normal policing perspective. If, for example, 211 people go into a Foot Locker, we ask which of them we are in a position to identify and follow their movements over the course of the night. It is not a question of asking who we can identify and what we can charge them with. We look to find the offences and the individuals we can connect to them. The investigative process is purposeful. We prioritise in terms of who we want to bring to justice and that is a range dependent on the seriousness of the offences. There was, in that case, substantial damage and serious public disorder. Those matters are an obvious priority and we would focus on those individuals in the first place.

In the Foot Locker example, as opposed to scanning a crowd scene, retrospectively or otherwise, if an individual is suspected of perpetrating a crime in the footage, the Garda would almost tag the individual and look to see where he or she appears in other frames or in other video footage. Is that correct? It is a matter of tracing the movements of a particular individual through a set of scenes. Is that right? Is that how it works?

Mr. Drew Harris

That is correct. The example of the destruction of public transport vehicles might be better. The individuals who engaged in that may, in fact, have engaged in earlier disorder and disruption at the scene and in other subsequent attacks. There is a hierarchy of offences committed and that is how we order our investigation. There is no question of asking who we recognise in the footage and what we can get them for. We are purposively pursuing the more serious offences and working our way through the full culpability of those individuals by, as the Cathaoirleach said, tagging them, seeing where they turn up next and what else they do. We hope then to arrive at a point where we can identify them.

That is very helpful. I thank Mr. Harris.

I wish our witnesses a good evening and thank them for their opening statements and for answering questions. I will be running into the Dáil shortly to make a contribution on another matter but I will return.

I have a couple of questions that arise from questions that have been asked. I will start with the representatives of the Data Protection Commission, DPC. Do they have concerns about how the Bill is drafted in the context of the sharing of data across jurisdictions, if it comes to that? For instance, if an individual is identified using this technology, would they have a concern about the sharing of that data? Digital Rights Ireland might also have a view on that.

Mr. David Murphy

I would not say that we have an overarching concern on that point. The Bill makes reference to the obtaining of images by the Garda. The sharing of personal data between jurisdictions for law enforcement purposes is already subject to substantial oversight and governance mechanisms, and we would expect that any information derived from this source would fall under the same governance and oversight rule set.

In our submission, we make reference to the code of practice that will need to be developed to provide substantial detail on how the system will work. We expect the code of practice will speak to that and the safeguards that may be in place.

There is no doubt that the code of practice will be key. Without a completed Bill or Act before us, it is fairly easy to see which is the more important of the documents.

I will turn to Commissioner Harris. Thanks to the Cathaoirleach, this point has been filled out a little, but, as a matter of interest, for the 22,000 hours of footage the Garda is assessing with respect to the Dublin riot, how many officers are assigned to that work and how many hours have they spent assessing that information?

Mr. Drew Harris

There is presently a viewing team of eight people and they have been working solidly on this since 24 November. There has been a considerable effort involving dozens of Garda members to secure CCTV evidence, mobile phone footage, dashcam footage, etc. There have been several public appeals. Those Garda personnel analyse the CCTV and produce packages of evidence. It is for others then to go out and do the interview, search and arrest portion of the work.

How would the implementation of this Bill impact that specific example? How would it reduce the man hours required?

Mr. Drew Harris

We are in something of a legislative lacuna here, in that there is no Bill providing clarity around what exactly we can do. At the moment, we are seeking legal advice about what we can do with all this footage within the present legal framework. There are individuals whose faces we cannot see but who are wearing distinctive clothing or have distinctive characteristics. We are seeking advice about using digital methods to pursue them, tag them, in effect, and follow their movements until we get to a position where we have an image that identifies them. We are doing that manually but if we were able to apply software to look for specific characteristics of a person's clothing or whatever it might be, it would speed up the process around certain of these issues.

A lot of this is a manual process because it is about reporting for an evidential purpose. Once someone has-----

I take the Commissioner's point on board. I will go back to Mr. McGarr. He mentioned the use of the parameters set out in the heads of Bill and the schemes used in other jurisdictions. Given it is individual officers who will assess the information in front of them, one of the points is that if a scenario arises where an individual before the courts disputes the identification of himself or herself in footage used by An Garda Síochána - perhaps Mr. Garrett will have a view on this as well - it is ultimately a matter for the Judiciary to make a decision. Under the Bill, or Act if it goes into law and is then used by An Garda Síochána, it is not technically AI that will determine whether this is the individual. It will be determined by an individual garda based on a profile, including height and build. Height and weight were mentioned as regards profiling of individuals. From an evidentiary perspective, is that a problem? Does Mr. McGarr envisage it being a problem?

Mr. Simon McGarr

From the point of view of evidence, as Commissioner Harris repeatedly said, we are not discussing the creation of machine-generated outcomes under the Bill. These will all have a human element as regards the decision-making. The issue we are raising is our concern about the quality of the prosecutions and the danger a prosecution might fall on the basis of how the evidence was gathered, which is a ground upon which there have been challenges in the past. We do not need to name the particular cases, but there have been challenges to quite strong prosecutions for very serious crimes on the basis of how the evidence was gathered as opposed to a challenge to the identification outcome.

Ms Aimée McCumiskey

We reiterate that the specifics around the decision-making process need to be clarified, probably within the code of practice. The lack of clarity around how the process will operate in practice is certainly concerning.

I have a final question on racial profiling or general profiling. This is for the Data Protection Commission representatives or anybody else who has a view on it. The heads of Bill do not go into it specifically, but there is already an effective ban on such profiling in the Data Protection Act. As members of an authority on such matters, do they believe that is sufficiently strong legislation to prohibit racial profiling by any authority that would have access to this technology, if the Bill were enacted?

I ask the witnesses to be relatively brief because I want to move on to the next member. I will allow the question to be answered as it is important.

Mr. David Murphy

We referenced the risk of in-built bias in our opening statement. The best place to address that is through the data protection impact assessment that should precede the procurement and roll-out of any technology. That is the appropriate way to identify risk arising from the processing of personal data and to build in any mitigating safeguards. Clearly, we would have a significant difficulty with anything that presents such a high risk that it already infringes the law. We can advise that we have been informed by An Garda Síochána that it is very happy to engage with us in that data protection impact assessment process. We welcome that.

I thank Mr. Murphy. Were there any other views?

Mr. Andrew O'Sullivan

On identification, I again make the point that this is not an identification process. Looking at the eight use cases in our submission, it is about sifting and reducing huge numbers of images for decision by a Garda member. That is the key point. Not only is it not autonomous machine decision-making, all it is doing is organising and reducing the footage to a manageable amount.

Does DRI have a view on that? It is important. It was raised, to be fair.

Who was the Deputy directing the question to?

To DRI, as it is about the robustness of the Data Protection Act regarding racial profiling in the context of the Bill, and whether it is sufficiently robust. Somebody raised that matter.

Ms Olga Cronin

A data protection impact assessment would be necessary by law. However, it is important that we are not just concerned about data protection rights. There are privacy rights, and rights to non-discrimination, freedom to protest, freedom of assembly, and to be anonymous and be a face in the crowd. I understand that we have been told the purpose of the Bill is not for identification but, as it stands, the Bill provides for identification. Serious clarity needs to be introduced.

I will indicate the next three members who are due to speak so they can prepare. Deputy Costello will be followed by Deputy Daly and Senator Gallagher.

I apologise to our guests and fellow committee members for running in and out. I am trying to balance two committees meeting at the same time. It is interesting that the children's committee is dealing with the impact of AI on children at the same time we are discussing the impact of AI, in a way, at this committee.

In the interests of time, I will streamline my questions a little more. For the ICCL and DRI, whose representatives made the assertion that the Bill as proposed is unlawful under EU law. Will they give us more information and detail on that? The Bill as it stands allows for authorisation by a chief superintendent instead of, for example, a judge and judicial warrant. I welcome their views, and those of the Law Society, on that.

The Commissioner referenced the National Institute of Standards and Technology, NIST. It would be very useful if the representatives addressed that point. I also note that the Commissioner made the point that "extending the already accurate, reliable and safe usage of image analysis and biometric identification technology beyond abuse cases to other serious ... investigations is essential for An Garda Síochána to keep people safe..." The ICCL has written in the past about the experience of referrals from the US National Center for Missing and Exploited Children, NCMEC, which used some of these tools. Will its representatives talk about them and their accuracy and reliability?

Ms Olga Cronin

I will speak to the point about NIST and the most recent point. Mr. McGarr will talk to the law points.

It is important to note that the figures from NIST are the go-to when we talk about how accurate this technology is. NIST figures have been mentioned in the Dáil in that respect, but it is very important that when we are told there is an accuracy figure from NIST, and we should use it based on a figure of, let us say, 98%, we have to ask exactly how that testing was carried out. For example, we heard NIST figures being used in the Dáil to support bringing in facial recognition technology, but those figures related to a very clear mugshot compared with another very clear mugshot. The figures did not involve the use of images taken from CCTV, otherwise known as taken from the wild. Image quality obviously has a massive role to play in how accurate this will be. We have seen situations in America where some police forces have superimposed a cheek, chin, frown or open eyes just because that was missing from an image taken from CCTV. Those NIST figures should be interrogated accordingly. I urge the committee to do so. It is also important to note that NIST stated that the accuracy figure could worsen in excess of 20% when it relates to images taken from CCTV.

It is also important to note with regard to NIST testing that if at least one of the results returned from a facial recognition technology, FRT, search is a match for the probe image, the search is considered successful and counted as part of the true positive match. For everyone in the room, it is very important to know that when an FRT search is carried out in the manner the Bill seeks to provide for, there will be a list of candidates.

That is a list of people with a percentage score next to it, like a similarity score. Let us say there is someone who looks exactly like me or very similar to me and we are in the same database. Let us say she commits a crime and is caught on CCTV. A picture of her is taken and it is run through an FRT search. It is not guaranteed she will be first on the list and I will be No. 6 or 7, so when we talk about accuracy figures there is a lot to unpack there. The committee should take time to really get to grips with that.

The other point was on-----

It was on NCMEC referrals and their accuracy and reliability.

Ms Olga Cronin

This relates to big tech companies voluntarily scanning messages and emails for-----

I am sorry, but will the Deputy or Ms Cronin define NCMEC?

It is the National Center for Missing and Exploited Children in America. I may have got the acronym wrong.

I just wanted the gist of it so everyone is on the same page.

It is involved with protecting children online and trying to identify children in online child sexual abuse material.

I thank the Deputy.

Ms Olga Cronin

The National Center for Missing and Exploited Children in the US is something of a clearing house for big tech companies. At the moment they voluntarily scan messages for child sexual abuse material and flag it to the NCMEC, which then flags it to the relevant law enforcing authority wherever it is in the world. We get referrals from NCMEC. There is an issue with how this scanning takes place. It has a significant false positive rate. Unfortunately, what we have seen in Ireland - we got these figures from the Garda - is a significant percentage of the referrals that have been sent to the Garda were false positives and were images of children in the bath, innocent pictures of people's kids on the beach or consensual images between adults. It was disconcerting for us, and we raised this publicly last year or the year before, that An Garda Síochána was retaining personal data pertaining to those innocent files, if you like, even after gardaí found there was not a criminal case to answer.

There have been moves at a European level to make that kind of scanning mandatory, but they have stalled because the measure is so problematic in terms of being proportionate given that it does not do what it says it does perfectly. When it comes to encryption, there was a ruling in a European Court of Human Rights case just this morning that if you were to weaken encryption for one you, would weaken encryption for all. Again, there is a lot to unpack there.

The concerns around the false positives and the unreliable nature apply to facial recognition technology.

Ms Olga Cronin

The scanning that takes place is not necessarily facial recognition technology. It is more looking for skin, so it is quite different from facial recognition technology, but again it goes back to the issues with tech solutions.

Deputy Costello's time has concluded.

Mr. Andrew O'Sullivan

References were made to information provided by An Garda Síochána, so perhaps I could clarify that.

This is related to the retention of images after the fact that-----

Mr. Andrew O'Sullivan

It is on the figures around children on the beach and so on.

That caught my attention as well. The suggestion was that some of those can be retained after they are required.

Mr. Andrew O'Sullivan

The first issue was around accuracy. To be clear, every single image received from the NCMEC is reviewed. There is quite a small number of images because what is sent are sample images from various Internet addresses, so it is between two and 50 in each case. Each of those is reviewed by a trained examiner to see whether it is child sexual abuse material, CSAM, or what is termed non-CSAM. That was one example that was given where, potentially, it could be innocent cases. However, the vast majority of cases relate to where it is abuse but it is not possible to determine the age of victim, or it is other forms of abuse-type material. It is a complete edge case to say this relates to images of family and so on. That was given as one example. Those images need to be retained because in some cases there may be other evidence in the future, such as being able to identify the age of the victim, or other cases may come to light. There is a legitimate reason for retaining those images.

On the accuracy point, as I said earlier, the methodologies NIST uses are fully published on its website. I encourage members to read those in detail because we do not really have time to go through them here. As I said, we have already taken hundreds of cases, on our existing basis of biometric identification, through the courts. Where this has been challenged we have had defence solicitors come in to the Garda National Cyber Crime Bureau to examine our methodologies and there has never been a single case where the accuracy of this technology has been disputed. That is just another comment on the accuracy figure.

I thank everyone for their submissions and opening statements. I thank especially the secretariat for providing a very good overview as usual. I also thank Mr. O'Sullivan for the presentation he gave me and another member of my party at the digital innovation centre, as I think it is called, in the Phoenix Park a few months ago. It was of great assistance prior to this meeting.

It is a good thing this Bill was not rushed, as was intended last summer, when it was to be tacked on to the end of another Bill. There are clearly discrepancies in what some people are saying about the data. On the initial list mentioned last summer were child abductions and very important and serious cases, and then we have an expanded list provided today with the heads of the Bill. It seems strange that some of the cases are in there. It is obvious the riot-type cases have been included subsequently, but this seems to be dealt with on an ad hoc basis. Murder and manslaughter are in there but attempted murder does not seem to be there. Sexual assault is not included and section 3 assault, which was so serious the sentence was increased last year to five years' imprisonment, is not on the list whereas section 4 assault is. As such, it is an unusual list. There does not seem to be any great consistency in it. It seems to be evolving, like the legislation. The code of practice also seems a bit strange. It would be much better to have it in the Bill. Leaving it to the Minister is not a good idea. It should include record-keeping, how to conduct identifications and review processes, etc. It is also important there be a pilot scheme to see how well this technology works given that it is, as I saw in the Phoenix Park, evolving all the time.

There are officials from the Department of Justice present. A look-back review and risk assessment should be built into this. As the Commissioner said, there must be safeguards against discriminatory effects to ensure proportionality, especially where human interaction comes into digital technology. We had the Declan Tynan case, which was registered as a miscarriage of justice. From my practice I can think of three cases off the top of my head where a brother of the person who actually did a crime was wrongly identified by gardaí who were involved in cases. It was presented to me, as the solicitor going to the Garda station, as being cases where the individuals were bang to rights, but I could tell them in two of the cases that they had the wrong man. By the way, one of the people who was wrongly identified went on to admit to the crime even though it was clearly not him but he was not prepared to say it was his brother who was there. That is just an example of human error creating a problem which is then aggravated.

On the heads of Bill, it is obvious every contact leaves a trace. In gangland crime there are very few confessions anymore and it is forensics and digital footprints that lead to many of the cases. We do not want a case, one of which was referenced by the Commissioner in his opening statement, where an individual garda rolls up his sleeves for 12 months to go through CCTV and basically has to go through a truckload of evidence. I ask the Law Society for its opinion on head 43(c), for example. What do its representatives think of that given the Dwyer case and the old section 29 of the Offences Against the State Act? It seems obvious there should be an application made, perhaps by a relatively senior Garda who is involved in a case, to a District Court judge.

It cannot be the case, unless it is extremely urgent and the security of the State is at stake or someone is about to be killed, that there would not be time to find a District Court judge who would have oversight of it. That will give some kind of clarity to the process. The garda who knows the case can make the application, probably on a sworn affidavit, and a District Court judge can make a decision. We looked at the community safety legislation last week. It originally started off that it had to go to the chief superintendent. Following Seanad amendments, that was reduced down to allowing an inspector to deal with it. Because of the way the case is evolving it is far better - subject to what the Law Society and others may think - that we have judicial oversight in the form of a District Court judge. The Garda Commissioner should have a record of all applications and authorisations over the 12-month period so that we would have a look-back to see how it is going. That is really the one point on which I wanted to ask a question.

I want to note that, as is his prerogative, the Deputy has used a considerable amount of time on his submissions. We will have to reduce the time for responses accordingly.

Ms Aimée McCumiskey

We strongly advocate that the application be made to a District Court judge. That would be consistent with other surveillance-type legislation such as the Criminal Justice (Surveillance) Act 2009 and the equivalent of search warrants. Other warrants that impact the fundamental rights of individuals need to be issued by a District Court judge. We agree that the same should happen here.

What is the witness's opinion on the new section 43D(1)(c), which refers to the use of biometric identification being "connected" to the investigation of an offence. In my opinion, it is a bit vague. It is quite weak. Perhaps a better wording would refer to such use being "likely" to assist in the investigation - in questioning, in revealing suspects or in discounting persons - rather than being "connected" to an investigation. Does the witness think that it is strong enough in the heads of Bill or should it be strengthened somewhat?

Ms Aimée McCumiskey

I think it should be strengthened. The application for authorisation should be detailed and connected to the investigation itself.

Maybe by way of a sworn affidavit.

Ms Aimée McCumiskey

By way of sworn evidence.

Okay, sworn evidence. In a case of extreme urgency does Ms McCumiskey think that should be set out? If that is being bypassed in an extreme case, should the reasons for the extreme urgency be set out, so that they are available if there is subsequently a trial?

Ms Aimée McCumiskey

Yes, in exceptional circumstances, it should be set out.

Thank you. We are going to move on because we have spent a lot of time on the opening statement. Senator Gallagher is up next, followed by Senator Ward and Deputy Pringle.

I thank the Chair and I thank the witnesses for coming here this afternoon. It has been very useful, albeit somewhat confusing, from my perspective. I am assuming that the starting point for us all here is that we would like to give gardaí every assistance possible in the investigation of crime. I think we are all on the same page in that regard.

That said, we want to do it in a way that is lawful and completely above board. It is somewhat concerning that we have got to where we are today with legislation that the data protection people have concerns with. Mr. McGarr and Ms Cronin have raised issues here that go as far as to say that it is not compatible with EU law. That is very concerning. Following on from the previous speaker in talking about the merits of pre-legislative scrutiny, this afternoon proves that there is always a strong need for it. When I have finished, will someone from the Department make a comment? From a lay person's perspective, I would take it as a given that whatever legislation is drawn up would have the imprimatur from the data protection people at stage 1, so to speak. It concerns me that this is not the case.

There seems to be a bit of confusion, or perhaps a difference of opinion, regarding whatever data is currently out there and the merits or otherwise, of it. I do not know who has the overriding voice in relation to it and who determines who is right and who is wrong regarding the information we have been given this afternoon. From our perspective, we are trying to get a complete handle on it from the Garda Commissioner, Drew Harris, and from Mr. O'Sullivan's perspective. He talked about the 22,000 hours of footage that would equate to 916 days for individuals to go through. lf the legislation we are discussing was invoked could we get a very rough estimate as to how many person hours would be involved in going through that 22,000 hours of footage? How beneficial would it be from a manpower perspective?

I would like the point about the range of offences which Ms Parau raised to be expanded upon. I certainly agree with that. What are the Commissioner's thoughts on the matter? I will pose those questions for starters.

Mr. David Murphy

I will address the matter of the legislation passing without reference to us. I should state that there is statutory consultation under section 84 of the Data Protection Act 2018. The Department is obliged to consult us on the legislation and I can confirm that this has commenced. We are in dialogue with the Department. Following this process, we will continue to engage with the Department on the progress of the legislation.

Regarding concerns about the lawfulness of the measure and its compatibility with EU law, recital 33 of the law enforcement directive does not necessarily require that the clarity, precision and foreseeability are provided for in a printable Act passed by the parliament. This is where we see the code of practice as a statutory, mandated code stepping in to provide - if that is the way the legislation progresses - substantial detail on how this is implemented in practice. This will cover those assessments of necessity and proportionality, on a case-by-case basis, on each deployment of the technology being made by An Garda Síochána. It will also cover the oversight mechanism and the expertise of Garda personnel to effectively challenge through the verification process.

As a suite of measures, the Bill and the code of practice will need to meet those thresholds of clarity, precision and foreseeability for persons affected. That is why we see the code of practice, in essence, doing a lot of the heavy lifting in that regard.

Is Mr. Murphy confident that by the end of the discussion with the Department, that point will be arrived at?

Mr. David Murphy

As written into the Bill, there is a requirement for consultation with our office on the code of practice. We await developments on that. The making of the code of practice will also rely on a positive resolution of the committee and the Houses of the Oireachtas as an additional safeguard on that. We would see this as an ongoing, iterative process of establishing the substantive detail of how this will actually work in practice.

Mr. Drew Harris

Rather than use the example of the disorder on 23 November, the example we can talk more authoritatively about is the examination of child abuse material. In a very recent example I looked at last week, in a two-month period, one phone gathered some 650,000 images. In effect, this individual grooms children and then through threat and intimidation, exploits them but also encourages them and demands them to produce other victims as well.

From that, we identified 54 children as victims, of whom 51 were actually identified. All of them were beyond this jurisdiction, so it required considerable international co-operation. One can imagine how impossible it would be to try to do that manually with a phone with 650,000 images. It is in this context that we talk about sifting through images quickly to look for groups or identifiable characteristics. That goes beyond the individual. It can be the room, the language on the spines of books in the background, school identification and simple things like light switches, electrical sockets and furniture. These are the details we look for. The specialist software for this purpose speeds up that process immeasurably. What is the work of a number of weeks in identifying 54 individuals would literally take months if we were to try to work our way through 650,000 images manually. We would not be able to identify 54 separate victims. In the meantime, the months would be rolling on and the abuse would be continuing. The software makes a considerable difference. That is well accepted.

Public disorder is a fast-moving and raw situation and there will need to be a great deal of manual examination, but we would look for individuals with clear characteristics as to their clothing. We would search for those electronically. Even if we just got the clues, the space-time continuum means they must travel down a certain route to get from A to B. We would look for the other footage manually to show us what was happening in between, which premises they were in or what actions they engaged in, hopefully leading to their identification.

I will first say that I appear in court on behalf of the Garda Commissioner and I am a member of the ICCL. I do not believe that either circumstance conflicts me, but I wanted to declare it at the outset.

The questions I wish to ask relate to false positives and the concerns in that regard. I have listened to what the Garda has said about this essentially being a data-processing tool that works in the same way as Ctrl+F in a PDF does. It finds a particular data set, but a human is ultimately responsible for deciding whether that data set is what he or she is looking for. Comment has been made about the number of hours of footage that might have to be looked through and what might be sought out in same. Depending on the quality of the algorithm, it can make mistakes based on the flaws of the person who designed it. In Ireland, for example, that is more likely to be a white person than someone of a different ethnicity. Is there any data or study to support the idea that an individual garda who sits down and watches eight hours of footage, fast forwards through it and finds what he or she believes is Barry Ward is more or less accurate than technology doing the same job?

Mr. Andrew O'Sullivan

It is accepted best practice that the best results are obtained where there is a true digital approach to the processing, with a trained facial examiner working in conjunction with the technology. If someone is sitting in front of a screen for eight hours, there is a chance that he or she will get tired or develop tunnel vision and start to make mistakes. A machine does not get tired. It is a question of the two working in conjunction.

This is complicated from a technical perspective, so I can give another example of where we apply digital techniques. It is exactly the same approach, but it is perhaps an easier one to explain. The Senator may have seen from press coverage last September or thereabouts that we now have access to a list of uninsured vehicles from the Motor Insurers Bureau of Ireland. That is known to contain inaccuracies. It is highly accurate for private vehicles, but it is not particularly accurate yet for fleet vehicles. It will be once fleet managers provide information on which vehicles are covered by their policies. We had deployed the technology to just 150 gardaí, but we deployed it to all 700 members of our roads policing units last week. In the next month or so, we will deploy it to all gardaí through their mobile phone apps. In practice, this means that, if gardaí get a flag from the app or the in-car ANPR about a vehicle being uninsured, they will stop it. That will be at a checkpoint or, in some cases, on the open road, but they make no assumptions. All they have is an indication that an offence may have been committed under the Road Traffic Acts. They will not take the car off the road or even accuse the person of having no insurance. It is an opportunity to have a discussion and use other information at their disposal, including discussing it with the motorist, who may be able to produce an insurance certificate. Of course, that certificate may not be correct, so the gardaí can ring the insurance company. There are other steps they can take to verify it.

The exact same digitalisation approach applies when we process facial images. The machine will give an indication that it perhaps has found a match, but it is up to the facial examiner to prove and stand over that in court. There were 174 detections last year for child pornography. All of those used biometric techniques. In the seven years we have been doing this, using the same techniques with progressively more reliable technology, we have never had a single instance of it being successfully challenged in court. The only thing that the court is interested in is the identification that was carried out by the Garda member. That is the best that I can say about accuracy.

Let us say that garda A is sitting down and working with the technology and the technology generates a match. Is there a concern, or what protocols have been put in place to address the concern, that the technology corroborates the garda’s view to the point where he or she might have a doubt but dismisses it on the basis that he or she is relying on the technology rather than his or her own view?

Mr. Andrew O'Sullivan

It is the same approach as the one taken to insurance data. Some people might say that we should trust the machine because the computer is always right and the data from the Motor Insurers Bureau of Ireland is reliable. The exact same approach is taken to vetting decisions. The only way we can process 600,000 vetting decisions per year is by relying heavily on the use of decision support technology. People are highly trained to ensure they are making their own determinations, and they are absolutely responsible and accountable for those decisions. There is no responsibility on the machine. They are trained, and know through experience, to question the decision.

I accept that, but in circumstances where a garda is investigating a serious offence and is under considerable pressure to identify someone, what do we do to avoid the danger that the garda, however well trained, might naturally feel pressure to find someone to prosecute, investigate or whatever the case may be? What is there specifically to disincentivise following the chain in a way that is convenient rather than accurate?

Mr. Drew Harris

These investigations are supervised by a senior investigating officer. Given the range of offences involved, the investigations will all be led by SIOs. In effect, they are producing an evidence package, with level 3 or level 4 interviewers looking at that. They are all sceptical readers and they want to be sure of the accuracy of what they are being told so that it is sufficient to convince the DPP of what we are doing, the quality of the investigation and, therefore, the necessity to charge or subsequently to report. There is actually more pressure to be accurate and to get it right. It is easy to make arrests and so on, but our fundamental approach is to get the evidence to bring individuals to justice as needs be.

One of the points that has come out of these discussions is the importance of the protocols from the Department and within An Garda Síochána. Is there any proposed protocol or measure that will penalise someone or is there any consequence for a garda who gets it wrong when he or she should objectively have got it right?

Mr. Drew Harris

We are all subject to human frailty and we can get things wrong just by mistake. That has to be recognised, but I would point to the rigour of the work being done, the supervision of same and, subsequently, the rigour of the reporting to the Director of Public Prosecutions. There is then rigorous examination of the evidence that we put forward. Regarding the child abuse material, the reporting of that is a human activity, in that individual members sit down, go through the stills and describe what is happening for the prosecution files. There is a rigour in that in terms of its supervision, but also the welfare of the members engaged in the work, as the Senator will imagine.

I thank the witnesses.

I will move on quickly because we are running out of time. There is to be a second round. Some of those witnesses are in different time zones, but I hope everyone will be able to contribute.

I thank the witnesses for their evidence. My question has been evolving as the proceedings have moved on. It is on the code of practice and consultation the Minister would have with the DPC. Is there a requirement on him to report whether the Minister takes on board his concerns, or is there a requirement on him to report what has happened in the consultation that takes place?

Mr. David Murphy

My understanding is that the Garda Commissioner must develop the code of practice, which must then be approved by the Minister. The Garda Commissioner is obliged to consult with us. That consultation is a closed loop.

Is there no requirement on the DPC to report?

Mr. David Murphy

Not that I am aware of. I do not think there is any requirement to report. With the development of any measures like this, we would encourage transparency, and that as much information as possible be put into the public domain about the development of these matters.

The Oireachtas will also have to approve the code of practice. Through history that has been a 50-minute debate in the Dáil, and that is it - end of story. The committee should recommend that the code comes before it for full and rigorous interpretation as well before it is adopted.

In his submission Mr. Garrett stated: "The General Scheme says that biometric data, as it is phrased, will not extend to such physical features as a person’s height or build – presumably because reliance on such information would not lead to a definite identification of a suspect." I direct this question to the Garda Commissioner. When he gave evidence previously, he stated he would look at the Footlocker incident where somebody of recognisable height or build or wearing recognisable clothes could be in there but could not be identified. They could be traced through all the procedures and cameras and so on. However, that is the very thing it has been stated this will not be used for. One cannot identify somebody through their coat, jacket or trousers and then follow on through. I know it is unlikely, but it is possible that somebody else wearing the same coat, who might be completely innocent, could be picked up by the camera somewhere else further along the line and be identified at a later stage by the Garda as the person involved.

Mr. Drew Harris

That requires a quality investigation - the expression used earlier - to tag an individual and be certain of their movements. The Deputy should bear in mind that when we report matters, the DPP looks at them closely. It is a high test - beyond all reasonable doubt. We need to be clear about the evidence we have gained and the quality of work we have done in terms of attempting to identify an individual. As part of our ongoing work, we see individual habitual serious criminals who go out in the public domain and are often masked. They have no expectation of privacy in the public domain. We then have to use other physical characteristics as a starting point and a follow through in the investigation.

The problem is that it is not provided in law, at the start, to use that process.

Mr. Drew Harris

I do not believe it is biometric data we are processing. I am not sure that particular characteristics of an individual's description is biometric data. We are looking at an individual on a public street. He is in public view and we suspect him of serious criminality. We are using the evidence we have gained - mostly footage, but also witness evidence - to try to identify that individual's movements and ultimately identify them. In effect, that is what we are obliged to do.

Yes, but they are being identified through what they are wearing or their height or build. What is said here could be wrong, and Mr. Garrett can explain whether it is accurate.

Mr. Mark Garrett

Our understanding is that is not biometric covered by this legislation, but that is separate.

That is separate, so it does not matter and does not make a difference.

Mr. Mark Garrett

It is a separate issue. It is not covered by this legislation because it is probably covered in other ways.

Okay, I think that is clear.

I thank all of our witnesses. It has been an interesting debate. It is important that the Data Protection Commission is truly onboard and feels that this legislation is airtight. My concern is that if we continued with things as they are, there would be huge problems when it comes to prosecution if data protection was to be broken. There is a lot of work to do at the Department's end to make this more airtight. Ms Cronin spoke earlier about privacy and how people's privacy is broken. My honest opinion, which comes from being chair of the committee where we did the Online Safety and Media Regulation Bill, is that the horse has bolted regarding people's data and their images. If I walk down O'Connell Street, there are people taking video footage of the street and the buildings. Kids are using their phones all of the time. To a certain extent that is impossible to roll back. I feel incredibly sorry for An Garda Síochána, which has a real job trying to fight crime with two hands tied behind its back while the world and its mother can use whatever data they like. Yet and all, so many Garda resources are being encumbered and burdened by something that technology could help and assist in the fight against crime. That is the most important thing. Everybody around this table is in favour of trying to assist that fight against crime. One example is those who were involved in what we saw in the riots in Dublin.

I will go back to the initial point about the use of technology. We can get the concerns of the DPC and everybody around the table, and do more work teasing that out with the Department. We could be prepared and in the best possible position to give An Garda Síochána the strength to do that because it is unhelpful. The other side is that the criminal world is using AI to its advantage. There are then the law-abiding citizens and those who are there to protect the State. It must feel like the most futile job in the world trying to do that against those who are using it for negative purposes, and who have a free hand in it. That is more of an observation than a question.

Having heard all of the observations and concerns, does the Department feel that it can do more work to strengthen the legislation to answer most of the concerns around the table, not break any EU law, have the support of the DPC and give An Garda the strength and support and use of AI it needs to do its job? We also have a shortage of gardaí, which feeds into all of that.

Ms Rosaleen Killian

We thank the Chair, the other witnesses and the committee for taking the time to listen to and observe the pre-legislative scrutiny. We appreciate all of the effort that has gone in, and the discussion is informative.

The general scheme requires more work; it is a general scheme. It has been informative to hear what has been said, and we will take a lot of it on board when we proceed to drafting. We will consult with the AG in due course and await the report of the committee.

Is there anybody else to whom Deputy Smyth wishes to direct her questions?

Does anybody want to comment on the views I have put forward?

I do not know if the Commissioner wants to respond to the comment about criminals using AI while law enforcement has a hand tied behind its back.

Mr. Drew Harris

There is a general comment about children and young people on social media. There is a definite modus operandi among those who wish to abuse children in the most egregious form. They are well-schooled in appearing as children on social media to gain the trust of a child. Children often have the same passwords. They share passwords to get into all of their accounts, and then progressively groom that child into providing images.

That becomes intimidation and blackmail and then, in effect, the abuse descends. A survey in the news last week stated that 24% of six-year-olds have a smartphone. I would say to all parents that they need to be really careful about what is happening on that smartphone. In effect, the modus operandi is to take over that account and then to intimidate the child. That is what we are dealing with.

Deputies have spoken about the extent to which the Internet and social media are part and parcel of our lives. There are real risks with it as well, and there are real predators out there. There is an international effort in law enforcement to try to counter this, but the volumes are huge. It is an extraordinary, extensive crime. We want to play our full part as a nation in terms of combatting this but working successfully with other law enforcement agencies both in Europe and the US in terms of countering this crime.

Senator McDowell is next and then our final member to come in on this strand will be Deputy Ward. I mention for the information of all members and witnesses that this session must conclude sharply at 6 p.m. because we have a second session commencing at 6.15 p.m. with some international witnesses in various time zones. It is a bit of an effort to get them all lined up online so I do not want to jeopardise things by running over.

I want to re-echo what has been said here about the value of this session. It is very important that the Oireachtas has the opportunity to have sessions of this kind. I want to make a few points. It seems, as has been pointed out by some of the contributors, that the sourcing of data material is not adequately dealt with in this legislation. Is a public service card available? Are passport photographs available? Is there a duty on the people who have those databases to make them available and in what circumstances? I would like to see that developed when the Bill comes before us again.

The second issue is the code of practice. I appreciate that the Minister is going to generate that but we should have some kind of regular review built into the statute so that the code of practice does not fossilise.

Third, with regard to the number of offences that are covered in the Schedule, I notice that apart from the security of the State, they are all personal injury-type offences. Maybe it is a bit topical, but arson is one thing which, at the moment, carries a life sentence. Organised crime is something at which we should also look. It seems to be that some of the offences covered could be minor compared with some of those.

There is one last point I wish to put out, which is that if people are going to be compelled to produce data from their own resources or from national databases and the like, the circumstances in which they are going to face such compulsion are very important. We hear so often now of people looking for dashcam footage. I have often wondered whether I am entitled to go around my neighbourhood using a dashcam as a local vigilante. In any event, however, it is very frequently sought. We should be conscious in all of this that nobody is going to be convicted on AI. In the end, a jury, judge or whoever is going to have to get the testimony of a garda and look at the material themselves to see whether they accept all this. I accept there is a danger that people will be arrested and unfairly made a suspect if the technology is not good. However, the chance of anybody being wrongfully convicted because of an algorithm is zilch.

I thank the Senator very much. On that, we heard a testimony at the outset and I believe Mr. O'Sullivan gave some statistics. I am aware of some civil cases for breach of privacy and so forth, but has there been a record of false conviction in any jurisdiction that anyone is aware of under this technology? Any of the witnesses may wish to come in briefly on that. I think it was referred at the start in one of the opening statements. Is anyone aware of any false convictions on that basis?

Ms Olga Cronin

There have definitely been cases where people have been wrongly identified or misidentified and brought in for questioning and detained. There have been several cases in the US. Six cases involved all black people - five black men and one black women. There have also been cases in Argentina. I do not think one man was convicted but he was detained for several days even though he lived hundreds of miles away from the crime that was committed.

Does Ms Cronin know if he received any kind of compensation or any kind of redress afterwards?

Ms Olga Cronin

I think they are still in the courts.

There is an ongoing dispute.

Ms Olga Cronin

Yes.

I thank Ms Cronin.

Mr. Andrew O'Sullivan

To answer that, as far as we are aware from extensive consultation, certainly not in Ireland, and from extensive consultation with our EU law enforcement agencies, we are not aware of any cases of wrongful conviction.

I thank Mr. O'Sullivan. I will move swiftly on because I have a window to do so unless the Senator wants to press any particular witness for an answer on that.

I just threw out my ideas into the conversation.

That is very good. It is always appreciated; they are always insightful. I will go to Deputy Ward who has not gotten to speak yet. He will be our final member to contribute.

I thank the Chair. I apologise to all the witnesses; I stayed for the opening statements but I had to leave because I was speaking in the Dáil Chamber. I may, therefore, ask a question that was already asked but if I do, I would like to hear the answer. In fairness, however, what I have gotten out of this session today is the absolute value and need for pre-legislative scrutiny when we come to really important issues like this. I am hearing many conflicting opinions across the board. Everybody's opinion really needs to be heard on this. One of the conflicting opinions I heard in the opening statements was, for example, when Mr. Garrett mentioned concerns that height could be used as a metric to identify a person. Commissioner Harris used a really good example of the use of technology to identify, say, a school crest on a young person who is a victim of sexual abuse. There are conflicting opinions and concerns on that.

My understanding is that we are discussing FRT and this debate today is specifically on that. This is kind of a simple question and I do not know whether it has been asked. Can FRT or this legislation be used to identify a person by height, clothes or other distinguishing identifiers apart from his or her face? A representative from the Law Society of Ireland might answer that question.

Dr. David Murphy

I will come in because it might be helpful to address a question that was also raised by Deputy Pringle. Under EU law enforcement directive, which is the overarching legislation, biometric data is any data derived from a person's physical characteristics that can uniquely identify him or her. Therefore, yes, a person's height, body shape or face or fingerprint are considered biometric data because it is data derived from his or her unique physical characteristics. In that case, therefore, we would deem that tracking somebody by his or her height or body shape would be processing biometric data.

What about, as I said, a school crest or hoodie or certain clothes?

Dr. David Murphy

Those would not be biometric because they would not be derived from the unique physical characteristics of the person. However, they would be personal data for the purposes of data protection law, if an individual can be identified from that, and that would need to be processed accordingly. However, they would not meet the threshold of biometric data, which is a special category of data and requires additional safeguarding.

Mr. Mark Garrett

What the Deputy is raising is the issue around clarity and providing clarity around what is meant in different sections. I raised the point under head 2, which refers to how biometric data has the same meaning attached to it in section 69 of the Data Protection Act 2018, but does not include "DNA, fingerprints or any other data 'except for facial images.'" Therefore, the confusion probably arises at this stage of the development of the legislation whereby those sorts of clarities will have to be teased out as it is drafted. That is the assumption we were making in our submission. As the Deputy can see from that particular quote from the heads, there is a clear view of biometric data except for these data, and we are talking about facial, or are we? That is some of the clarity we require in terms of the procedures that should be clearer, the oversight that should be clearer and exactly what people have talked about in terms of the procedures and guidelines.

Mr. Simon McGarr

On that point, from a practitioner's point of view, having a set definition of biometric data is a European definition. It is defined in European law. For us to attempt to redefine it for particular circumstances could frequently lead to problems later, which would be better avoided. If we mean to say facial data, we should say facial data. If we mean to use biometric data, we should remember that has been defined at a superior law level at the EU level.

That is a drafting question. These things are frequently changed and improved as we go along, but it is worth bringing it up now. Clarity is one of the necessities for this, and already you can see there is a kind of built-in lack of clarity there. We have already got a definition of biometric data but we are trying to redefine it under these circumstances.

I have a couple of minutes left, but I am going to give way to Deputy Daly.

The Deputy can transfer the two minutes to Deputy Daly.

I thank Deputy Ward. I have one question for the Commissioner arising out of what was mentioned with regard to the events of 22 November, retention of data and the whole issue. There was a report by Gript, which said on 29 November that it was misled by a source on the identity of the attacker. It had accessed details of asylum history belonging to a particular person and he was identified as a result of an article that appeared on the site. Is there an inquiry going on as to who leaked that information and how?

Sorry, I do not think I can allow that question because, as I understand it, there is ongoing litigation with regard to the publication in question. There is also an entity being referenced that is not represented in the room, which would fly in the face of our normal rules. Unfortunately, I do not think that question is permissible.

If the Commissioner wants-----

The Commissioner might want to make a general comment.

Mr. Drew Harris

There are several strands to the investigation. One is around the communications that were entered into. If the latter was an unlawful release of material, that is also obviously an item that would be of concern to the investigation.

It is an ongoing investigation.

Mr. Drew Harris

Yes.

We have just about concluded this session on time. I remind members that we are back in 15 minutes for part two. I thank all the witnesses across the spectrum who contributed. It was a very valuable hearing. I thank them all for their time, contributions and inputs. They were very valuable.

Is it agreed that we publish the opening statements to the committee website? Agreed. We will suspend until 6.15 p.m. I am sure we will speak to our witnesses again in due course on other matters.

Sitting suspended at 6.02 p.m. and resumed at 6.16 p.m.

We are resuming our engagement on the general scheme of the An Garda Síochána (recording devices) (amendment) Bill. We invite those witnesses joining us online to log in. They may take a minute to do whatever they need to do. If the witnesses are online, we are trying to get them connected. They might be able to hear us but not see us. We are working on it. I welcome our online witnesses. I think I can see everybody. I can see six faces and hopefully they can all see us. Will they raise their hands if they can hear me? Super. I think everybody is good.

I thank everyone for joining us. We are about to commence the second part of a session we are holding on the An Garda Síochána (recording devices) (amendment) Bill. I thank this round of witnesses, who are appearing remotely. I will introduce them all now. I formally welcome Dr. Abeba Birhane. I cannot quite see the names on the board, so I ask the witnesses to put their hands up when I call them so I know who is who. Dr. Birhane is a senior adviser in AI accountability at the Mozilla Foundation and an adjunct assistant professor at the school of computer science and statistics at Trinity College. She is very welcome. I welcome Dr. Daragh Murray, senior lecturer and IHSS fellow at the school of law at Queen Mary University of London and UKRI future leaders fellow. I welcome Dr. Nessa Lynch, who I think I can see in the top left corner. She is the Matheson lecturer in law, technology and innovation at University College Cork and a research fellow at the faculty of law at Victoria University of Wellington, New Zealand. I welcome Professor David Kaye from UC Irvine school of law and Ms Hinako Sugiyama, digital rights fellow and lecturer in UC Irvine school of law. I see them in the top right and bottom middle on my screen. I welcome Dr. Ciara Bracken-Roche, who is on the bottom left of my screen, if I am correct. She is assistant professor and director of internationalisation at the school of law and criminology at Maynooth University.

From the Department of Justice, we are joined by Ms Rosaleen Killian, principal officer, and Mr. Frank McNamara, legal researcher.

When witnesses are speaking, they should ensure their microphones are not on mute. They should ensure that they are on mute when not speaking in order to avoid interference. Everybody is used to this now ,but we still trip ourselves up every so often.

Members have already been served with a warning on parliamentary privilege but witnesses joining the session should note the long-standing parliamentary practice to the effect that they should not criticise or make charges against any person or entity by name or in such a way as him, her or it identifiable, or otherwise engage in speech that might be regarded as damaging to the good name of any person or entity. If witnesses are directed to discontinue their remarks on the grounds that they are potentially defamatory or otherwise in breach of the Houses' rules, it is imperative that they comply with any such direction. Therefore, if statements are potentially defamatory in relation to an identifiable person or entity, they will be directed to discontinue their remarks, and any such direction must be complied with.

For witnesses attending remotely outside the Leinster House campus, there are some limitations to parliamentary privilege. They may not benefit from the same level of immunity as a witness physically present does. That is the age we are in, and we are used to it at this stage; it is just a question of mentioning the housekeeping note at the outset. Witnesses participating in this committee session from a jurisdiction outside the State are advised that they should also be mindful that domestic law could be applied to the evidence they give.

I invite the representatives to give a short opening statement on behalf of their organisations. Three minutes is the norm for such statements. After three minutes, I may have to interrupt and move on to the next speaker. Once the opening statements are completed, we will have a rota system whereby members will each have a six-minute slot in which to engage with the witnesses and ask questions. I will move from member to member in the order in which they have indicated.

We have a relatively short window for the meeting tonight because there are several meetings today. Therefore, I ask everybody to remain focused. I will move things on if I need to have the appropriate times.

I am delighted to have such professional and esteemed witnesses with such expertise in this area. We are very lucky to have this expertise available to us in our deliberations. I call on Dr. Birhane, who has three minutes in which to make her opening remarks.

Dr. Abeba Birhane

I am very grateful for this opportunity. I really appreciate the opportunity to appear before the Joint Committee on Justice.

In seeking policing FRT legislation a second time, Ireland runs the risk of inadvertently imposing a technology that is ineffective, inherently flawed, opaque and proven to be discriminatory. A robust body of scientific evidence over the past year has demonstrated flaws and inaccuracies arise time and again. Although often presented as a cost- and resource-effective aid to policing, FRT has proven to be ineffective and intrusive. For example, in a recent survey by Big Brother Watch in the UK that reviewed police use across Wales of FRT that scanned over 508,542 faces, it was found that over 3,000 people were wrongfully identified. An inaccuracy rate of 88% was recorded in the period 2016 to 2023. In all this time, only three arrests were made. I point to this research to show the ineffectiveness of the technology.

In an audit carried out by Cambridge researchers only last year, which evaluated the ethics and legality of police use of FRT in the UK, the researchers complied ethical and legal standards for governing facial recognition based on extensive literature, feedback from academia, government, civil society and policing organisations. They then applied the resulting audit tool to evaluate three facial recognition deployments by police forces in England and Wales and found that all three technologies failed to meet the stated criteria or standards.

Although computer vision, the basic technology for FRT, has come a long way since the misidentification of black people as gorillas in 2015 by the app Google Photos and huge error rates in identifying black-skinned faces, as confirmed in the 2018 study by Buolamwini and Gebru study that found error rates of 34.7% for black women by comparison with 0.8% for white men, the technology remains deeply flawed. In a recent study carried out by me and my colleagues, which evaluated the latest state-of-the-art computer-vision models on classification tasks, we found that black people are still labelled and misclassified as "criminal" and "suspicious person" at a much higher rate. So far in the US alone, we know of six people who have wrongfully been arrested and convicted due to errors in FRT, five of whom are black men and one of whom is a black woman. Again, the discrimination tends to be against minoritised identities.

I cannot emphasise enough the importance of independence when it comes to evaluations and audits of the technology. Although the technology is deployed to track, monitor and surveil the public, it operates in the dark without clear oversight, transparency and accountability. This is partly due to the proprietary rights that come with technology. For example, there is no access to training data, model architecture and other critical information necessary for model performance. For the most part, all the information remains hidden from the public and independent auditors, essentially making independent audits and evaluations impossible.

If Ireland goes ahead with this technology, it is just a matter of time before it becomes another cautionary international headline. I thank the committee.

Dr. Daragh Murray

I thank the committee for the invitation to participate in this session.

In my opening remarks, I would like to address first, in general terms, the surveillance capability made possible by facial recognition and highlight the chilling effects this surveillance may give rise to. I will then flag three concerns regarding the human rights law compliance of the general scheme.

Facial recognition represents a step change in police surveillance capability. It is an oversimplification to think of facial recognition as a technology that merely allows police to examine an image and identify those present. The use of retrospective facial recognition allows police to look back in time and determine where an individual was, who they were with and what they were doing. Both live and retrospective facial recognition make possible the ability to monitor, track and profile large segments of the population, with significant private-life implications. Linked to this surveillance capability is the possibility that chilling effects will emerge. Chilling effects arise when individuals modify their behaviour because they are afraid of the consequences that might follow if that behaviour is observed. These chilling effects are most likely to be felt by those outside the mainstream or in opposition to the status quo. In concrete terms, they can undermine the right to protest and the ability to mobilise or organise for political change.

Chilling effects are most likely to be felt when police are granted broad powers, as is the case regarding the draft Bill, which allows police to use facial recognition for a wide variety of offences and a wide variety of purposes on the basis of subjective interpretation. The European Court of Human Rights has classified facial recognition as "highly intrusive", requiring a "high level" of justification to be considered "necessary in a democratic society". It is difficult to see how new section 43 satisfies this criterion.

Two additional issues, linked to the authorisation and oversight process, can be highlighted that again raise concerns regarding human rights compliance. The authorisation process runs the risk of being reduced to a tick-box exercise by failing to take into account the context of each use of facial recognition, thereby undermining the ability to evaluate necessity. Equally, a tool this powerful should appropriately be subject to independent oversight.

To summarise, the surveillance capability made possible by facial recognition is unprecedented. We should be cautious about authorising the use of this tool and should first fully understand both the benefit to policing and the potential harm, including to human rights.

Discussion surrounding police uses of facial recognition is characterised by unsubstantiated claims and this is why I have suggested a moratorium for the time being.

Dr. Nessa Lynch

Good morning from New Zealand where it is already Wednesday. I thank the committee for the opportunity to speak. It was great to listen to the expertise of others appearing today. I would like to highlight three overarching points from my written submission. This draws on my expertise in biometrics and biometrics regulation from an academic perspective in various regulatory and ethics roles in the public sector and in police leadership. Regarding human rights compliance, the risks of biometric technologies in policing have rightly been highlighted in the scholarly and advocacy literature and by my expert friends here today. Facial recognition technology involves the collection, analysis and retention of sensitive personal data. As we heard, it poses significant risks to collective and individual rights such as privacy and freedom from discrimination. I agree strongly with this analysis and have made those points in my own work. I also invite the committee to reflect on the wider human rights framework.

The State has a duty to protect the human rights of all. I argue that assessing the use of FRT can be very context-specific. There are situations in which it may be possible to use the technology in a human rights-compliant way or to deter or prevent significant human rights abuses. As an example, I come from a disciplinary background in children's human rights. I found in my work that children and young people from ethnic and social minorities may be disproportionately impacted by the use of biometric technologies due to their use of public space and existing inequities in the impact of policing. I also know that in the growing area of online child exploitation, these types of tools can be used to identify alleged victims and alleged perpetrators, thus fulfilling Articles 19 and 36 of the Convention on the Rights of the Child. These two things can be true at the same time. It is vital to look at those contexts. While perceived social licence, political expediency or public opinion cannot in any way override fundamental human rights, we have to reflect on situations in which technology such as this can be used in a human rights-complaint way where there is a legitimate law enforcement objective.

Moving to the particular use case contemplated in the draft legislation, which is retrospective, which means that analysis will occur on already collected imagery. It will depend on the system and vendor contracted, were these proposals to go ahead. There is a spectrum within retrospective use ranging from the kind of speed and scale improvements where a single image is compared against footage to use cases that are very close to live automated, where there is only a small time lag and the reference database is large. That needs to be considered as well in terms of that spectrum. Previous work I have done on FRT in the policing context has categorised retrospective FRT as medium risk. This is dependent on the existence of a significant system of controls. I encourage the consideration, which will be compulsory, of the European Union AI regulations, which I think aligns with that scheme. As we heard in previous submissions, accuracy and bias remain significant concerns. It is important to reflect that humans are not a gold standard for visual identification. It is important not to fall into that trap for comparisons.

Legislation is a broad power which empowers and restricts but any attempts to implement safe ethical and human rights-compliant technology requires a comprehensive system of controls. Some of these are within the power of the committee as legislators or members will have the opportunity to signal their importance to the operational matters for the Garda. This would include the legislation itself and regulatory controls such as the independent oversight of patterns of use, a requirement for internal and external controls, technology assurance such as the vendor with which the State might enter into an agreement, assurance of technology use, audit of use, technical matches, rules on matching and-----

I am sorry to interrupt-----

Dr. Nessa Lynch

I am just wrapping up.

I did not mean to come to such an abrupt conclusion but I was going to ask Dr. Lynch to wrap up. She is over time. I apologise. She will get in again afterwards.

Dr. Nessa Lynch

A very important control is the human rights values and ethics that need to be embedded in the policing organisation. I thank the committee for the opportunity to contribute my expertise on this issue. I look forward to the questions and hearing the contributions of others.

I call Professor Kaye. If I need to, I will remind witnesses of the time limit.

Professor David Kaye

I thank the committee for the opportunity to speak today. I applaud the committee’s effort to place law enforcement use of facial identification technology under the rule of law. However, generally speaking, I am concerned that the law as drafted faces fundamental defects that would render it inconsistent with Ireland’s obligations under international and European human rights law. In this statement, I want to address a few key points concerning fundamental rights that bear scrutiny, alongside our submitted comments, which offer an overview of international human rights law relevant to facial identification. Paragraph 4 and footnote 8 of the written comments identify some of my work on digital surveillance, including my role as the United Nations special rapporteur on freedom of opinion and expression. I also wish to reintroduce my colleague, Ms Hinako Sugiyama, a co-author of our submission.

Facial identification technologies deeply affect people’s lives, as the committee has already heard and already knows, especially when used in public places like streets, parks, train stations and malls. It can create anxiety about the loss of anonymity or being wrongly flagged, imposing a burden on freedom of movement and the right to assembly. Notably, this applies whether such technologies are used in real-time or on past footage. Retrospective use might even impose a long-lasting impact since it can analyse data from any time in the past. Due to this chilling effect and the state obligation to promote and protect fundamental rights, international human rights law imposes, at a minimum, the strictest constraints on, if not a prohibition of, the use of facial identification data obtained in a publicly accessible place. The rationale behind such constraint is that the rights at stake are not only rights held by individuals but are essential to democratic societies, as repeatedly confirmed by European and international human rights law mechanisms and jurisprudence.

If, notwithstanding the grave risks to fundamental rights, legislators decide to allow law enforcement use of the technology on data of publicly accessible places, any such framework must, at a minimum, ensure that such use always meets international human rights standards, namely, the requirements of legality, necessity and proportionality and legitimacy. To be more specific, at the very least, any such law would require the following: minimisation of the chilling effect by establishing a specific timeframe between data capture and use of facial identification; mandating strict checks and balances, crucially by requiring prior judicial approval and robust public disclosure; specification of which data sources are used for identification, allowing the public to anticipate police use of the technology and thus self-regulate their behaviour; and notification to the public and to individuals subject to facial identification in order to guarantee the right to effective remedies as required by international human rights law. I hope my suggestions help ensure that this law fully respects international human rights standards. I thank the committee for including me with the other experts on this panel.

I thank the Professor. As with all the witnesses, it is great to have such a depth of expertise in the room. Dr. Bracken-Roche has the floor for the next few minutes.

Dr. Ciara Bracken-Roche

I thank the Chair. I hope everyone can hear me.

It is all good.

Dr. Ciara Bracken-Roche

I thank the committee for having me here today to contribute to this discussion. I am an assistant professor in the school of law and criminology at Maynooth University and an adjunct professor in criminology at the University of Ottawa, Canada.

I am a specialist in surveillance technologies and privacy including national and international frameworks and policies. I focus mostly on the development, regulation and use of new technologies for policing, public safety and security purposes.

To echo what Professor Kaye said, I welcome the production of this draft Bill and amendment to bring facial recognition technology under the rule of law to help safeguard the Irish public. However, I am concerned that the adoption of FRT could do more harm than good in this context and that it could be counterproductive for An Garda Síochána’s operational goals. The development and consistent review of Garda practices, resources and technologies should occur in tandem with the adoption of new technologies, which need to be assessed on their own merit, but also in the context of broader organisational operations.

In this statement, I will highlight three key points relating to effectiveness and the proprietary nature of FRT for the committee but I am happy to discuss other things later on. First, with regard to public order policing or surveillance of public spaces, everyday garments can often render FRT useless because these simple physical barriers stop FRT from seeing the face clearly. Thinking of the riots, baseball caps, face coverings, face masks, glasses, hoods and umbrellas can all potentially obstruct FRT. Notwithstanding this issue, the chilling effects at play should prohibit the use of FRT in public spaces in the first instance.

Second, FRT relies on vast databases of images to operate, using algorithms to find details about one face to assess its similarity to others. Instead of positively identifying an unknown person, some systems will calculate a probability match score between the unknown person and specific face match templates in a database. This means that, instead of a single match, the system offers up several matches ranked by probability score. This puts privacy at risk as individuals who have nothing to do with an event might still be brought into the investigation if their probability score is high enough.

Lastly, the proprietary nature of FRT means the algorithms and processes inside the system are black-boxed. They are often unexplainable to government institutions and the public alike. The practices employed by FRT companies are questionable at best with one of the largest international providers, Clearview AI, being fined and sanctioned for the inappropriate collection, use and disclosure of personal information, creating risks of significant harm to individuals who have never been engaged in a crime and collecting images in an unreasonable manner.

We are over time. Dr. Bracken-Roche was finishing anyway. I thank her for the contribution. I thank all our distinguished guests. I will move to the members. All members present are offering. I will call on Senator Ruane first, followed by Deputy Daly and Deputy Pringle. Deputy Alan Farrell and some others may wish to come in at that point.

I thank everyone for their presentations. I do not know if anybody watched the last session. It is probably unfair to the witnesses that some of the last session's witnesses' contributions are still ringing in my head. Today's witnesses may be able to clarify some of my concerns from that last session by virtue of their expertise. Facial recognition technology is about facial recognition. As Dr. Bracken-Roche said, that facial image can be obscured in many different ways. In speaking on the heads of Bill, An Garda Síochána has pointed out that there are measures already in place. People can be identified by clothing and then followed from point to point until they are identified. Its representatives seem to be completely avoiding speaking to the fact that this is facial recognition. It is in the name. It has to involve recognition against something. In its contribution, An Garda Síochána has not acknowledged that a database has to be used. Faces are identified and then run through a system to measure them against lots of other faces but An Garda Síochána seems to be saying that no reference point is going to be used for facial recognition technology. That does not make very much sense. If anybody has watched the last session, will they use their expertise to clarify whether there is some sort of misunderstanding in those contributions as regards what the technology is built for and intended to do under the heads of Bill?

I have another question, which is about Garda verification of those matches. Having a human involved seems to be put forward as the answer and solution but, as was highlighted by the ACLU in the US, evidence has shown people being misidentified and that misidentification being held up by the police overseeing the process. The police officers involved were apprehensive of questioning the machine. Will the witnesses speak to the importance of judicial oversight in the granting of access and the importance of independent oversight and the human who makes a decision based on the data that comes through?

I will ask a group of questions and people can indicate as we go back through them. On the codes of practice, to be transparent, I do not feel that it is enough to regulate FRT through regulations or guidelines as it is ever-growing and ever-expanding. Legislation should obviously be sufficiently flexible but guidelines may not be enough to fulfil obligations in respect of human rights, privacy and civil liberties.

How does a person who is misidentified by FRT get access to justice under this Bill? In other legislation, such as the Freedom of Information Act and the GDPR, An Garda Síochána already has rights carved out, allowing it to access data. Does anybody have recourse to justice in light of existing carve-outs?

We know that the method by which that 99% accuracy rate is calculated is problematic. Who is calculating those rates? Is it the vendors, who have decided what accuracy looks like? Obviously, it is their intention to sell a product so it is in their interests to create an accuracy rate based on the number of people a system did not misidentify rather than the number it did. We should speak to some of those vendors. The procurement process in Ireland is an issue. How do we know who those vendors, who are setting the accuracy rates, are? Has a reliable vendor been found in this area?

I see a few hands raised on the screen. I will call on Dr. Bracken-Roche first, Dr. Birhane second and Dr. Lynch third.

Dr. Ciara Bracken-Roche

I thank the Senator for those great questions. I will give a general response because I know some of my colleagues will have more technical expertise. Many of the questions the Senator has raised are concerns we had in reading this draft amendment Bill. There is a lot of information missing as to the technology, how it will operate and what the protocols around it will be. Trying to incorporate this as an amendment to another Act when the technology is changing so quickly and when there is such a lack of clarity as to how it will operate is of concern. Where has the biometric data come from? There is discussion of biometric data but we are not sure where An Garda Síochána will access data from. On verification, head 7 of the Bill talks about the result being verified by an investigation team but at no point in the draft amendment does it say who this investigation team will be or what kind of training they will have. Who is the facial examiner? How would that operate? All of those specifics need to be brought into the conversation in order for this amendment to proceed. Legislation on this topic could be ad hoc because the technology develops so quickly and because we need to be really specific about how it operates. I will leave my comments there and pass on to Dr. Birhane.

Dr. Abeba Birhane

I thank the Senator for the questions.

As Dr. Bracken-Roche said, these are the questions where there is no clarity from the Garda Síochána. I wish I could answer the question on how the 98% or 99% accuracy was calculated. I listened to some of the previous session and I searched for the particular paper. I would really appreciate a link to it because it seems to be an outlier in the sea of research and large pile of evidence that shows otherwise. The evidence shows recurring discriminatory classification and error rates more particularly against darker-skinned individuals and marginalised people. The general scientific literature indicates these are the traits. I have not heard of the one paper mentioned in the previous session and I would appreciate a link to it. I also question whether it was carried out by independent investigators. As I said in my opening statement, independence in evaluating and auditing systems is key. The independence of the evaluator determines the validity and credibility of the result itself. I am afraid I am not answering the question but rather I am also asking a question. The claim seems to be the outlier against the scientific evidence.

I thank Dr. Birhane. Because this is the first question and it has got people talking I will continue with the answers before I move on to the next member.

Dr. Nessa Lynch

I thank Senator Ruane for her very insightful questions. I will home in on verification points and then hand over the floor to others. This is a very important control. There are international standards for training on forensic face examination. This will be a forensic standard. It is also important to recognise that the initial verification match in the context of the Garda is an intelligence match. Any action to be taken on this would fulfil the normal statutory or common law position on making an arrest or taking further action. For this type of match to go through to a situation where it would be tested in court and form the basis of a formal identification on which a finding or some sort of criminal sanction would eventuate, it would still need to go through the usual rules of evidence. Definitely, as my expert friends have said, there is still a lot of concern about accuracy. There are some controls that can be put in place to mitigate this.

Dr. Daragh Murray

I thank Senator Ruane nature for the questions and I will try to answer the first couple of them. With regard to the existence of a reference database, for facial recognition to be effective there has to be a reference database. If a person in a crowd is identified, the benefit of facial recognition is being able to match that face against a database to know who you are looking for. To say this is not present seems difficult to believe. In the UK the reference database is the police national database, which is a database of custody images. There are approximately 19 million images. In the UK we have a problem whereby in my opinion there is not an adequate legal basis for this. We have referenced other databases under the carpet. For example, there have been reports that the passport database is being used. At a minimum it is very important to define in the draft Bill what the reference database would be and how it will be used.

With regard to the human operator, a human safeguard is often presented as a fail-safe. If an automated algorithm produces an inaccurate result, the human is an appropriate safeguard. There is a big body of literature that speaks about automation bias and the risk of deference to an algorithm. In our experience when we reviewed the London Metropolitan Police's use of live facial recognition we found a presumption to intervene. Because of the nature of deployment, the pressure of the environment and how a camera is set up there is often no scope for an effective adjudication of the image. It often happens very quickly. There are very few cases where the operator has overturned and where the fail-safe was, in effect, a fail-safe.

With regard to the point on accuracy I am not sure where the 99% comes from either. In the UK when we refer to deployment it has all been about the metrics that are chosen. If you choose to scan 1,000 people over the course of an operation and an alert is generated for five and there is a false positive of one, this is a false positive of one in 1,000 so you could say it is 99.9% accurate but equally it is one in five so it is potentially 20% inaccurate. It all depends on how it is evaluated. The very important matter is not the people who are scanned and no action is taken but the people who are engaged with by the police who may face consequences.

I thank Dr. Murray.

Ms Hinako Sugiyama

I will specifically address the point on the code of practice. The code of practice will be prepared by the police and approved by Parliament. This is great but at the same time there should be mechanisms whereby the code of practice itself substantively complies with human rights law. This is a general remark. Specifically on the right to remedy, it is important to guarantee a right to remedy for people who are wrongfully arrested and detained. The timing of the human rights violation is the point in time when the people's faces were scanned. It is also important to notify people whose faces were scanned. I would also like to emphasise this point.

I thank the witnesses for their very useful observations. I let this round run a little longer than usual because a number of contributors were making responses for the first time.

I thank the witnesses for their presentations. Some of the questions I was going to ask have been covered. With regard to the international experience, is there prior judicial approval? If so, do any of the witnesses know exactly what the detail of this is? Does it go to a local district judge or a judge at a lower level? How does it work in the US? A few people have mentioned oversight. Is this from an independent body following the introduction of facial recognition technology? Is it pre-approval oversight? With regard to international experience, has the code of practice usually been in the Bills that have been introduced abroad? I feel it should be set out clearly so that we know what we are approving. The technology is evolving practically week by week. What risk assessments are carried out? Are there international bodies that carry out pilot schemes or oversight of a pilot scheme prior to the introduction of any such legislation? I have asked a lot of questions but mostly they are about what the international experience is and what other countries introduced prior to codifying the legislation.

Dr. Nessa Lynch

I thank Deputy Daly. This may not be particularly helpful but it is quite an unusual thing to begin to legislate for facial recognition technology in this way. There is probably not a large amount of experience internationally to draw on. Certainly in New Zealand where there is some use of retrospective facial recognition technology the police do have an independent panel on emerging technology with community members who look at cases before they begin but there is no legislative oversight. The Irish experience is probably quite novel. Other jurisdictions are relying on a combination of common law and statute to empower it. It is more unusual to begin to legislate. I apologise as this is not particularly helpful but it does show the international context.

Dr. Daragh Murray

I echo Dr. Lynch's comments. Ireland has an opportunity to establish best practice in this area.

In the UK, the answer is quite simple. There is no explicit legal basis for the use of facial recognition so the legal framework is based on a combination of the common law policing powers and then police forces' own policy documentation. Linked to that is the fact that there is no judicial oversight of the use of facial recognition. That is the way it is at the moment. That would be liable to legal challenge in the future and I cannot see it holding but as it stands, there is no explicit legal basis or judicial oversight.

Professor David Kaye

I was going to say that Dr. Lynch has said it better than I will say. There is very limited state practice on this. I echo the point that this is an opportunity to place this practice under rule of law. This is a subject of very serious controversy, particularly in the US. Because issues like access to facial recognition technology will be dealt with state-by-state or even municipality-by-municipality, for some time, some jurisdictions are banning the availability of it altogether. It is worth keeping in mind that this is a very dynamic and novel situation and a subject of very significant controversy and significant abuse because of the lack of rule of law guard rails in many jurisdictions around the world.

What databases do they source them from in the US or UK?

Professor David Kaye

I would defer to Dr. Murray on this one.

Dr. Daragh Murray

In the UK, there is a combination of databases. The main reference database is the police national database, which is a database of custody images. These are images recorded essentially when an individual comes into contact with the police irrespective of whether he or she is prosecuted successfully or convicted. It is any initial custody image. That contains about 19 million to 21 million images.

Are those images held indefinitely in the UK?

Dr. Daragh Murray

In the UK, they are. It is a complicated legal situation in that they are not necessarily supposed to be and a person has the right to revoke them but as it stands, they are held indefinitely. Other reference images can be used, for example, if you are carrying out a search using retrospective facial recognition, possibly the image you are looking for is from social media or an image a police officer has taken. The really big reference database is the police national database.

One of the witnesses said that this is an opportunity for Ireland to be an exemplar of best practice by putting in place a robust framework for the use of the technology. To play devil's advocate, would I be right in saying that it could have been introduced without legislating at all and that perhaps this could have been rolled out as a technique within law enforcement without any such legislation being passed and one could take one's chances in the courts to see if it was challenged or would issues arise? I am not necessarily advocating for that approach. When this debate started some time ago, it struck me that another approach might have been to just start doing this because I am not sure if it is illegal even though it is not necessarily legally supported and we do not have a framework so perhaps the witnesses could offer some views on that.

Dr. Nessa Lynch

In my submission, I raised some of the risks around the use of what is known as shadow IT. In the absence of a framework, it is known internationally that individual police officers will use available technology whether on their personal phones or not, so that is a danger. I will not speculate about what individual members of An Garda Síochána are doing but that is a big risk. Having this opportunity to take forward a robust legal framework may be a better option than leaving that space open for unregulated use. That is certainly a danger in that regulation gap.

Professor David Kaye

The question is a good one to surface. I share Dr. Lynch's response and would add that facial recognition technology raises a particular question about remedy in the sense that those who suffer harm do not necessarily know that they have suffered harm in this context. Although in a normal law enforcement situation, someone might have the opportunity to bring a claim against what he or she considers to be an illicit or unlawful use, in this context, it is unclear that those kind of situations would always arise in the array in the situations that we can imagine. That is why an advance legal framework is so vital with regard to this type of technology.

Dr. Ciara Bracken-Roche

I appreciate that question because in the Canadian case, this did happen. Clearview AI had done a free file service to the RCMP that was found to be in violation of the Canadian privacy Act. They conducted a joint investigation in 2022 with provincial privacy commissioners and found that the purposes of the company were inappropriate because they were not related to the purposes for which the images were originally posted, they were to the detriment of the individuals whose images were captured and it created a significant harm to those individuals, many of whom would never have been and will never be implicated in a crime. That is really important for us to note.

Was that a ruling of a court or is it an ongoing matter?

Dr. Ciara Bracken-Roche

That was an investigation and outcome from the Office of the Privacy Commissioner of Canada so not quite-----

Was that in Canada?

Dr. Ciara Bracken-Roche

It was in Canada. That is in my original submission so it is referenced there.

I thank our witnesses. It is very interesting. We probably should have had this session before the previous session because it is far more informative and leading up to questions we could be asking. When the Garda Commissioner was giving his evidence, he said that the 99% plus accuracy of modern biometric identification systems is clearly demonstrated by the biannual review by the US National Institute of Standards and Technology. Are the witnesses familiar with that and whether that example was quoted extensively by An Garda Síochána in terms of how robust this new system is going to be even though it was pointed out that a number of police forces in the US have refused FRT?

Dr. Birhane said that it was a matter of time before it becomes another cautionary international headline. I would agree with that. Regarding the databases that will be used, the database we will use will be the custody records of An Garda Síochána so there is an inherent bias in there in that it is somebody who has been arrested and more than likely will be charged. Does that mean that facial recognition technology will be inherently biased anyway?

I think one witness referenced the passport database. It is shocking that some people will be coming in at the airport, their photograph will be scanned and on that basis, will be accessed. What procedures can we put in place to ensure that those other databases are not used? I believe that, unfortunately, they will be and it will have serious implications.

I was just wondering if I could have an answer to those couple of questions, please.

I ask Dr. Birhane to take that question, please.

Dr. Abeba Birhane

In reference to the finding which was mentioned in the presentation, again, I am not clear as to the particular evaluation referred to. The National Institute of Standards and Technology, NIST, is a huge organisation which produces plenty of reports. For example, in December 2019, NIST evaluated the effects of study variants of race gender in safe zone facial recognition software. It asked how accurate was facial recognition software in identifying people of varying sex, gender and racial background. According to a new study by NIST, the answer depends on the algorithm at the heart of the system. The application uses it in its data effect but the majority of facial recognition algorithms exhibit demographic differentials. A differential means that an algorithm exhibits two images of the same person which varies from one demographic to another. I am just presenting this to show the NIST reports are vast and the specific report I am familiar with had 99% accuracy.

On racism, sexism and other discriminatory attributes on datasets or databases, I cannot speak to the reference database which An Garda Síochána will be using but I can speak generally about computer vision systems which are the basis of facial recognition technology. These systems are built where one generally trains an algorithm for various tasks. In the case of a facial recognition algorithm, one needs millions if not billions of images to train the algorithm so that it recognises, identifies, classifies and does all sorts of tasks on a given image. On the reading of that data, that is usually my expertise and I will read those types of datasets when they are available or are open source. To us, data, according to almost all audits carried out, have a tendency to label images of, again, minority identities with problematic slurs and connotations. As I mentioned previously, in one of my own research projects we found the state-of-the-art and the latest models tend to label black images, more particularly images of black men, as suspicious persons and entities. I am talking about the data which are used for training. One has training data that are the core and the basis for these algorithms. Then there are fine-tuning data in the reference database. Again, I am not very familiar with that in the context of Ireland. I hope that clarifies some of the questions which have been put forward.

I thank Dr. Birhane. I invite Dr. Murray to speak next, please.

Dr. Daragh Murray

I thank an Cathaoirleach and I thank the Deputy for his questions. I will try to answer the Deputy’s second and third question. The second question was about the custody image database. I believe the Deputy hit the nail on the head with regard to the risks with that. The European Court of Human Rights in a different case referred to the idea of the risk of stigmatisation. Essentially, the problem is that one has a database of people who have been brought into contact with the police for whatever reason, have not necessarily been convicted of a crime and who are then treated as suspicious, in and of themselves, and are differentiated from the rest of the population. That is a very big problem with a rule of law and an equality society.

The second question the Deputy raised about the passport database highlights the problem about the lack of effective supervisory legislation in the sense that there are no strict limits on what databases can be used for. A way of resolving that in this legislation is to be very explicit about not only the databases which one can use but also the types of searches which one can do on those databases and on what basis or to what purpose can facial recognition be put. Being explicit is very much the answer.

I thank Dr. Murray and I invite Dr. Lynch to speak now, please.

Dr. Nessa Lynch

I endorse everything Dr. Murray has said on the non-convicted information and I add to that ensuring that it operates on a principled basis in line with the rules for collection and retention of other biometrics such as DNA and fingerprints as the same ideas apply. When someone has been convicted, there is obviously an understanding in international human rights law that that places some restrictions on a person but that does not apply to suspects, victims of crime, voluntary sources and open source intelligence.

I know from various reviews I have done with police and the creakiness of police information technology systems that I would not be overly worried at this stage about the ability to do a broad-based search but technology will improve. It is important, therefore, to lay down those rules now.

I agree with the points made about the passport and driving licence databases because people have given over that data for a particular purpose. I contemplate, as might the European regulation, that this for very extreme circumstances such as a significant threat of terrorism or national security where there could be some sort of exception but, certainly, broad-based searches of other national databases should be explicitly ruled out.

Ms Hinako Sugiyama

I just want to quickly build on the points made by Dr. Murray and Dr.Lynch on the database because international human rights law clearly requires in respect of the legality test not only that it is prescribed in law but that it should be transparent, in that it gives prior notice to the public the sort of consequences which would be caused to humans.

The current Bill does not specify what sort of databases will be used as others have highlighted the driver’s licence or passport databases. I emphasise the timeframe when photos or footage was recorded and when the facial recognition technology will be used against it. This key information is critical for the public to anticipate as are the human rights restrictions that would be caused by this technology. I wish to emphasise this point, please.

I thank Ms Sugiyama. Did any other witnesses wish to return on that point before we go back to the floor? No. Our next member offering is Deputy Costello, who is joining us online and he has six minutes for his questions.

I thank the Chair. I am happy to be in the enviable position of coming to the end-----

I am sorry Deputy Costello but we are losing-----

I wanted to make some general points by way of a response to the earlier points, partly for the benefit of-----

My apologies to Deputy Costello but I would ask that he speak into the microphone as when he turns his head away, we are losing the sound. I know the Deputy has a valuable contribution to make so I want him to get the full opportunity to make it.

I am not sure I have many questions but I have a couple of points which I want to make so that I can put them on the record for the report. I ask that our witnesses respond to them. I am conscious of the European Court of Human Rights impact which was briefly mentioned in the previous session, particularly around judicial warrants or approval. I believe that question may already have been asked but I would ask for information around the international experience which would be useful.

I found the chilling effect points made by several of the witnesses very interesting in that this was not necessarilyy mentioned in the earlier session.

On the reference databases, we absolutely need that clear legislation as to what it can and cannot be used for, particularly in the international context which brings me to a question for our friends from the Department of Justice. Does this legislation require and EU Technical Regulation Information System, TRIS, referral and has such a referral been done, particularly if we are going to be trying to access international and European databases?

A further point which has been made by many of the witnesses which was not made in the previous session was around the proprietary and black box nature of many of these software systems. The point was made about transparency being essential but how can an accused challenge with ease in court without having that level of transparency?

We have had codes of practice for other technology. They need to be dealt with as much as possible in the primary legislation. We talk a great deal about policing by consent, so surely any code of practice should involve an extensive review and extensive public consultation. That public consultation should be baked into the legislation.

Again, I do not necessarily have any questions. The submissions have all been very clear. I just wanted to underline those points. For the remainder of the time, I invite the witnesses to react to those points, and in particular the question for the Department.

I thank the Deputy. We might start with the Department because there was a specific question for it. Then we will take any reactions the witnesses want to give on the points raised.

Ms Rosaleen Killian

In relation to the TRIS and whether one is required or if one has been made, it is too early in the process to decide on that definitively. If one is required, it will be made in good time. We are still at the pre-drafting stage, so there is plenty of time.

I thank Ms Killian for that response. Some of the other witnesses might like to come in. I see a few hands going up. That is very good. All the witnesses are very welcome to have their say. We will start with Dr. Birhane.

Dr. Abeba Birhane

I thank Deputy Costello for those remarks. I reiterate the points I made earlier. The fact is that crucial information for the base models, such as training data, is out of the question for independent investigation. I do not know what the Garda Síochána will be contracting. I do not know who the vendors are. That has not been made clear, but I very much doubt that the vendors will also allow transparency in terms of giving access to independent investigators on terms what data are used as training for fine-tuning and what kind of architecture model they are building this from. As far as I know, this kind of information has never been open to the public. It is almost always hidden by proprietary rights and is inscrutable. That is the remark I want to add.

I thank Dr. Birhane. That is a very interesting point about the contractors and unlikelihood that the algorithms would be open source or anything like that. In any event, it is an interesting point. Dr. Lynch is next.

Dr. Nessa Lynch

I thank Deputy Costello for the point about oversight. Experts such as ourselves are all very well, but it is important to ensure that community members, in particular those who would probably be disproportionately affected by this, are properly consulted and are in partnership on some of the oversight mechanisms.

One point that has not come up today and that I am particularly interested in is how the proposed legislation would deal with children and young people. Obviously, children and young people have a particular human rights framework, which has a higher level of consideration around privacy, stigmatisation, vested interests, etc., so I think that is an important thing for the committee and the framers of the legislation to reflect on and whether children and young people would be entirely excluded or that there would be a very limited power. The UN Convention on the Rights of the Child does contemplate some situations where public safety may be the primary concern, but there does need to be a different lens on the data of children and young people. I think that is a very important point.

I thank Dr. Lynch. I think Dr. Murray's hand was up.

Dr. Daragh Murray

I will make just two points very quickly. On the black box, in the UK there is a public sector equality duty, and I am not sure if there is an equivalent piece of legislation in Ireland. Essentially, in the only case that went to court about the police use of facial recognition, they said that the black box was not sufficient to discharge the police's duty, so even though there might have been a proprietary interest in the algorithm, the police still had an obligation under the equality Act to ensure that there was no discrimination. The fact of a black box was not sufficient of itself. That raises a whole other scope of issues, but in terms of the police's obligations to prevent discrimination, it may be a relevant point.

The second one is about the chilling effect. I think the chilling effects are a significant concern because of their potential to undermine democratic processes and have a significant impact on human rights. The difficulty with the chilling effect is that we do not know what the precise impact of facial recognition is going to be, and that should be a cause for concern. The issue is that we might have very negligible effects in the short term but that the impact may be profound in the long term. That is why we do not want to sleepwalk into a very dystopian future. The risk of chilling effects are compounded by the broadness of this proposed legislation. As it stands, it is an enabling mechanism that allows for facial recognition to be used in a wide variety of contexts. As a simple example, the range of activities that can be considered important or relevant to the investigation of a crime is huge. Reasonable people can have a very different interpretation of it so what it comes down to is that facial recognition as it stands could nearly be used for any activity and related to one of the offences, which in themselves are very broad. As it stands, the open-ended nature of the authority that is given to the Garda is a problem.

Before I go to Ms Sugiyama, who I think has her hand up, I want to make an observation on the point Dr. Murray made. He mentioned the black box. I think he means the magic computer that has the technology within it. It is a black box in the sense that we cannot access the algorithms within. I am reminded that in this jurisdiction there is a plethora of challenges to the technology used to detect motoring offences. It could be a speed van or a Breathalyser for drink-driving offences, and so forth. A cottage industry has sprung up around criminal defence in terms of challenging those devices. I am interested in the views of the expert witnesses. In those type of prosecutions, or certainly within the area of drink-driving or speed capture, it is submitted that the machine is determinative of the offence in the sense that the machine said the person was going past the speed limit or was over the drink-driving limit. As I understand it, what has been put forward today by the Commissioner and others who are advocating for the technology, it is not determinative, rather this is an aid to evidence-gathering and that another process, be it the court or an officer reviewing the outputs, etc. would make the next step. In other words, it is of assistance but not determinative. Does that make any difference in the opinion of the witnesses?

Ms Sugiyama is up next to speak. She will say something anyway but I might just come back to the witnesses on the question I have asked. I would like to hear what they have to say on that. I know Ms Sugiyama has a point to make anyway, and she can answer my question if she likes. Then I will hear from witnesses across the board on the question I have asked, if that is in order.

Ms Hinako Sugiyama

I am more interested in addressing the original question but I will also address the black box issue.

In terms of the prior notification to the public, it is pretty important to make the public disclosure to the public in a way that a layperson can understand how the technology works and how it impacts people's lives. Professor Kaye also mentioned that in the United States there are attempts at state and municipal level to restrain the use of facial identification and other technologies. Some of the legislation includes items to be disclosed to the public and the manner in how this information should be disclosed in San Francisco, for example, or New York city. The American Civil Liberties Union, ACLU, published model legislation to guarantee these points, so that source might be informative in this context.

I thank Ms Fugiyama. That is very good. I will go back around the table to get responses to the point I raised about the black box.

Dr. Ciara Bracken-Roche

When we are talking about FRT, it can be applied to almost any imagery - video and photographic - so in terms of the scale and scope, it is an entirely different level. When we are comparing it to other technologies, I think we need to keep that in mind. Thinking about things like the chilling effect, function creep, net widening and, as Dr. Murray said, the stigmatisation that can impact already marginalised communities is key. We need to know what the referent database is, and a lot of particulars from the vendors would need to be under consideration as well.

Dr. Nessa Lynch

On the point about the difference between an intelligence match and evidence in court, I think it is probably worth comparing this again to other types of biometrics such as DNA and fingerprints. A motoring offence is maybe a little bit different because that is probably intrinsic to the proof of the offence. However, for a serious criminal offence, obviously DNA and other biometrics will go to prove that the person was there at the particular time, or the identification. Certainly, like DNA evidence, it is not going to help with proving mens rea or other elements of the criminal offence. There is the idea that the normal rules of evidence for a serious criminal offence, and the standard of proof, are still going to have to be met. However, I definitely agree with the points made about accuracy.

Dr. Abeba Birhane

I just want to clarify what we mean by black box technology or black box algorithms. It generally refers to the fact that, for example, for a computer vision system or a face recognition system you save up your parameters, you gather up thousands and even millions and billions of images of faces, and you train your algorithm to differentiate what is a face from what is not a face. After sufficient training, then the algorithm has learned what a face is like. However, you do not know how. That is what is generally referred to as a black box algorithm. This at the heart of machine learning. It is a problem not only with face recognition systems or computer vision but AI in general.

However, it is still possible to have as much information about a given algorithm to be able to assess error rates, performance and so on. For example, training data is a critical element that determines how a certain algorithm performs. Things such as training data and the model architecture are what I refer to when I talk about the public or independent auditors - not even the public, as they do not have to give access to the public. This is information that is available which independent auditors could have access to and could do independent evaluations of such given models. This is the kind of information that is hidden by proprietary rights that nobody outside the vendors and probably An Garda Síochána - I am not sure about the arrangements - may have access to. I just wanted to clarify what we mean by a black box algorithm.

I thank Dr. Birhane for those further remarks, and all of the speakers who have contributed. That concludes our engagement with the members this evening. We have had a long couple of sessions where there has been quite a degree of detail and engagement. This was a very worthwhile panel and I also note the submissions that have been made, which are before the committee and will be considered when we go to prepare our report in due course. That has been very helpful.

We are privileged to have such an array of expertise before us, and it is one of the benefits of these Oireachtas committees that we can draw upon both national and international expertise, as we have done this evening, to better inform. I hope that it will, as has been suggested, make the legislation that is finally produced here an exemplar of international best practice. That would be a worthy goal, and we would certainly strive to do that on this committee.

I thank the witnesses for all of their assistance. I propose that we publish all of the opening statements on the committee's website. Is that agreed? Agreed.

That concludes our business and I will adjourn until the next sitting. I thank all of the participants who should feel free to follow this process. I am sure they will anyway but I ask that they feel free to remain in touch with us, officially or unofficially. I am sure they will be watching this with interest as they have a significant interest in these matters.

The meeting of the joint committee stands adjourned until Tuesday, 20 February at 4 p.m., which is this day next week, if I am not mistaken. We will then consider a number of reports, including this one.

The joint committee adjourned at 7.35 p.m. until 4 p.m. on Tuesday, 20 February 2024.
Top
Share