Skip to main content
Normal View

Dáil Éireann debate -
Wednesday, 13 Dec 2023

Vol. 1047 No. 5

Digital Services Bill 2023: Second Stage

I move: "That the Bill be now read a Second Time."

I am very pleased to bring this Bill before the House today with my colleague the Minister of State at the Department of Enterprise, Trade and Employment, Deputy Calleary. The purpose of the Bill is to fully implement in Ireland Regulation 2022/2065 of the European Parliament and of the Council on a Single Market for Digital Services, commonly referred to as the EU Digital Services Act. The EU regulation has direct legal effect in all member states and, consequently, the obligations it places on regulated entities are directly applicable and require no further implementation in national law. The Digital Services Bill 2023 is a technical Bill, necessary to give full effect to the supervision and enforcement provisions of the regulation. The Bill does not add to, nor alter, the obligations placed on the regulated entities by the regulation.

The services available online have impacted on practically every facet of the lives of European citizens. These services have brought enormous benefits, socially, culturally and economically. They have provided unprecedented access to information and facilitated an entirely new level of connection and communication between geographically remote citizens. They are transforming for the better our work-life balance, the delivery of healthcare and education, and access to Government services. They have also opened up new enterprise opportunities for commerce and trade for so many people. Online services have, however, also created an entirely new source of risk for citizens and to society at large. These online safety risks include the dissemination of illegal and harmful online content and disinformation. Recent terrible events at home and abroad have underscored how acute these dangers are, and the need for governments to act to protect citizens and society at large. It was with this objective that the EU adopted, in November 2022, the EU digital services regulation. The regulation establishes a pioneering regulatory framework to protect EU users of digital services and their fundamental rights online. It aims to rebalance the responsibilities of users, online platforms and public authorities, placing citizens at the centre.

The regulation marks a sea-change in the EU’s ability to protect society from illegal and harmful online content and disinformation. The regulation is designed to improve online safety by placing obligations directly on providers of online intermediary services, with a focus on platforms such as social media and marketplace sites, as well as on search engines. These obligations are designed to expedite the identification and removal of illegal and harmful online content. The regulation also places obligations on service providers to improve the transparency of their services and to give users more control over their online experience. The European Commission has designated 19 entities as "very large online platforms" and "very large search engines" under the regulation. The European Commission has primary responsibility for regulating these entities but will do so in concert with national authorities. As 13 of these very large entities are established in Ireland, we have a unique, critically important and high-profile role in the overall EU regulatory framework for digital services. In other words, under the country of origin principle where a company is headquartered in European Union, that country - in this case Ireland for 13 of the 19 entities - has the obligation to regulate not only at home but right across the Single Market in its entirety. For this reason, it is imperative for our national reputation that Ireland enacts the Digital Services Bill before the EU deadline of 17 February next, when the EU regulation comes into full effect.

The regulation requires Ireland to designate a lead competent authority, to be known as the digital services co-ordinator. The digital services co-ordinator will be a single point of contact, with lead responsibility for all matters to do with the EU regulation in Ireland, including co-ordination across the EU, handling complaints, policy development, communications, supervision, investigation and enforcement. The Bill designates Coimisiún na Meán as the digital services co-ordinator for Ireland. As Deputies will be aware, Coimisiún na Meán was established by the Minister for Tourism, Culture, Arts, Gaeltacht, Sport and Media, Deputy Catherine Martin, on 15 March 2023. The Bill adds the functions of the digital services co-ordinator to those the coimisiún already has and adapts the coimisiún's existing powers, such as powers of investigation and the power to impose financial sanctions for the specific cases, where it will be implementing provisions of the EU regulation. The co-location within Coimisiún na Meán of the supervisory and enforcement responsibilities for the digital services regulation, the EU terrorist content online regulation, the Online Safety and Media Regulation Act 2022, and the regulation of broadcasting and video-on demand services will enable efficient and cohesive implementation of the regulatory framework, for the benefit of citizens and providers. The synergies will also provide significant cost savings for the Exchequer.

The Bill designates the Competition and Consumer Protection Commission, CCPC, as a second competent authority with specific responsibility for the elements of the EU regulation relating to online marketplaces. The Bill provides both competent authorities with the necessary powers to carry out investigations and take enforcement actions, including the imposition of significant financial penalties for non-compliance. The Bill is prescriptive on how the competent authorities should exercise their powers to ensure that the principles of natural justice, fair procedures and proportionality are fully respected.

Having set out the background, context and purpose of the Bill, I will now outline its main provisions. The Bill is structured in four Parts with 78 sections. It has been drafted to ensure maximum alignment between the provisions for Coimisiún na Meán and those for the CCPC, while simultaneously maintaining consistency with their respective principal Acts, namely the Broadcasting Act 2009 and the Competition and Consumer Protection Act 2014. Part 1 of the Bill deals with preliminary and general matters common to legislation; namely, commencement, definitions, service of documents, etc. Part 2 of the Bill contains a series of amendments of the Broadcasting Act 2009 to designate and empower Coimisiún na Meán as the digital services co-ordinator and a competent authority for the EU regulation.

Sections 9 and 10 provide for an coimisiún to disclose, in certain circumstances, personal data and confidential information to the CCPC for the purpose of performing its functions under the EU digital services regulation.

Section 11 places an obligation on authorised officers and staff of an coimisiún to maintain professional secrecy.

Section 14 amends the definition of "contravention" to include infringements of the regulation. For investigations, the broader term "inquiry subject" is substituted for "provider" to include other persons as defined in the EU regulation.

Sections 15 and 16 specify the steps required to commence an investigation relating to an infringement of the regulation, including the requirements to notify the digital services co-ordinators in other member states and the European Commission.

Section 16 provides for joint investigations with competent authorities in other member states.

Section 17 contains the necessary technical amendments to ensure that the powers of an authorised officer can be exercised during an investigation into an infringement of the digital services regulation.

Section 18 provides that authorised officers may use their investigatory powers in a joint investigation led by another member state in an investigation conducted by the European Commission.

Section 19 provides that a daily penalty payment of up to 5% of daily income may be imposed on a person for the purpose of enforcing an obligation imposed by an authorised officer during an investigation. This section also empowers an coimisiún to require intermediary service providers to take interim measures in specific circumstances where a contravention is ongoing. It also provides that any proceedings, including summary proceedings, shall be taken by, or with the consent of, the Director of Public Prosecutions.

Section 20 provides that in joint investigations, the report of an authorised officer must take into consideration information or views provided by digital services co-ordinators or competent authorities in other member states. Sections 21 and 24 make provision for technical matters relating to the specific obligations regarding the conduct of joint investigations. Section 23 provides that a person may be liable for administrative financial sanctions if they do not comply with a request for information or if they supply false information. Section 25 provides that once an coimisiún has come to a decision relating to an infringement of the regulation, it must notify the European Commission and the digital service co-ordinators in other member states.

Section 28 sets out the maximum administrative financial sanctions that may be imposed by Coimisiún na Meán for these contraventions. These sanctions are significant: 6% of the annual turnover of the provider for infringements of certain articles of the regulation, for non-compliance with a commitment agreement entered into by the provider with an coimisiún, or for failure to take interim measures following an order to do so by an coimisiún; or 1% of the annual turnover or income of a provider or person for various infringements such as obstructing or impeding an investigation or knowingly providing false information. These percentages may sound small but given the turnover of some of these global giants, it is a lot of money, which gives the regulators significant teeth in the context of investigations.

Section 33 provides that, further to issuing a notice to end a contravention, Coimisiún na Meán may impose a daily penalty payment of up to 5% of daily turnover on the provider for the purpose of enforcing the notice. Section 34 provides that under certain circumstances where a provider does not comply with a notice to end a contravention, an coimisiún may apply to the High Court for an order requiring the provider to block access in the State to the service which is the source of the continuing contravention.

Section 36 empowers Coimisiún na Meán to issue compliance notices and to enter into a commitment agreement with a provider. Section 37 introduces provisions into the Broadcasting Act 2009 relating to vetted researchers, trusted flaggers and out-of-court settlement bodies. It provides for reviews, investigations, revocations and appeals relating to all three. It also creates offences for knowingly providing false or misleading information when applying for certification and sets out how Coimisiún na Meán shall handle complaints related to failure by providers to comply with the digital services regulation.

Part 3 designates the Competition and Consumer Protection Commission, CCPC, as a competent authority for three specific articles of the regulation relating to online marketplaces. The provisions for CCPC have been set out in a self-contained Part of the Bill. This Part provides that CCPC can use its powers of investigation, as provided for under the Competition and Consumer Protection Act 2014, in the context of EU regulation. Section 41 provides for the CCPC to enter into a co-operation agreement with Coimisiún na Meán to ensure effective and consistent implementation of their respective responsibilities.

Section 43 provides for the CCPC to disclose, in certain circumstances, personal data to an coimisiún for the purpose of performing its functions under the EU regulation. It also sets out how the CCPC shall handle complaints. Sections 45 to 54 set out how the CCPC shall conduct an investigation and enforce its investigative powers with administrative financial sanctions. Sections 55 to 58 set out the processes and steps the CCPC must follow in coming to a decision following an investigation and the receipt of a report. It also contains provisions regarding the notice and publication of decisions.

Sections 59 to 65 provide for the procedures and conditions regarding the application of administrative financial sanctions. These sanctions have the same limits as those for Coimisiún na Meán. Sections 67 to 69 contain provisions for the CCPC to issue a notice to end a contravention which is continuing. It also allows for a daily penalty for failing to comply with said notices, including provisions for appealing the penalty.

Sections 70 and 71 provide for further measures the CCPC may take when a contravention continues, including an application to the High Court for an order requiring a provider to block access in the State to the service which is the source of the continuing contravention.

Sections 72 and 73 provide the CCPC with additional enforcement powers of compliance notices and powers to enter into a commitment agreement with provider. These are the same powers granted to an coimisiún in section 36. Section 74 defines the potential fines and imprisonment terms on summary conviction or conviction on indictment for offences under Part 3 of the Bill. Section 75 provides that the CCPC may take a summary prosecution under Part 3. Part 4 contains miscellaneous provisions, including an obligation on authorised officers and staff of the CCPC to maintain professional secrecy.

It is imperative the State has a comprehensive and robust legal basis for the full and effective implementation of the EU digital services regulation. I am confident the Bill achieves this objective in a balanced and proportionate manner.

I thank my colleague, Deputy Calleary, and all our team in the Department, who did a huge amount of work to bring it to this point. I ask for help and co-operation across the House on getting the Bill done and ensuring it is law by 17 February. We will only get through some of Second Stage today but in the new year we will have to work quickly on this. We will listen in a helpful and constructive way to any concerns and to amendments the Opposition wants to bring forward. We have to get this done in time, which means co-operation where necessary. It would be a serious problem for Ireland if we cannot get this legislation enacted in plenty of time for 17 February. I hope Deputies will work with us on that and thank them in advance for their co-operation.

Go raibh maith agat, a Aire. We note that 17 February deadline.

I thank the Ministers and, especially, the team in the Department for engaging with us for the briefing and for the helpful information exchanged. Sinn Féin will support the Bill on Second Stage. We will bring forward amendments on Committee Stage. That will not surprise the Ministers but they will be brought forward in a constructive and not obstructive spirit. We intend to work with them. We appreciate the deadline and will do our best to ensure it is adhered to in good time for 17 February, and not at a minute to midnight on the night before. Any amendments we bring forward will be done with the intention of strengthening the legislation and being constructive.

This is, as the Minister said, complex legislation, both technically and politically. It impacts national and European law. Last November, the Digital Services Act came into force and it applies directly across the European Union.

The purpose of the EU Digital Services Act is to impose a harmonised set of obligations on intermediary service providers, which include services such as Internet service providers, cloud services, messaging services, marketplaces or social networks, which have an intermediary role in connecting customers with goods, services and content online. The European Commission has primary responsibility for regulating these entities but will do so in concert with national authorities. As 13 of the very large online platforms are established in Ireland, we have a critical and high-profile role in supporting the EU Commission to regulate them.

The stated purpose of the Digital Services Act is to: ensure a safe, predictable and trusted online environment; address the dissemination of illegal content online; address the societal risks that the dissemination of disinformation or other content may generate, and within which fundamental rights are effectively protected; and ensure innovation is facilitated. It seeks to make the services used by people in the EU more transparent. People will have more information about recommender systems and how ads are targeted at them, which is a good thing. There are protections against profiling of minors to make sure content is not inappropriately targeted at them. There will be greater access to the data held by these companies, so there will be further knowledge about their algorithms and how they operate. This data will help design future regulations and inform policy.

An internal complaints mechanism will also be needed. Companies will have to respond faster and be transparent about who people are actually dealing with. The EU Digital Services Act also sets out the framework for supervision and enforcement of the obligations on the providers of intermediary services. The provisions of the EU Digital Services Act that impose obligations on the providers of online intermediary services will have direct impacts here. The Digital Services Bill does not adapt or add to these obligations. That matter is now settled European law. However, I will raise some political concerns regarding the application of the EU Digital Services Act by the EU Commission later.

The Digital Services Bill will give effect to the elements of the EU Digital Services Act that require national implementation measures to designate and empower the relevant competent authority within the regulatory framework to supervise and implement the EU Digital Services Act. In this regard, the legislation designates Coimisiún na Meán as the Irish digital services co-ordinator. It also provides for other miscellaneous matters, including the liability regime for providers of online intermediary services, the harmonisation of court orders to take down illegal content from online services, along with the procedures for awarding trusted flagger status, certifying entities as out-of-court dispute settlement bodies and procedures for dealing with complaints by users or bodies mandated to act on their behalf.

It is important to note that the Digital Services Bill only implements the regulatory framework. It does not deal with the substance of obligations on the technology companies, so it does not deal with the substance of online safety. The Bill assigns added functions to Coimisiún na Meán, which already has a role in the area of online safety, and ensures that the regulator can impose fines and will put in place procedures to implement the EU Digital Services Act.

The Bill designates Coimisiún na Meán as Ireland's lead competent authority for the EU Digital Services Act, to be known as the digital services co-ordinator for Ireland. An Coimisiún has been set up since March 2023 further to the provisions of the Online Safety and Media Regulation Act 2022. The provisions of the Digital Services Bill will provide for new functions related to online content. Coimisiún na Meán is under the remit of the Minister with responsibility for media. There are many areas where the Digital Services Act overlaps with an coimisiún.

The Bill also provides for the powers that Coimisiún na Meán will have in that role. However, Coimisiún na Meán already has many of these powers, such as the power to investigate and impose fines. Accordingly, the Bill mainly adapts those powers to the requirements of the EU Digital Services Act. To comply with obligations for supervision and enforcement of the relevant provisions of the EU Digital Services Act, the Bill empowers Coimisiún na Meán to carry out a number of functions, such as: handling complaints about alleged infringements of the regulation; entering into binding commitment agreements with providers under which the provider agrees to take measures that appear to address an issue regarding compliance by the provider; and issuing compliance notices to providers for infringement of articles of the Digital Services Act, among others. By appointing Coimisiún na Meán as the digital services co-ordinator, efforts can be made to ensure that the different legal instruments are used in a coherent and effective fashion to address issues such as content that his harmful to minors and vulnerable adults, hate speech and misogyny, threats of violence and non-consensual sharing of images or videos.

Sinn Féin is supportive of this Bill and the EU Digital Services Act insofar as it seeks to regulate VLOPs and provide a more equitable online environment. It addresses illegal and harmful content, reins in the powers of big tech, and gives Internet users more control over their digital lives. The Minister said in recent weeks that he very much welcomes the publication of the Digital Services Bill. The events of recent weeks in Ireland and abroad have demonstrated the risks posed by illegal and harmful online content and the spread of disinformation. They have underscored the need for a comprehensive and effective regulatory framework to protect individuals as well as society at large. The Bill, once enacted, will be an indispensable component of that EU-wide framework.

The Minister is correct. Curbing the risk posed by illegal and harmful content, as well as the spread of disinformation, is an objective that we should all be aiming for. However, the are also recent examples where the EU Digital Services Act has been applied in a manner that could be seen as a threat to free speech and indicative of the risks that exist with regard to how certain definitions are interpreted. Matters regarding the shape and nature of the EU Digital Services Act have been settled but there are differences of opinion on the interpretation of the Act and the definitions that underpin it. In 2018, disinformation was defined in an EU code of practice on disinformation. The definition states that disinformation is:

"verifiably false or misleading information" which, cumulatively,

(a) "Is created, presented and disseminated for economic gain or to intentionally deceive the public"; and

(b) "May cause public harm", intended as "threats to democratic political and policymaking processes as well as public goods such as the protection of EU citizens' health, the environment or security".

In recent weeks, since the outbreak of the Israel-Hamas war, there has been growing apprehension about the unintended repercussions of the EU Digital Services Act on the digital rights of Palestinians and other vulnerable communities. Online information has been taken down amid accusations that it constitutes harmful content or disinformation, only for the poster to open a complaint and prove the content is legitimate and valid. This highlights how the EU Digital Services Act could potentially be applied with bias, as the interpretation of disinformation can be politically applied. For instance, Commissioner Thierry Breton recently emphasised the global impact of the Digital Services Act by warning X, Meta and TikTok of their obligations under the text with regard to the war in Palestine. The primary issue here lies in the Commissioner's and the EU's framing of the situation, which aligns closely with the mainstream narrative that neglects the Palestinian perspective and ongoing human rights violations. It must be stated that Ireland stands outside this narrative, and the people, media and politicians are generally even-handed and fair in their analysis of the current conflict.

My point here more broadly refers to the narrative that pervades at EU Commission level and in several other EU countries. This narrow focus perpetuates a one-sided narrative that overlooks the complexities and nuances essential in understanding the situation's full scope. This is of concern in Commissioner Breton's portrayal of the Commission's role in enforcing the EU Digital Services Act, which, in my opinion, is exacerbated, or could potentially be exacerbated, by comments regarding the conflict made by the President of the EU Commission, Ursula von der Leyen.

Much of the apprehension primarily revolves around the prioritisation of speed over due diligence in content removal. This approach has resulted in the unjust removal of legitimate content, breaching EU Digital Services Act provisions in a context where nuanced, contextual understanding is crucial. From memory, this happened with a clip regarding the current war in Palestine in an episode of the "Free State" podcast, where a social media post was removed following an accusation of disinformation or harmful content, only for Mr. Joe Brolly and Mr. Dion Fanning to contest this and prove it was factual and fair.

The zeal to combat illegal content and disinformation at any cost has exerted pressure on very large online platforms, such as X, Facebook and TikTok, to act swiftly and decisively, even if it means relying on imperfect and opaque algorithmic tools to avoid liability and public scrutiny. Unfortunately, this has led to the unjust and disproportionate removal of lawful content produced by Palestinian and foreign journalists, as well as human rights advocates documenting the on-the-ground reality. Consequently, this practice distorts vital information necessary for global understanding and monitoring of human rights abuses. Instead of adhering to the principle that any restriction on freedom of expression must be necessary and proportionate, online platforms have removed Palestinian-related content and suspended accounts. This raises the question of how Coimisiún na Meán will operate following the passage of the Digital Services Bill with regard to interpreting disinformation.

Concerns have also been raised recently regarding the EU Digital Services Act in terms of how SMEs and microbusinesses can financially afford to comply with the EU law. As stated previously, there is a legitimate fear that large platforms will have a tendency to block too much, removing content according to the fast-paced, blunt determinations of an algorithm, while appeals for the wrongfully silenced will go through a review process that, like the algorithm, will be opaque and arbitrary. The review will also be slow. Speech will be removed in an instant but only reinstated after days, weeks, or potentially even months and years. At least the largest platforms will be able to comply with the EU Digital Services Act while it remains to be seen if SMEs and microbusinesses will be able to operate in Europe if they cannot raise the money necessary to pay for legal representatives and filtering tools.

Thus, it is argued that the EU Digital Services Act sets up rules that allow a few tech giants to control huge swathes of European online speech because they are the only ones with the means to do so. Within these tech companies, algorithms will monitor speech and delete it without warning and without regard to whether the speakers are bullies engaged in harassment or survivors of bullying describing how they were harassed. We hope Coimisiún na Meán will shed more light on such matters and will work with the EU Commission to address these concerns.

The designation of vetted researchers is catered for in the Bill. Approved researchers will get access to data from large online platforms and search engines. In order to become a vetted researcher, an individual must apply to the EU Commission to be recognised as such, with the application process including an assessment by a member state's DSC. The Commission can request more information before making a decision. If satisfied, it will approve or reject the application. The Commission can also revoke a vetted researcher's data access if they no longer meet the required conditions. This decision can be based on the Commission's findings or on information from third parties. This may not seem an important political point but, as already stated, the EU Commission may seek to only designate vetted researchers it feels align with its views. Thus, the approval of a designated researcher may potentially be applied in a political manner. This remains a concern and is an area that should be closely monitored.

The enterprise, trade and employment committee recommended in its pre-legislative report that provision be made in the legislation to enable public interest research based on data provided by regulated platforms to facilitate research access, conduct data analysis and manage collaborations. However, this has not been included in the Bill. On a similar point, the appointment and revocation of the status of trusted flagger is extremely important. Section 191 describes how to remove this status after complaints but this could potentially be very slow. In the meantime, a platform loses indemnity for certain types of illegal content if it does not act on complaints from a trusted flagger. What will be done to ensure political partisan groups do not gain the position of trusted flagger as a means of controlling or influencing content? Will the Minister elaborate on what will be done to ensure that bias is not applied?

It will be necessary to reference who puts in take-down complaints and their impact when seeking to revoke a status. Article 19 of the Digital Services Act describes a complaints mechanism for trusted flaggers which is only open to online platforms. Article 19.6 contains the phrase "on the basis of information received by third parties". Perhaps the legislation could be amended to explicitly state that third parties can complain about the actions of a trusted flagger, not just online platforms, and that the database of trusted flaggers for submission to the Commission be reported each year with aggregate details of complaints, including unsuccessful complaints, for appraisal.

The EU Digital Services Act perceives the problem on the Internet as one of illegal content and the role of regulation being to assign responsibility to different bodies to remove this content. This view of the Internet and current harm reduction regime was cultivated by large tech companies as they seek to discuss their difficult burden of adjudicating between the rights of free speech and expression and the rights not to be constantly presented with gambling advertisements or pro self-harm videos on your phone. An alternative view is that this technology is deployed as a product with inherent features in the feed or presentation of content which should be subject to product safety regulation to minimise the very real harms occurring. Current court cases in the US allege that digital platform software products were created for their addictive properties and that this is causing harm to children, which needs to be addressed as a product safety issue. The Digital Services Act does not take this position. The role of the national co-ordinator is limited to focusing on content and take-down notices in Ireland, with the Commission doing the heavy lifting on large systems and algorithms.

There are still several important points. The ability of Coimisiún na Meán to work with the CCPC on investigations and the apparatus for same is important. The Internet is not media; it impacts every aspect of life as we develop regulation. Knowledge within the Government of the impact of certain online practices will be found outside of the media commission. It is welcome to see this joined-up approach. A mechanism for civil society bodies to engage with the reporting and take-down of illegal content is welcome and the trusted flagger process seems a sensible mechanism. As previously mentioned, it is open to abuse in a polarised and highly political media landscape. It is important to have as open and accountable a process as possible. In addition to the requirement to report the appointed trusted flaggers to the commission, it is important that flagged content is reported to see who edits the Internet and for what purpose. Coimisiún na Meán should publish a list of complaints, actions and flaggers each year, as well submitting it to the EU Commission. Furthermore, complaints about trusted flaggers should be open to other third parties, not just platforms. Responsibility for auditing the algorithms of VLOPs at European Commission level is welcome but the appointment of vetted researchers by the Irish commission important tool. Areas such as targeted advertisements are specific to each territory and it is important that research is undertaken on the Irish media environment. The legislation seems unsure about who will initiate and recruit researchers. If the researcher believes they have a research project which could be covered by a reasoned request under Article 40.4 of the EU Digital Services Act, the Commission should support this once it complies with the public service requirements of a researcher being certified as a vetted researcher.

The enterprise, trade and employment committee also recommended in its pre-legislative report that Coimisiún na Meán be satisfactorily resourced with the level of staffing and legal expertise required to allow optimal operational capacity and enforcement. It further recommended that highly precise detail be given about the roles, individual functions and responsibilities of Coimisiún na Meán. The Department of enterprise maintains that Coimisiún na Meán has a budget of €6 million for 2024. However, the coimisiún submitted a business case outlining in detail the additional roles and skills required for operation in 2024, as it moves from the setting up phase to official designation as a digital services co-ordinator. It is essential it gets the necessary support and funding to ensure it has the resources, human and financial, to deliver on its remit as a digital services co-ordinator for this State.

It is essential that the CCPC receives the necessary funding to meet its new responsibilities under the Bill. The Minister indicated that an industry levy will be used to ensure the necessary funding for Coimisiún na Meán and the CCPC is there to meet their obligations under the Bill and the Digital Services Act. We still await clarity on this.

I look forward to working with the Government quickly. I do not mean working with it in an elongated manner. That is not a metaphor for thousands of amendments intended to obstruct. I hope I made clear in my contribution that the intention is to be constructive. We can get this right and work with the Government to ensure it is done properly. I apologise but I cannot stay. I have stayed for as much of the debate as I can. I will have to leave.

I welcome the ambitious Digital Services Act regulation and recognise that European law in this area continues to be the global leader. I also welcome this Bill. The Labour Party will support its transition through the stages to ensure it is enforced by 17 February. In much the same way that the general data protection regulation, GDPR, led the way in privacy rights, the Digital Services Act leads the way in social and market responsibility in online spaces. ComReg called the Digital Services Act a paradigm shift in the regulation of digital platforms in Europe. The Digital Services Act recognises the wide-ranging influence harnessed by the systems that providers of very large online platforms and very large online search engines have and places an important wider social responsibility on them - rightly so.

Recital 3 of the Digital Services Act states:

Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trustworthy online environment and for allowing Union citizens and other persons to exercise their fundamental rights ...

We in the Labour Party could not agree more. Far too often, as Members will know, providers fall below the standards right-thinking people across this country and the EU would expect. They also fall well short in far too many instances in the enforcement of their own guidelines and the fulfilment of their commitments. We commend the intention of the measures of the Digital Services Act, particularly with respect to the most influential online operators – the "gatekeepers", as the Digital Services Act terms them. These currently include Google, YouTube, Facebook, Instagram, TikTok, X and more.

The Digital Services Act importantly identifies four systemic risks: illegal content including child sex abuse material and illegal hate speech; impacts on the Charter of Fundamental Rights, including discrimination and privacy; foreseeable impacts on the democratic process and public security; and manipulation or co-ordination of disinformation campaigns especially in public health, mental well-being and gender-based violence.

It is quite clear from Recital 12 that what is to be determined as illegal content would include hate speech and unlawful discriminatory content. It is right that after many years we are updating our hate speech laws to take account of the ways in which we communicate in a mass fashion. These could not have been envisaged even 20 or 30 years ago. Our laws in that area need to be updated. Given the events in Dublin on 23 November and the many appalling events across the country in recent times - blockades of buildings intended to house applicants for international protection, protests at libraries, violent protests outside these Houses and at many other locations across the country - any legislation that strengthens the hand of regulators of the online space is to be welcomed. I could fill the Dáil record, as could many others, with instances where large online platforms played pivotal roles in stoking fear among communities, platforming misinformation, and where platforms themselves have manifestly failed to properly address those issues. In some cases they have point blank refused to act. This legislation is clearly much needed.

The existence of the Digital Services Act and the facilitation of its operation in Ireland by this Bill will not instantly put a stop to illegal content, misinformation, or human rights abuses proliferating online spaces. However, even apart from the new regulatory framework, what we hope to create is a different culture in those firms. The director of the EU Agency for Fundamental Rights, FRA, Michael O'Flaherty, recently stated:

The sheer volume of hate we identified on social media clearly shows that the EU, its member states, and online platforms can step up their efforts to create a safer online space for all.

The FRA has recently conducted extensive research, which reveals that women are the primary victims of online hate, facing abusive language, harassment, and incitement to sexual violence on a regular basis. This will of course come as no surprise to public representatives who are women. The number of hateful posts targeting women was almost three times that of those targeting people on a racial basis. Given this study and what we have witnessed here and around the world, it is clear that enforcement will be a mammoth task. A year ago, it was widely reported that the CEO of X had disbanded that platform's trust and safety council. This was the advisory group of approximately 100 independent civil, human rights and other organisations the company formed in 2016 to address hate speech, child exploitation, suicide, self-harm and other problems relating to the platform. Only last week, citing legal matters, the company refused to come before a committee of this House to answer questions. All of this serves to demonstrate the uphill task at hand for us as legislators, and the regulators we seek to empower.

I question the assignment of two existing bodies to undertake this mammoth but meaningful and important task, namely, the media regulator and the CCPC. The Bill provides these agencies with extensive and necessary powers to enforce the Digital Services Act and the legislation underpinning it. However, powers under statute can only be executed where the agency has the appropriate capacity to do so. The media regulator as the primary agency with oversight for the Digital Services Act is still newly established. Its four commissioners were only appointed in March and it is and still openly recruiting to fill formative roles.

The significant financial sanctions available under the Digital Services Act require that the media commission be empowered to issue fines of up to 6% of annual turnover of the service provider. Considering that the DPC can make fines up to 4% of annual turnover for infringements of the GDPR, 6% is significant. If the annual turnover of a very large online platform like Amazon is taken into account, there could, in theory, be in excess of €28 billion in fines issued were it to fail to comply with the relevant provisions of this legislation. The necessary processes, procedures and expertise to exercise such powers often proved difficult for the Data Protection Commission, and still somewhat does, with investigations taking considerable time to complete. That is notwithstanding the objective fact that the Data Protection Commissioner in Ireland does a good job.

When the next very large online platform emerges and is designated as a gatekeeper by the EU, the platform will have four months to comply with obligations under the Digital Services Act, as will agencies in member states. Assisting the media commission with the enforcement of the provisions will be no easy task. This recently formed body is being given significant powers and responsibility to regulate all non-gatekeeper intermediaries operating in Ireland and assist in international regulation. Without robust establishment, the media commission may struggle to meet it’s brief in the manner of the DPC following the GDPR becoming law. The additional duties envisaged for the media commission will require significant staff resources. To further complicate this task, this Bill apportions partial responsibility for Digital Services Act oversight to the CCPC when it comes to online markets. This will give them similar powers and responsibilities with respect to Articles 30, 31 and 32 of the regulation, which relate to platforms where distance contracts for goods or services are facilitated – essentially online marketplaces. It is of concern to me that in making a submission to the Department's open call for opinions, the CCPC made a detailed submission with respect to the Digital Markets Act and was silent on the Digital Services Act, perhaps indicating that the CCPC did not foresee a role for itself at the time. That the decision was made to split the enforcement responsibility tasks on such a monumental piece of legislation quite frankly needs further interrogation. It requires more detailed explanation. I understand the explanation given by the Minister in his earlier contribution. I only hope that this Government and those that succeed it commit to properly staffing both of these agencies in undertaking this important work.

Given how desperately necessary the Digital Services Act and this Bill are in our increasingly digital world, I sincerely hope that splitting the competent authority across two pre-existing agencies will not become problematic. The stakes could not be any higher. We have to get this right. While I understand the urgency to pass this legislation, given that enforcement capabilities must be in place by February 2024, it is not the fault of this House, the committee or the assigned agencies. There is a choice we have, which is to do this right.

I will work with the Minister. We may introduce amendments on Committee Stage, but all of us in opposition are anxious to get this right. We will work with them and we understand the importance of it. Government follow-through on resourcing the media commission and the CCPC must be a priority. However, in our view, Government follow through on resourcing the media commission and the CCPC must be a priority.

We know the reason for the EU Digital Services Act. We know the issues there have been, in particular with big tech, with regard to disinformation, bullying and content on which there was no checking in any way, shape or form. Beyond that, from the likes of Frances Haugen and others, we got into the ins and outs of the algorithms of companies like Facebook. You had an online tool that was fabulous from a communications point of view, but the difficulty was that it could be weaponised by state and non-state actors in organised and disorganised fashion. We all realise the journey had to be made into some form of online regulation, in particular where big tech is. We all know the companies we are talking about. This morning on my local radio station, LMFM, somebody sent in a particular comment about migration. I think people can guess the general tone. I will paraphrase the comment from Michael Reade, which was something like how he did not understand what the story is with people who present these things as absolute facts because they saw them on some form of social media. We are in absolute support of making sure we get this right. Of course there will be questions, and amendments will be required. We have to do our absolute best to make sure we have something that is fit for purpose. We know the wider issues. We talk about Facebook, and we know the wider issue with the Rohingya. That is when you see something at its worst, where an online tool can be weaponised to carry out the worst of actions.

Coimisiún na Meán is designated as the digital services co-ordinator for Ireland. The functions and powers of the digital services co-ordinator will include the ability to impose fines and to co-operate across borders and with other digital services co-ordinators. The European Commission will issue specific certifications, updating the liability regime from the e-commerce directive 2000, and the content, handling and take-down relating to court orders. We all know there are a huge number of question marks about this.

Deputy O'Reilly spoke about the difficulties there may be for certain SMEs and micro businesses that may not necessarily apply to big tech. That is something we must examine. It is all well and good having someone who is a vetted researcher but, at this point in time, for example, we are dealing with a huge level of bias across the European Union in regard to the disgraceful actions of Israel. On some level, social media companies are probably just following the line that is there from the West. Whatever we do as regards getting this regulation as fit for purpose as possible, unless we are willing to staff and fund Coimisiún na Meán and the CCPC in respect of its operation, none of that will really matter. There will have to be a huge deep dive to ensure we can provide what needs to be provided.

I welcome the opportunity to speak on the Digital Services Bill 2023. It is important legislation that will provide the full opportunity for Ireland to implement the EU regulation on a single market for digital services. That EU digital services regulation, more commonly referred to as the EU Digital Services Act, provides a regulatory framework to protect the fundamental rights of users throughout the Union in their engagement with digital services. It is important this legislation is juxtaposed and considered in reference to the Online Safety and Media Regulation Act 2022 and that the new functions and responsibilities of Coimisiún na Meán under the EU regulation are closely integrated with the commission's other responsibilities. That is necessary to ensure a coherent framework in Ireland for the regulation of online platforms.

This important legislation impacts on many lives. If we think back to the late 1990s, when the Internet first appeared in people's homes, we remember having to wait five or six minutes for the crackling dialling sound before, suddenly, we were online and connected to the world. We have moved on a lot since then. It is not a criticism of Ireland but of the world that we have waited two or three decades to introduce regulations to enforce requirements around what happens online. Unfortunately, people conduct themselves quite differently when they are hiding behind a screen, whether a laptop, desktop or mobile telephone device. Social media is a fabulous tool that has enabled us all to become far more connected as a society. It is lovely to connect with former schoolmates and college friends online. Previously, people would have to wait for their class reunion every ten years. Now we can see what people are doing and even what they are having for breakfast, which is somewhat absurd.

Along with all the benefits of that social connectivity, we have platforms like X, formerly Twitter, where it is a free-for-all and open shooting season. I would not dare to put up a picture of my family or post about anything nice happening in my personal life on that platform because the storm of abuse that comes these days is absolutely appalling. If the owner of X will not take responsibility, which certainly seems to be the case, it behoves governments like ours and those in other countries to have robust laws in place. We all in this House have been very much at the receiving end of abuse. Many young people who are considering a life in politics should scroll down through some of the social media channels to see what is posted there. It is often necessary to mute or block the whole lot.

As I said, this is important legislation that certainly will impact on people's lives. It is about playing our part, in step with other EU countries. Much more can be done globally to rein in some of the types of Wild West scenarios that still exist on social media.

The Social Democrats support this legislation. Before getting into the detail of it, I want to wind back the clock to April 2023 when the general scheme of the Bill was before the Oireachtas Joint Committee on Enterprise, Trade and Employment. It sets the tone for where I want to take my contribution. I am not a member of that committee and, as such, I did not have the benefit of assessing the merits of the Bill in pre-legislative scrutiny. The introduction of the pre-legislative stage is one of the better reforms we have had. It gives an opportunity to go into the detail of legislative provisions at an early stage.

A couple of issues are raising red flags for me. The first is that a waiver was sought in respect of the Bill. The other is that we are tight on time to get it over the line. As the Minister said, mid-February is the mandatory EU deadline. We absolutely should comply with that deadline. I say this in the context of the designations made by the European Commission in regard to some of the online platforms and search engines and how they relate to us here in Ireland. The EU has applied status to 19 of those entities as being very large online search engines or very large online platforms. Thirteen of them are established in Ireland, including Google, Google Search, X, Facebook and so on. At issue is the scale of some of these platforms, the volume of information posted on them and the pace at which they can gather momentum. Things move now at social media speed and that can amplify harm because not addressing issues quickly can add to the problems.

Going back to the pre-legislative scrutiny of the Bill, I read the transcript of the proceedings of the joint committee and its resulting report. What jumps out at me is the dissatisfaction of the membership at being tasked with carrying out the scrutiny. I tend to agree with their points. The committee, in effect, was asked to review a Bill to provide for the appointment of a competent authority, namely, the media commission, that was under the remit of another Department, that is, the Department of Tourism, Culture, Arts, Gaeltacht, Sport and Media. No doubt the Minister will fall back on the same lines the officials gave during the course of the committee meeting, which is that it was his Department that negotiated the EU Digital Services Act originally. There is a real difficulty in this regard because we want Oireachtas committees to have a degree of expertise. It is only occasionally that a mismatch occurs but, when it happens, the result is that the committee does not work as effectively as it could. The net point is that the enterprise committee rightly stated it was not fully knowledgeable on the workings of the media commission and that the expertise in this matter lies with another committee.

Coimisiún na Meán will implement and enforce the EU Digital Services Act in Ireland. Can the Minister tell us today what additional resources have been provided to it for that purpose? I understand his Department made an allocation of €2.7 million to the commission in respect of its functions relating directly to the Digital Services Act and its role as the digital services co-ordinator. Does he believe that allocation will be sufficient? I am a member of the Committee of Public Accounts and one of the things we repeatedly see is that where there is a failure of regulation or whatever, when we start drilling into it, it is very often a resourcing issue rather than a legislative issue. Often, we are compliant in theory but not in practice. This is an area in which we must comply in practice.

The funding of €2.7 million was to provide for an additional commissioner and more staff for Coimisiún na Meán. Will the Minister update us in that regard? What is the headcount and the skill set as of today? We are two months out from the requirement to have the digital services co-ordinator role operational. Is that nailed down at this stage or will it happen after we enact this legislation? What degree of workforce planning has been done by the media commission in advance of the passing of the Bill? How will key performance indicators, KPIs, be set for the commission in the context of its work as digital services co-ordinator? As with any other regulatory body, we may not really get an insight into how it works until it is up and running, but we can predict some of the issues in advance by thinking the whole thing out and resourcing it properly.

As expected, there will be a requirement to have adequate in-house expertise within Coimisiún na Meán and the operational capacity to work through its caseload. That was highlighted by the joint committee in its pre-legislative scrutiny of the Bill. As I understand it, each member state will have a direct channel to report issues emerging on its territory and to request assistance from the competent digital services co-ordinator in the member state in which the online platform is established. We have a particularly large number of those platforms located here. Will Ireland be doing the heavy lifting when it comes to regulation? I assume that will be the case.

The Department’s record when it comes to defining the role of new agencies and resourcing them is not ideal. I give two examples, one of which frustrated me hugely. This was when we had engagement on the Corporate Enforcement Authority. I recall there was real difficulty in getting the allocation of gardaí to the agency. We were given guarantees as to when it would be up and running. The critical point was getting the allocation of resources and which Department would pay for it. It is often not a legislative failure but other things that become a problem.

We must also look at the capacity of the Competition and Consumer Protection Commission. Has it had the kind of resource allocation it will require? The Minister of State might go into some detail about when that will happen.

Getting on to the substance of the Bill, the media commission will be tasked with handling complaints about alleged infringements of the regulations. It will, quoting from the briefing note provided to us, "enter into binding commitment agreements with providers under which the provider agrees to take measures that appear to address an issue related to compliance by the provider with Articles of the DSA". Will the Minister of State define what a binding commitment agreement is? The commission will also “take measures that appears to address an issue related to compliance”. In the context of the Act, it appears to be quite subjective or very loose. The media commission may issue compliance notices to providers that infringe on the articles of the Digital Services Act. Is there a timeframe for compliance to be met once an instruction has issued? We do not want to see a situation where we are asking questions about something taking an inordinate length of time and a backlog forming. That goes back to the resourcing issue I mentioned.

The media commission may seek a High Court order to have an Internet service provider block access to a provider. I am not clear on how that will work. A third party is drawn into the enforcement action. It could be anticipated that that would be contested, for example. Further to that, I know the Law Society made a similar observation in its submission on the proposed Bill.

The media commission will also be responsible for, "certification of independent out-of-court dispute settlement bodies to resolve issues between users and platforms". Is this just another layer of bureaucracy? If a settlement is arrived at, a legal representative is present. Surely that is binding. Does it require to go to the court in every case? Is this something the media commission needs to concern itself with or what advice will it have regarding handling this?

I want to touch on the point of algorithms. There is already an EU regulation requiring very large platforms to carry out risk assessments and mitigate them when there is an issue of concern. These assessments are audited annually. However, I am not clear on who carries out the audits. Is it at European level, where the lead rests when it comes to enforcement, or will it fall to the media commission? I presume it will be the responsibility of the media commission when it comes to the platforms located here. Please correct me if I am wrong on this.

The media commission will issue the status of trusted flagger to certain bodies, NGOs and others it sees as appropriate. Has the media commission nailed down the scheme for this at this stage or does it still need to be designed? Has any NGO approached the media commission declaring an interest in gaining this status? What measures are in place so that a platform like Meta, for instance, does not drag its heels when a trusted flagger contacts the platform about content and that flagger actually receives priority?

I take the point Deputy O'Reilly made regarding potential bias. She cited the example of Gaza, and it is a point very well made and a very good example she used. How will that be assessed? The regulator itself is potentially problematic or there may well be different interpretations depending on the locations and the particular country. We all have our biases. Having said that, regulating an organisation like X needs the might of a bloc like the EU to ensure a power balance, and I completely understand that.

As I said earlier, we all move at social media speed nowadays, and that can amplify damage if something is not addressed early. It used to be that somebody would send a letter which would take several days to arrive. People had time to read it, to construct a response and then the reply went back into the snail mail. People had time to think. This does not happen with social media. The speed of social media is a real issue in making sure this legislation will be effective in getting offensive material removed very quickly. This comes back to the platforms themselves. Obviously, the regulation will be very important, but they should not have to be made to do so. Unfortunately, the very fact we are talking about this legislation means they do.

We are supporting this legislation. As it states, it is about ensuring a safe, predictable and trusted online environment for people who use social media and the Internet in general. We all recognise the huge advantages that have come from the advent of online platforms and media and how much people have information at their fingertips. My mother, who is 91, often picks up the phone and looks on Google to find out about something she has seen on television. We are all quite amused that someone of her age can still use this technology to enhance her life. Many people do so. I recently spoke to a man who told me he regularly phones his son in Australia on WhatsApp. This man is also in his 90s. It has made connection around the world tremendously positive, and that needs to be said. However, there is also a very dark side to it, and that is why we need such advanced regulation.

The major issue is disinformation and the spreading of all kinds of untruths, lies and far-fetched stories. At the same time, if the stories are told often enough and enough people share them, they become valid and people start to believe them and spread them. That is the issue we have. There needs to be a certain accountability on the part of the platforms, such as X, Twitter and Facebook, most of which are headquartered in Ireland and have a role to play here. Therefore, as legislators in this jurisdiction, there is an added onus on us to ensure we get this right.

We also have to look at the people spreading the disinformation. It is not just the platforms on which it is spread. Somebody had to sit at a keyboard and write down the disinformation for it to be spread. They also need to be held to account. Very often they can hide behind anonymous accounts. This legislation also needs to look at that, try to get behind that and try to hold to account the individuals who spread such often dangerous disinformation that can cause not just difficulty for people but can incite violence, hatred and fear. We are all aware of the short distance from fear to hatred. This legislation goes some way towards dealing with that.

We also need to look at the issue of fraud and the fact that so many people can be so easily duped out of their money while using these online platforms. We all use them to buy items online and yet we do not have the same validity. We do not pick up the item and carry it off with us out of the shop as we do if we go shopping somewhere else. We have to wait for it to be delivered, and when it does not show up, all of a sudden we discover we have been defrauded out of our money. That happens far too often. There is also the issue of access being gained to people's accounts. We need to have very trusted platforms to do all of this. Sometimes they can be mimicked and used in a very aggressive way, and many people can be caught out by them.

The legislation is valid but it needs a lot more work to strengthen it, make it better and ensure it delivers for people.

This is obviously important legislation for many reasons given the huge importance and influence of digital services and social media, and just how much our lives now are caught up with the digital world, information technology, online economic activity and all of that stuff. Particularly important in our thinking on this are the recent horrible events where a small group of malign far-right actors sought to exploit in the most horrible way a terrible incident for their own political purposes and the consequences were pretty terrible to put it mildly. That has highlighted in the public mind the importance of looking at these things.

Obviously, there is a balance that we have to get right because we want freedom of expression, we want open and robust political debate, we want the right to disagree with each other to put forward controversial views and to put forward dissenting and minority views that sometimes may not be terribly popular. We need to defend that too. At the same time, we need to prevent people inciting hatred and violence or peddling dangerous misinformation. That is a tough balance to get right and the potential for abuse of significant powers being given to regulatory authorities is something we need to think about.

For example, I would not trust Vladimir Putin with regulating digital online activity. I would not trust the Israeli Government with regulating online activity. In fact, it is very busy trying to suppress what I and many people might consider very valid and legitimate criticism and outrage at what Israel is doing to the Palestinians in Gaza at the moment. It has moved very significantly to prevent digital communication in Gaza and to prevent journalists and people in Gaza actually conveying the true horror of the situation. It is trying to suggest that anyone who in any way questions the massacre, as I would see it, that Israel is conducting in Gaza, is somehow an apologist for terrorism, for example. Would one trust the Israeli government with regulating online communication? I certainly would not. I would be very concerned that it would actually use it simply to censor all views that did not suit its narrative or did not legitimise its horrors.

On the one hand we want to prevent online incitement, dangerous or blatant misinformation or fraud that may occur online but on the other hand, we cannot always trust authorities to use the power they might have to regulate those things in a genuinely fair and objective way. That is a difficult balance to get right.

I want to say one thing that is related to the Bill. Given that much of the power in these areas is now controlled by a relatively small number of private very wealthy entities who control the online world - whether it's Mr. Musk or a small number of other super wealthy individuals who now control many of these online and social media platforms - one way of trying to balance things, which is long overdue, is to impose digital taxes on these companies that are making absolutely extraordinary profits. Why is that important? How does that help in what we are trying to do here?

This relates to the big debate we had about RTÉ. Whatever difficulties we may rightly have with the governance issues that emerged in RTÉ, the big difference with a public service broadcaster is that at some level it is accountable to us. Theoretically at least, it is accountable to us and we can influence how it gets the balance right about information. Considering its funding problems, if we did not try to assist in maintaining the public service broadcaster - I think we it is imperative that we do - then the information sources that would be available would be essentially reduced down to these social media companies and a few media barons, the Rupert Murdochs and the Elon Musks of this world. I would not trust them as far as I could throw them with giving fair and balanced information or with regulating online communication in a fair way. Mr. Musk nailed his colours to the mast and they certainly would not be views that I or I suspect many people would share.

The idea that these people control the flow of information for millions or possibly billions of people or regulate what is dangerous, what is incitement or what is true, that is a real problem, whereas with public service broadcasting at least there is some level of accountability. The financial problems RTÉ has are to a fair degree related to the growth of these private social media corporations, digital information, digital news and so on. The two things, RTÉ's financial problems and the growth of this area, are connected.

One way to balance it as a matter of urgency is to impose digital taxes in order to fund public service broadcasting albeit a public service broadcasting that may need reform. However, as an aside, I do not see how that reform should take the form of slashing the volume or quality of content as may well happen or slashing jobs because then the consumer of public service broadcasting is losing and the workers who did nothing wrong are losing. There must be real fears that that is what is going on when we hear about different areas of public service broadcasting being significantly reduced. It seems the public and the workforce are being punished for the wrongs of people at the top. Ironically enough, much of the wrongs they were doing related to the fact that they were trying to be like digital services area, the private sector.

Part of what needs to happen to balance out the dangers of what can happen on social media is to ensure that we continue to have the resources available to ensure that there are reliable sources of public service information online, on television, on radio, in print and so on.

It would be a tragedy if our only sources of information, digital or otherwise, were to become monopolised by a small group of wealthy individuals who time and time again show they have a very definite agenda. That is usually quite a self-serving agenda. Very often - in fact invariably - this is a partisan agenda. They have certain ideas of what is legitimate information, but that may not tally at all with any reasonable idea of what is actually objective and balanced, or with views that are shared by the majority of people, or with views that are simply scientifically and empirically valid. Let us be honest; we are almost getting into the area of philosophy in terms of how we assess what is empirically true or genuinely balanced. It is therefore a tricky area.

The regulatory bodies that are being discussed in this Bill must be genuinely accessible to the public. They must have the resources that are necessary to be genuinely responsive to the public. I am not by any means an expert, but I have learned about content moderators. I did not even know this group of people existed until some of them contacted me. It was interesting to hear their accounts of their job and the way they are treated. They work for some of these fantastically wealthy corporations where they moderate content and look at some of the most shocking and awful content. They have to moderate it in order to filter out the really sick, dangerous and horrible stuff. They have to engage with it and make decisions about it according to all sorts of criteria. This relates to how they judge whether some content should not be seen. Although they are employed by very profitable companies, and we often imagine all the employees there are being paid very well, a lot of this work is being outsourced to people who suffer awful conditions, are being treated very badly and are not particularly well paid. They have to deal with content and some of them even report they have post-traumatic stress from it because it is so awful. This is an area we need to think about. The workers who have to deal with this content and moderate it need our support, so they can deal with it. They must be treated properly.

That is a slight aside, but critically, I refer to the statutory bodies or agencies. I understand this will be the first or second line in the Bill. They need to be properly resourced so they are genuinely responsive to public concerns and genuinely fair and balanced. We need a constant, ongoing review of how they adjudicate these things so that on the one hand they are doing their job, but on the other hand they are not infringing on important rights, such as freedom of speech, the right to dissent and the right to have different views. It is a matter of getting that balance right. That is always going to be a controversial area.

Sometimes it is the case that the contrarians who are in the minority turn out to be very important people. There can be a majority consensus where everybody is so certain that they are right about certain things. Then, it turns out down the line that the majority consensus was a load of nonsense and misinformation. Let us think about some of the economic orthodoxies that dominated the advice that was given to people in the run-up to the Celtic tiger collapse. It turned out that much of it was total misinformation about what was happening in the financial world. This was being peddled by very respectable sources. I am saying this is a sensitive area and there is always the danger of abuse. It must be subject to considerable oversight and constant review from the bodies that are there to regulate and be fair to those who are trying to post legitimate content. It must weed out the true poison, the genuine incitement to violence, etc.

I will raise another important issue. I do not know how one deals with this, but we need to think about it. I am referring to the ability of people to hide behind online personas. We cannot track these people down when they are inciting horrible stuff, are doing horrible stuff online or are engaged in vile or dangerous things. We must be able to track those people down so we know who they are. We must look at that area, because it seems as though all sorts of people can hide behind multiple fictional identities to spread forms of propaganda, lies, misleading information or content that is vile, inciting, racist, etc.

It is a sensitive area. It is an important area and we have to get to the bottom of it. We need to make sure that the regulatory bodies operate in a transparent way, that they are genuinely open and accessible, that they are subject to review and oversight and that we get reports that detail what they are doing and what sort of issues are coming up. These must be properly scrutinised by Oireachtas committees and open to the public to review and question.

I am not sure if this Bill is the place for this, but I will go back to the point I made at the start. There is the question of how these companies are making staggering profits. They are incredibly powerful now. They are far too powerful as far as I am concerned. In my socialist world, I would nationalise the lot of them, because I think they are too powerful and the money they are making is extreme. In the absence of that, let us at least impose a digital tax on their profits and put the receipts into public service broadcasting - I am referring to a proper, reformed, decent public service broadcaster - and into other areas of culture and the arts which are massively underfunded. I do not want to be critical of people who, as we all do, sit and look at their phones. There is an addiction to the phone. You are looking at this, that and the other. Would we not all like it if more people were to go to plays, concerts and other forms of social interaction outside of social media? We should have the resources to make those things available and accessible to people so they do not have to rely on their phones for entertainment and information. We also need to think about that. Let us be honest: many of us think this is a plague on our lives. It is consuming our lives. You cannot do anything about that, but you can resource more healthy forms of human interaction and sources of information. You can make those things more accessible and put the funding resources into them. A lot of money could be made available for that if we were to impose a digital services tax on these big streaming companies and digital and IT companies.

This is an important Bill. I am not certain it will fulfil the needs of the day, or even those of future or past days. The fact of the matter remains that digital communication and all that goes with it is a huge boon to society, to business, to social cohesion and to bringing news to the multitudes.

It also can be turned into a lethal weapon. Unfortunately, that has shown its face already in many facets internationally, for example in elections. Various situations have presented whereby a person can feed misinformation, false information or sheer pure hatred into the system before walking away from it anonymously and waiting for the result. That has already happened, as we know. It has happened in this country and has happened throughout the world. It has been facilitated by governments for malicious and seditious purposes. It still goes on.

I remember speaking in this House when digitalisation became a reality. That is not an indication of how long I have been in this House but of the fact that it came about when a number of Members were in this House. I remember saying how fortunate it was for businesses and governments to have such a ready means of communication at their hands. For business and industry, instead of having to wait two or three weeks sometimes for a reply, they had a reply within two or three minutes or even less than that. Then I thought about what happened in Rwanda, where a particular merchant of hatred seized a radio station and used the position he was in to pump out hatred against selected people. It resulted in a backlash and 500,000 people were beheaded. It was an appalling testimony to making hatred readily available and to the vehicles which created the opportunity for hatred to be transmitted.

I remember speaking when Salman Rushdie made his famous The Satanic Verses citation. I remember being critical of it. I was told by many people that I should not do so because the author was speaking freely and there is a public right to expression. Of course that was not the case. This book was targeted at a religion. Some people are very sensitive to religion. We should know that in this country, given that we fought wars for hundreds of years here on the basis of religion. The fact of the matter is that it did bring a backlash and he subsequently apologised for it. He said at the time that he only made £5 million on the basis of it. I have my doubts about that.

Deputy Boyd Barrett mentioned the wealthy people who operate these digital platforms. They do not have to be wealthy all of the time; others have been able to do it too. I agree that we have to find some ways or means of ensuring that misinformation, disinformation and very aggressive hatred is not pumped out into the system unimpeded. It is not good for society. It will have its downside very quickly. How do we do that? I think the technology is advanced enough to be able to say that what you propose to put up on the system is repugnant to the constitution which governs digital communications. I believe the legislation is there for that purpose. Otherwise we will arrive at a situation which we are already in, where virtually anything goes. The more outrageous it is, the more likely it is to get coverage, be transmitted further on, be added to, and be enjoyed in a perverted way by some. The quality and level of the content has got to be regulated in some way by some responsible international or national authority. There needs to be some control over what one can say. One cannot walk into a room full of people, insult everybody and threaten them, impinge upon their lifestyle and thinking and influence them to an extent that is above and beyond anything we have ever heard before, and then walk away from that with impunity. That brings society down to its knees.

We had revolutions in the past. The French will always give great credit to their own revolution and the Russians likewise. The French will claim that their revolution was different. I had occasion once to tell a Frenchman that there were certain similarities. They murdered the royal family in both cases and they then set off on a tangent of legalised murder and destruction, which was no better or worse than what went before. Where do we find ourselves going in the heel of the hunt?

Back again we come to it. To my mind, it comes back to exercising some kind of control. I was listening to a radio programme on national radio recently. I spoke about it in the House at the time. A person expressed a wish that they wanted to hate people and that they wanted to tell people how much they hated them. What a crazy situation. What good does that do to society? It may well do something good for the people or the person who is promoting that idea but it does not do anything for society, for peaceful coexistence or for respect. We live in a world, unfortunately, where there is an increasing lack of respect for anybody else. We have to balance this with the right to ensure every person has the right to express their views, provided it does not bring about a retaliation in such a way that the whole thing gets out of control. Maybe we need to do that sometimes in this House also, a Cheann Comhairle.

I am sorry for going on as I have. I assure the Ceann Comhairle that I could go on for a great deal longer on this subject.

I sometimes agree with Deputy Boyd Barrett. In this case, I remind him that it is not just the wealthy conglomerates which carry out these acts of sedition, of treason, and of undermining the political system with changes in elections. This has been done already to a great extent. I am sorry that I cannot express myself adequately in the time available but will do so some time again.

Deputy Durkan is enjoying it.

The next slot is a three-minute Sinn Féin slot, but its Deputies are not present. Deputy McGuinness was late for his seven-minute slot. If Members are amenable, I will give him the three-minute Sinn Féin slot and I will proceed from there. Can we do that?

Yes, a Cheann Comhairle.

I thank the Ceann Comhairle and Members very much for that. I want to agree with Deputy Boyd Barrett. I believe these bigger companies should be taxed and the money should then be reinvested into the arts of one kind or another. Some way should be found to do that because what we are experiencing online, with the input of individuals or of bigger companies, is nothing but trash. One is in the sewer the minute one goes into some of those online platforms. We need to have regulation.

I see a very significant gap between those who are familiar with online business and those who are not properly educated, for want of a better term, on what can happen online. As a result of that gap, one can get scams of all sorts happening. Therefore, I would like to include in this Bill some form of recognition that companies are obliged to ensure that when a person's name has been included in some form of scam, he or she can make an input to the online platforms to ensure they work to prevent it from happening in the future. I am thinking of older people at home who are contacted by a service provider and coaxed into giving their financial details.

A lady in Kilkenny was codded out of €3,500. It happened in split seconds and the money was gone. Her efforts to deal with the banks and with the company came to naught because they ignored her. We have to reach out to those people and ensure those who are involved in that type of transaction can be held to account. In fact, one can see online through purchases how one can be codded very easily and how easily one can lose one's money.

As such, there is a constant battle that must be fought with those bigger companies, but we must ensure there is not a digital divide between those who are older or maybe younger who are simply not literate in all these activities. Some provision must be made to protect those vulnerable people and perhaps to go along the lines of investing in public broadcasting. The main source of information and news for people is their local radio and I would like to see whatever money that is spent being spent in a way through local radio or through public broadcasting. It should not be favouring RTÉ or some of the bigger broadcasters but bringing on those that make direct input into the lives of people through local radio and local papers to ensure the message from this Digital Services Bill is got across to them and that they have been given, through the Bill, an easy access to getting some sort of redress for the problems they have had. I encourage the Minister of State to look at that and think of the older people and those who are not literate in this area.

The Digital Services Bill supports Ireland's responsibilities to comply with EU Digital Services Act. That has a very high and lofty principle, which is that what is illegal offline will be illegal online also. It seems to be a complex enough Bill and there are a number of significant statutes within it, namely, the appointment of An Coimisiún na Meán as the digital service co-ordinator and lead competent authority for the EU regulation, provision for the supervisory and enforcement powers and, as a backup, the designation of the Competition and Consumer Protection Commission as a second competent authority with specific responsibility for online marketplaces under the EU regulation. Beyond that, there are a number of other provisions to do with intermediaries, service provision, trusted flagger status and procedures for dealing with complaints and dispute settlement bodies.

The European Commission states the Digital Services Act is intended to "create a safer digital space in which the fundamental rights of all users of digital services are protected". Those are high principles, but this is going to be very difficult. We on the enterprise committee did some pre-legislative scrutiny on this and one of the first things to be seen was the scope of the remit of this Bill. Ireland is going to have to put in very significant regulatory oversight because some of the largest platforms are here. They have entities incorporated in Ireland and they come under Irish as well as EU law. We are seen as the first moderating authority to try to ensure they are acting as they are supposed to.

When we talk about the digital world, we are talking about a number of different streams of digital activity. There is online information. There is the digital marketplace, which we understand as all types of commerce, exchange and transactions. There is ostensible news and media presentation, though some of it is newsworthy and some of it is not. Then we have social media messaging. This is massive. We know by the number of users worldwide the number of people who are interacting every minute of the day online. This is the space we are now trying to provide regulation in and we must try to do our part in having oversight, controlled by Europe. That is what we are signed up to, but it is going to be very difficult.

When it comes to managing the message, which is ultimately what we are talking about doing here in large part, that is being done at the moment by moderators engaged by the platforms. We have had moderators before the enterprise committee and they are not actually employees of the platforms but are generally subcontractors employed at a remove by the operating platforms. Their job, generally speaking, is to try to take down content that is seen to be malicious, subversive or just too traumatic to view. They are being injured as a result of this. This is one of the reasons they came in to talk to the enterprise committee. Some of them are suffering post-traumatic stress, if you can believe it, from looking constantly at unbelievably harrowing imagery and having to decide whether it should remain up. That is only growing because the amount of content being generated worldwide is also growing.

The ability of social media messaging and the Internet to be used as a communicative and a propaganda tool has been highlighted here already. We have seen in Dublin in recent weeks how people can stoke up a narrative and put graphics to content to make people believe something taking place is nefarious, which it may or may not be, and allow people to take part in it. This is not new as we have known about propaganda since the world wars, but we are now sending it interstellar and this is going to be very difficult to control. I predict, a Cheann Comhairle, that in future you will not be able to read anything on your mobile phone and trust it. You will end up having to go back to the likes of a Reuters or other established news media sites and discounting anything else you see because you cannot trust it. We have seen just recently how the new digital deepfakes have upended elections in other European states. They are coming here as well and last year I think we saw one or two of our significant politicians having their images used in that way. There is, therefore, a really difficult body of work here.

Another area worth highlighting is the issue of negative bias. It is a personality trait human beings have. We are generally more minded, biologically, to notice negative things in our environment than positive. It is probably part of our fight-or-flight or survival instinct. This has been known by the media companies for quite a long time and it is the reason that when a person puts up negative content and positive content, the negative content is generally shared ten times more often than the positive content. We all know that in life, but the problem with it is it also means that if operators of a platform wish to keep a large number of viewers and subscribers, their algorithms must be switched to observe negative bias in posts because that is what circulates. We have people complaining about their mental health and everything else, but I suggest most people turn off their bloody phones and stop reading what is on them. They might feel a lot better. They could go out for a walk for their own sake rather than listening to a lot of this content. We are seeing a lot of it and it is going to become far more commonplace. We can see it even with what is happening in Gaza and Ukraine at the moment. The situations are terrible and it is very important those images are circulated so people know, but there is much disinformation going on there as well.

It is important Ireland does its part in trying to get on top of this really difficult subject. The question here is going to be one of resourcing on the one hand, but the second aspect is going to have to be on regulating all these online platforms. I agree with what Deputies Boyd Barrett and McGuinness said about coming up with a digital tax, because the costs of what we are trying to implement here are going to horrendous. We must do what the EU is rightly doing in coming up with very significant fines for breaches of data law, and that must be implemented in this country, but we will still not be able to get at those who are based overseas and promulgating information from there. It is something we will have to look at. I will be supporting the Bill as it moves through the House.

Words of wisdom, Deputy Shanahan. Deputy Tóibín is next.

Gabhaim buíochas leis an Ceann Comhairle. Words matter; they matter a lot. Words can cast light, they can educate, they can build up, they can get to the truth, they can hold people to account and they can inspire. Words can obviously do the opposite in many ways. They can be the vehicles of hate and they can be used for good and bad. There is no doubt about it.

Aontú is a republican political party. We believe in a republic where every citizen is equal, where the colour of a person's skin is of no more significance than the colour of a person's eyes. In that republican society we want a place where everybody can live their lives without fear of incitement to violence or of discrimination. In a citizens' republic everybody also has equal rights to their views and equal rights to articulate their views. In a civilised society we need to do this respectfully. We need to raise our children with the ability to question and challenge in a manner that does not cause needless hurt. In truth, if we want to fight misinformation, we need to do that with information. If we want to fight hate, we need to tackle that with truth and decency.

If we want to remedy disinformation, we need to provide accuracy. The key issue that has been missed in this debate so far is that the whole idea of a liberal democracy is built on the system that ideas can challenge each other fairly and that in a respectful competition of ideas, the better ideas will percolate to the top and become the policy of a given society. The truth is that the history of censorship has never ended well. It has been used throughout history as a tool of dictators who have stained the earth with the blood of millions of people. Censorship is authoritarian. It deletes the liberty of citizens to have that competition of ideas and reduces the ability of people to challenge and test the prevailing ideology. Censorship not only leads to the erasing of citizens' right, also guarantees that the best solutions for societies will not be achieved. It means that societies can radically and speedily swing in any direction where the latest ideology fashion is blowing at a certain time.

It is also important to say that censorship is an ingredient of conspiracy because the more censorship that happens within a society, the less society believes the authorities of the State, whether the Government or other sources of information. There is a direct correlation between more censorship and more conspiracy.

Censorship happens in different ways. It can happen in a legal fashion. It can have a chilling effect by stopping people talking about certain things. It can also happen when media organisations curate information on the basis of what they think adults in society can handle. We have seen examples in recent times where newspapers have not given certain information because they think people in this Republic are not able to handle, decipher and use that information properly.

In the past, censorship has been a tool of the political right. It has been used to delete democratic rights. In the past, the left has been the sector of society that has fought censorship and has been the voice of ordinary men and women across the country. However, in many ways, that is no longer the case. What we are seeing, unfortunately, is a move towards a more authoritarian-type instinct among left-leaning political parties. Ireland is now, unfortunately, becoming known internationally as a country that seeks to delete the right of freedom of speech and other democratic rights. The hate speech Bill has become notorious internationally because it seeks to delete people's rights to interact on issues that are important in a democracy. Two of its major elements, the definitions of "hate" and "gender ideology", are very weak. They are self-referential. If we were to ask a kid, "What is a banana?" and the kid says, "A banana is a banana", we would probably accept it. However, when we ask the Minister for Justice, who wants to jail people for what they say, "What is hate?" and she says, "Well, hate is hate", we have a problem. That is why the world is taking note of the difficulty that is arising now in Ireland in that respect.

The Government recently introduced legislation to stop peaceful protest outside hospitals and doctors' surgeries where abortions are being carried out. It was interesting that it was the week before the violent riots in Dublin that this Government banned peaceful protest outside hospitals. This is where the Government has, in many ways, become so distracted by the culture wars that it has forgotten about the bread-and-butter crime and antisocial behaviour that are wreaking havoc on our society. I would caution the Government as to how it regulates large international companies, social media organisations or digital information. I do not believe that people have the right to incite violence, and that absolutely should be a criminal offence. We do, however, need the ability to communicate.

I have a political interest in issues such as those relating to section 31. It is interesting that Sinn Féin voted for the hate speech Bill in this Chamber but voted against it in the Seanad. That was another example of the Sinn Féin flip-flop shop that has recently become prominent in politics. Dissent is a key and important element of democracy. It is a bulwark against intellectual fashions that often take hold in democracies. We need to ensure people have the right to dissent.

Large tech companies have enormous power. They absolutely have too much power. However, history is also littered with governments that have had too much power in terms of information. The best way to tackle large international tech companies is around the issue of their dominance within society. This is an issue that the European Union and this Government have been reticent about historically. They have not tackled the dominance of these organisations. No organisation, including media and tech companies, should have market dominance. There needs to be perfect competition within all these markets so there is a diversity of views and no particular view has sway.

I want to speak to another aspect of the Bill that relates to the issue of video-sharing platforms. This is a key issue for which I have real problems with the Government's approach. There is a situation in this country whereby young children of the ages of eight, nine and ten are consuming hard-core and violent pornography on a daily basis. Young boys are growing up on a diet of violent hard-core pornography. We then ask ourselves why there has been a doubling of sexual assaults and rapes, and a tripling of domestic violence, in the past ten years. The fact is that the ingredients we are giving to young boys, in particular, for them to understand relationships and sexuality are leading directly to the level of sexual violence that happens in our society at the moment. The Government is wise enough to seek to ban the advertising of junk food to children because it knows that when children see adverts for junk food, they eat it. The Government is, however, refusing to do anything about stopping young children consuming hard-core pornography in this country. A mother told me recently that she picked up a laptop and looked at the two most recent searches made by her ten-year-old child. One was a search for Santa Claus and the other search was for oral sex. That latter search yielded an explicit video. The juxtaposition of those two elements within a young child's brain is incredible, but it is a reflection of the lack of action by this Government. I would like the Minister of State to take particular heed of these points, which I have made over and over again. The Government is still shying away from seeing the link between these matters and the fact that we are living in a more violent society.

Other countries have taken such steps. France has introduced legislation to ban children from consuming hard-core pornography. It told tech companies that if they want this material available on their websites, they need to guarantee they need to know the age of the person consuming it. It has even provided a state digital certificate to ensure it can be done. Companies that refuse to comply are taken down. We in Aontú produced a Bill which looks to do the same thing. The mad thing is that this country will do it for money. A film studio can demand that the courts take down copyright material from the Internet in this country because it obviously does damage to the studio's income flow. Judges do that on a regular basis. They go to the Internet service providers, the Eirs and the Vodafones, and demand that those websites are taken down. Nobody is demanding that we take down the damaging material that young children in Ireland are consuming on a daily basis. I raised this matter with the Minister, who told me to wait and said the Government would deal with the issue in the Digital Services Bill. However, the Bill only looks at video-sharing platforms, which is a tiny element of the Internet, that are based in Ireland. The Government's complete plan to try to control harmful material is to deal with approximately 5% or 6% of it.

We hear talk in this Chamber on a regular basis when young women are murdered in horrendous situations. I refer, for example, of the case of Ana Kriégel who was viciously sexually assaulted and murdered by two young boys. All of the talk and expressions of anguish and anger that happens in such cases do not lead to one effort from the Government to solve the crisis. Why can the Government not join the dots? How long is it going to take for the Government to join the dots between the material that young boys, in particular, are consuming and the level of sexual violence that is happening in this country? It is an absolute disgrace that this Bill is being brought through without any real effort to tackle that issue. The CARI Foundation deals with children at risk in Ireland. It has stated there has been a 44% increase in the level of sexual assaults by children on children.

Not only are we making victims of children, we are also making some of them perpetrators of sexual violence against other children. The Government states that it is up to parents to deal with this. It is up to parents to stop children drinking vodka at the age of ten but we also have a law to say that off-licences should not sell vodka to children of that age. Why are Ministers sitting on their hands when it comes to dealing with this issue? It is so frustrating that over and over again the Government is refusing to take any opportunities that come its way to deal with this issue.

The last time I spoke to the Minister with responsibility for media in respect of this matter, she said that the European Union would not allow us to do as I have suggested. I gave her the example of the French Government doing it. The link is clear. The British ombudsman for children has done extensive research in this area and I would urge the Minister of State to read its report which shows a direct link between the material that the Irish Government allows to be consumed by ten-, 11- and 12-year-old boys and the level of sexual violence in society. In previous decades, we saw sexual assaults and sexual abuse in institutions. The sad thing is that many people knew about it at the time. It is happening today, not necessarily in institutions but in families and locations around our country. Everybody knows about it but the Government is still doing nothing. I implore the Minister of State to take this issue seriously, to do the research on it and to make sure the Government comes up with an appropriate response.

I am pleased to have the opportunity to speak on this Bill. This is one of the regulatory challenges of our generation. We are starting on a crucial but difficult journey of coming to terms with the impact of the digital world on the way we live our lives. This really is different from anything we are used to as legislators. Platforms have generated a massive range of data on every one of us, they know how to amplify messages and how to find influential hooks to bring influence to bear on us. Of course it is true, as the Digital Services Bill states, that this has transformed the marketplace - that is, if one looks at it purely as a marketplace. The potential for undermining competition rules and consumer protection, as well as the potential for legitimising illegal products and distorting information that people are presented with can completely upturn the way in which markets work. In that context, this legislation is really crucial. However, it comes very much from a market regulation perspective and while that is really important, it is not complete. We live in a political environment which is much broader than the marketplace and the tools that come from thinking about the digital market as a marketplace may be inadequate to really confront the scale of the challenge presented. Not only have huge amounts of information about us been collected, there is a complete asymmetry between the pace with which initial false information can spread compared with the opportunities to put the genie back into the bottle when it has spread so widely. There is no comparison between the two.

Equally, there is a complete inequality of the capacity of regulators, however well intentioned, to keep up with the inventiveness of the companies that are creating these algorithms which are driving the choices we make. Now we have moved on to a situation where this is recursive. It is not humans who are designing the algorithms but artificial intelligence. The latter is remodelling the algorithms so often that no one actually knows the principles that are being applied because they have evolved themselves beyond what was intended by the individuals who designed them. This creates a real challenge. I take my hat off to the European Commission, the European Parliament and the European institutions for coming up with a lot of innovation in this area. There is a lot there to be welcomed including putting an obligation on particularly large platforms to conduct risk assessments and where they identify riskiness, they have a mitigation strategy. There is also an obligation to conduct an audit and that is really important because a lot of these particularly big platforms have huge international reputations to protect and putting that obligation on them can be important. The idea of having trusted flaggers is really worthwhile. It means there are groups that will be recognised and that will have a standing in challenging material that goes up. The penalties are substantial, and at 6% of turnover are quite eye-watering when it comes to some of these companies. The legislation is innovative as well in looking for transparency in the origin of what is put up. Bringing in a crisis management mechanism and banning the profiling of minors are also really progressive changes.

As politicians, when we look at social media we need to look beyond the marketplace. The hope originally was that social media would create this fantastic, democratic forum where everyone would be listening to one another, there would be a great exchange of views and wisdom would emerge from it. Instead, the debate has lapsed into capsules that are often just listening to their own points of view and which become increasingly intolerant of those who hold different views. We need to protect the value of democracy that is so important to our institutions, including the freedom of expression but also an understanding of the balance between the rights and the duties that are there as well. There is no doubt that when we look at the autocratic despots with whom democracy, as we know it, competes, we see that they have no compunction or no sense of obligation to protect the balance between freedom of expression and privacy, among other things. They see this not only as a tool for control of their own people but also for the manipulation of those in democratic states like our own who seek to maintain the open and free societies that we have maintained.

I really welcome this Bill but believe we have a good deal further to go in understanding how the explosion of digital communication and the growth of artificial intelligence will change the way we work. The danger, as always, is that regulators will arrive breathless and late, as they say. We need to take stock of how these platforms are impacting on our society. Recently we saw that many of the political protections we would like to have like the registration of political parties and the controlling of their expenditure are completely blown asunder when there is influence coming from outside which is regulated nowhere and which can use the media to have a huge impact on the opinions of people who are open to being manipulated.

I congratulate the Minister on the work done so far. There is a lot that is good here but we will have to substantially strengthen our capacity to understand and legislate in this area in the coming decades.

Is ceist thábhachtach í seo, go háirithe do Choimisiún na Meán atá díreach ag teacht ar an saol i gceart agus anois táimid ag déanamh cinnte de go bhfuil ualach breise anuas air. Measaim gur thuig Coimisiún na Meán go raibh sé seo ag teacht mar bhí sé á phlé le blianta. Bhí an coimisiún réidh don chuid is mó de. Tá an coimisiún ag obair san earnáil seo cheana féin ach táimid ag tabhairt rialacha breise dó ó thaobh a leithéid d’obair a dhéanamh i dtíortha Eorpacha eile agus ag rá gur ghá dó cuidiú le haon fiosrúchán ó aon bhaill Stáit eile. Tá na fiosrúcháin sin ríthábhachtach. D’éist mé le roinnt de na Teachtaí romham agus iad ag caint ar cheisteanna a bhain le fuath, foréigean, gnéas dírithe ar pháistí, nó mí-úsáid páistí, agus is léir nach bhfuil roinnt de na seirbhíse ag feidhmiú mar is cóir dóibh sna cásanna sin.

Is léir sin dúinn anseo, le tamall anuas, ón tslí a chaitheann roinnt de na seirbhísí leis na bréagfíricí agus leis na bréagteachtaireachtaí atá á gcur suas, roinnt acu sin ón eite dheis, ag gríosadh daoine chun fuatha, chun foréigin nó chun ciníochais agus a leithéid.

D’fhéach mé ar na pionóis atá leagtha síos anseo. Ní fheicim aon áit go bhfuil sé de cheart nó de chumhacht ag an Aontas Eorpach, sa chás seo, aon cheann de na seirbhísí seo a dhúnadh san iomlán. Tá sé de phionós gur féidir fíneáil suas le 6% den teacht isteach iomlán atá acu a ghearradh/ D'fhéadfaí an pionós sin a chur orthu. Caithimid a bheith réalaíoch nach féidir é a dhéanamh i gceart. Tá rudaí ar fud an domhain ag an stad seo. Má tá ceann de na hardáin seo ag gríosadh daoine nó muna bhfuil siad sásta bogadh chun cosaint a thabhairt don phobal i gcoinne an tsóirt gríosadh seo, ba chóir go mbeadh muid in ann féachaint ar conas gur féidir linn stad iomlán a chur leo, muna bhfuil an t-ardán agus siúd atá i gceannas ar an ardán sin sásta cosaint a thabhairt don daonlathas nó do chine ar leith thar chine eile.

Tá a lán rudaí eile maidir le ceist na hardán digitigh seo agus conas gur féidir linn cosaint a thabhairt. Bhí Baill ag labhairt roimhe seo faoi chosaint a thabhairt don copyright atá ag aisteoirí agus ag scríbhneoirí agus a leithéid. Is féidir linn teacht ar ais chuig an gceist sin amach anseo nuair a bheidh muid ag plé an cheist seo ag leibhéal níos faide.

While we can all support some of the principal objectives of the Bill as regards compliance around online safety, I cannot support the legislation at this time. It is a legislative pillow designed to suffocate free speech until it stops moving. There is a profound democratic deficit at the heart of this Bill that is deeply alarming. Indeed, I am genuinely concerned that we are embedding a new regulatory regime that will simply replace one series of evils with another. In fact, there is now an overwhelming sense to all but this Government and its cheerleaders in the anti-free speech space that Ireland is adopting borderline authoritarian measures to address what are undoubtedly very real problems.

I will highlight one concern about section 37 and the designation of trusted flagger status. Apparently, trusted flaggers are those organisations or persons who have demonstrated expertise and competence in recognising and reporting harmful content. No one will be surprised that this amounts in practice to organisations that are in perfect lockstep with the Government's preferred narrative on a range of contentious issues being given significant power to close down debate. Of course, the other problem is that much like the absurd and dangerous hate speech Bill, there is no agreed consensus or definition of what constitutes harm, beyond the obvious things, such as child pornography, etc. We are entering dangerous territory and everyone can see it except the Government.

Social media can be used for good and for evil. I see social media being used, for example, to look for the World Rally Championship, WRC, to come to Limerick. We can put that on social media, it reaches a couple of hundred thousand people, and they can comment on different things and see social media is a good thing because we are using it to try to get the WRC here. If we then consider the time I brought the truck to the Dáil when we were looking at fuel prices, where people had transport networks-----

You should not be reminding us about that.

I did it according to the law of the House. There were no markings on the truck and it was my only means of transport on the day so I followed the rules of the House. It did not break any rules, but it showed some of the TDs in Government parties who did not realise the amount of tax that was being taken on fuel. Deputies were voting on things in the House that they did understand. Social media taught them about that. The other thing social media can do is hold the Government to account. Members of the Government go on radio and television and say one thing, but they then come into the House and vote the exact opposite. They vote against their counties. Social media can get word out to the people and tell them about what the Government is doing. That is what it can do. It can hold the Government to account.

To look at the other side of it, it can be seen that people are using social media for the wrong reasons. I will use a couple of examples. If somebody has a road accident and is fatally injured, and an individual takes a picture of that and puts it up online without families even knowing about it, that person should be held 100% accountable, as should people who put pornography online. People who caused the issues in Dublin, including the rioting that went on, the criminal damage to property and business owners and the harm that was done to children, should 100% be held to account. Those who use social media for violence should 100% be held to account.

However, there is a lot of good out there that social media does. If it is looked at from the point of view of Independents getting onto the airwaves and on television, we do not get that. Part of the budget pays for RTÉ so favouritism comes to the likes of Government TDs when it comes to getting on air. In addition, the Government funds other social media outlets. When it is looked at from the point of view of people getting their voices out there, social media holds the Government to account because it means we can get out the message and show people in this country what is really happening.

Hate speech legislation is now coming through the House. The Government is trying to stop anyone who wants a democratic debate with it being on a social media network. This is some of the stuff it is trying to stop. That is not democratic and is not fair on anyone who wants to hear the real stories that are going on. There is an awful lot of fraudulent stuff going on, and a lot that goes on social media that is not even accurate. It should be 100% cut out, but people in this country should be entitled to see the good stuff, the promotional stuff and content about what happens. Sometimes, you could make a film in here because some of the stuff the Government comes up with could be laughable, if it were put out on social media.

We want proper regulations to be put in place for people who misuse media for hatred, violence, and against children, and who directly target people out of hatred that causes violence. That is what I am fully intent on. Those people should be held to account. People who are sharing such content on social media should be held to account. However, there are also many people and young children who do not understand the content of something they share, and do not realise it is bullying that goes on among them, until it hits home and somebody suffers because of it. Again, that can be brought back to a policy where the amount of media presence can be limited for people who need to be educated on it first. Some children have phones at the age of five or six and can enter all sorts of different social media networks. That is wrong. We need to make sure that people are educated and for those who do something and make a mistake, that the mistake can be rectified.

That has to be looked at too. From the point of view of media, we need to look after the people who have good stuff and we must look after the people who have proper democratic debates and highlight the things that are wrong. I am 100% against anyone outside of that who has come along and used it for the wrong reasons.

Consider, for example, the people who were outside the US ambassador's house last night and the posts they put up on social media about what they were doing out there. It is okay to protest once it is peaceful protest and does not harm anything, rather than what we saw in Dublin last week, which was horrible. People's lives and properties were put at risk. It was all instigated by a small minority of people. Through social media they were able to gather very fast . We need to target the likes of that. If we break down that process, when we see something like it, we must target it and make sure. Otherwise, we are not only putting people at risk, we are also putting gardaí and front-line workers at risk. Why not concentrate on something like that and get that right first regarding those people who create harm and hate? We must concentrate on that first and make sure we cover that.

The Government has it wrong on the Criminal Justice (Incitement to Violence or Hatred and Hate Offences) Bill. It needs to make sure there is freedom of speech up to a certain amount, but when it is about organising crime, then we need to make sure a provision is built in properly for the protection of children and people going forward.

There are parts of the Digital Services Bill I would support but I will not be supporting the main part of it. It has serious consequences for many people. We have talked about different areas, from hate speech to freedom of debate in this Dáil and throughout the country. It is hugely important. There is a huge difference between freedom of debate and people inciting hatred. Sadly, in this country, quite a lot of people have got away with inciting hatred and this has led to serious consequences. That is through media outlets no matter where they came from but it can be tackled.

During the rioting in Dublin two weeks ago, sadly, gardaí were treated scandalously by certain individuals, including kicking them and holding cameras up to their faces as if to say, "We can do what we like to you but you cannot touch us." That is a sad society and a road we should never have travelled. Video images came from a phone. They were put up on the some social media platform, and surely be to God there is already a law of the land whereby we can clamp down on that nonsensical carry-on and punish the people who do that. It all comes down to the sentencing of these criminals and the people who carry out these acts. The sentences are too light. If these people were caught and dealt with properly, I can assure the Minister of State that we would have a very different country.

When talking of social media and freedom of speech, politicians have a lot to answer for. It does not matter if a person is accused of being far right or far left, politicians are politicians regardless. They are public servants and, as we saw last night, some of them are deciding that, if somebody goes to meet an ambassador, it is a bad thing to do, even though the same ambassador is representing a country that provides tens of thousands of jobs in this country. I know myself from the case of Apple and several others in Cork that this ambassador helps to create tens of thousands of jobs through tourism when people from their country come to visit our country. Then a TD decides, in his wisdom, that he can name and take pictures of people going in and out and put them up on his social media feed. It is scandalous that he does not put up any pictures of what he was doing to a Minister a few years ago. Has he forgotten that? Ireland has not forgotten that. I have not forgotten it. People sent me images of it today just as a reminder. I am not going to waste my time at that kind of nonsense and carry-on. It just tells us something. After the violence we have had, this is absolutely scandalous. There is also a Labour Party TD who actually has incitement to hatred on his media feed by stating it was the Rural Independent Group Members who were the cause of the burning down of Dublin city that day. It was an astonishing statement, from any TD on his or her social media feed. The Garda should intervene there. We should not be intervening. It is the Garda that should intervene in issues like this when immature politicians carry out all sorts of crazy things.

We all use social media. It is our only means of getting our message out there, but it might be a general attack about a certain topic on agriculture, for example. We do not personalise it. It is not personal finger-pointing and it is not putting up something on social media to incite hatred. This, however, is what we have. These politicians will come in and support this Bill, of course, but perhaps when they go outside the door they should look at themselves and at the scandalous way they carry on. This is not being far left or far right. It is just irresponsible, immature politicians who do not know how to carry out their job. That is basically what we are talking about here.

I hope in the Digital Services Bill we would look at areas like local radio stations, for example. They do give people an excellent opportunity to speak. Radio stations such as C103 FM, Red FM, and 96 FM in Cork, with Patricia Messinger, JohnPaul McNamara, John Greene, and Neil Prendeville. We are lucky to have them because they give a balanced debate. You do not always get what you want to hear but there is a balanced debate and opportunity for people to speak. We need to continue that. My worry is that if we insert the words "hate speech" and if we bring forward legislation in the House on hate speech, we are certainly going down a very dangerous road. People's rights to freedom of speech for genuine reasons will be prohibited to them.

I am aware there are a lot of issues with young people using mobile phones. We all have concerns about that. I urge parents to get those young people's heads out of the phones and get them walking the streets in some safety. That is not there at present.

I always accept the bona fides of the Minister of State, Deputy Calleary, and I have worked with him over the years. I do not accept the bona fides of the Government that is producing this kind of legislation, although it is necessary to control harmful content, and especially for our daoine óige. I have ten wonderful grandchildren all under the age of nine. The Minister of State has young children himself. The young people need to be protected and ordinary people need to be protected as well. Things are running amok on social media. We see now the hard left supporting legislation like this. There are none so blind as those who will not see at all.

Last night we were invited to the residence of the ambassador of United States of America. I duly went along on behalf of my constituents because we have 8,000 direct investment jobs in Clonmel alone, and 5,000 or 6,000 more in Dungarvan trasna an bóthar in Port Láirge. Elected Members of this House were prancing around there with communist flags on them, roaring and shouting abuse at us going in there, and prohibiting people from getting access to the ambassador's residence. It was a sad day when the American ambassador's residence was blocked off and people could not get access to it last night. It was a sad day for democracy. These people claim to be democrats themselves. They do not know what the word means. They are nothing but bullies, thugs and vagabonds of the worst type. They are in this House every day catcalling and name calling me, my colleagues and others here. It is bullying. They try to bully and intimidate us. They try to dictate what goes on in this House.

There are cabals of NGOs running in front of the Government, so to speak, and I am sure there are plenty running around in front of the Minister of State too. Deputy Nolan found out there are 30,000 NGOs costing nearly €6 billion a year. They are writing legislation and churning it out, and the Government is saying, "Yes, write some more, we will do that for you now, and we will do the hate speech next", just to shut up all these people. We are going back to an autocratic State. Our democracy was hard fought for by the men of 1916 and the men of 1922 and 1923. It was hard fought for. We have come a long way now with legislation like this that is supposed to be doing one thing but is really sneakily and cleverly crafted at the behest of the NGOs out there. We are sick of seeing them morning, noon and night on television and every place else attacking the church. In this context of NGOs we must remember there are sisters, brothers and lay people who go all over the world looking after people in famines and so on. Now we have to involve these NGOs with CEOs and deputy directors and all these spokespersons for NGOs. We also have brilliant doctors and nurses who go out and do such work. These NGOs, however, are just like a bloody cancer with regard to the money and it is taking the money away from the people who need it. There were 3,000 people queuing for food from 4 a.m. today at the Capuchin centre in this city.

Three thousand people on a cold night like last night and all that bothers the people in here and the Government is how to silence the good Christian people of this country and destroy their religion, culture, heritage and dúchas. They will not. I have that message. There are silent people out there who want to practise as Christians and to practise their culture, faith, language, rince agus gach rúd mar sin. They will continue to do that in spite of all this legislation.

The hate speech Bill is so dangerous and it has been found out. I salute Senator McDowell in the other House for picking so many holes in it. It was rushed through here with indecent haste. Sinn Féin and the left are voting for this legislation. Sinn Féin was for years banned from radio and television under the Offences Against the State Act. Now it is going along with this. It has gone so far in with the establishment parties, the Government and the globalists, serving the EU masters, that it has forgotten its values and how it was denied access to radio. Now we want to put a muzzle on Members and on organisations.

Legislation was brought in some weeks ago to stop people praying outside a health centre. If the pro-life rally I gladly attend every year walked down O’Connell Street, it could fall foul of that legislation because we will be nearer than 100 m to some health facilities. The twisted thinking and logic behind that will destroy people’s lives. We have this so-called neoliberal great society and then we had 3,000 people waiting for food last night from the Capuchins. We have 13,000 homeless, including 3,500 children.

Government Members should hang their heads in shame because they keep bringing in legislation and obeying the NGOs. Citizens’ assembly after citizens' assembly is formed to bring in legislation to thwart the democratic right of the people who elected us to this House to carry out legislation. I will not support any part of this legislation because it is not doing what it says on the tin. It is a cover for the Government to silence people.

I call Deputy McNamara. He has 20 minutes.

I certainly will not utilise that. If Deputy Wynne needs additional time, I am happy to share.

I have reservations about some aspects of the Bill. How Covid played out in the media was an enlightening and salutary experience for me. It was a strange thing to see what happened to medical doctors and people who head research centres in universities. I am thinking in particular of Professor Carl Heneghan, a British academic. Some of what he said was being flagged on Facebook as unorthodox, untrusted and not to be accepted because somebody somewhere determined it was not in accordance with the prevailing orthodoxy. It may well be that what he had to say did not accord with the orthodoxy but that tendency is frightening. I wonder how much expertise the person who determined his posts were somehow dangerous had in evidence-based medicine, which was Professor Heneghan’s field.

This labelling of stuff made be uncomfortable. Surely everything you hear or read is to be taken with a grain of salt. Even among medical professionals, there is a reason people sometimes get a second opinion. It is not to say the person who gave the first opinion lacked bona fides but opinions differ. Is it not the nature of life and discourse that opinions differ? We seem to be obsessed with the idea there is one truth we all have to share and if we do not share it society will fall apart, so we need to banish those who do not share our views and have improper thoughts. It goes back to a darker time in Irish and world history. It seems to me that is where we are going.

There were fact-checkers. One outlet in Ireland was obsessed with fact-checking and setting the fact-checkers on people. It was a media outlet supported almost exclusively by Government funding at the time. A Government official could contact it. If somebody disputed anything a Government official said, the outlet could be contacted and would do its fact-checking. It became a real witch-hunt. Witch-hunts are fine. Everyone is free to become a witch if they want to and we all have our views on hunting but when it is State-sponsored and carried out by an outlet paid by the State to enforce a message the State wants people to accept without question, then it becomes a source of worry, to me at least.

That is a worry I have about this Bill. There is a vetting of researchers provision which seems a strange delegation of powers. There is an accredited fact-checkers provision, also in section 37. Are we now to have vigilante groups policing the Internet and accorded a special badge by Coimisiún na Meán? That is a dystopian way of looking at it but I fear that is the road we are going down. It started during Covid with flagging content, including that of eminent professors who did not fully accept what other eminent professors or, more important, Ministers for health were saying. I am not talking about an Irish context; this is beyond Ireland and in a much bigger context. Those people were suddenly pushed to the margins and distrusted. That is worrying.

Equally, at the start of the Ukraine war there was one narrative only and you had to buy into it. We could not afford the narrative, it now looks like from what the Government is doing. We will differentiate between those who fled Ukraine and gained temporary protection at one time and those who come now and have temporary protection. It seems rather odd, philosophically, but I am sure the Government will explain it in due course. Close to the start of the Ukraine war, the European Commission, led by a former German Minister for defence, no less, announced certain outlets were to be banned. RT, formerly Russia Today, was banned. It did not affect my life. I have never gone to Russia Today to look for the truth on anything like that. I may have flicked through television channels in a hotel and seen Russia Today in that context but I would not go there to discern the truth. I am not sure I would go to any particular media outlet to discern the truth. You inform yourself from a variety of sources and make up your mind, do you not? Russia Today being banned by a former German defence Minister made me uncomfortable. A German defence Minister deciding what I could and could not view on my television is not a society in which I want to live. I am not saying I want to look at Russia Today. I could not care less whether it is there or gone, but the fact she is telling me what I can and cannot look at is worrying. It is moving us in a particular direction, which is farther from a plurality of views and opinions and towards the idea of one universal truth we must all accept, marching together towards this common nirvana that awaits us all.

That is worrying, so I would have a lot of reservations about this Bill. I accept that is based on EU legislation, but the fact it is based on EU legislation does not give me much consolation, given the make-up of the current Commission, its political impetus, and the complete lack of transparency in some of the decision-making processes there. Then there is the fact that she was so willing to essentially hide communications with various companies from people who wanted to investigate it, when there was a legitimate issue in determining it.

Finally, and it is a lot closer to home, there was this schemozzle between the Department of Justice and X. Which of them do we believe? The Minister came out and announced something and X said that it was untrue. Somebody somewhere made a mistake. It may be down to human error but is the Minister for Justice to be flagged as somebody whose pronouncements are not to be trusted on social media? I do not think she should be, for what it is worth. Nevertheless, she has not withdrawn the accusation that she made. There is a suggestion somewhere that some people have preferred routes to contacting social media companies. A former local politician in Ireland announced on television that she had got on to someone she knew in Twitter and they had dealt with something for her. Again, are we to have a preferred grouping of people who police what is said on behalf of the rest of us? These are all potentially dark avenues we can go down.

We need to be very careful about this Bill. It is, fundamentally, about censorship, and of course censorship is fine when all we are saying is that you cannot say something that is untrue. However, saying that the Earth orbited the Sun was a very dangerous thing to say at one point, and we should not ever go back to a time when people are penalised or distrusted for saying things like that, or anything, arguably.

In case this is the final time for me to contribute this year, I wish all colleagues in the House, the Ceann Comhairle, or the Cathaoirleach Gníomhach in his absence, and anyone watching at home a very happy and safe Christmas, and best wishes for the coming year.

I welcome the establishment of Coimisiún na Meán, and I note this Bill, which proposes to designate the commission as Ireland’s digital services regulator. This House is probably suffering fatigue from talking endlessly about our national broadcaster in recent months, and rightly so. However, I would like to take the opportunity this afternoon to talk about the fantastic radio offerings in my constituency of Clare and the challenges facing them.

There are three radio stations in Clare: our independent radio station, Clare FM, and two community radio stations, Raidió Corca Baiscinn and Scariff Bay Community Radio, one serving east Clare and the other serving west Clare. These exemplary stations provide a fantastic public service and a familiar background noise to homes, farms, businesses, and car journeys the length and breadth of County Clare.

I recently spoke with Ronan McManamy, the chief executive of Clare FM, and we discussed the various benefits of Clare FM, which was recently crowned the IMRO local radio station of the year, and I hope the Minister of State will join me in warmly congratulating it. Clare FM is listened to by 50% of the people of Clare each week, and thanks to a ring-fenced round of sound and vision scheme funding that colleagues and I across the House advocated for, it will be featuring new programming in the new year on a weekly basis around Clare women in business, minority sports in the county, and the thriving arts scene in the Banner County. It is money well spent, if you ask me.

In our conversation, Mr. McManamy conveyed to me that independent broadcasters feel their commercial model has been severely disadvantaged by the free-for-all enjoyed by digital operators, both from a content and advertising perspective. Digital now commands over half of the advertising spend in Ireland, and this is greatly assisted by them operating well outside of the tight regulation under which FM stations operate. The bailout provided to RTÉ and the reduction in VAT for newspapers has unbalanced the scale. While I welcome the funding recently announced in this House for coverage of local and European elections, Clare FM and IBI are concerned that the schemes will not be delivered in time for the elections and this looming crisis must be sorted as a matter of priority.

I also recently met with Mairéad O’Higgins Finnegan, the manager of Raidió Corca Baiscinn and treasurer of the community radio association of Ireland recently. This station, which is the only fully licensed community radio station in County Clare, employs five full-time CSP staff, one part-time CE, one part-time Tús and one part-time adult learner. It is partly funded for the CSP by Pobal, but this does not cover all of the revenue portion. It has a large base of volunteers who, thanks to this station, can have their voices heard on air. It relies heavily on fundraising to keep the doors open. In the interest of transparency, I will declare that my parliamentary assistant, Mike Taylor, sits on its board of directors.

Stations like RCB and Scariff Bay Community Radio, which has a 100-day licence, are reliant on Pobal funding despite Pobal now saying that community radio is non-priority and may be more appropriately supported by other Departments and State agencies. Who will that be? In the last open round of sound and vision scheme funding, slightly over 0.17% of the total fund was approved for community radio. Only one community radio station was successful in its application. Without proper support and investment, ring-fenced funding and a liveable wage, these assets in our rural communities will cease to provide this fantastic community service. We need a grant for operational funding under the CSP scheme to allow stations like RCB to not just tread water but keep their heads above water, their lights on, and the radio waves well represented in west Clare.

A great way of tackling misinformation is by ensuring and safeguarding trust in State services, public bodies and, in particular, policy. That is a huge issue these days, and if the Government wants to demonstrate that it is truly serious about addressing the harms of misinformation, I would suggest that is a good place to start. It must be understood that we have been facing a housing crisis without the adequate response, a cost-of-living crisis, and climate action and change without a just transition, to name but a few. I always say that this House is not the real world, so the Government must look at the possibility that it might make the mistake of seeing these points as a bash from the Opposition. There are real people who feel they have been forgotten about and who have been on the receiving end of considerable harm. If they do not see actions that will benefit them and bring about better situations for them, then it directly affects trust.

Gabhaim buíochas leis an gCathaoirleach Gníomhach agus le gach Teachta a bhí páirteach sa díospóireacht seo, go háirithe leis na Teachtaí a léigh an Bille agus a bhí ag caint faoi cad a bhí sa Bhille, thar aon rud eile.

I re-emphasise that the purpose of the Digital Services Bill is to provide for full implementation in Ireland of the EU regulation on a single market for digital services. This Bill will be an indispensable component of a pioneering regulatory framework to protect EU users' digital services and their fundamental rights online. The framework will rebalance the responsibilities of users, online platforms and public authorities, and will place citizens at the centre.

The regulatory framework represents a sea change in the EU's ability to protect society from illegal and harmful content and disinformation. Many Deputies have spoken today about the events in Dublin some weeks back. It is important so say that while we are implementing the legislation, the first time that some of the provisions under the EU Digital Services Act were used in terms of Coimisiún na Meán being able to contact the European Commission to seek its assistance in getting stuff taken down was through this regulation, which will be empowered by this Bill. Some Deputies who were calling for action now oppose the Bill that would empower such action. It is important to point that out.

Many Deputies have raised issues around the resourcing of that. Coimisiún na Meán and the CCPC have been allocated extra money in next year's budget to do that. We have allocated €2.7 million for 2023 to set up the DSC function, or the digital services function, in Coimisiún na Meán. This funding has increased to €6 million for 2024, when it will be fully operational. The CCPC's Exchequer funding has significantly increased in recent years, in light of its additional functions.

This will enable the continuing recruitment of people with the necessary technical, legal and regulatory skills for an coimisiún to carry out its function as the Irish DSC and for the CCPC to do so as the competent authority.

I emphasise again that the Digital Services Bill 2023 is a technical Bill. It is necessary to give effect to the supervision and enforcement provisions of the EU regulation. It neither adds to nor alters the obligations of intermediary service producers under the EU regulation. The Bill ensures that rights and protections provided for in the EU digital services regulation will be rigorously asserted in Ireland for the benefit and protection of Irish users of digital services. It is mandatory under EU regulations for member states to give effect to these national provisions by 17 February next.

I thank all Deputies for their co-operation this evening. I look forward to engaging with them on the amendments. I thank the Office of the Attorney General, the Office of the Parliamentary Counsel and, in particular, officials from the Department, who are in the Chamber and the Gallery this evening, for their extraordinary commitment to getting this Bill over the line in this timeframe. I wish to acknowledge that commitment.

To look at some of the issues raised, I thank Deputy O'Reilly and Sinn Féin for their support of the Bill. I look forward to working with them on the amendments they wish to put forward. I note Deputy O'Reilly's comments on the application of the DSA in the context of freedom of speech. That was the theme from several Deputies regarding the removal of content. The DSA puts freedom of speech at its centre and provides users with a right to complain if their content is removed, with the right to access out-of-court dispute mechanism bodies if the matter is not resolved and with an appeals process. The requirement for very large online platforms to undertake risk assessments yearly and to take mitigating measures associated with identified risks is a key tool of the DSA. They must be audited independently and the European Board for Digital Services, which will be made up of the digital services co-ordinators of all 27 member states, will report each year on prominent and recurring risks and best practice to mitigate those risks. Regarding vetted researchers, as raised by Deputy O'Reilly and a number of Deputies, the DSC in the country of the very large online platform's headquarters verifies vetted researcher status. This will be done by Coimisiún na Meán for researchers who wish to access data for VLOPs here. Public research bodies will also be able to access data.

I thank Deputy Nash and the Labour Party for their welcome of the Bill. I will engage with them. He mentioned several companies. I do not want to cite any particular company but companies designated as VLOPs will have to comply with the DSA or face the wrath of fines or the rigorous processes that go with this measure. That will have an effect on their shareholders and board to the extent of the fines. He also raised the resourcing of Coimisiún na Meán and the CCPC, which I have already dealt with. On the connections between Coimisiún na Meán and the CCPC, they already have a strong working relationship. Both are part of the digital regulation group. They have initiated a workstream to prepare a co-operation agreement which will facilitate effective collaboration in this context. We will watch this matter closely.

Deputy Ó Murchú raised issues around the staffing and resourcing of Coimisiún na Meán. We established Coimisiún na Meán and gave it its functions early in this process. We are one of only two member states that have designated a DSC. Now it is designated, resourced and employing staff, it will hit the ground running from 17 February, once this legislation is passed.

I thank Deputy Cathal Crowe for his remarks. The DSA is a key step in addressing the online world and regulating it. I think he used the phrase "the wild west of technology". This legislation will put much more onerous roles on providers that are larger entities, which are resourced to do it, and the services they use. It will rein in some of the platforms.

Deputy Catherine Murphy raised resourcing, about which I have already spoken. We prioritised funding to prepare and to allow Coimisiún na Meán to do its work effectively.

Concerning algorithms, which several Deputies raised, a new European Centre for Algorithmic Transparency, ECAT, was established in April 2023. This provides scientific and technical expertise to the European Commission, which supports the enforcement of the DSA. It is envisaged that ECAT will review the risk assessments and audits carried out by VLOPs and VLOSEs. It will carry out research into the impact of the algorithmic systems used by online platforms and search engines. Deputy Boyd Barrett spoke about the power balance. The essence of this Bill and the Digital Services Act is to balance this relationship. The designation of VLOPs is based on the number of service users as a potential population. The strongest obligations are on the largest companies. That will be monitored all the time. As new companies come on the scene, as they invariably will, they too will fall within the remit, if they meet the numbers. No VLOP that is designated as such will be exempt from the DSA, regardless of what it is.

Several Deputies raised the online environment and the negative impacts it has on citizens and consumers. The DSA specifically focuses on limiting the level of illegal content but also on goods and services, which have been somewhat forgotten in the debate. This will bring regulation to the marketplace. This Bill puts in place the necessary measures in our national legislation, which will allow Ireland to take up its role in regulating providers to ensure that illegal content, goods and services are controlled, for once and for all. On offences online, the Garda also will have additional resources and powers.

Deputy McGuinness raised the real issue of fraud and digital technologies. The DSA is specifically focused on providers of online intermediary services rather than the wider gamut of digital technologies and services and related enterprises. It is an important issue. We will focus on it but it is the role of An Garda Síochána to prosecute.

Deputy Shanahan referred to the important role Ireland will have in the European space in this area. That is why we have put such a level of resources into Coimisiún na Meán, to ensure it has the power to do the work necessary.

Deputy Tóibín raised concerns around freedom of speech and censorship, which were also the concerns of some members of the Rural Independent Group. The DSA is putting the protection of freedom of expression at its core. This includes protection from what people may see as Government interference in their freedom of expression and information. The existing horizontal rules against illegal content have been carefully calibrated. They are accompanied by robust safeguards for freedom of expression and an effective right of redress to avoid both under-removal and over-removal of content on the grounds of illegality. The DSA will give users the option to contest decisions by an online platform to remove their content, including when these decisions are based on platforms' terms and conditions. Users can complain directly to the platform, choose an out-court settlement with a dispute settlement body or seek redress before the courts. The DSA proposes rules on transparency of content moderation decisions. For VLOPs and VLOSEs, this will provide users and consumers with a better understanding of how these platforms impact our societies. VLOPs and VLOSEs will be obliged to mitigate those risks, including to freedom of expression. They will be held accountable through independent auditing reports and public scrutiny. If people are afraid of independent auditing and public scrutiny, they are entitled to oppose the provisions of this Bill. If they want the online world to continue as it is today, as has been spoken about by many Deputies, without any control, they will oppose this Bill. Those who want to bring control into this space can support this Bill and work at committee level to strengthen it.

Deputy Tóibín and several Deputies also raised the issue of minors.

The Digital Services Act introduces a range of obligations, including requirements for most online services to complete risk assessments on the exposure risk of all of their users to illegal online content, and the exposure of children and young people to age-inappropriate content. These services must then address this risk of exposure through a range of mitigation measures. In order to facilitate age verification, the European Commission is committed to working with member states to promote EU standardisation in order to strengthen effective age verification methods. The Commission has, in particular, committed to facilitating a comprehensive code of conduct on age-appropriate design by 2024, and has indicated that the code could provide for age verification for accessing certain online content. The suite of regulations and initiatives will go a long way to reduce the exposure of our children to harmful online content, and when fully implemented in the coming period will keep them safer online. Again, that is happening because of the Digital Services Act.

I thank Deputy Bruton for his contribution. As always it was real and focused, and acknowledged that the Digital Services Act is the first significant step in addressing the issues, while also being clear about the limitations of the Digital Services Act and how much more we have to do.

Many Deputies raised issues about trusted flaggers. The job and role of a trusted flagger will be to flag illegal content - not objectionable content. The trusted flaggers will have to publish information every year on what they have done in terms of reported notices. They will have to publish information on the type of illegal content, the provider, the response and the action taken by the provider. Coimisiún na Meán can investigate a trusted flagger on its own initiative, on information received from a third party, or if a provider reports them. The list of trusted flaggers will be published by the European Commission.

To those Deputies who spoke about everything else that had nothing to do with the Bill, I will once again be clear that this is a technical Bill and it relates to the Digital Services Act, which is an EU regulation. It is implementing the measures required in national law to give effect to the Digital Services Act and to increase and enhance the protections available to our citizens, including our children, from what is happening online.

I look forward to Committee and Report Stages. The reality is that as a country we used the Digital Services Act in response to the events in Dublin some weeks back. We will use it again to protect our citizens, not just online but from the consequences in the offline world of online activity. It is a fantastic power that we have. It is not censorship. It is not about hate speech. It is a technical Bill that will result in the protection of our citizens and our businesses. I propose the Bill to the House.

Question put and agreed to.
Top
Share