Home Knowledge Irish Tech Counsel Event

Irish Tech General Counsel Event

William Fry was proud to host an event on behalf of the Irish Tech General Counsel (ITGC).

The event was led by ITGC Head Sarah Irwin and William Fry’s own resident AI expert Barry Scannell. Barry spoke about the future of AI and the importance of ensuring that businesses are prepared for an explosion of AI use, and AI legislation regulating that use. While there is huge potential in AI for businesses, as emerged in the discussion, it will be crucial to ensure that any use or deployment of AI is compliant with relevant legislation, or a business may risk leaving itself open to legal action.

The following key “takeaways” from the event are a good starting point to get ahead of the curve:

  1. Companies should complete an inventory of their AI systems to identify where they are using AI, and which of their systems are actually using AI. On the back of that inventory, companies should then carry out an AI Impact Assessment in order to assess the risk level associated with identified AI and to take ameliorative measures. This would include assessing fundamental rights that may be affected by algorithms and AI used by your business e.g., will the algorithm discriminate in their assessments against certain groups of people.
  2. Going forward, compliance with the forthcoming AI Act, the revised Product Liability Directive, the AI Liability Directive, and existing obligations such as those under the GDPR will be vital, and businesses should be considering these pieces of legislation now.
  3. Companies that use “high-risk” AI systems are a key focus of the new AI Act, so it is very important that businesses understand the scope of their liability in terms of their software use and the interplay between physical products that also use AI. Recruitment / performance tools that use AI are considered high risk, as bad decision making can have real time effects on individuals.
  4. Be mindful of IP issues surrounding use of large language models (LLMs) such as ChatGPT, as there are open questions around copyright ownership of AI generated work, as well as potential copyright issues in training data.
  5. Be cognisant of AI clauses, indemnities and warranties when negotiating, reviewing, and drafting contracts.
  6. Update internal policies. Policies should cover all AI that businesses use and the recommended use, as well as other AI systems that your employees might be using e.g., ChatGPT.
  7. ChatGPT should not be used for legal advice, analysing legislation, or drafting precedents. LLMs like ChatGPT make statistical reproductions of sentences in order to answer input prompts. In an industry where contract clauses etc need to be 100% correct, and not “statistically correct”, in-house lawyers should always ensure that any LLM outputs are checked and verified.

While the challenge of monitoring and properly complying with AI law may be daunting, any business which engages and interacts with legal compliance now may gain an edge on its competitors. For more information on upcoming regulation, including the AI Act, please contact Barry Scannell or another member of the William Fry Technology team.