The pharmaceutical industry is experiencing a significant transformation, driven by the integration of artificial intelligence (AI) and more current Generative AI (GenAI) into aspects of drug discovery, clinical trials, and patient care. While these advancements promise substantial benefits, from accelerating drug development to providing more personalized medical treatments, the GenAI revolution raises ethical considerations regarding data protection, privacy, and the responsible use of technology.

Balancing AI benefits and risks in the pharmaceutical industry
AI brings many benefits to life sciences companies. Both traditional and GenAI enhance efficiency and can automate repetitive tasks. In the medical sphere, GenAI already looks promising against traditional diagnostic models in detecting illnesses like cancer. That early detection could be the difference in stopping its spread before it is too late. Though it’s early, these remarkable, potential benefits of GenAI models are coming.

Traditional AI already is widely used by organizations that go beyond just pharmaceutical. For example, it can analyze data and discover patterns in customer behavior to narrow down the right customer base, and ultimately to increase sales. Using Next-best Action, AI sends missing critical or actionable insights to your sales rep so that they understand about and react to previous experiences.

What’s important to remember is that GenAI, particularly Large Language Models (LLMs) like ChatGPT, is still in its infancy. With GenAI, the possibilities expand further than with traditional AI. Chatbots can answer many of the routine questions that people have for customer service, saving them hours waiting for a live representative while also saving the company money. Microsoft Copilot allows users to gather information spread across all their Microsoft platforms — Word, Teams, Excel, Outlook email – in seconds, giving them what some call the world’s fastest personal assistant. Not only that, but Copilot will anticipate your needs while using the Microsoft suite and make online suggestions for you, speeding up your work and making you more efficient.

While traditional AI typically operates within a closed network, GenAI involves LLMs that draw from vast public domain data, a distinction that raises critical questions about data privacy and security.

Ethical questions about AI and private patient data 
One of the primary ethical concerns in the pharmaceutical industry is the protection of patient data. Traditional AI is typically done within the confines of a company’s private network, but GenAI is different. GenAI models require access to extensive public domain data to operate, potentially exposing sensitive patient information. This dilemma raises concerns about data breaches and unauthorized access.

Clearly there is a need for strict boundaries and guardrails to protect private information and intellectual property (IP) when using GenAI. There are some solutions in play, such as creating private LLMs or collaborating with AI providers that commit not to store client data. For instance, Tableau’s new TableauGPT claims that it “allows administrators to enable trusted, ethical, and open GPT-powered experiences that fit the business needs without compromising data security and privacy.” Despite claims like these, it is still early in the GenAI revolution, so ethical data strategies are crucial for AI in the pharmaceutical industry moving forward.

Potential ethical AI strategies
What should any organization consider when navigating the ethical landscape of AI? Here are a few starting points to consider.

  1. Put IA before AI. This excellent advice came from the company Teradata six months before ChatGPT4’s release. Prioritize Information Architecture (IA) to ensure data quality and cleanliness before implementing AI solutions.
     
  2. Consider the role of data governance. You need to ensure that robust data collection, data cleaning, and data modeling are done in the right way before your data goes to a model.
     
  3. Educate executives about what is needed for AI. Executives rarely see the bulk of the behind-the-scenes work that goes into drug production. The same is true about GenAI, so they might not immediately understand why you need a new data architect or other resources to support your AI work. You must consistently advocate for the importance of data architecture and AI to executives to gain their support, and so they understand what your data team needs.
     
  4. Make Compliance and Legal team allies. Collaborate closely with compliance and legal teams to ensure ethical AI practices. Rules and regulations can vary between teams within a company, between companies, and between industries, so the best thing to do is constantly ask questions and communicate with legal and compliance: What are the rules? What are the guardrails in place? What are the restrictions with AI? Am I allowed to use this? Only then can you put forward a solution with confidence.
     
  5. Encourage users to do self-service analytics. Once guardrails are in place, empower experienced users to perform self-service analytics. This way, you will not be bogged down with small requests. You would do better to conserve your data team’s time and energy for a bigger initiative or a more critical innovation.

Complements, not replacements
In many cases, AI and GenAI serve as complements that enhance employee efficiency and drive innovation, not only within the pharmaceutical industry, but within yours, too. It will not look the same across every industry. For some, AI might replace resources. For others, it could free up resources, and for still others, it might automate repetitive tasks that used to take you two hours to do every day. But for nearly every industry, AI will complement what people are doing and make workers more efficient. Before you can fully experience the benefits of AI in pharma or any industry, you must have your bases covered around privacy, ethics, quality, and governance.
 

Disclaimer: The views and opinions expressed in this article are those of the author and do not necessarily reflect the views or positions of his employer.