This week, I share M&A lawyer-specific use cases for Microsoft Copilot and highlight some risks associated with it.
AI will change the world, but how will it change M&A? I want to focus on AI’s impact on M&A in this newsletter. I am not an expert on either M&A or AI, but I want to learn about both topics and how they intersect. I thought there might be others in my situation (or people who are experts in one field or the other) who would find information on M&A and AI helpful in their careers, so I created this newsletter to track and share what I learn.
Microsoft Copilot Use Cases for M&A Lawyers
Now that Copilot is widely available, I thought I would share some concrete use cases of Copilot for M&A lawyers.
Email Summaries
One great feature for people who receive many emails is asking Copilot for a summary of their unread emails. For example, a user could prompt Copilot first thing in the morning to “summarize the emails I received overnight.” Copilot would then summarize all of the emails the user received. Users can go a step further and prompt Copilot to highlight action items contained in emails. Obviously, the emails should be read in full, but the summarization feature may help prioritize the most important emails.
Ask “What’s Missing?”
When reviewing or drafting a document, a user can ask Copilot, “What’s missing?” Ideally, Copilot catches a missing provision or gives the lawyer an idea of something to add. I am curious to see how well this works in the real world. Copilot is not trained on legal data, so its suggestions may not be great. However, fine-tuning the prompt will improve Copilot’s accuracy. For example, a prompt could be “act as an expert M&A lawyer and tell me what is missing in this document.” This prompt directs the model towards its legal knowledge, theoretically improving outputs.
Clean Up Notes
I had not previously considered using Copilot to make notes more professional before distributing them. This potentially saves a lot of time. If your notes are anything like mine, the first draft is messy. Instead of rewriting and reformating the notes manually, AI can refine the notes into a memo or at least a cleaner version of the notes.
Here’s a prompt: “Organize my notes in a professional and readable way. Format the notes into a bullet point list. Make sure to keep the substance of the notes the same—do not add anything. Emphasize any action items contained in the notes. Make sure to be direct, avoid excess, and maintain a professional tone. Here are the notes: [insert notes].”
Enhanced “Ctrl-F”
As we have talked about before, one of the main benefits of AI is an enhanced “ctrl-f” feature. AI can “read” long documents, making finding a particular part of the document much easier. By using Copilot in Word, M&A lawyers can more efficiently search long agreements using natural language.
Risks of Using Copilot
Using Copilot does not come without risks. Here are two articles that highlight a serious risk with using Copilot.
Malicious actors can cause Copilot to malfunction by simply sending a Microsoft 365 user a document via email.
Here’s how it happens:
-
The malicious actor sends a document with a specialized prompt targeting a question that a user may ask. Alternatively (but much more difficult), a hacker could cause Copilot to malfunction on all prompts, not just a targeted one or two.
-
The malicious documents will contain text that causes Copilot to “look” at it when searching the 365 system for relevant documents. The documents will also contain malicious (or benign) prompts. Once it “reads” the document, Copilot will follow the instructions below the “attention-grabbing” text. Here is a benign example from the articles:
In response to a prompt asking about the bank information of a company, Copilot responded, “The Boston Celtics.”
It is incredible how easy it is to cause Copilot to change its behavior. If the user is using AI appropriately and double-checking all outputs, an attack such as this would probably not matter much. But if the user takes everything Copilot says as gospel, there is a high risk of using inaccurate information in important documents—especially for lawyers.
Here’s the lesson: Trust but verify. AI is generally helpful and can greatly enhance productivity. But as you can see, it is extremely easy to manipulate AI into giving bad answers. So to avoid any issues, always verify AI’s outputs. Thankfully Copilot makes this easy by providing references/links to most outputs.
About Me
My name is Parker Lawter, and I am a law student pursuing a career as an M&A lawyer. I am in my last semester of law school, and with some extra time on my hands, I decided to create this newsletter. I hope it is informative and helpful to anyone who reads it! I am not an expert at either M&A or AI, but I am actively pursuing knowledge in both areas, and this newsletter is a part of that pursuit. I hope you’ll join me!
Follow me on LinkedIn: www.linkedin.com/in/parker-w-lawter-58a6a41b
All views expressed are my own!