People used to ask me why they should hire an estate planning attorney instead of LegalZoom. Now they ask about ChatGPT.
I’m not concerned.
Let’s say you ask ChatGPT to draft your will. It does. It tells you how to sign in the presence of two witnesses who are also in the presence of each other. It creates a self-proving affidavit. You just saved some money, right?
Well. Maybe.
The problem is that AI doesn’t know what questions to ask, and you don’t know what you haven’t been asked. That gap is where plans fail.
Florida Homestead
You tell the chatbot to leave the house to your kids. It drafts that. Clean.
What it didn’t ask: are you married. Do you have a minor child. Is the house your homestead.
If it’s homestead and you have a surviving spouse or minor child, you can’t devise it to the kids outright. The Florida Constitution won’t let you. Art. X §4(c). What actually happens under §732.401: your spouse gets a life estate and your children get a vested remainder, or the spouse elects a one-half tenancy in common. Neither of those is what you told the chatbot you wanted. Neither of those is what your family wants either. Now the spouse can’t sell without cooperation from the remaindermen, can’t refinance easily, and every decision about the house becomes a negotiation. Sometimes a lawsuit.
A human attorney asks three questions and this problem never exists. The chatbot doesn’t know those questions exist.
The Disabled Beneficiary
You have three kids. You tell the AI to split your estate equally among them. It drafts that.
What it didn’t ask: does any of your kids have a disability. Is any of them on SSI or Medicaid. Does any of them rely on means-tested benefits.
If the answer is yes, an outright bequest destroys those benefits the month the check clears. The fix is a third-party supplemental needs trust. It’s not a hard document to draft. It’s an impossible document to draft if nobody asks the question.
I have seen this in probate more than once. The family didn’t know. The drafter didn’t know. Everyone finds out when the benefits stop.
The Son-in-Law
Here’s the one nobody talks about.
When a client tells me they want everything to go to their daughter, I listen to how they say it. If they pause, or shift in their chair, or their jaw moves a certain way when they mention her husband, I ask a follow-up. Sometimes two. Sometimes the answer is “he’s fine.” Sometimes it’s “he’s a deadbeat and if she dies first I don’t want a nickel of this going to him.”
That second answer changes the whole document. Now we’re doing a lifetime trust for the daughter, with a trustee who isn’t her husband, spendthrift protection, a remainder that goes to her kids and not to him, and a provision that keeps the trust principal out of the marital estate if she divorces. None of that ends up in an AI draft because AI can’t read a pause.
The chatbot gives her the money outright. She gets hit by a bus a year later. The son-in-law inherits everything you spent forty years building. He remarries. Your grandkids see none of it.
Everything Else You Didn’t Mention
Second marriage with kids from the first. A business interest with a buy-sell agreement that controls on death. Out-of-state real estate that needs ancillary probate. A child with a gambling problem whose inheritance will be gone in six months. A beneficiary with active creditors. A child you loaned $200,000 to and want to equalize around at death. A retirement account where the beneficiary designation overrides the will and blows up the whole plan.
None of that comes up unless someone asks. The chatbot doesn’t ask. You don’t know to volunteer it.
The Cost
People use AI to save on legal fees. Understandable. A decent estate plan from a competent Florida attorney runs a few thousand dollars for a basic setup and more for anything complicated.
What the AI draft costs is paid by your estate, not by you. You won’t be around for that part. The homestead fight is five figures minimum. The lost SSI benefits are permanent. The son-in-law is forever. Fixing a bad plan posthumously costs more than doing it right would have, sometimes by an order of magnitude, and the person paying for the fix is usually the person the plan was supposed to protect.
The will isn’t the hard part. The intake is the hard part. AI skips the hard part and hands you a document that looks like an answer.
