Article

Can AI run your personal finances?

Generative AI is a revelation, but the jury is still out on using ChatGPT, Copilot or other such tools for financial guidance. Here’s a look behind the curtain at how these chatbots work, where responses come from, and why human advice still reigns supreme.

| 11 min read

There seems nothing people won’t try to use generative AI (GenAI) for nowadays – from where to eat and what to buy, to staying healthy, work chores, learning about the world, and indeed, finding one’s place in it. A striking observation from the Harvard Business Review in 2025 was that the most common use of large language models (LLMs) – ChatGPT, Claude, Copilot and so on – was therapy and counselling. Crazy when you consider how young this technology is, yet how trusted it seems to be.

If people are using models for emotional release, then it should be no surprise that they’re also being used for questions about personal finances. After all, responses seem so slick and wide-ranging.

But before you jump on the bandwagon, imagine you’re sitting in a restaurant where any dish can be served, to your heart’s desire, within a matter of seconds. Wonderful, of course, for a while. But after dining there a few times, wouldn’t you eventually begin to wonder what’s going on in that kitchen? Some things are too good to be true, so what corners are being cut to maintain that service?

This is a useful way to think about AI. Because before handing over details as personal as your salary, pension pot and investment portfolio, it’s worth knowing more about it.

Where has generative AI come from?

In Amazon’s Cambridge offices, the tech giant reportedly develops its ‘Olympus’ LLM. 

There, legions of data associates sit in front of workflows of the rawest GenAI responses. They “label” good responses from bad, attempt to jailbreak safety features and course-correct the model. It’s data entry-style work and not especially glamorous, but their job is important to stress test the technology being built by engineers in Seattle and elsewhere. Efforts are focused on trying to improve the model’s performance, either as it relates to response accuracy – weeding out hallucinations and outright falsehoods – or as it relates to safety, deflecting harmful prompts and safeguarding when appropriate. 

Why does all this matter? 

Because, for many of us, it’s worth being reminded that there’s a workforce behind this technology. It’s not an oracle of nature. It’s human-made and therefore bound to be filled with some amount of human judgment and error. Indeed, as the work of human hands, many people observe that different LLMs seem to reflect their makers – “careful” Claude, “commercial” Open AI and “edgy” Grok, from Danil and Daniela Amodei (who defected from OpenAI), Sam Altman, and Elon Musk, respectively.

By underlining how GenAI is a human creation with strategic direction given to it, we can move on to delve deeper into why, perhaps, it may be ill-advised to trust it with your financial life.

GenAI wasn’t purpose-built for financial advice

To understand the risk, it helps to know what these models are designed to do.

In those Cambridge offices, human reviewers are fed examples of interactions, and Alexa is trained to prefer answers people will find more helpful. Amazon Alexa owners who’ve opted into ‘beta’ consent to having their voice prompts relayed to places like the Cambridge production floor, where they’re played through headphones to at least a dozen or so Amazonians, who then have a job to do.

But with this process, companies also teach their models to offer help to a user wherever they can, make things as upbeat as possible, and do their best to keep the conversation alive. For someone trying to get their head around an unfamiliar topic, that can be nice. There’s no human on the other side to judge any “stupid questions”. But it might also explain some of the strange behaviour we often encounter. Here's an example:

If someone asks whether they will be able to afford early retirement with an aggressive investment strategy, GenAI could produce a response saying, “tough, but… it could be done… if… And would you like some more advice?” That might sound reasonable. But a human adviser is unlikely to respond this way. 

At Charles Stanley, our ‘client first’ values mean having honest, healthy conversations. Sometimes, quite plainly, the numbers don’t add up, the risk is too high, and the plan needs rethinking. If this is the case, it’s our responsibility to help you find a realistic financial plan for your goals. And the basis for that is our advisers’ experience in talking to you at eye level about trade-offs, risks and what the options are. 

Generative AI finds patterns in things, without a human-like understanding of those things

LLMs work by generating responses based on patterns in enormous datasets of text, which come from sources online, either publicly available or licensed specifically to the tech companies. Funnily enough, LLMs get their name from how they find relationships in language. They see patterns of letters and words (‘tokens’) that make up question and queries. Then, they match those to other patterns of letters and words which statistically, are probably the right response.

For example, if the question gets asked, “why do only a small group of the population invest their money?”, GenAI finds sources online which claim to answer that question exactly how it’s worded, and it quotes those sources. It might flesh out its response more by finding related patterns of words that it can include, or which give ideas to keep the conversation going, but it never independently thinks through economics, behavioural finance or investment in the way that a professional would.

Beyond what’s out there in the mainstream for an AI model to regurgitate, financial planners and investment managers can go a step further to give you tailored advice that they're accountable for. The proliferation of GenAI content only makes it more important to meet real professionals with true expertise.

That said, Gen AI could be useful in the right role. Say, when it comes to understanding ISA rules, pension basics, tax thresholds or basic investment concepts. Its conversational nature means users can ask follow-up questions until their understanding clicks into place. 

The key point is that there is a difference between using AI to build understanding and using it to make the final call on important financial decisions.

Learn more about Charles Stanley’s planning support here: Financial Planning services | Charles Stanley

AI’s source bed of information is unreliable

If you decide to go to the library to read about some distant period in history, say Napoleon's life or the American Civil War. You’ll inevitably find that different books use different dates for key events, many attribute quotes to the wrong people and interpret things in completely opposite ways. That’s unfortunately because human’s misrecord things, misremember, and then mistakes get copied. 

We’re now living in an information age where truth feels up for grabs as various voices and narratives compete. AI deepfakes look like the real thing. And facts, purported facts, and marketing all blur together in ways that is likely to cause even bigger headaches for historians in the future. As GenAI is trained on human-produced material, it faces the same challenges. 

As experts in financial planning and investment management, we can attest that in many cases, advice given by GenAI on retirement planning, tax planning, saving and investments, can be very wide of the mark. We recently became aware of an example in the office of Copilot claiming that spouses can use each other’s pension allowances. It was a confident claim – and it got that from somewhere – but unfortunately, not true. 

Rather than saying, “I don’t know,” AI fills the gap with whatever sounds right, because fundamentally, it doesn’t “know” anything apart from its programming. In a casual conversation, hallucinations and inaccuracies may not matter that much. If you’re suspicious of something, you can always verify the answers elsewhere. But in financial planning, you don’t want to have to factcheck what your being told. You want your expert to reliably know their stuff, else they could easily send you down the wrong path.

Why your personal data could be at risk

The UK Information Commissioner’s Office has repeatedly warned that organisations using generative AI need to take data protection risks more seriously. Some tech companies say their models don’t use your conversations to improve the technology, at least not by default, but many others openly do. 

For the time being, we think it’s wise to apply caution when keying in your personal information to ChatGPT, Copilot, Grok, or any other LLM unless you fully understand the risks involved.

While GenAI is also meant to be trained to disengage with abusive or harmful prompts, financial advice is not always detected as out of bounds, as it should be. GenAI sometimes hands over unqualified advice that doesn’t show any conformity at all to FCA regulations. For example, there’s not always a disclaimer about how investment involves risk, or how past returns are not a guide to future returns. To make matters worse, it’s part of some models’ training to keep these (as with all) conversations going. And even where an answer is technically accurate, that still doesn’t make it suitable for your circumstances.

The truth is: AI might be pretty good for education, but that’s not the same as being ready to guide life-changing financial decisions. For those, it should be telling you to speak to an adviser.

AI could dull your financial skills 

Charles Stanley’s Head of Responsible Investment, Paris Jordan, says, in the context of the workplace, “AI could have harmful effects on the brain”. She went onto say, 

“Although we’re still in the early adoption phase, it has already been discovered that an over-reliance on AI diminishes self-regulation, interpersonal skills and emotional intelligence… This is a phenomenon called ‘cognitive offloading’, where users outsource their individual thinking processes (to AI), rather than developing them themselves.”

In the context of your personal finances, being in tune with your spending and conscious of how to save is important for budgeting and for being able to do the mental maths on what financial plan will work. In the investment sense, temperament matters hugely, and you’ve got to practice temperament. It’s one thing to delegate to professionals who are qualified in these areas, but it’s quite another to outsource your financial thinking to a machine. 

So, let us return to that restaurant 

If the dishes keep arriving, beautifully plated and tailored to your taste, you might well be impressed. But now that you know how the kitchen works – at speed from a messy stockroom of man-made ingredients, following instructions on packets rather than truly understanding what it’s serving – will you think twice before trusting it with something delicate?

People are at the heart of Charles Stanley. While AI can be useful for exploring ideas and building understanding, our financial advisers still edge it when it comes to building real financial plans around what matters to you. Find out more about our financial planning service and book in a free fifteen-minute consultation to kick things off.

Nothing on this website should be construed as personal advice based on your circumstances. No news or research item is a personal recommendation to deal.

Our financial planning services

Whether you have a specific question about your finances or are looking for a professional adviser to help you create a holistic financial plan, we can help you create a more secure financial future.

Find out more

More insights

Article
Is it worth being an ISA early bird?
By  Rob Morgan
Spokesperson & Chief Analyst
07 Apr 2026 | 6 min read
Article
2026/27 UK tax brackets: how much will you pay?
By  Rob Morgan
Spokesperson & Chief Analyst
07 Apr 2026 | 8 min read
Article
Investment commentary – first quarter 2026
By  Rob Morgan
Spokesperson and Chief Analyst
07 Apr 2026 | 6 min read
Article
Five financial planning tips for female business owners
By  Charles Stanley
02 Apr 2026 | 5 min read