General practitioners are turning to AI to help with their patient workload

14
Jan 25
By | Other
Deepali Misra-Sharp Dr Deepali Misra-Sharp sits at her desk, with pictures and cards behind her.Deepali Misra-Sharp

Dr Deepali Misra-Sharp uses AI to help take notes

This is the fifth feature in a six-part series looking at how AI is changing medical research and treatment.

Difficulty getting an appointment with a GP is a well-known problem in the UK.

Even when an appointment is secured, the increasing burden faced by physicians meaning those appointments may be shorter than the doctor or patient would like.

But Dr Deepali Misra-Sharp, a GP partner in Birmingham, has found that AI has relieved some of the administration of her work, meaning she can focus more on patients.

Dr Mirsa-Sharp started using Heidi Health, a free AI-assisted medical transcription tool that listens and transcribes patient encounters, about four months ago and says it has made a big difference.

“Usually when I’m with a patient, I write things down and it takes me away from the consultation,” she says. “This now means I can spend all my time locking eyes with the patient and actively listening. This makes for a better quality consultation.”

She says the technology cuts down on her workflow, saving her “two to three minutes per consultation, if not more.” She cites other benefits: “It reduces the risk of errors and omissions in getting my medical records.”

With a shrinking workforce, as patient numbers continue to rise, GPs face immense pressure.

A single full-time physician is now responsible for 2,273 patients, up 17% since September 2015. according to the British Medical Association (BMA).

Could AI be the solution to helping GPs cut back on administrative tasks and ease burnout?

Some studies suggest that it can. A 2019 report prepared by Health Education England estimated a minimum saving of one minute per patient from new technologies such as AI equates to 5.7 million hours of GP time.

Meanwhile, research from the University of Oxford in 2020, found that 44% of all administrative work in General Practice can now be largely or fully automated, freeing up time to spend with patients.

Corti Corti co-founders Lars Maaloe (left) and Andreas Cleve stand together with the sea behind them.Court

Lars Maaloe (left) and Andreas Cleve co-founders of Danish medical AI firm Corti

One company working on this is Denmark’s Corti, which has developed AI that can listen to healthcare consultations, whether over the phone or in person, and suggest follow-up questions, instructions, treatment options, as well as note-taking automation.

Corti says its technology processes around 150,000 patient interactions a day across hospitals, GP surgeries and healthcare facilities across Europe and the US, amounting to around 100 million encounters a year.

“The idea is that the doctor can spend more time with a patient,” says Lars Maaløe, co-founder and chief technology officer at Corti. He says the technology can suggest questions based on previous conversations it has heard in other health care situations.

“The AI ​​has access to related conversations and then it can think, well, in 10,000 similar conversations, most of the questions asked X and that hasn’t been done,” says Mr. Maaløe.

“I imagine GPs have one consultation after another and so have little time to consult with colleagues. It’s giving that colleague advice.”

It also says it can look at a patient’s historical records. “It might ask, for example, do you remember to ask if the patient is still suffering from pain in the right knee?”

But do patients want technology to listen and record their conversations?

Mr Maaløe says “the data is not leaving the system”. However, he says it is good practice to inform the patient.

“If the patient objects, the doctor cannot register. We see few examples of this as the patient can see better documentation.”

Dr Misra-Sharp says she lets patients know she has a hearing aid to help her take notes. “I haven’t had a problem with it yet, but if there was, I wouldn’t do it.”

C signs Women in office looking at C Signs softwareC signs

C Signs software is used to analyze a patient’s medical record

Meanwhile, 1,400 GP practices across England are currently using C the Signs, a platform which uses AI to analyze patients’ medical records and check for signs, symptoms and risk factors of cancer and recommends that what measures should be taken.

“It can capture symptoms, such as cough, cold, bloating, and basically in a minute it can see if there’s any relevant information from their medical history,” says C the Signs chief executive and co-founder Dr Bea Bakshi , which is also. a general practitioner.

The AI ​​is trained on published medical research papers.

“For example, it might be said that the patient is at risk of pancreatic cancer and would benefit from a pancreatic scan, and then the doctor will decide to refer those pathways,” says Dr Bakshi. “It won’t diagnose, but it can relieve.”

She says they have performed more than 400,000 cancer risk assessments in a real-world setting, identifying more than 30,000 cancer patients in more than 50 different cancer types.

An AI report published by the BMA this year found that “Artificial intelligence should be expected to transform, rather than replace, healthcare jobs by automating routine tasks and improving efficiency”.

In a statement, Dr Katie Bramall-Stainer, chair of the UK General Practice Committee at the BMA, said: “We recognize that AI has the potential to completely transform NHS care – but if it is not implemented safely, it can also cause significant harm.AI is subject to bias and error, can potentially compromise patient privacy, and is still a work in progress.

“While AI can be used to enhance and complement what a GP can offer as another tool in their arsenal, it is not a silver bullet. We cannot wait for the promise of AI tomorrow to deliver productivity , stability and necessary security improvements are needed today.”

banner

Alison Dennis, partner and co-head of the international life sciences team at law firm Taylor Wessing, warns that GPs should tread carefully when using AI.

“There is a very high risk that generating AI tools will not provide full and complete or accurate diagnoses or treatment pathways, and may even provide incorrect diagnoses or treatment pathways, eg produce hallucinations or base results on clinically inaccurate training data,” says Ms Dennis.

“AI tools that have been trained on reliable data sets and then fully validated for clinical use — which will almost certainly be a specific clinical use — are more appropriate in clinical practice.”

She says specialty medical products should be regulated and receive some form of official accreditation.

“The NHS would also like to ensure that all data entered into the tool is kept secure within the NHS system infrastructure and not absorbed for further use by the tool provider as training data without proper GDPR. [General Data Protection Regulation] protective measures in place”.

For now, for GPs like Misra-Sharp, it has transformed their work. “It’s made me come back to enjoy my consultations instead of feeling the time pressure.”

Click any of the icons to share this post:

 

Categories