Another view: Neil Paul

  • 29 May 2012
Another view: Neil Paul

I recently saw a press release for a phone company that had developed a service that meant that if a person collapsed their phone would ring for help. I guess it works by utilising an accelerometer? A sudden bang and the program activates.

Combined with a GPS it sounds like a good way of alerting people. I can see it being useful for patients with strokes and epilepsy and, perhaps, for some people who are just frail and elderly and who want peace of mind.

I thought it was a clever idea and wondered what else could be done? At this point, I have to declare an interest, because I have thought about this before.

With a couple of colleagues I helped developed an iPhone app called iTennisElbow that we give away free. It’s meant to help you do your exercises for tennis elbow.

It uses the compass to know which way up it is, and as long as you hold it in a standard way it works well. Although we never upgraded the app, later versions of the phone have a full gyroscope, so it would be possible for it to tell you if the movement you were doing was to fast or too slow.

Collecting vital signs

Having had some fairly clueless third year medical students in the practice recently, I was wondering how much of a basic examination could be done on a phone.

You can already get apps that check visual acuity and colour blindness, but what about checking visual fields using the gyroscope?

With calibrated earphones could you do an audiogram? Could it be used to measure respiratory rate, by counting the rise and fall of your chest as it sat in your shirt pocket?

I’ve seen an app that checks mental state on a regular basis, using validated depression scores; and all sorts of symptom diaries could be recorded.

It’s already possible to check memory and verbal reasoning using mini mental state apps. Tracing letters and shapes on screen might indicate cognitive decline or early Parkinson’s.

Indeed, a motion sensor or gyroscope might help to interpret tremors – resting and intentional – or detect past pointing or hemiballism. Dysdiadokinesis would be an obvious candidate for an app.

Then, there’s the microphone

Could a sensitive microphone detect your heartbeat? A lot can be told from auscultation. I’ve seen expensive electronic stethoscopes with clever displays, but surely a smartphone could do better than a tiny screen?

In a teaching environment, all the students could join a local network and hear the same as their instructor or record things for later review in a tutorial group or by an expert.

Could a phone somehow detect an ECG from a distance? To move a smartphone over someone’s chest and see an ECG would be amazing.

To do this and get a load of other readings as well; suddenly I’m, in Star Trek. We could save a lot of time and investigations if every patient with palpitations had their own 24 hour ECG that reassured them there were no abnormal rhythms detected.

Having played an ocarina on my phone, what lung function tests could be done using the microphone and what could be done if they added a simple flow rate sensor? If the camera was upgraded with an infrared imaging sensor could it be used as a thermometer?

What about environmental detection? Could the microphone warn of unacceptable levels of noise that could damage hearing or monitor prolonged exposure? Could a sensor detect noxious drugs? Or high levels of tobacco in the vicinity? Or just air quality?

Linking up devices

Perhaps, more realistically, the main role for a smart phone will be to act as a processor and controller of numerous implanted or stuck on sensors or for talking to external devices.

Imagine a urinal that tests the urine of everyone using it, and sends the results to their smartphone. Or, even better, finds out their identity from their phone and sends the results to their doctor or a toilet that constantly checks for blood.

Maybe it’s treatment that will be the future. As a medical student I worked on a project with one of the professors that looked at whether neural networks or traditional pharmacokinetic modelling was better at predicting the blood sugar of type 1 diabetic children, given information on diet activity and previous insulin doses.

A smartphone could take its own readings of blood sugar. There is already a machine that takes constant readings from the skin, via osmosis, and adds these to the other variables.

We might be able to achieve perfect control with an insulin pump regulating the amount of insulin on a second by second basis. Could blood pressure be done in same way?

Of course, talking to peripherals is already happening; we have seen blood glucose meters, weighing machines and blood pressure machines.

Do it yourself ultrasound probes

The ultimate attachment for the gadget family might be an ultrasound probe that talks to your smartphone.

I don’t think it’s as daft as it sounds; bladder scanners are already automated. You hone in on the circle that is the bladder and it will tell you pre and post void residual volumes. For a full scan, all you have to do is hold them roughly in the right place and follow the on screen instructions.

All it would take is a satnav-like interface to guide you around your abdomen, viewing all the major structures; the same for the heart. You might get an instant automated report or even some crowd-sourced radiologists report; perhaps they might even come online and talk you through it?

The future is likely to have a lot more data. We need good methods of viewing it, sorting it, analysing it, workflowing it and getting meaningful information from it.

About the author: Dr Neil Paul is a full time partner at Sandbach GPs, a large (21,000 patient) practice in a semi rural Cheshire. Until recently, he was on the PEC of Central and East Cheshire Primary Care Trust, with responsibility for Urgent Care and IT.

He is now on a journey into the unknown. He is on the board of his local consortium, one of many on a pilot leadership programme, and looking at provider opportunities. He recently set up a successful primary care clinical trials unit and is involved in several exciting IT projects.

Subscribe to our newsletter

Subscribe To Our Newsletter

Subscribe To Our Newsletter

Sign up

Related News

How patient-collated data is supporting evidence-based care

How patient-collated data is supporting evidence-based care

Dr Haidar Samiei of EMIS argues that health apps are not only empowering patients, but also supporting evidence-based care.
Another view: of age being a barrier to tech

Another view: of age being a barrier to tech

In his latest column, Neil Paul looks at if age really is a barrier to tech or whether it is down to user interface and…
Another view: of Covid-19 vaccination centres

Another view: of Covid-19 vaccination centres

Our GP columnist, Neil Paul, gives an insight into how the Covid-19 vaccination centres are being run at a local level via Primary Care Networks.