Articles

Doctor in your pocket

Tanya Blake

Can apps for the ubiquitous smartphone provide a more affordable, user-friendly alternative to standard diagnostic tests, rehabilitation aids and health monitoring?

Smartphones today are packed with technology. Within the devices that most of us carry around all day you can find high-resolution cameras, wi-fi and Bluetooth, as well as a range of sensors that can include gyroscopes and accelerometers to measure movement; magnetometers and GPS chips to plot your position; and environmental sensors to measure variables such as temperature, barometric pressure and light. 

All this technology offers valuable potential for healthcare and medicine, which the sector is increasingly realising. Medical start-ups and academics are using this affordable and portable suite of technology to create apps to monitor long-term illness, improve a patient’s rehabilitation, or even the way in which medical diagnostic tests are carried out. 

One such project is being carried out by researchers at the University of California, Los Angeles (UCLA), who have developed a phone-based diagnostic system – or a mobile medical lab.  

Traditional enzyme-linked immunosorbant assay (ELISA) is a diagnostic tool that identifies antigens such as viruses and bacteria in blood samples. ELISA can detect several diseases, including HIV, West Nile virus and hepatitis B, and is widely used in hospitals. It can also be used to identify potential allergens in food.

Traditional ELISA testing is performed with small, transparent plates that resemble honeycombs, typically with 96 tiny wells. Samples are placed in the wells first, followed by small amounts of fluid containing specific antibodies that bind to antigens in the samples. These antibodies are linked to enzymes, so, when a substance containing the enzyme’s substrate – the molecule the enzyme acts on – is added, the resulting reactions cause a change in colour. This colour change is then analysed to detect and quantify any antigens that may be present.

 

Building-in mobility

Aydogan Ozcan, associate director of the California NanoSystems Institute at UCLA, said: “Conventional 96-well plate readers are bulky and costly, limiting their use to well-resourced settings.”

To overcome those drawbacks, he and a team of researchers at the institute have developed a smartphone-based device that can read ELISA plates in the field, with the same level of accuracy as the large machines normally found in clinical laboratories.

The device, which is created with a 3D printer and attaches to a smartphone, illuminates the ELISA plate with an array of light-emitting diodes. The light projects through each well and is collected by 96 individual plastic optical fibres in the attachment. The smartphone transmits the resulting images to UCLA servers through a custom-designed app. The images are then analysed by a machine-learning algorithm tailored for this purpose, and the diagnostic results are sent back to the phone within about a minute for the entire 96-well plate. The app also creates a visualisation of the results for the user.

 “Our ELISA plate reader is much more compact and lighter, and is hand-held. It does not involve any mechanical scanners or complex, expensive optics, making it significantly more cost-effective and also mobile,” says Ozcan.

All the clinical testing and evaluations were compared against “gold-standard Food and Drug Administration-approved technologies” that are being used in the UCLA Health System, he says. The researchers successfully tested the mobile platform in the UCLA clinical microbiology laboratory. They used FDA-approved virus ELISA tests on a total of 567 and 571 patient samples for training and blind testing, respectively, achieving accuracy of up to 99.6%, 98.6%, 99.4% and 99.4% for mumps, measles and two variants of herpes tests, respectively.  

 

Out of hospital

Ozcan envisions that small point-of-care offices could run these portable ELISA tests, using the phone-based readers. “This will bring some of the functions of an advanced hospital to small nurse or point-of-care offices for biomedical testing and diagnostics,” he says.

A patent application has been filed through the UCLA Office of Intellectual Property, Ozcan has licensed the technology via his start-up company Cellmic, and he will soon introduce the product to the healthcare market. 

 

Growing possibilities

This kind of smartphone application looks set to grow, with product development company 42 Technology and the University of Cambridge among those in the UK working on smartphone-enabled portable diagnostic devices.

Meanwhile, researchers at New York University (NYU) Tandon School of Engineering have been working to improve a different facet of healthcare: patient rehabilitation. 

Stroke patients must typically go through repetitive and arduous physical rehabilitation to relearn basic skills and motor functions, such as regaining strength in a weakened hand. To do so they will perform certain lifting and grabbing tasks with the good hand, make ‘mental models’ of the grasping and lifting forces, and then try to replicate these with the affected hand. 

A lot of measurements must be taken during these kinds of rehab tasks, often using expensive force sensors placed on an object, which can cost up to $10,000, and expensive software to analyse the measurements, says NYU Tandon professor of mechanical and aerospace engineering Vikram Kapila, who led a team of students on the project. 

“Something like that is useful to develop and validate a research protocol in a research setting. But it is not so useful for deployment in a clinical setting, or giving the device to the patient to use at his or her home,” he says.

Patients recovering from strokes are often reluctant to continue returning to a clinical or sterile hospital environment to carry out their rehabilitation, but would prefer to carry out exercises at home. However, even at home, a patients’ engagement rehabilitation may decline.

To overcome these problems with traditional methods, Kapila and his research team developed wearable mechatronic devices equipped with off-the-shelf sensors: a jacket to measure arm placement, a glove to measure wrist and finger placement and finger joint angles, and a finger trainer built of “hand-friendly, compliant material”. All are connected inexpensively by a smartphone. 

 

Virtual reality

When a patient performs an exercise assigned by a physician or physical therapist, microcontrollers quantify the action – measuring grip strength on a bottle, for example – and display that information via the smartphone to both the patient and medical provider. Rather than mindlessly repeat the exercise, patients engage in various virtual-reality games that allow them to observe the performance of the unaffected side of the body and mimic the same performance on the affected side. For example, when a patient lifts up an object, a ball on screen will change in diameter. Patients must keep the ball size consistent and in between two vertical lines. The colour of the ball changes from red or green to tell the patient if they are using the correct amount of grip force. 

Because the microcontrollers are attached to easy-to-wear garments, exercising can easily be integrated into a patient’s day-to-day activities rather than treated as a separate, unwelcome task, says Kapila. 

The control group testing the smartphone-enabled tests felt they were more intuitive and natural than using games on computer screens or having to grasp a device with sensors which had no user interface, says Ashwin Raj Kumar, a mechanical engineering student at NYU Tandon who helped to develop working prototypes of the devices.  

The mechatronic devices would sell for $1,000, providing comparable measurement results to existing research-standard devices selling for eight times that amount. The team is seeking funding to build and test further iterations, and hopes to eventually commercialise the technology for home use. 

Kapila has high hopes for the technology. He expects that, just as smartphones are changing the commercial fitness market, with products such as FitBit and exercise apps bringing exercise back into the home, smartphone-enabled technology will “transform the rehabilitation community”.

Another firm working on a healthcare app, this time for monitoring diabetes, is Outcomes Based Healthcare (OBH). The London company has partnered with Texas software developer SoftServe to create a healthcare app, Sense360, that combines ‘passive data’ from phone sensors with ‘active data’ from patient feedback. The first aim for the app is to correlate behavioural patterns with the health of patients with diabetes. 

Alex Amelin, senior vice-president of client success at SoftServe, says: “Everyone has a cellphone, and it knows a lot about you. When we are sick, we begin to use our telephone function in slightly different ways. For example, you might have more missed calls, or change how often you use social network apps, how you are using wi-fi or Bluetooth, or even temperature or lighting changes. This is all gathered using sensors that already exist in a modern cellphone.”

The app gathers, stores and analyses this data using machine learning algorithms. Users are also asked to answer simple questions each day, in a user-friendly interface. These questions are personalised, when the patient first uses the app, towards their own healthcare outcome goals. This system contrasts with traditional methods, where patients are asked a predefined set of questions on a form to measure treatment outcomes. The problem with the latter is that patients can give unreliable data, or not engage with the process because it feels impersonal.

OBH has conducted an alpha test of its app using a handful of people with diabetes, and plans to undertake more extensive testing later in the year. Rupert Dunbar-Rees, chief executive at the firm, says that in general terms it found that patients using Sense360 fitted the anticipated behaviour of “when we are sick or unhappy, we use our phones differently”. 

He adds: “The much harder question is: which combination of sensors gives us most insight into what is happening, or what is about to happen? We strongly believe that artificial intelligence and machine learning have the answer to this question. It is only a matter of time before our phones can detect a wide range of health problems before they have even become apparent. We see this app as an essential step in that direction.”

While it’s still early days, the first regulator-approved apps are beginning to appear in app marketplaces. Dr Christopher Huckvale from Imperial College London, who has carried out research into the scope for smartphones and other smart device apps in supporting the care of long-term conditions, says that current offerings do provide a modest improvement in self-care for people suffering from conditions such as diabetes but the benefits seem less clear for other conditions, such as asthma. But he says the use of such diary-based self-management apps is unlikely to be worse than any other method for supporting people, so if it is preferred by patients then it should not be discouraged.

However, Huckvale adds: “We don’t yet know if an app can be cost-effective compared with paper tools, nor if your average GP or hospital – rather than a centre of digital excellence – can easily put them into practice.”

Questions of data security must also be considered if the use of health apps grows, although Huckvale believes the risks associated with the theft of private health information are low.

Cheaper, smartphone-enabled medical diagnostic tools, and more engaging rehabilitation methods, do appear like a logical step forward, particularly for use in poorer countries with limited access to healthcare. However, at this early stage, the benefits of health monitoring apps, especially if they fall into the hands of insurance firms, seem less obvious.

Did you know? Mobile possibilities

In 2015, more than three-quarters (76%) of adults in the UK owned a smartphone, according to research carried out by Ipsos Mori for consulting firm Deloitte.

What matters most: Diabetes patients can use the OBH Sense360 app to pick treatment outcomes that mean the most to them, such as ‘good quality of life’ or ‘free from symptoms’. OBH hopes this will improve patient engagement when monitoring their illness.

Share:

Read more related articles

Professional Engineering magazine

Current Issue: Issue 1, 2025

Issue 1 2025 cover
  • AWE renews the nuclear arsenal
  • The engineers averting climate disaster
  • 5 materials transforming net zero
  • The hydrogen revolution

Read now

Professional Engineering app

  • Industry features and content
  • Engineering and Institution news
  • News and features exclusive to app users

Download our Professional Engineering app

Professional Engineering newsletter

A weekly round-up of the most popular and topical stories featured on our website, so you won't miss anything

Subscribe to Professional Engineering newsletter

Opt into your industry sector newsletter

Related articles