Follow-Up: Garmin Vivoactive HR & Polar H10: Which measures heart rate more accurately?

Heart Rate Measurement Using Garmin & Polar Wearables

A study was made of the Garmin Vivoactive HR and Polar H10 chest strap in terms of comparative heart rate assessments. The units are shown in Figure 1 below. The two units involved included a wrist-based sensor (Garmin Vivoactive HR) and a chest strap (Polar H10).

Polar h10 chest strap and garmin vivoactive hr smart watch were used in the comparison

This follow-up focuses on 20 minutes of water rowing using both units in an effort to assess the heart rate measurement consistency and reliability. Both watch and chest strap were properly attached with no movement between these devices and the skin. Data were collected and then downloaded and processed through a Microsoft Excel spreadsheet. The data were time-synchronized so that corresponding data points from each device were associated in time. A summary of the analysis is provided here.

Time-Based Plots of Heart Rate

Overlay scatter plots of heart rate measurements versus time were made and are as shown in Figure 2.

Figure 2: Heart rate measurement while water-rowing approximately 20 minutes. Shown are overlays of Garmin Vivoactive HR and polar h10 heart rate versus time.

A general observation from the data is that the heart rate measurements from the two devices seem to overlap reasonably well as viewed by the naked eye. But there are key drops in measurements, particularly with the wrist-based heart rate sensor, that show as deviations in the overlap of the two signals. This can be seen more readily via the correlation curve shown in Figure 3. The correlation coefficient of 0.91 was determined between the two sets of measurements. It should be noted that the wrist-based sensor was snug with no movement on the wrist. Ambient temperature was approximately 80F.

As I showed in a previous post, there was a serious issue with the wrist-based sensor in which there were data dropouts with some significant time lags between measurements. In the case of the wrist-based sensor for the associated measurements here, this was also experienced. For comparison, I show histogram plots of the time intervals between measurements for both the wrist-based sensor (Figure 4) and the chest strap (Figure 5). The wrist-based sensor experiences a significant number of events in which the time between actual measurements are greater than one second. Indeed, from the figure, only 83 measurements during this interval were obtained within one-second of one another! There were a significant number of measurements in which the interval was > 1 second, with one as high as 40 seconds. The overall quantity of measurements was thus reduced to approximately 430 during the workout. On the other hand, the chest strap consistently measured at one-second intervals for a total of approximately 1320 measurements.

Figure 3: Scatter plot of heart rate as measured between the wrist-based Garmin device and the chest-strap Polar H10. A correlation coefficient of 0.91 was determined between the measurements. Perfect correlation is shown by the diagonal line.
Figure 4: Historgram of time between measurements for Garmin wrist-based sensor. Note the significant quantities of measurements in which the interval is greater than 1 second (the advertised measurement interval). For example, there were 20 instances in which the measurement interval was 6 seconds, and one instance in which the measurement interval was 40 seconds! Note that only 83 measurements were in the one-second interval width!
Figure 5: Historgram of time between measurements for the chest strap polar H10 sensor. All measurements (of which there were more than 1300) were reliably at one-second intervals.

Conclusions

Chest straps are much more reliable for heart rate measurement versus wrist-based sensors. Users of wrist-based sensors for heart rate measurement should be advised that measurements can be in question, as results illustrate here. This is not to say that chest straps are the gold-standard. Clearly, ECG measurement similar to those obtained through stress-testing are of diagnostic quality. Yet, for rate measurement chest straps are quite adequate and seemingly reliable.

Garmin Vivoactive HR & Polar H10: Which measures heart rate more accurately?

Figure 1: Polar h10 chest strap and garmin vivoactive hr smart watch were used in the comparison.

Heart Rate Measurement Using Garmin & Polar Wearables

A study was made of the Garmin Vivoactive HR and Polar H10 chest strap in terms of comparative heart rate assessments. Three different types of tests were conducted while the author wore these devices. The units are shown in Figure 1. The Garmin unit is able to be used with a number of sports, including rowing, and provides measurements of heart rate, stroke rate, distance per stroke, split times, and also provides for location tracking during the workout. Data can be uploaded to http://connect.garmin.com/and are also available for download in TCX (an XML format) as well as splits downloads in CSV format. The Polar H10 is strapped around the chest just below the level of the breast bone. This unit, too, can upload data to the http://flow.polar.com/web site, where data can be downloaded in TCX format, as well.  In order to provide some variety, I considered three different activities:

  • General workout, involving weight lifting, sit-ups, squats;
  • Walking for 1 mile; and,
  • Indoor rowing for 15 minutes.

In all cases, both the Vivoactive HR and the H10 were attached, with the Vivoactive HR snuggly affixed to the left wrist. Both watch and chest strap were properly attached with no movement between these devices and the skin. Data were collected and then downloaded and processed through a Microsoft Excel spreadsheet. The data were time-synchronized so that corresponding data points from each device were associated in time. Plots of the measurements were made.

Time-Based Plots of Heart Rate

Overlay scatter plots of heart rate measurements versus time were made of all three activities, shown in Figure 2 through Figure 4. Data were downloaded from the Garmin & Polar cloud sites and were uploaded into MS Excel. The data were then time synchronized using visual basic to align the measurements.

Figure 2: Heart rate measurement while walking 1 mile. Scatter overlay of Garmin Vivoactive HR and polar h10 heart rate versus time.
Figure 3: Heart rate measurement during general exercise activity. Scatter overlay of Garmin Vivoactive HR and polar h10 heart rate versus time.
Figure 4: Heart rate measurement while  rowing indoors on Concept 2 ergometer. Scatter overlay of Garmin Vivoactive HR and polar h10 heart rate versus time.

Heart Rate Comparison: Walking

Measurements of heart rate were taken during a one mile walk. The heart rates were plotted against one another and the correlation coefficient was computed between the two sets of measurements. In the case of the comparison shown in Figure 5, the correlation among measurements was rather poor: the correlation coefficient was determined to be -0.54. Perfect correlation is given by the diagonal line in the figure. Interesting to note is that the data points taken from the Vivoactive HR time variance. In the case of the Vivoactive HR, in some instances, the time between measurements was as high as 47 seconds with 62 measurements in the 12-14 second interval range, whereas in the case of the Polar H10, all measurements were 1 second interval. Thus, the number (quantity) of measurements taken by the Polar H10 were far denser than those of the Vivoactive HR.

Figure 5: Scatter plot of heart rate measured while walking one mile using the Polar H10 versus Garmin Vivoactive HR. The correlation coefficient of -0.54 was determined between the two sets of measurements. Perfect correlation is shown by the diagonal line.

Heart Rate Comparison: General Activity

In the case of general activity, which included some weight lifting, sit-ups, leg raises and standing exercises, the heart rate comparison is as shown in Figure 6. The correlation coefficient among these measurements is a bit higher at 0.60. The variation in measurement collection time associated with the Garmin HR was even higher here, with one measurement interval as high as 88 seconds!

Figure 6: Scatter plot of heart rate measured while performing weight lifting, sit-ups and general standing exercises using the Polar H10 versus Garmin Vivoactive HR. The correlation coefficient of 0.60 was determined between the two sets of measurements. Perfect correlation is shown by the diagonal line.

I have hypothesized that the wide variation in data collection time may be due to arm motion that is not experienced to the degree in walking. I also have hypothesized that the improved correlation may be due to the higher heart rate, which is more easily detected by the Vivoactive HR. We will see some supporting evidence of this in the final section on indoor rowing.

Heart Rate Comparison: Indoor Rowing

Rowing on the Concept 2 PM5 unit while wearing both the Vivoactive HR and the Polar H10 produced the results as illustrated in Figure 7. The correlation between the Vivoactive HR and the Polar H10 is much higher here, with a correlation coefficient of 0.95. Several items of note: the variation in measurements with the Vivoactive HR is much lower, with only two measurements 19 seconds apart and most measurements having 1-2 second intervals. This complies much more closely with the 1-second measurement intervals of the Polar H10. Furthermore, heart rate measurements are much higher here: some measurements as high as 165 beats/minute (during sprints). In general, corroboration between the two units is better as heart rate measurement is higher. This could be due to more accurate peripheral measurement.

Figure 7: Scatter plot of heart rate measured while performing indoor rowing using the Polar H10 versus Garmin Vivoactive HR. The correlation coefficient of 0.95 was determined between the two sets of measurements. Perfect correlation is shown by the diagonal line.

Conclusions

Based on the limited sampling and workouts thus far, the general conclusion regarding heart rate measurement “trust” is that the Polar H10 is more reliable based on several observations: (1) data collection time variation remains consistent at 1 second; and, (2) data density remains high with no dropouts in any of the workouts. This is not a surprise in general as the conventional wisdom is that chest straps are much more reliable. Yet, I wanted to quantify this reliability using some objective measures. It should be noted that while heart rate remains somewhat questionable with the Vivoactive HR, I have found that stroke rate measurement in comparison with the Concept 2 PM5 measurement is dead on accurate (at least based on the data I have observed).

Rowing Data Analytics: Reducing and Studying the Rowing Workout

In my last post (“Rowing Data…”) I discussed the steps associated with downloading the Garmin Vivoactive HR data from Garmin Connect to an Excel spreadsheet. In this post, I’m going to take the reader through the analysis of the data as a tutorial and guide for assessing certain elements of these data.

Raw data in Excel format are shown in Figure 1. I am going to focus on distance (column M), speed (column N), and heart rate (column O).

Figure 1: Downloaded rowing worksheet from Garmin Connect

I normally like to study discrete, time-based data by translating the time component from the Zulu time (column L) into a relative time from the start of the workout. Furthermore, I like to translate these into units of seconds as the base unit.

To do so, we can take advantage of some powerful capabilities contained within formulas inside of Microsoft Excel. For example, the start time listed in column L begins with the entry:

2017-07-08T14:09:31.000Z

The next entry is:

2017-07-08T14:09:34.000Z

These are “Zulu” time or absolute time references. We wish all future times to be keyed or made in reference to the first time. In order to do so, we need to translate this entry into a time in seconds. We can do so by parsing each element of the entry. These entries are listed sequentially in column L2 and L3, respectively.

Each element is translated into seconds by parsing the hours, minutes and seconds using the following formula:

=MID($A$2,12,2)*60*60+MID($A$2,15,2)*60+MID($A$2,18,2)

The first component extracts the time in hours and translates into seconds. The second component extracts the “minutes” and translates into seconds. The third component extracts the “seconds” element by itself. The total time is the superposition of all three individual components.

Thus, what I normally do is to copy the contents of the initial spreadsheet into a new sheet adjacent to the original and then begin working on the data. Presently, I am in the process of developing an application that will perform this function automatically. Yet, here I am “walking the track” associated with analyzing the data in order to chronicle the mathematics surrounding the process.

The hour, minute and second can be extracted as separate columns. Let us copy the contents of column L in the original spreadsheet into a new sheet within the existing workbook and place the time in column A of that new sheet. Thus, the entries in this sheet would appear as follows:

ns1:Time Absolute Time (seconds) Relative Time (seconds)
2017-07-08T14:09:31.000Z 50971 0
2017-07-08T14:09:34.000Z 50974 3
2017-07-08T14:09:35.000Z 50975 4

The Absolute time in the middle column is the time in seconds represented by the left-hand column relative to Midnight Zulu time. The right-hand column is the time relative to the first cell entry in the middle column. Thus, zero corresponds to 50971-50971. The entry for three seconds corresponds to the difference between 50974 (second entry) and 50971 (first entry), and so on.

I also created some columns to validate parameter entries. For instance, the reported total distance and speed (in units of meters and meters per second, respectively), in column M and N and the heart rate, in column O, are referred to next. I created a new column O in the new spreadsheet to provide a derived estimate of total distance, which I computed as the integral of speed over time. The incremental distance, dS, is equal to the speed at that time, dV, multiplied by the time differential between the current time and the previous time stamp, dt. Then, the total distance is the integral, or the summation of this incremental distance and all prior distances. I reflect this as column G in the new worksheet, shown in Figure 2.

Figure 2: Modified rowing spreadsheet with derived time, range and longitude-latitude calculations.

What follows now are plots of the raw and derived data. First, the heart rate measurement over time is shown in Figure 3. Note that the resting rate is shown at first. Once the workout intensifies, heart rate increases and remains relatively high throughout the duration of the workout.

Figure 3: Workout heart rate (pulse) versus time.

The total distance covered over time is shown in Figure 4. This tends to imply a relatively constant speed during the workout due to the linear behavior over the 8700+ meters.

Figure 4: Workout range versus time. Note linear behavior, indicating relatively constant speed.

The reported speed, as measured via GPS, shows variability but is typically centered about 1.85 meters per second. The speed over time is shown in Figure 5.

Figure 5: Workout measured speed versus time. Average is 1.85 meters per second.

The GPS coordinates are also available through the Excel data. I have subtracted out the starting location in order to provide a relative longitude-latitude plot of the workout, shown in Figure 6.

Figure 6: relative longitude and altitude of the workout.

In my next post I will focus on the athletic aspects of the workout related to training.

Arterial Blood Pressure Signal Tracking

Filtering of Arterial Blood Pressure Signal Artifact using the Extended Kalman Filter

Arterial blood pressure signal (from MIMIC II Database) with measurements and tracking signal overlaid.

The figure above depicts several seconds of raw arterial blood pressure (ABP) data obtained from a patient within the MIMIC II physiologic waveform database. [1,2]

This figure shows a raw signal with a tracking signal based on the extended Kalman filter (EKF) overlaid. In this case, the signal error and the process noise are very small (signal noise 0.1 mmHg, process noise 0.5 mmHg). With these settings, the filter tracks the actual signal very closely, and makes it appear as if there is not difference between signal measurement and track.

The full analysis is available at the following link in PDF form:

ABP Tracking via EKF

[1] M. Saeed, M. Villarroel, A.T. Reisner, G. Clifford, L. Lehman, G.B. Moody, T. Heldt, T.H. Kyaw, B.E. Moody, R.G. Mark.Multiparameter intelligent monitoring in intensive care II (MIMIC-II): A public-access ICU database. Critical Care Medicine 39(5):952-960 (2011 May); doi: 10.1097/CCM.0b013e31820a92c6.

[2] Goldberger AL, Amaral LAN, Glass L, Hausdorff JM, Ivanov PCh, Mark RG, Mietus JE, Moody GB, Peng C-K, Stanley HE. PhysioBank, PhysioToolkit, and PhysioNet: Components of a New Research Resource for Complex Physiologic Signals.Circulation 101(23):e215-e220 [Circulation Electronic Pages; http://circ.ahajournals.org/cgi/content/full/101/23/e215]; 2000 (June 13).

 

Alarm Fatigue? What a Nuisance!

Alarm Fatigue

“Hospital staff are exposed to an average of 350 alarms per bed per day, based on a sample from an intensive care unit at the Johns Hopkins Hospital in Baltimore.”[1]

From the same survey, almost 9 in 10 hospitals indicated they would increase their use of patient monitoring, particularly of Capnography and pulse oximetry, if false alarms could be reduced. [2]

“Of those hospitals surveyed that monitor some or all patients with pulse oximetry or Capnography, more than 65 percent have experienced positive results in terms of either a reduction in overall adverse events or in reduction of costs.”[3]

Attenuating Alarm Signals

The problem with attenuating alarm data is achieving the balance between communicating the essential, patient-safety specific information that will provide proper notification to clinical staff while minimizing the excess, spurious and non-emergent events that are not indicative of a threat to patient safety. In the absence of contextual information, the option is usually to err on the side of excess because the risk of missing an emergent alarm or notification carries with it the potential for high cost (e.g.: patient harm or death).

Analysis

The purpose of this study is to look at the and some of the Mathematical Techniques for Mitigating Alarm Fatigue: techniques and options available for evaluating real-time data. The objective is to suggest a dialog for further research and investigation into the use of such techniques as appropriate. Clearly, patient safety, regulatory, staff fatigue and other factors must be taken into account in terms of aligning on a best approach or practice (if one can even be identified). These aspects of alarm fatigue are intentionally omitted from the discussion at this point (to be taken up at another time) so that a pure study of the physics of the parameter data and techniques for analyzing can be explored.

References

[1] Ilene MacDonald, “Hospitals rank alarm fatigue as top patient safety concern”, Fierce Healthcare. January 22, 2014.

[2] Wong, Michael; Mabuyi, Anuj; Gonzalez, Beverly; “First National Survey of Patient-Controlled Analgesia Practices.” March-April 2013, A Promise to Amanda Foundation and the Physician-Patient Alliance for Health & Safety.

[3] Ibid.

 

Opioid-induced respiratory depression and monitoring patients diagnosed with obstructive sleep apnea

HIMSS Future Care posting:

Managing patients on the general care floor (GCF) who are either at risk or “diagnosed with obstructive sleep apnea (OSA), and those in particular who meet the requirements of the STOP-BANG criteria for OSA, can be quite challenging. The ECRI Institute, a federally-certified patient safety and research organization, has identified in its 2017 list of Top 10 Health Technology Hazards “Undetected Opioid-Induced Respiratory Depression” as Number 4 [1]. Opioids used for treatment of acute postoperative pain is rather commonplace, and patients at-risk for OSA, if left unattended, can experience anoxic brain injury or death.

Nanotechnology: A Key to Future Diagnosis and Treatment?

Will the future of medicine rely on nanotechnology for treatment of disease?

The average size of the avian influenza virus is on the order of 100 nanometers, or 0.1 microns, which is of the order of nanotechnology. That a virus so small can wreak such havoc on the human body is a testament to the complex mechanisms associated with these infections. The ability to ward off such infections is also a testament to the awesome nature of the human immune system. By comparison, the width of a typical human hair is on the order of 100,000 nanometers (estimates put the range at 50,000 – 150,000, depending on the specific type of hair).

Now, consider the field of nanotechnology which focuses on the manufacture and fielding of mechanical and electronic devices of microscopic size, typically on the order of 100 nanometers or smaller. The National Cancer Institute (NCI) provides a fairly detailed overview of the use of nanotechnology in cancer treatment, and the NCI Alliance for Nanotechnology in Cancer is an initiative that provides a focal point for public and private investigation for the application of nanotechnology to the treatment of cancer. Researchers and companies have been investigating the manufacture of devices of this order of magnitude and smaller for application in the treatment of disease. A major focus for nanotechnology in healthcare is, not surprisingly, the treatment of cancer. Specific methods and modes of delivery vary. Examples include outfitting little “robots” with markers that will burrow into and attach themselves to cancerous cells for the purpose of enabling treatment and destruction of malignant cells. A major benefit of this approach versus traditional methods of radiation and chemotherapy is that the malignancies can be targeted directly without attacking or otherwise molesting healthy cells. This is a major advancement, since many of the current therapies that attack cells indiscriminately will kill both healthy as well as malignant cell material. When battling this terrible disease the last thing needed is to destroy those healthy cells upon which the individual depends for sustenance and survival. Thus, nanotechnology provides a mechanism for delivering targeted, customized, tailored therapy.

What about nanotechnology for diagnosis?

While we are on the cutting edge of the application of these technologies, the vision is real, and it is extremely promising. Treatment is only one aspect of nanotechnology use. Diagnosis is another area, in which nanoparticles can be used to assist in imaging of potential malignancies. While almost a cliché, the aging of the baby-boomer population will drive a number of these new technologies, applications, and initiatives. It is almost a tautology that early diagnosis of disease  translates into a higher likelihood of survival. Technologies that support early diagnosis are, therefore, of great value and will enable better, more efficient, and  more accurate treatment of disease going forward. As a member of this generation (albeit, at the tail end), I am very encouraged and supportive of this research. I recall some 17 years ago when my mother passed away from breast cancer that the use of exotic technologies such as nanotechnology was barely an inkling. Indeed, the three oft-used mechanisms for treating cancer have remained surgery, irradiation, or poisoning (chemotherapy). It has only been within the past 10 years or so in which alternative therapies have been devised and discovered that are not simply variants of these three. Research into the targeted treatment of cancer by destroying the genetic material within malignant cells so that they cannot reproduce or cannot receive nourishment is an astonishing advancement and offers great future promise—a testament to human ingenuity, talent, innovation, and creativity. As in vitro and in vivo medicine evolve, such future-looking technologies will be essential in terms of early diagnoses and intervention.

Can medical device integration facilitate diagnosis and treatment?

One cannot control what one cannot measure. In vivo measurements are necessary to determine whether any treatment paradigm is working: comparison pre- and post-treatment to determine the correlation and association of a treatment modality to establish intended effect. In later posts, I discuss the use of data taken from medical devices at the point of care to facilitate clinical decision making. These data, whether obtained from the patient externally or internally form the basis for identifying the state of the patient and trends towards improvement or decompensation.

Healthcare Information Technology and The Future of Medicine

Medicine & Healthcare Information Technology In the Future

When it comes to medicine, healthcare information technology has become an integral part of care. Estimates by the U.S. Census Bureau expect the population of Americans aged 65 and older to increase by more than a factor of two between 2010 and 2050 [1]. This is not good news for the state of medicine in the U.S., particularly when it comes to providing quality care for these patients.

Estimates of healthcare expenditure increases between 2007 and 2017 show an increase to nearly 20% of GDP in this period [2]. These estimates were made prior to the recent financial crisis that began during the Fall of 2008. Further compounding this increasing demand and the concomitant increase in costs is the availability of allied healthcare professionals. Some studies [3] identify the likely decrease in the number of physicians entering any number of key specialty areas, including cardiology (20% decrease by 2020), geriatrics (35% of current demand met today), rheumatology (38 day average wait for a new appointment), and primary care (on the verge of collapse). Those of us who are baby boomers are on the leading edge of this demand and, in order to mitigate and minimize the cost impacts on our children, it is our challenge and responsibility to innovate and meet these challenges without passing along unnecessary burdens to our children and grandchildren.

Aging & Future Medicine

For most of us, aging means more frequent and severe afflictions, from chronic ailments to more acute care needs and longer hospital stays. Medicine and medical care becomes an extremely important part of life as we age. Taking care of our health by improving diet, exercising, and maintaining an otherwise active lifestyle is essential to ensure a high quality life. Even with increased vigilance chronic ailments can affect us later in life, brought on both by our genetics and consequentially due to the lifestyles we’ve led in our youths. Ailments such as dementia, coronary artery disease, Alzheimer’s, myocardial infarction, congestive heart failure, macular degeneration, osteoporosis, hypertension, chronic obstructive pulmonary disease, diabetes, and others take their toll. Managing chronic diseases is costly from a logistical perspective in terms of time and money. However, even more to the point, effective and quality oversight of patients with chronic ailments requires regular review, screening, and monitoring of patients. This is further complicated by the need to serve patients who lack the means or are physically incapable of leaving their homes for extended periods. Telehealth and remote monitoring are a means by which a case manager—an individual assigned to oversee the care of chronically ill patients within a home-health setting—can review patient information on a regular basis (for example, daily) and support both the patient and the primary care provider. Furthermore, Intensive care units and emergency departments are becoming more crowded. Healthcare information technology is helping to ameliorate the challenges through improved workflow and more reliable & dense information. Individuals with insurance are going to EDs because they cannot find satisfaction in terms of prompt scheduling with their gatekeepers (family practitioners). The quantity of individuals with chronic ailments is on the rise (stroke, CHF, diabetes, COPD, etc.) This is in part due to the fact that people are living longer. At the same time the Medicare and SS systems will not be able to sustain the growth in population over age 65. This means that working individuals will increasingly bear the financial burden for us “boomers.” As a result of increased longevity and the fiscal challenges, the retirement age will increase.

Where does healthcare & medicine go? To the home!

So, what do we do? Well, several things: first, technology in the form of remote data collection, reporting devices and software will become more prevalent: glucometers, BP cuffs, spirometers and associated software will be more readily available for direct communication with personalized electronic health records. If the purpose of a typical visit is to take BP and diabetic assessments, this can be handled most by collecting data at the point of care (home) and transmitting to the physician’s office for assessment. Such also applies to nursing and assisted living facilities. Next, the technical infrastructure required to transmit and store these data will be required. Paying for this infrastructure could come from a number of sources. One possibility: most everyone nowadays has access to cable television. Cable companies could offer devices that integrate with existing modems to collect and transmit data to the FP, together with complementary emails to next of kin (e.g. “Your mother’s BP as of 8:10 this morning was 145/89”). Other technologies that can be used to evaluate and monitor chronic ailments such as macular degeneration can further reduce costs by providing video cameras at point of care whereby opthalmologists can review retinal changes without requiring an elderly individual to be transported at expense and time to a hospital or office. In addition, support for remote consults via VoIP and video can be supported over the same network. This empowers the remote provider with the ability to interact with the patient All of these technologies are in use in remote pockets around the world today. But, they will become more prevalent. These technology implementations will reduce costs and provide for more personalized care in comfortable settings (homes). Of course, nothing takes the place of the tactile hands-on. But, for routine visits the above will be invaluable. In terms of the software technologies, personalized medicine will become the norm (eventually). Telehealth will be key. But, also, support for automated workflow in the acute care environment will need to be augmented. This means fully integrating all data into the enterprise HIS.

The U.S. Department of Health and Human Services through its Office of the National Coordinator for Health Information Technology, published operational scenarios focused on providing key information to assist in harmonizing standards on the implementation, certification, and policy implications for robust remote patient monitoring [4]. Included in this assessment are requirements on interacting with personalized health records and enterprise health information systems. The approaches to advancing remote monitoring include both seamless communication from medical devices at the point of care (i.e., in a patient’s home setting) and with a case manager and primary care provider both through electronic transfer, storage, and display of health information and remote video and audio interaction with patients in the same home health setting.

Healthcare Information Technology is not the silver bullet, but those described above are key enablers for remote health monitoring. Of course, the use of technology carries with it the implication that sufficient underlying infrastructure exists. This is not always the case in remote areas of the country. Satellite, cable, and fiber optic technologies are fairly extensive within the continental United States, but pockets and regions exist in which this is not the case. Therefore, a combined effort to extend the communications infrastructure must continue together with a unified effort to standardize and train and “in-service” individual care providers on these technologies must occur. One of the best mechanisms for enabling this is through the local hospitals and their satellite clinics.

So, how long do we have? Well, the sooner the better. Successful telehealth and remote monitoring programs exist throughout the United States and worldwide today. We should ensure that our elected representatives direct healthcare expenditures towards several specific areas to promote growth and alignment to meet the objectives of remote monitoring. These include continuing alignment on electronic personalized health records, expansion of our underlying communications infrastructure, and promoting common standards of communication among these records so that, regardless of location, a patient can communicate his or her information to any physician and allied health professional within the country. In summary: healthcare information technology employing common storage, homogeneous communication, standardized formats is necessary to provide the type of support necessary for the future of medicine.

References
[1] Source: Population Division, U.S. Census Bureau, August 14th, 2008; Table 12: “Projections of the population by Age and Sex for the United States: 2010 to 2050 (NP2008-T12)”

[2] Cinda Becker, “Slow: Budget Danger Ahead,” Modern Healthcare, March 3rd 2008.

A suggested method to control heart rate pacing and stroke volume in left-ventricular assist devices and for patients undergoing heart transplantation

Controller Design Concept

Update: this weblog article has been updated recently and a PDF of the document along with the new article is available here: Autonomic Heart Rate Controller Device Concept

I present a concept for autonomic cardiac pacing as a method to augment existing physiological pacing for both ventricular assist devices (VAD) and heart transplantations. The following development represents a vision and reflects an area that has yet to be fully exploited in the field. Therefore, the analysis is meant to be a starting point for further study in this area. Furthermore, an automatic control system methodology for both heart rate and contractile force (stroke volume) of patients having either an artificial left ventricular assist device (LVAD) or who have experienced degenerative performance of the Sinoatrial node is suggested. The methodology is described both in terms of a device and associated operational framework, and is based on the use of the naturally-occurring hormones epinephrine, norepinephrine, and dopamine contained in the return blood flow through the superior vena cava. The quantities of these hormones measured in the blood stream are used to derive a proportional response in terms of contractile force and pacing of the Sinoatrial node. The method of control suggests features normally described using cyclic voltammetry, expert systems, and feedback to pacing an artificial assist device.

Medical Device Data and Modeling for Clinical Decision Making

Medical Device Data, Modeling & Simulation

This cutting-edge volume is the first book that provides practical guidance on the use of medical device data for bioinformatics modeling purposes. Professionals learn how to develop original methods for communicating with medical devices within healthcare enterprises and assisting with bedside clinical decision making. The book guides in the implementation and use of clinical decision support methods within the context of electronic health records in the hospital environment. Supported with over 100 illustrations, this all-in-one resource discusses key concepts in detail and then presents clear implementation examples to give professionals a complete understanding of how to use this knowledge in the field.

“Medical Device Data and Modeling for Clinical Decision Making” Content Overview:

  • Introduction to Physiological Modeling in Medicine: A Survey of Existing Methods, Approaches and Trends.
  • Simulation and Modeling Techniques.
  • Introduction to Automatic Control Systems Theory and Applications.
  • Physical System Modeling and State Representation.
  • Medical Device Data Measurement, Interoperability, Interfacing and Analysis.
  • Systems Modeling Example Applications.
  • Modeling Benefits, Cautions, and Future Work.