Rowing Data Analytics: Reducing and Studying the Rowing Workout

In my last post (“Rowing Data…”) I discussed the steps associated with downloading the Garmin Vivoactive HR data from Garmin Connect to an Excel spreadsheet. In this post, I’m going to take the reader through the analysis of the data as a tutorial and guide for assessing certain elements of these data.

Raw data in Excel format are shown in Figure 1. I am going to focus on distance (column M), speed (column N), and heart rate (column O).

Figure 1: Downloaded rowing worksheet from Garmin Connect

I normally like to study discrete, time-based data by translating the time component from the Zulu time (column L) into a relative time from the start of the workout. Furthermore, I like to translate these into units of seconds as the base unit.

To do so, we can take advantage of some powerful capabilities contained within formulas inside of Microsoft Excel. For example, the start time listed in column L begins with the entry:


The next entry is:


These are “Zulu” time or absolute time references. We wish all future times to be keyed or made in reference to the first time. In order to do so, we need to translate this entry into a time in seconds. We can do so by parsing each element of the entry. These entries are listed sequentially in column L2 and L3, respectively.

Each element is translated into seconds by parsing the hours, minutes and seconds using the following formula:


The first component extracts the time in hours and translates into seconds. The second component extracts the “minutes” and translates into seconds. The third component extracts the “seconds” element by itself. The total time is the superposition of all three individual components.

Thus, what I normally do is to copy the contents of the initial spreadsheet into a new sheet adjacent to the original and then begin working on the data. Presently, I am in the process of developing an application that will perform this function automatically. Yet, here I am “walking the track” associated with analyzing the data in order to chronicle the mathematics surrounding the process.

The hour, minute and second can be extracted as separate columns. Let us copy the contents of column L in the original spreadsheet into a new sheet within the existing workbook and place the time in column A of that new sheet. Thus, the entries in this sheet would appear as follows:

ns1:Time Absolute Time (seconds) Relative Time (seconds)
2017-07-08T14:09:31.000Z 50971 0
2017-07-08T14:09:34.000Z 50974 3
2017-07-08T14:09:35.000Z 50975 4

The Absolute time in the middle column is the time in seconds represented by the left-hand column relative to Midnight Zulu time. The right-hand column is the time relative to the first cell entry in the middle column. Thus, zero corresponds to 50971-50971. The entry for three seconds corresponds to the difference between 50974 (second entry) and 50971 (first entry), and so on.

I also created some columns to validate parameter entries. For instance, the reported total distance and speed (in units of meters and meters per second, respectively), in column M and N and the heart rate, in column O, are referred to next. I created a new column O in the new spreadsheet to provide a derived estimate of total distance, which I computed as the integral of speed over time. The incremental distance, dS, is equal to the speed at that time, dV, multiplied by the time differential between the current time and the previous time stamp, dt. Then, the total distance is the integral, or the summation of this incremental distance and all prior distances. I reflect this as column G in the new worksheet, shown in Figure 2.

Figure 2: Modified rowing spreadsheet with derived time, range and longitude-latitude calculations.

What follows now are plots of the raw and derived data. First, the heart rate measurement over time is shown in Figure 3. Note that the resting rate is shown at first. Once the workout intensifies, heart rate increases and remains relatively high throughout the duration of the workout.

Figure 3: Workout heart rate (pulse) versus time.

The total distance covered over time is shown in Figure 4. This tends to imply a relatively constant speed during the workout due to the linear behavior over the 8700+ meters.

Figure 4: Workout range versus time. Note linear behavior, indicating relatively constant speed.

The reported speed, as measured via GPS, shows variability but is typically centered about 1.85 meters per second. The speed over time is shown in Figure 5.

Figure 5: Workout measured speed versus time. Average is 1.85 meters per second.

The GPS coordinates are also available through the Excel data. I have subtracted out the starting location in order to provide a relative longitude-latitude plot of the workout, shown in Figure 6.

Figure 6: relative longitude and altitude of the workout.

In my next post I will focus on the athletic aspects of the workout related to training.

Rowing Data: Accessing Heart Rate, Distance and GPS Location from Garmin Vivoactive HR

An Introduction to Garmin Connect

For those who use the Garmin Connect Dashboard (, to synchronize their Garmin fitness devices, there is a fairly straightforward method for downloading higher-frequency data from the workout relative to heart rate, distance and GPS location that can be directly imported into Microsoft Excel for further analysis.

Getting Started with the Download

From inside of Garmin Connect (Figure 1), select a specific activity. In this example, I am picking my latest rowing workout, shown by the red arrow.

Garmin Connect: Accessing workout data within your Garmin Vivoactive HR.

Once you have selected the specific workout, click on it and this will take you to the details of that workout. Once there, navigate over to the gear on the right-hand side, as shown by the red arrow.

Detailed view of workout within Garmin Connect.

The drop-down box from the arrow shows a number of export options. Select “Export to TCX”, shown by red arrow in Figure 3.

Accessing export function within Garmin Connect.

Upon selection, the file will be downloaded, as shown in Figure 4. On a Windows platform, this will be downloaded by default to the user’s downloads folder.

File download in .TCX format from Garmin Connect.

Once the file is downloaded, go to the file directory and locate the file you just downloaded, as shown in Figure 5.

Locating .TCX file in downloads directory on Windows platform

Then, change the suffix from .TCX to .XML, as shown in Figure 6. Accept the change when prompted.

Renaming .TCX file to .XML.

Now, open Microsoft Excel. Select the Data tab, as shown in Figure 7.

Microsoft Excel: selecting Data tab in preparation for .XML file import.

On the left-hand side, select Get External Data “From Other Sources”, and scroll down to “From XML Data Import”, as shown in Figure 8.

Importing the .XML file into Microsoft Excel using the import
XML file option.

A dialog box will open. Navigate to your newly-created XML file. Select it, and click the series of “OK” buttons in the dialogs that come up, including the one placing the location of the start in cell $A$1. Once completed, the contents of the file will be imported into your spreadsheet. Heart rate data will be contained in column O, as shown in Figure 9. Distance & speed are contained in columns M & N, respectively. GPS latitude & longitude are contained in columns P & Q, respectively. Average speed is contained in column R.

Viewing available columns of data within Microsoft Excel.

In my next post on the subject I will describe how to manipulate these data for further analysis.

Garmin Vivoactive HR for Rowing & Sculling

Vivoactive HR

Sculling and Rowing

I am a rower and sculler. I first cut my teeth in the sport over 30 years ago while at college rowing on the Charles River. I had been looking for the longest time for a device that I could use to track my heart, stroke rate, and also support GPS mapping of my workout while on the water. There are professional devices that track stroke rate and the like, such as Speed Coach GPSStroke Coach and Coxmate GPS. These are all excellent pieces of equipment, by the way. But, I am not in varsity rowing any more and I was looking for a piece of equipment that could support my rowing “habit” both for indoor and outdoor rowing (aside: I also possess a Concept 2 ergometer, which I love) while also serving the utilitarian purpose of being a good watch that can track heart rate full time.

When I row, however, I am really interested in being able to map the analytics to the motion. The Vivoactive HR enables me to do this as well as to post-process the data. I am into data. As a Chief Analytics Officer in the healthcare field for a medical device and real-time patient surveillance company, it is important to me to be able to access and understand the information collected during an activity. The connectivity and access to data provided by the Vivoactive HR are phenomenal.

Data view from Garmin Connect web site.





The figure above details an example analytics screen, which shows the map of the workout, heart rate, stroke rate, distance traveled at each measurement point, and allows tracking the entire workout with a cross-hair that is dynamic and interactive on the web screen. The unit supports many other types of workouts, including running, biking, pool, golf, walking, indoor rowing on ergometer, SUP rowing, XC skiing, indoor walking, indoor biking, and indoor running, and tracks sleep. The unit can be submerged in water and the battery life is amazing. I normally live with the unit on my wrist, and after 3 days of continuous use, battery is down to, perhaps 80%. I will take it off for an hour or so to charge, and it is good-to-go. I highly recommend this unit for the avid professional or veteran rower (like myself).

Update June 29th, 2017: Comparison among NK, Coxmate, Minimax

Robin Caroe of RowPerfect kindly left me a comment to this post last evening and provided an updated article on comparison among the NK, Coxmate GPS and Catapult Minimax which contains quite valuable data on performance related to these products. I have provided the hyperlink to the article above. Technological differences in sampling rate (e.g.: 5 Hz for NK versus 10 Hz for Coxmate) are important for accuracy. I must say that I was very close to purchasing the Coxmate GPS prior to investigating the Garmin. Upon reading the brochure for the Minimax S4, I am intrigued. The Minimax offers an update rate on the GPS that provides for precision in terms of location. In the Rowperfect article, of the key measures of performance identified, (1) heart rate & heart rate variability; (2) force and length of stroke; and, (3) GPS update rate are important measures for the elite athlete. In the case of the Minimax, GPS update on the order of 100 times per second (10 milliseconds) can reveal boat pitch, roll & yaw. Highly impressive. I would agree, though, that this level of accuracy and precision would be important for the competitive athlete. Yet, in my case (non-competitive, casual athlete), I still love my Garmin. I am able to see and track my position very accurately, monitor my stroke and heart rate, and in terms of heart rate variability, I can write an algorithm in R or Matlab to monitor that measure fairly directly.

Arterial Blood Pressure Signal Tracking

Arterial blood pressure signal (from MIMIC II Database) with measurements and tracking signal overlaid.

Filtering of Arterial Blood Pressure Signal Artifact using the Extended Kalman Filter

In an earlier post, I had discussed some mathematical techniques for mitigating alarm fatigue.

Expanding on the mathematical techniques employed, another reason for filtering of data includes the smoothing of artifact or spikes that are due to signal errors or other issues associated with signal acquisition.

Figure 1 depicts several seconds of raw arterial blood pressure (ABP) data obtained from a patient within the MIMIC II physiologic waveform database. [1,2]

This figure shows a raw signal with a tracking signal based on the extended Kalman filter (EKF) overlaid. In this case, the signal error and the process noise are very small (signal noise 0.1 mmHg, process noise 0.5 mmHg). With these settings, the filter tracks the actual signal very closely, and makes it appear as if there is not difference between signal measurement and track.

The full analysis is available at the following link in PDF form:

ABP Tracking via EKF

[1] M. Saeed, M. Villarroel, A.T. Reisner, G. Clifford, L. Lehman, G.B. Moody, T. Heldt, T.H. Kyaw, B.E. Moody, R.G. Mark.Multiparameter intelligent monitoring in intensive care II (MIMIC-II): A public-access ICU database. Critical Care Medicine 39(5):952-960 (2011 May); doi: 10.1097/CCM.0b013e31820a92c6.

[2] Goldberger AL, Amaral LAN, Glass L, Hausdorff JM, Ivanov PCh, Mark RG, Mietus JE, Moody GB, Peng C-K, Stanley HE. PhysioBank, PhysioToolkit, and PhysioNet: Components of a New Research Resource for Complex Physiologic Signals.Circulation 101(23):e215-e220 [Circulation Electronic Pages;]; 2000 (June 13).


Alarm Fatigue? What a Nuisance!

Alarm Fatigue

“Hospital staff are exposed to an average of 350 alarms per bed per day, based on a sample from an intensive care unit at the Johns Hopkins Hospital in Baltimore.”[1]

From the same survey, almost 9 in 10 hospitals indicated they would increase their use of patient monitoring, particularly of Capnography and pulse oximetry, if false alarms could be reduced. [2]

“Of those hospitals surveyed that monitor some or all patients with pulse oximetry or Capnography, more than 65 percent have experienced positive results in terms of either a reduction in overall adverse events or in reduction of costs.”[3]

Attenuating Alarm Signals

The problem with attenuating alarm data is achieving the balance between communicating the essential, patient-safety specific information that will provide proper notification to clinical staff while minimizing the excess, spurious and non-emergent events that are not indicative of a threat to patient safety. In the absence of contextual information, the option is usually to err on the side of excess because the risk of missing an emergent alarm or notification carries with it the potential for high cost (e.g.: patient harm or death).


The purpose of this study is to look at the and some of the Mathematical Techniques for Mitigating Alarm Fatigue: techniques and options available for evaluating real-time data. The objective is to suggest a dialog for further research and investigation into the use of such techniques as appropriate. Clearly, patient safety, regulatory, staff fatigue and other factors must be taken into account in terms of aligning on a best approach or practice (if one can even be identified). These aspects of alarm fatigue are intentionally omitted from the discussion at this point (to be taken up at another time) so that a pure study of the physics of the parameter data and techniques for analyzing can be explored.


[1] Ilene MacDonald, “Hospitals rank alarm fatigue as top patient safety concern”, Fierce Healthcare. January 22, 2014.

[2] Wong, Michael; Mabuyi, Anuj; Gonzalez, Beverly; “First National Survey of Patient-Controlled Analgesia Practices.” March-April 2013, A Promise to Amanda Foundation and the Physician-Patient Alliance for Health & Safety.

[3] Ibid.


Howmar Boats Finest 15′ Sloop: Designers Choice (“DC”)

Designers Choice: A Small Sloop

Originally designed by the naval architects Sparkman & Stephens, and built circa Edison New Jersey in the late ”70s through mid-’80s, the Designers Choice (“DC”) is a fibreglass-hulled sailboat with length overall (LOA) of 14′ 10.5″, length at the waterline of 12′ 9″ and beam of 6′ 1″. She weighs in at 315 lbs.

The draft of the DC varies from 5″ (centerboard up) to 3′ 0″ (centerboard down). Aft freeboard is 1′ 2″.

The mast is tall and the sail area of the mainsail is 82 sq ft; that of the jib is 28 sq ft. Crew capacity is 900 lbs. In my experience, 3 adults and 3 children can be comfortable on board.

DC Standard Features

She features:

  • Black anodized aluminum spars.
  • Grooved mast with loose footed mainsail fitted with luff slugs.
  • Stainless steel chain plates, headstay & shrouds.
  • Deluxe heave duty fittings.
  • Four-part mainsheet with quick release cam cleat on centerboard trunk.
  • All hardware  mounted with through-bolts or drilled and tapped into aluminum backing plates.
  • Controllable outhaul, boom vang and Cunningham.
  • Kick-up rudder with foam-filled floating black anodized aluminum tiller and universal hiking stick.
  • 1.25″ vinyl rub-rail.
  • Non-leaking centerboard pin above the waterline and cockpit sole for easy access.
  • Hand laid-up heavy duty mat and roving hull construction
  • White gelcoat finish.
  • Molded-in skid-resistant side seats and cockpit sole.
  • Large covered stowage locker under afterdeck.
  • Durable dacron mainsail and jib.
  • Jib window and jiffy reefing are standard.


I have owned my DC since 2003. I have had 3 sailboats in my life and this is a decent little craft. She was built in 1979, making her 38 years. Several photographs are included below. A copy of the original Howmar Designers Choice is provided for download, as well.

1979 Designers Choice with original sails
1979 Designers Choice on the beach
Tanaka 3 HP outboard kicker
1979 Designers Choice on Trailer. Sails are original. She is launched from the yard onto the Chesapeake Bay. She is kept covered when not in use.


Opioid-induced respiratory depression and monitoring patients diagnosed with obstructive sleep apnea

HIMSS Future Care posting:

Managing patients on the general care floor (GCF) who are either at risk or “diagnosed with obstructive sleep apnea (OSA), and those in particular who meet the requirements of the STOP-BANG criteria for OSA, can be quite challenging. The ECRI Institute, a federally-certified patient safety and research organization, has identified in its 2017 list of Top 10 Health Technology Hazards “Undetected Opioid-Induced Respiratory Depression” as Number 4 [1]. Opioids used for treatment of acute postoperative pain is rather commonplace, and patients at-risk for OSA, if left unattended, can experience anoxic brain injury or death.

Medical Device Plug and Play

Originally posted at, the discussion surrounding plug-and-play medical devices focused on the ability to have true interoperability from a semantic and physical perspective. This post was originally written in 2009 surrounding the need for better medical device plug and play interoperability and integration, in much the same way a USB-enabled accessory purchased for a standard computer is recognized by the drivers once plugged into the computer.

Nanotechnology: A Key to Future Diagnosis and Treatment?

Will the future of medicine rely on nanotechnology for treatment of disease?

The average size of the avian influenza virus is on the order of 100 nanometers, or 0.1 microns, which is of the order of nanotechnology. That a virus so small can wreak such havoc on the human body is a testament to the complex mechanisms associated with these infections. The ability to ward off such infections is also a testament to the awesome nature of the human immune system. By comparison, the width of a typical human hair is on the order of 100,000 nanometers (estimates put the range at 50,000 – 150,000, depending on the specific type of hair).

Now, consider the field of nanotechnology which focuses on the manufacture and fielding of mechanical and electronic devices of microscopic size, typically on the order of 100 nanometers or smaller. The National Cancer Institute (NCI) provides a fairly detailed overview of the use of nanotechnology in cancer treatment, and the NCI Alliance for Nanotechnology in Cancer is an initiative that provides a focal point for public and private investigation for the application of nanotechnology to the treatment of cancer. Researchers and companies have been investigating the manufacture of devices of this order of magnitude and smaller for application in the treatment of disease. A major focus for nanotechnology in healthcare is, not surprisingly, the treatment of cancer. Specific methods and modes of delivery vary. Examples include outfitting little “robots” with markers that will burrow into and attach themselves to cancerous cells for the purpose of enabling treatment and destruction of malignant cells. A major benefit of this approach versus traditional methods of radiation and chemotherapy is that the malignancies can be targeted directly without attacking or otherwise molesting healthy cells. This is a major advancement, since many of the current therapies that attack cells indiscriminately will kill both healthy as well as malignant cell material. When battling this terrible disease the last thing needed is to destroy those healthy cells upon which the individual depends for sustenance and survival. Thus, nanotechnology provides a mechanism for delivering targeted, customized, tailored therapy.

What about nanotechnology for diagnosis?

While we are on the cutting edge of the application of these technologies, the vision is real, and it is extremely promising. Treatment is only one aspect of nanotechnology use. Diagnosis is another area, in which nanoparticles can be used to assist in imaging of potential malignancies. While almost a cliché, the aging of the baby-boomer population will drive a number of these new technologies, applications, and initiatives. It is almost a tautology that early diagnosis of disease  translates into a higher likelihood of survival. Technologies that support early diagnosis are, therefore, of great value and will enable better, more efficient, and  more accurate treatment of disease going forward. As a member of this generation (albeit, at the tail end), I am very encouraged and supportive of this research. I recall some 17 years ago when my mother passed away from breast cancer that the use of exotic technologies such as nanotechnology was barely an inkling. Indeed, the three oft-used mechanisms for treating cancer have remained surgery, irradiation, or poisoning (chemotherapy). It has only been within the past 10 years or so in which alternative therapies have been devised and discovered that are not simply variants of these three. Research into the targeted treatment of cancer by destroying the genetic material within malignant cells so that they cannot reproduce or cannot receive nourishment is an astonishing advancement and offers great future promise—a testament to human ingenuity, talent, innovation, and creativity. As in vitro and in vivo medicine evolve, such future-looking technologies will be essential in terms of early diagnoses and intervention.

Can medical device integration facilitate diagnosis and treatment?

One cannot control what one cannot measure. In vivo measurements are necessary to determine whether any treatment paradigm is working: comparison pre- and post-treatment to determine the correlation and association of a treatment modality to establish intended effect. In later posts, I discuss the use of data taken from medical devices at the point of care to facilitate clinical decision making. These data, whether obtained from the patient externally or internally form the basis for identifying the state of the patient and trends towards improvement or decompensation.

Healthcare Information Technology and The Future of Medicine

Medicine & Healthcare Information Technology In the Future

When it comes to medicine, healthcare information technology has become an integral part of care. Estimates by the U.S. Census Bureau expect the population of Americans aged 65 and older to increase by more than a factor of two between 2010 and 2050 [1]. This is not good news for the state of medicine in the U.S., particularly when it comes to providing quality care for these patients.

Estimates of healthcare expenditure increases between 2007 and 2017 show an increase to nearly 20% of GDP in this period [2]. These estimates were made prior to the recent financial crisis that began during the Fall of 2008. Further compounding this increasing demand and the concomitant increase in costs is the availability of allied healthcare professionals. Some studies [3] identify the likely decrease in the number of physicians entering any number of key specialty areas, including cardiology (20% decrease by 2020), geriatrics (35% of current demand met today), rheumatology (38 day average wait for a new appointment), and primary care (on the verge of collapse). Those of us who are baby boomers are on the leading edge of this demand and, in order to mitigate and minimize the cost impacts on our children, it is our challenge and responsibility to innovate and meet these challenges without passing along unnecessary burdens to our children and grandchildren.

Aging & Future Medicine

For most of us, aging means more frequent and severe afflictions, from chronic ailments to more acute care needs and longer hospital stays. Medicine and medical care becomes an extremely important part of life as we age. Taking care of our health by improving diet, exercising, and maintaining an otherwise active lifestyle is essential to ensure a high quality life. Even with increased vigilance chronic ailments can affect us later in life, brought on both by our genetics and consequentially due to the lifestyles we’ve led in our youths. Ailments such as dementia, coronary artery disease, Alzheimer’s, myocardial infarction, congestive heart failure, macular degeneration, osteoporosis, hypertension, chronic obstructive pulmonary disease, diabetes, and others take their toll. Managing chronic diseases is costly from a logistical perspective in terms of time and money. However, even more to the point, effective and quality oversight of patients with chronic ailments requires regular review, screening, and monitoring of patients. This is further complicated by the need to serve patients who lack the means or are physically incapable of leaving their homes for extended periods. Telehealth and remote monitoring are a means by which a case manager—an individual assigned to oversee the care of chronically ill patients within a home-health setting—can review patient information on a regular basis (for example, daily) and support both the patient and the primary care provider. Furthermore, Intensive care units and emergency departments are becoming more crowded. Healthcare information technology is helping to ameliorate the challenges through improved workflow and more reliable & dense information. Individuals with insurance are going to EDs because they cannot find satisfaction in terms of prompt scheduling with their gatekeepers (family practitioners). The quantity of individuals with chronic ailments is on the rise (stroke, CHF, diabetes, COPD, etc.) This is in part due to the fact that people are living longer. At the same time the Medicare and SS systems will not be able to sustain the growth in population over age 65. This means that working individuals will increasingly bear the financial burden for us “boomers.” As a result of increased longevity and the fiscal challenges, the retirement age will increase.

Where does healthcare & medicine go? To the home!

So, what do we do? Well, several things: first, technology in the form of remote data collection, reporting devices and software will become more prevalent: glucometers, BP cuffs, spirometers and associated software will be more readily available for direct communication with personalized electronic health records. If the purpose of a typical visit is to take BP and diabetic assessments, this can be handled most by collecting data at the point of care (home) and transmitting to the physician’s office for assessment. Such also applies to nursing and assisted living facilities. Next, the technical infrastructure required to transmit and store these data will be required. Paying for this infrastructure could come from a number of sources. One possibility: most everyone nowadays has access to cable television. Cable companies could offer devices that integrate with existing modems to collect and transmit data to the FP, together with complementary emails to next of kin (e.g. “Your mother’s BP as of 8:10 this morning was 145/89”). Other technologies that can be used to evaluate and monitor chronic ailments such as macular degeneration can further reduce costs by providing video cameras at point of care whereby opthalmologists can review retinal changes without requiring an elderly individual to be transported at expense and time to a hospital or office. In addition, support for remote consults via VoIP and video can be supported over the same network. This empowers the remote provider with the ability to interact with the patient All of these technologies are in use in remote pockets around the world today. But, they will become more prevalent. These technology implementations will reduce costs and provide for more personalized care in comfortable settings (homes). Of course, nothing takes the place of the tactile hands-on. But, for routine visits the above will be invaluable. In terms of the software technologies, personalized medicine will become the norm (eventually). Telehealth will be key. But, also, support for automated workflow in the acute care environment will need to be augmented. This means fully integrating all data into the enterprise HIS.

The U.S. Department of Health and Human Services through its Office of the National Coordinator for Health Information Technology, published operational scenarios focused on providing key information to assist in harmonizing standards on the implementation, certification, and policy implications for robust remote patient monitoring [4]. Included in this assessment are requirements on interacting with personalized health records and enterprise health information systems. The approaches to advancing remote monitoring include both seamless communication from medical devices at the point of care (i.e., in a patient’s home setting) and with a case manager and primary care provider both through electronic transfer, storage, and display of health information and remote video and audio interaction with patients in the same home health setting.

Healthcare Information Technology is not the silver bullet, but those described above are key enablers for remote health monitoring. Of course, the use of technology carries with it the implication that sufficient underlying infrastructure exists. This is not always the case in remote areas of the country. Satellite, cable, and fiber optic technologies are fairly extensive within the continental United States, but pockets and regions exist in which this is not the case. Therefore, a combined effort to extend the communications infrastructure must continue together with a unified effort to standardize and train and “in-service” individual care providers on these technologies must occur. One of the best mechanisms for enabling this is through the local hospitals and their satellite clinics.

So, how long do we have? Well, the sooner the better. Successful telehealth and remote monitoring programs exist throughout the United States and worldwide today. We should ensure that our elected representatives direct healthcare expenditures towards several specific areas to promote growth and alignment to meet the objectives of remote monitoring. These include continuing alignment on electronic personalized health records, expansion of our underlying communications infrastructure, and promoting common standards of communication among these records so that, regardless of location, a patient can communicate his or her information to any physician and allied health professional within the country. In summary: healthcare information technology employing common storage, homogeneous communication, standardized formats is necessary to provide the type of support necessary for the future of medicine.

[1] Source: Population Division, U.S. Census Bureau, August 14th, 2008; Table 12: “Projections of the population by Age and Sex for the United States: 2010 to 2050 (NP2008-T12)”

[2] Cinda Becker, “Slow: Budget Danger Ahead,” Modern Healthcare, March 3rd 2008.