Thursday, August 7, 2008

Weeks 6 & 7: Great experience

I've decided to post the manuscript of the uncut version of my immersion presentation, as I will not be able to present this in Ithaca.

Well, here it is:

[This photo was taken at the Fourth of July Fireworks in South Street Seaport, at the bottom tip of the Manhattan Island.]

My Immersion assignment was with my mentor, Dr. Jonathan Weinsaft, in the Cardiology Department. As the primary focus of my immersion experience, I consulted Jonathan to have my immersion experience revolve around noninvasive aspects of the cardiovascular system.

Here is what I learned from the first few weeks of my immersion program:

When patients visit the Cardiology department complaining of chest pain, one of the first scans done is the Echocardiography. This is a standard procedure that uses ultrasound, in which the technologist acquires different 2D ultrasonic images, ranging from Triscupid valve Apical 4 and 2-Chambers. I was told that there are about 70 or more Echos done every day at the NYP.

Of particular note is the transesophageal ECHO, which gave out much nicer ultrasound images compared to standard ECHO. The above photo you see is that of the transesophageal probe, which is about a meter long, and that which a patient sedated for a couple of hours had to swallow.

I also got to see several CATH Labs, in which I observed the insertion of catheters for performing x-ray angiography. There are about 20 of these each day.

Nuclear stress tests are associated with SPECT imaging, and they are used to examine myocardial perfusion. Most of what I saw were the three different kinds of stress tests;
Treadmill, Adenosine, and Dobutamine

I followed Jonathan in the Clinical ICUs, and learned a lot about the importance of spending time examining the cases of each patient. As this is the intensive care unit, the patients are among those who have the most critical cardiac conditions. While I got to ask many questions to the fellows, residents, and medical students about the various instruments, terminologies, and tasks that are done in the ICU, it was also tough to see some patients pass away during the week that I followed Jonathan in the rounds.

Here is a slide that I pulled off from Google; it shows a CT angiography, in which we see a clear extension of the coronary artery in this specific case. In addition to seeing Jonathan examine these on the workstation, I got to learn from one of the fellows how to read CTs. This was a very interesting experience, as CT reading seems to be more of an art, than a systematic task that can be automated by a computer.

Finally, I got to do a bit of Magnetic Resonance Imaging. This is a photo from a scanner on 70th Street. Shawn, a fellow Immersion student, is in the scanner and was my first human volunteer for MRI scanning. At one point during Shawn's brain scan, as I was getting used to the different scanning parameters on the computer control screen, I completely forgot to press the scan button for quite a while. Shawn was unknowingly in the scanner for over 20 minutes without anything happening, and experienced a long and tiring scan due to the ineffective performance of a novice scan technician. (sorry Shawn) However, we did get some cool images.

I’d like to now describe my project, which revolves around the Cinematic (CINE) imaging of the left ventricle.

First, the left ventricle is perhaps the most important of the four chambers in the cardiac system, as it is the primary chamber that pumps the blood to the body. From what I have experienced, most of the coronary angiography, stress perfusion/myocardial performance, and diagnostic imaging focuses on the examination of this chamber.

CINE imaging refers to the cinematic imaging, and cine-CMR (SSFP) provides high spatial resolution imaging and is widely accepted as a diagnostic standard for assessment of left ventricular systolic function and chamber volumes.

In order to use this as an effective tool, the workstations are equipped with a software called ReportCARD, which has a manual tracing feature, which is widely applied for quantification of cine-CMR. This software is used to segment the left ventricle chamber and myocardium at systole and diastole. However, there are limitations to manual tracings.

The major Limitations of Manual Tracing are: that it is time consuming. That reproducibility is variable. That it eliminates data; because of time constraints, only end-systolic and end-diastolic volumes are quantified, so all other cardiac phases are ELIMINATED.

I watched Jonathan perform time trials of these tracings for an upcoming paper, and he took on average about 5 to 7 minutes, and sometimes 10 minutes for each case.

The big question we asked is Can we do better? And the answer is Yes.

The LV METRIC segmenter is a program developed by Mr. Noel Codella of WCMC, and it is an automated system that can quickly segment the CINE images saved as SA FIESTA on the workstations, and acquire volumetric data in a lot less than 5 minutes per case. Citing the performance of the segmenter from Mr. Codella’s paper, we know that this tool demonstrates robust performance in getting an accurate volumetric data of the chamber. Our project will take advantage of the segmenter's ability to perform full volumetric assessment. This opens up new possibilities of not only examining the LV chamber contraction (i.e. systole, ejection fraction) but also the patterns of LV chamber relaxation (i.e. diastole).

Now let us talk about diastole. Is diastolic function important? Yes, for
- prognosis
- treatment
- etiology of heart failure

Q. How do we typically assess diastolic function?
A. MUGA, which stands for Multiple Uptake Gated Acquisition, is an Nuclear study that measures the derivative of pressure; dP/dt. We can also use Echo and look at mitral inflow patterns. The MRI has been used for assessment of diastolic function as well; for example with tagging.


The problem with tagging is, it requires additional dedicated imaging (adding to exam time, more breath-holds, inability to analyze large datasets), and the computational analysis of change in myocardial thickness by tagging is nontrivial, as it needs to thoroughly account for spatial and temporal geometry.

Let me explain a little more about the Left Ventricle Diastole. With a full volume curve, we can make the following plot, as in the above. We can then identify the diastole region to the peak of the filling curve. One parameter we are interested is the volume change over time; ie. taking the derivative. Now let's zoom in to the derivative of the diastole region.


In the derivative of the diastole region, we observe that the following is analogous to the Mitral Inflow pattern obtained from Echo. In a healthy case, (above) we can observe the E-wave being larger than the A-wave.


In the following diseased case, we see some abnormality, where the E and the A wave profiles look clearly different. We note that our full-volume assessment is able to generate the same curves as the Mitral Inflow patterns.

Here's an illustrative example of why our study is important. Let us consider the following cases:

Consider two cases with TPFR is the “time to peak filling rate”, and is measured from the end of systole to the time of peak filling; in other words, to the moment with the largest slope value.

Notice that for each of the two cases, the TPFR is quite different; but the traditional ejection fraction method would identify these cases to be both healthy.
For the PFR, the peak filling rate taken by the maximum value of the derivative curve, we notice a substantial difference between the two volumetric curves. This would be a likely misdetection case had we used the Ejection Fraction method to diagnose the cardiac condition.

In order to analyze this data, my project was to develop a software that efficiently sorted, filtered out any private information, and would allow easy analysis of all the cases to be examined for an upcoming study. In MATLAB environment, I developed a Graphical User Interface called LV Analyzer.
Here is what the block panel of the Graphical User Interface LV Analyzer looks like. This runs on MATLAB, and feeds in the raw data from the workstation, processes and sorts the data accordingly, displays the features necessary for the study, and saves it as an output file that can be opened by a spreadsheet program, like Excel.

I will continue to develop the LV-Analyzer after the Summer Immersion program, and plan on using the software to analyze data for an upcoming study.

I'd like to finish my presentations by thanking the following people who have made my immersion experience truly a great one.


No comments: