Monday, November 30, 2009

Content:

Dr. Liang's lecture on Image Processing was the basic overview of how we can enhance an image  and obtain important information from the image. I am also enrolled to Digital Image Processing (DIP) course and we have to study different techniques to process an image. Filtering, noise reduction, edge detection, segementation are few of them. The interesting thing about image processing, as mentioned by Dr. Kahol) is that it is the applied domain of mathematics. We can apply different mathematical functions and see the results in the image. When we consider an image as a matrix of pixel values, then we can visualize different mathematical models like masking, filters, transforms in an image. In medical field, we can enhance CT, MRI, and/or XRay images using various techniques that would help a doctor to diagnose disease(s). In fact, the way CT, MRI, X-Ray are generated, they are also application of image processing.


Posted by
Prabal

Sunday, November 29, 2009

Digital Image Processing

Content:

This week Dr. Liang gave us an overview of the basic knowledge of digital image processing. I take Jimmy’s course: Intro to Digital Image Processing as my selective course. Digital image processing is widely used in researches and our daily life. The digital camera is the result of digital image processing. There is a chip in the camera which can convert the light intensity into voltage and store the information in matrix. Actually, digital image processing deals with data in matrix, so besides the devices for obtain the image information, it is concerned with mathematics. Also, it needs the knowledge of programming, because it is the computer which does the mathematics for image processing. Usually digital processing is used to find the parts of images we are interested in such as the abnormal darkness in the CT image. The project we do for Jimmy’s course is to detect the contours in the image using an active contour model. It is interesting however we do it for a whole semester. The algorithm sounds amazing for me. It is sure that people who describe the algorithms must be intelligent and with high IQ.


Anyway, happy holiday to everybody~


Posted by
Xiaoxiao

Saturday, November 28, 2009

digital imgage processing

Content.  Image processing, I think really cool.  I also have great respect for those people in this field, because it is very hard, not only do you have to know a bit about computer programming, you also have to have a great knowledge in math, and probability.  During the classes this week we talked about point processing, filters, and edge detectiontechniques.  Point processing, is applying some kind of algorithm to each point in the image. While filters are algorithms that are applied to a mask.  Edge detection techniques that we Sobel's method, and the Otsu method. 


Posted by P. Ortiz

Image Processing

Content:
Image Processing is very interesting and it brings the math that is performed by imaging programs that we take for granted to the surface. I find that the lectures that was given was also very helpful in that there was plenty of examples as well as equations that helps to illustrate what the processing is doing. In addition, I find that image processing is very similar to signal processing with the filters, manipulations, etc.

Here's a link with basic information on images and how images work in 2D and 3D that I ran across during my research for my undergraduate project.

http://www.ncsu.edu/scivis/lessons/understandingimages/images1.html


Posted by Eric

Good Overview of Digital Image Processing

Content:
Being totally unfamiliar with imaging, Dr.Liang's lectures gave me a very good idea of what exactly digital image processing deals with. The intimate intervention of imaging with numbers and various enumerations to modulate the images are really exciting to study. There is a lot of math underlying this field and a profound mathematical understanding would really help in excelling and gaining good grip in image processing. Dr.Liang gave a good basic introduction to handling images in terms of image manipulation, image reconstruction and image segmentation. Significant research efforts in this field would surely create wonders in Biomedicine.

Posted by
Harsha Undapalli

Friday, November 27, 2009

Image Processing

This lecture was a continutation of past lectures in which we learned about the different types of resolution.  Here, we looked at some specific mathematical and statistical properties of image data.  In image processing, an image can be represented as matrix values.  Because of this, it is easy to pinpoint certain data values.  Various subsets of image processing were discussed, such as data interpolation, image restoration, and image segmentation.  The types of digital images include binary, grayscale, true color, and indexed color.  Also, basic statistical and manipulation methods were introduced that could help describe and change imaging properties.  Histogram equalization was one statistical method.  Histograms are a way of displaying the distribution of image data.  Also, various functions were described that could change the appearance of the image.

Listed below is a very interesting, informative, yet easy to understand series of lectures on digital image processing.  Images were manipulated using MATLAB for class assignments.  It made me want to take pictures and do assignments! :)

http://eeweb.poly.edu/~onur/lectures/lectures.html

Posted by Annie

Imaging

Content: Hi Guys. Hope everyone had a nice Thanksgiving. I know that this is a new and confusing holiday for some, but I am sure that you have unique holidays that I would not understand. Jimmy really knows his stuff, but this is complex. For images, you acquire, preprocess, enhance, restore, segment, represent descriptively and recognize and interpret. He has focused on image enhancement, restoration and segmentation. The enhancement makes the image more suitable for applications and improves sharpness and contrast. The components here are point processing and neighborhood processing. Point processing utilizes arithmetic operations. Averaging and non-linear median filters can be used for the specific function needed. Geometric functions can also be used for enhancement and rotation is one type of this method. Image restoration involves the removal or reduction of degradation. For Noise, includes gaussian, salt and pepper, speckle, and periodic and is these must are ways to address this. The median filter is advantageous for removal of the salt and pepper noise. Gaussian noise approaches utilize a composite image from a sequence of images. Image segmentation is focused on the separation of the image into component parts. Thresholding is used here. Edge detection is another aspect of the segmentation process. Specific filters including Sobel, Canny edge, and Roberts may be used for this. Edges may be sharp or ramped, requiring different approaches. I will continue to evaluate the mathematical functions that are involved here. So much of brain imaging depends on these approaches.


Posted by Stuart

Image Processing

Content: The lecture was quite intense going into the methods used in digital image processing. Surprising it is simple matrix algebra which can do such amazing manipulations. Without the slides its difficult to write a lot, but of what I remember were point and neighborhood processing and use of various filters to correct the images. With my very limited interest in this field I can only contribute a tutorial site for all those interested. I think this is a good site which introduces to the basics of the field.
http://www.library.cornell.edu/preservation/tutorial/toc.html
Well that was my 2 cents as I don't have and don't think I ever will have a very good understanding of this field.


Posted by

Sheetal Shetty

Digital Imaging Processing

Happy Thanksgiving!
Didn't Debbie say it all?  For those of us not enrolled in the Imaging Class, the information presented, was challenging. I think Jimmy gave a good introduction, stressing that it was only introductory, but the concepts were difficult.  I have used Jasc and Photo Shop so now have a better understanding that a mask is a "piece" of the entire image, and that we manipulate for imagery, using consequtive masks. I won't pretend to interpret more. I'm waiting for the slides!
Also, as with the review, Kanav has a way of mandating learning, so his summary of Monday's lecture was appreciated.
Lee

Week of 11/23/09

Content:  This week Dr. Liang gave us lectures on imaging.  Also, Dr. Kahol gave us a lecture that included image filtering.  Filtering is commonly applied using masks in an image.  One of the topics included in the imaging lectures was image interpolation.  Interpolation can be used for functions such as scaling or rotating an image.  Another topic covered was image restoration.  Image restoration can be preformed by applying filters that adjust image intensity histograms or generally lighten or darken an image.  Image restoration can also be applied by using filters that sharpen or smooth an image.  Additionally, segmentation was a topic that was covered.  Segmentation can be preformed using filters such as edge detection filters.

Dr. Liang did some nice work organizing and summarizing the topics that he covered.  I also thought Dr. Kahol did a nice job expaining some filtering topics.  I liked getting to learn about the imaging topics that were covered this week.  The filters covered are so widely used that it is interesting to know more about how they work.  It was interesting to learn about the segmentation filters.  I can imagine how those filters are practical for a large number of purposes like object detection.

Here is a link to a open source program that is like photoshop.  If anyone doesn't have photoshop and wants to try image filters they could use the program at the link below.
http://www.gimp.org/
It seems like GIMP has a lot of functionality that is similar to photoshop.

Posted by:  Nate

Image Processing

Content: The image processing lectures by Jimmy were very interesting to me because, 1st of all, it involves math (rather than boring slides with only text!) and 2nd of all, it shows the math implementations in image form. As Dr. Kahol says, image processing is the best implementation of math as the result of the math can be observed in the image. We learnt the implementation of different filter on images to make it sharp or blur. Image processing is a very promising field in biomedical informatics. In fact, being a signal processing guy, the 1st motivation that I have got for switching to BMI is to do research on image/video processing.


Posted by Gazi

Research Validation

Content: Dr. Petiti’s lecture was about validating any research method. The outcomes from any research can come in different shape and terms. It can be measured in terms of mean, risk/odd ratio or hazard ratio. A research study is valid as long as it finds the truths. There are 2 types of validity – internal and external. Internal validity researches with the design method of the study and the measurement analysis, where the latter validates the difference in applicability to person to person or place to place. Also the lecture discusses about different types of errors in research method, like insufficient or not properly randomized samples.


Posted by Gazi

Surgical Simulators

Content: The lecture on surgical simulator was the most interesting lecture so far, as it relates to the research work we do in the lab. We are developing “active world” which is a whole new animated parallel world where different patients have been created with different scenarios. Residents and doctors can hone their cognitive skills by going through these scenarios in the animated world. There are at home simulators like drilling simulator, FLS or virtual peg transfer game. Surgeons can practice it offline even from home to improve their surgical skill. The peg transfer simulator or wii games like marble mania are actually being used at the patient’s bedside for warming-up before the surgery. We are also developing simulator that can measure effect of fatigue at different time of the shift.


Posted by Gazi

Thursday, November 26, 2009

Imaging Manipulation by Dr. Jianming Liang

The imaging manipulation presented in both lectures this week were interesting but rather hard to follow, especially the functions and equations for the various types of filters and masks.  They all seemed to blend together.  Although I love math, trying to follow the math examples for each type of manipulation was near impossible.  I understanding the masking, average, minimum and maximum filters, etc but fail to see when these would be used in clinical practice.  Obviously the filters and masks are needed to help clarify an image but I don't understand how you determine which one to use and when.  I appreciated the simplistic approach of using a photograph familiar to everyone instead of using actual medical imaging pictures. We could clearly see the results of each type of manipulation which was helpful but I don't understand when you would use a 3x3 versus a 5x5 mask, etc.. 

In addition, the slides are not posted for review.  Will these be posted soon?


Posted by Debbie Carter

Sunday, November 22, 2009

Dr. Petiti's lecture on designing for validity was another interesting topic on research design methods. She first described outcomes of a research design and then explained the measures that affects the outcome(s). The main focus of the was on understanding different types of study design and the factors that affect the study. As an example, she mentioned the study on breast cancer self examination, which was pretty interesting. The other examples that she mentioned were 'cellphones as the cause of brain cancer', and 'drinking as the cause of coronary heart disease'. The first example can't be done until the participants are forced to do BSE, and different law enforcement should be required. In all these cases randomization is required. But, we should also consider the risk of technology being changed in the future. That would definitely affect the study.

Prabal

Research Validity

Content:
Dr. Petitti gave a great lecture on research validity where she focused on things that are major threats to research validity. These threats are as follows:
  • bias in design
  • measurement error
  • type 2 statistical error
  • poor study conduct
These can all lead to studies that are not valid or credible. Also, it was mentioned that typically study sizes are very important yet we see studies being published with very low numbers that have questionable conclusions.

On a side note, a recent hack attack on Climate Change specialists' emails have showed that results may not be what it seemed when it was presented. As well, I believe one of the articles covering the story mentioned that there was some plotting by these groups to prevent the release of data/Information as required the Freedom of Information Act. I found these articles interesting since it was published just days after our lecture on Research Validity. Also on a side note, the summary above for the articles may not be accurate since the authenticity of the documents have not been verified as authentic, but here are the articles:

Posted by Eric

Saturday, November 21, 2009

Research Validity

Dr. Petitti brings some real world experiences into the classroom.  We had the opportunity to discuss the news interview on the latest mammogram recommendations coming from the task force where she is the vice chair.  It's very interesting to hear the different perspectives and how people hear things differently and the media tries to twist the truth or make you stumble.  Nonetheless, she did a good job on the news interview I saw.  Then, the next day, she shows up in our class to talk about research validity.  Of all people to question if research is valid or not, I think Dr. Petitti has a lot of experience with evidence based medicine and has certainlhy shared her knowledge on the validity of research.  This lecture was good but I felt it needed a lot more explanation, especially when talking about the odds ratio and the risk ratio.  It looks very similar to the sensitivity and the specificity tables but not quite the same.  I guess in order to better understand, we should take her biostatistics course.  The confusion for me was determining the sample size and coming up with the 4 factors to plug into a power analysis to recommend a valid sample size.  I'm not sure if the 4 factors are made up or if these are coming from some other type of calculation.  I'd like to see a little more in regards to the points she reviewed as the information shared felt a little rushed and we were unable to practice with examples.


Posted by Debbie Carter

Study Design

Content:
 Dr. Petiti's class on Reaserch Methods was another very useful and  insightful  lecture. This time she discussed about experimental studies and also introduced the measurements that are used for experimental validity like the odds  ratio, hazard ratio and risk  ratio. She mentioned a very good point that the basic step towards desinging a study is to understand the types of study desingns and know about the factors and biasness that can influence the study. She ta lked about study of  occurence of breast cancer with people taught about  BSE  and how sample size is very important factor in a study and how can a small sample size can get to a wrong conjecture.In another study of Coronary Heart Disease with drinking or non-drinking the experimental validity measurements were considered like the risk ratio and how the occurence of risk ratio of one event can be the  inverse of risk ratio of occurence of another event.In the study of occurence of cancer in people using cellphones she told about the factors which could bring a change in  the final outcome of results like the time taken to develope cancer and also the technological advancements during that time which could make the whole study unimportant. I would like to thank Nate and Sheetal for providing link to the articles. They are really good .

Posted by
Ashutosh Singraur

AMIA 09

Content: I was very fortunate to also be able to attend AMIA in San Fransisco.  There were many presentation and talks about a wide range of subjects in Biomedical Informatics.  I was most struck by the a presentation that Dr. Shotliffe gave when he was assessing the past, present and future of BMIE.  He basically gave the same presentation that he gave to us in the beginning of the semester in 501.  I thought it was incredible that we were able to hear this lecture before in a more private setting.

I tried to attend as many panel sessions as possible.  Topics in these panel sessions included, global health, BMI in the industry,  entreperneuship, and standards. One of the key note speakers was Mark D. Smith.  He is the CEO and president of the California Healthcare Foundation.  He have a really nice presentation about infomatics in his company.  He re-emphasis that he was struck by high levels of today's technology, yet the healthcare industry has not taken full advantage.  An example he gave was the fact that when nurse takes the patient's temperature with a digital termometer, and the nurse still has to write down the temperature either on paper, or in the computer.  Instead the thermoter should be connected to the computer and it should be automatically recorded.

The some panels were very interested and some needed a bit more structure.  The panel about global health was amazing, they talked about how biomedical informatics is being implemented in more poverty stricken countries.  The panel about Informatics roles in industry was lacking a bit, or I just had different expectations from it.  The panel was composed of people from industry and they just talked about their company and what the company was doing.  The description for the this panel mentioned that it was going to be about how to get into industry.  This is the first year AMIA had a panel like this one, maybe in the upcoming years, they will improve.  I also attented a panel on standards, yes standards.  The panel consisted of C. Chute from the Mayo Clinic.  I was told by various of my peers, especially Mithra, that he gave a lecture about standards in past years and he made it very exiting. I was a bit dissapointed on this panel too, because Chute only talked about BRIDG infastructure and not really about standards.  It was good expusure to BRIDG though, I can't complain too much.  I also went to a panel on enterprenuaship in BMI, again, I think I had different expectations about this panel, though it was nice to get a few names of upcoming companies in BMI. 

I also attended the student paper competition, which I thought was the best part of the whole conference, I really learned a lot from these presentations. I also attented most of the paper presentation from our department and I would like to congratulate Mithra for a great presentation!

Dr. Shortliffe mentioned this article in one of his presentations, I thought it might be interesting to read

http://www.time.com/time/health/article/0,8599,1883002,00.html


Posted by p. ortiz

Friday, November 20, 2009

Valid Research

Dr. Petitti gave us more insight into conducting meaningful research.  This lecture focused more on the areas that could prove to be problematic throughout conducting research and interpreting results.  One area that could pose problems was the initial choice of the right study design.  This is probably the most difficult and time consuming part of the research process.  Once the right design is chosen, there are still many areas that could threaten the validity of the design, such as incorrect measurement of outcomes and lack of application to a wider variety of situations.  If the study is only "perfect" within its own design, its serves no value for the rest of the research community.  With these issues in mind, it is more likely that you are prone to conduct more valid and valuable research.

Posted by Annie

Design for Validity

Content:

In Dr. Petitti’s last lecture for study design, she introduced the descriptive studies, observational studies and experimental studies. And this week Dr. Petitti talked on the validity for the study design. One example is about the occurrence of breast cancer with being teaching BSE or not. It helped us refresh the knowledge about research design and since Dr. Petitti mentioned the sample size of this research and how researcher conducted it, I think it is also a good example for understanding the validity. The internal validity is about the study itself and the external study is about the application of the achievement of the study in the future. And another example of the Coronary Heart Disease is to explain the risk ratio and odd ratio which are the measurement of the outcome. The RR and OR work for the binary data and from the perspective of time, there is hazard ratio. Dr. Petitti spent much time on type 2 statistical error. It is a common threat to the validity of study. Because we do not know how big the sample is could be suitable for the study. And carefully choosing the subjects and power analysis could avoid the type 2 error. What’ more, the randomization is essential in the Epidemiology study.
Posted by
Xiaoxiao

AMIA Conference&San Francisco

Content:
Amazing week and great city! Although I missed the two lectures in last week, I absorbed other new knowledge and exposed to varieties of creative idea in BMI field from the 2009 AMIA conference. In this annual meeting, the main topics and discussion are around the Clinical Decision Support system, from technology, policy, implementation and social perspective. Also the translational medical informatics was also a hot topic discussed. How to emerge the traditional EMR system with genotype and phenotype information from technology perspective was introduced, and the security issue about how to store and communicate personal genomic information, as well as how to efficiently using these information to better understand and improve medical research and clinical practice attracted more concerns and discussion. As a BMI student, I more focused on the trend of technology developing in this field. But in this conference, I have a big feeling that the in BMI, technology is not the top one priority, although the technology is also a key important part. If from pure technology perspective, the technologies used and presented in this conference are not much so fancy from CS technology guru. But how to appropriate apply these existing technology into medical practice to fulfill the requirement of physicians and patients is more challenging than pure technology. And how to find out the point in medicine field to create a new idea to use these technologies is what the BMI student should do.

Posted by
Di Pan

AMIA conference

Content:
It is really lucky for me to have the chance to attend AMIA conference at the first year of my graduate study. As I was entirely new to this area, I was in confusion in the last three months. I had no idea of what was the direction of my future development as well as what I could do for this area.

In this conference, I listened to a series of science sessions an communicated with people in this area. Finally I find out what an amazing area it is. Biomedical Informatics is an area requiring creativity and imagination and can produce real effect to real people in real time. I was so excited to be exposed  to brainstorms of researchers in biomedical informatics. Though I could not understand most of the technical details, a lot of ideas have been inspired.


Posted by Jing

Dr.Petitti's lecture on Research methods

Content
The lecture on "Research Methods" by Dr.Petitti left me with a very different experience. Her explanation using specific biomedical examples to elaborate the importance of statistical measures was really interesting. The lectures mainly focused on the estimation and interpretation of Risk ratio and Odds ratio, given a medical condition. The concept of experimental validity is also discussed. It is really difficult to validate a study because technology keeps broadening, tracking of research subjects might become challenging and various other factors come into play.

The example on occurence of Coronary Heart Disease with drinking or non-drinking clearly explained that Risk ratio of occurence with drinking is the inverse of Risk ratio of occurence with non-drinking. Another example discussed was on implication of brain cancer with cellphone exposure using Odds ratio estimates. The number of sample members included in the study (sample size) has a key role in the efficacy and in estimating the statistical significance of a study.

Posted by
Harsha Undapalli

Epidemiology Research Design

Content:Thanks Lee for giving such a concise overview of the recommendations of the task force. This entire week I have been hearing debates about the U.S. Preventive Services Task Force (USPSTF) recommendations and stirred up the hornet’s nest. Every debate I have been hearing makes me understand the political implications of the recommendations and the brave effort by the taskforce to actually state the findings genuinely. Obviously they do not recommend that women over 40 should not be taking mammograms but only that they make a more informed choice about this rather than being herded to the X-ray room once they hit 40. It is really interesting to hear the other side trying to demonize the taskforce by stating that the taskforce has passed the death knell on mammograms as a tool for effective screening, which is definitely not the case. It is surprising to see how people can take statements out of context and stir up such a racket.
Dr Pettiti’s lecture was a very good over view of epidemiological methods. Understanding design concerns is the first step towards designing a study. I think that having a firm understanding of the various designs (case control, cohort, ecological, randomized control trial) and the biases and confounding factors underlying each of these is the first step towards your dissertation/thesis. Dr Pettiti talked about the Type I and Type II errors and the importance of avoiding Type II errors in research. I would like to add that Type I error or alpha (significance level of study) is also important to consider. We traditionally consider 0.05 as the magical number for setting this error and I have learnt all through my Masters that this is a big mistake. Every study needs to consider the scope of the study, the population being impacted and the outcome measures of the study before setting the alpha level at 0.05. Thus a p=0.05 is not a universal alpha and researchers need to consider other aspects before setting their alpha. The same applied to Type II error or beta (also known as power of the study) and there is nothing magical about 80% (0.8).
Another aspect of every research design is defining the measurement errors, biases and confounding factors beforehand. This is very critical to your study and there are textbook chapters on these. These biases obviously affect the internal and external validity of the study. I highly recommend going through a simple epi text to understand these as these concepts are not limited to epi research but encompass any kind of research study.
I found a very simple and easy online text and hope everyone likes it.
http://www.bmj.com/epidem/epid.html

Posted by Sheetal Shetty

BMI502Post

We have had another useful and excellent lecture from Dr. Petitti. This time the topic was validity and the use of risk ratio, odds ratios, and hazard ratios. Truth is the key and must be internal and external. Randomization is feature of this and the external applicability relates to being able to generalize. Type II error avoidance is a part of this quest for validity. I will want to try some of the power analysis and use these principles even more intensively in studies that I design. This analysis by Dr. Petitti provides a template to further evaluate other studies in the literature.

Stuart

Week of 11/16/09

Content:  This week Dr. Petitti gave a presentation on methods used for biomedical research.  Some of the topics in the presentation were measurements used in experiments and experimental validity.  Two of those measurements that were covered were risk ratios and odds ratios.  A couple of the experimental validity topics that were covered were sample sizes and technological advancements.  Increases in sample sizes can increase an experiment's power.  Technological advancements can cause a technology that is being studied to become rarely used.

The example that Dr. Petitti gave about technological advancements and studying the possibility that cell phones cause cancer was interesting.  The time that it could takes to develop a cancer is another interestingly complicated factor in that type of study.  I found a study that describes how studies on the possibility of cell phones producing cancer have been inconclusive.  The article recommends that "well-designed, preferably prospective, studies" be done.  However, as mentioned in class, technological advancements may make it difficult to accurately trace cell phone use over time.  The study is available here:
http://www.ncbi.nlm.nih.gov/pubmed/19282560?itool=EntrezSystem2.PEntrez.Pubmed.Pubmed_ResultsPanel.Pubmed_RVDocSum∨dinalpos=4

Posted by:

Nate

Thursday, November 19, 2009

Dr. Petitti/Research Design II

This was another good lecture by Dr. Petitti.  She skillfully provides examples, when explaining research methods. Of particular note, was her discussion of the United States Preventitive Services Task Force and the new recommendations for breast cancer screening. Upon review, I wish we had been provided with even greater detail. It seems 2 studies were done, totalling 500,000 Chinese women. My question about the studies would be whether the study of a single population group would properly reflect and correlate with American women. Diets, in particular are not the same, and couldn't this skew the study.
 FYI-
Summary of Recommendation
•The U.S. Preventive Services Task Force (USPSTF) recommends screening mammography, with or without clinical breast examination (CBE), every 1-2 years for women aged 40 and older.

Rating: B recommendation.
Rationale: The USPSTF found fair evidence that mammography screening every 12-33 months significantly reduces mortality from breast cancer. Evidence is strongest for women aged 50-69, the age group generally included in screening trials. For women aged 40-49, the evidence that screening mammography reduces mortality from breast cancer is weaker, and the absolute benefit of mammography is smaller, than it is for older women. Most, but not all, studies indicate a mortality benefit for women undergoing mammography at ages 40-49, but the delay in observed benefit in women younger than 50 makes it difficult to determine the incremental benefit of beginning screening at age 40 rather than at age 50.

The absolute benefit is smaller because the incidence of breast cancer is lower among women in their 40s than it is among older women. The USPSTF concluded that the evidence is also generalizable to women aged 70 and older (who face a higher absolute risk for breast cancer) if their life expectancy is not compromised by comorbid disease. The absolute probability of benefits of regular mammography increase along a continuum with age, whereas the likelihood of harms from screening (false-positive results and unnecessary anxiety, biopsies, and cost) diminish from ages 40-70. The balance of benefits and potential harms, therefore, grows more favorable as women age. The precise age at which the potential benefits of mammography justify the possible harms is a subjective choice. The USPSTF did not find sufficient evidence to specify the optimal screening interval for women aged 40-49 (go to Clinical Considerations).

•The USPSTF concludes that the evidence is insufficient to recommend for or against routine CBE alone to screen for breast cancer.

Rating: I recommendation.

Rationale: No screening trial has examined the benefits of CBE alone (without accompanying mammography) compared to no screening, and design characteristics limit the generalizability of studies that have examined CBE. The USPSTF could not determine the benefits of CBE alone or the incremental benefit of adding CBE to mammography. The USPSTF therefore could not determine whether potential benefits of routine CBE outweigh the potential harms.

•The USPSTF concludes that the evidence is insufficient to recommend for or against teaching or performing routine breast self-examination (BSE).

Rating: I recommendation.

Rationale: The USPSTF found poor evidence to determine whether BSE reduces breast cancer mortality. The USPSTF found fair evidence that BSE is associated with an increased risk for false-positive results and biopsies. Due to design limitations of published and ongoing studies of BSE, the USPSTF could not determine the balance of benefits and potential harms of BSE.
This information came directly from the following website, which provides much more detail, if you're interested.
http://www.ahrq.gov/clinic/3rduspstf/breastcancer/brcanrr.htm

Further, and on another note, she confirmed my suspicions that much medical research is done for the sake of research. For many years I have pondered the purpose of a study with 10-50 participants. Much medical literature is based on these small study sizes. However, now armed with Power Calculations to aid in our assessments of test validity, we can make informed decisions about existing studies, as well as plan for our own valid research.
Lee

Saturday, November 14, 2009

Virtual Reality in medicine

Content:
The lecture gave us a comprehensive view of virtual reality systems and importance of various simulators to tailor medical education. Virtual reality systems has the tremendous power to revolutionize the field of medicine by improving the skill and efficiency of surgeons and reducing medical errors to a great deal. Use of haptics in the anatomical examinations and surgery would greatly help surgeons to understand the underlying anatomic structures and thereby they significantly develop their skill before they involve in the actual treatment or surgery. Medical education using various kinds of simulators could be greatly useful for the burgeoning surgeons improve and weigh their surgical performance, which could significantly improve the treatment efficiency and reduction of fatal medical errors.

This article on virtual reality and its application to neurology is really interesting.
http://emedicine.medscape.com/article/1136605-overview

Posted by
Harsha Undapalli

Simulations as Warm Ups

Content:
During this week we had the opportunity of having Mithra present to us the state of the medical simulation as well as training through simulations. The concept of using everyday gaming systems to provide cheaper alternatives is great and will likely result in lower costs. As well, with warm ups available for nearly every other field, this provides the medical field with its own warmup. I feel that in doing so it will put the patients at ease and patients will be more trustful of the person holding a knife.

One question with regards to using gaming consoles for training I have to ask is, is the PC considered part of this mix? There and lots and lots of games out there for the PC both games installed locally on the PC and games on the internet (Flash). With the correct game, any skill be be developed.

On a side note, I recently saw this Cisco commercial about some of their products but they are referring to it as the human network. This linked clip shows how they envision TelePresence in Medicine. What does everyone think?

http://videolounge.cisco.com/video/the-doctor-is-in/


Watch this first (~43 mins) if you don't want your watching experience to be spoiled; otherwise, just skip to the paragraph below.
http://abc.go.com/watch/greys-anatomy/93515/239731/give-peace-a-chance



Back to simulations, Grey's Anatomy had an episode with an inoperable cancer along the spinal cord. Dr. Shepard goes ahead with the surgery anyways since the patient insists that he does it. He spends I think 8 hours initially just staring at the spinal column of the patient because he didn't know where to start one tiny mistake and the patient could die. He then spends 24 hours to actually take it out. Just imagine the benefits to patient and the doctor and the hospital if he had simulations ahead of time. As well, this shows a moment in the OR where the surgeon just tells everyone to be quiet so he can think. (The noise is a bit extreme, but it does emphasize the importance of introducing variables such as noise into training programs and simulations).



Posted by Eric

virtual reality

Content:

It is really a fancy idea that training the surgeons in the virtual environment before they go to the operation rooms in real life. Just like the athletes in the racing games, they have to start, accelerate, sprint, fight against fatigue and even the way of breathing. Also, the warm up can prepare them well the competition. There is no doubt that just a simple warm-up process can make such a great improvement on a surgeon. In Mithra’s lecture, the simulator training has a close connection with cognitive skills.


And recently I read news about the virtual health, a hospital took place in Second Life, a 3-D, virtual world that exist entirely on the Web (http:// secondlife.com). It said that it is an amazing virtual world spring to life for health care professionals, patient and even kids to explore and contemplate.


Posted by
Xiaoxiao

Friday, November 13, 2009

Mitra's Lecture

Content:
 This week we had only one lecture that too from Mitra which was indeed very insightful. The use of simulation techniques in surgical training. Its really commendable that people are recognizing the vast application  of simulation in medical pratice which is very valuable. Before medical practises I had only heard the use of this technique in training in avaiation where they tried to train pilots in virtual reality world. I also attended the seminar given by Dr. kahol on his work on this topic and how they have constructed simulation instruments  and technology using cheap and readily available instruments and actaully match to the precision and acuracy of what is available in thousands of dollar. This is very important because not all health care organisations can affoard to provide this quality and costly techniques to train their surgeons especially if I talk about India. There are very few trained suergons for  such a big population. If this technology is available cheaply then there would be a considerable decrease in number of deaths caused due to surgical errors. Being from India and having witnessed the fragile condition of health care system I believe this could really bring a remarkable improvement. The other thing that I liked in Mitra's lecture was the different studies that were conducted to study various aspects ,be it from physcomotor and cognitive skills to fatigue and also various enviornmental conditions like noise to create a real world situation. It also mentioned how surgical simulation training could make the residents perform the surgery quickly and also reducing the number of interoperative errors. I found a published experimental study as hoe virtual reality training can improve surgical results. It can be found in Pubmed by the heading of "Virtual Reality Training Improves Operating Room Performance Results of a  Randomized, Double Blinded Study."


Posted by
Ashutosh

Week

Content:  Thank you Mithra for the lecture.  It was very interesting to see how just a simple warm-up can have such great improvement on a surgeon.  It makes so much sense that surgeons warm-up before conducting surgery yet they don't.  But yet I wonder if a surgeon warms-up, will he/she get fatigue earlier? Since it was also proven that fatigue plays a role in the performance of the surgeon it would be interesting to find out.

Virtual realty is a powerful tool, and I think we are just seeing the beginning of technology.  As for medical education, virtual reality is being used in many places other than surgery.  For example, it is used  to train physician communications skills.  In contrast it is also being used to patients, especially veterans, with post-traumatic stress disorder. An news article on this can be seen in the following link: http://archives.chicagotribune.com/2009/feb/05/local/chi-virtual-reality_zone_nwfeb05

Posted by P. Ortiz

Simulation in Medical Training

It is a wonder that for some of the most intricate and life-saving work, there is no training that truly simulates the real life situation. This has occurred in the surgical field until now. Presently, there are some great inventions in the field that can truly have an impact on surgeons and medical students in the near future. The smart move is targeting the neurocognitive aspects of the surgeon because understanding decision making processes is the first step in evaluating mistakes and reasoning of methods. Once this is understood, steps can be taken to "correct" the thought process, providing a better service to the patient. Ultimately, I think the goal will be to understand the cognitive processes of the surgeons so well, that simulation can be tailored to every user for every situation, thereby making this training tool so useful for the entire population and maximizing the benefits for all types of patients.

(Thanks Mithra for a great lecture!)

Posted by Annie
Content:
Mithra’s lecture on the use of simulators for surgical training was insightful. Use of a more “real” environment for training using simulators makes sense as real surgery room environments are very different from a quite lonely simulator training room. The other important aspect stressed in the lecture was the feedback module which needs to include psychomotor and cognitive skills scoring. Especially interesting was the effect of fatigue on the psychomotor skills. Medicine has always falls in the realm of “practice” and experience plays a very important part for obvious reasons. I therefore found the concept of take home simulators particularly interesting as now we could have a freshly graduated surgeon performing surgeries with greater dexterity than expected. Also the use of gaming technology to produce these cost effective simulators can go a long way in improving the skill sets of surgeons. I found the article by PJ Fager, ‘The use of haptics in medicine’ very useful. You can find this article on PubMed. I think the article as it gives a very good overview of the subject and is simple to read.



Posted by Sheetal Shetty
Content: We are grateful for Mithra's presentation this week. As before, I will return to the past. When men were men and women were women, we would be on call often for 24 days/24 hours over a 30 day period. It was a badge of courage to be able to take good care of patients and sleep minimally. Then the pendulum swung in the other direction, resident physicians were limited to 40-60 hours of week. We were shocked because we felt that physicians needed to be trained with this intensive sleepless experience. Perhaps, the reality is somewhere in between. Fatigue in quantitative trials by Kahol and his group adversely affects cognitive and psychomotor skills in trauma residents and attendings, but less so in the later. However, the fine motor skills become grosser with chunking in the trauma attendings. Many performance components decline with fatigue including gesture proficiency, too movement smoothness, and hand movement smoothness in addition to cognitive errors. I would assume that there are better outcomes with less fatigue as consequence of this. However, too much rest and not being prepared contributes. This is why the studies that Kahol and his group have shown using warm-up to improve performance are logical and useful. It is therefore, also not suprising that practice improves performance in the laboratory and at home with Wii systems. The problem is that to reach full "embodiment" requires simulation of the circumstances of surgery with noise and stress. If we can add more features of this, then training will become even more useful. Up until now, we have not incorporated these approaches into resident training or established surgical training and performance, much less incorporating this into the training of medical students. The old adage has been see one, do one, and teach one. This is how I learned to do a lumbar puncture on patients. I became pretty good at it, but what if I had had the approach and training provided by Kahol? As a further note, an integrated group of monitoring evaluations include EEG and skill analysis systems that use Bayesian classifiers and Hidden Markov models. In Neurology, we consider the EEG to be a crude tool that may have some relevance in capturing vigilance in these particular circumstances. This work would profit from incorporation of more precise indicators of physiological function as part of the studies. As such, it would be wonderful to use techniques like functional MRI or PET scan in these studies, not only for localization of function with training but also change with training. These techniques are not necessarily simple or feasible for this, but it would be useful to know of other measures to use in addition to EEG.


Posted by  Stuart

Medical Education

Content:  I too want to thank Mithra for presenting the lecture to us, she did a fabulous job.  This area of research is fascinating and it is hard to believe that it is relatively new.  Until recently once a doctor passed his boards there was no more assessing his skills or performance.  I think that it is vital that we assess them and that we learn how to make them better, whether it be having them warm-up or acknowledge when they are too overwhelmed to preform.  It is astonishing to think that a musician warms-up, and yet a surgeon doesn't!!  I never thought about it until Mithra pointed it out.  It seems like an obvious way to prepare for surgery. 
I also got to see Kanav's lab and play with some of the simulations there.  They are extremely interesting!  I especially like the pointer that can "feel" objects.  I was looking at the end of pointer on the screen and a ball and when I physically moved the pointer around I could feel the resistance as it "touched" the ball. 

I also think that these simulation tools and tools such as the Banner simulation lab are vital in reducing errors that are a result of the learning curve.  Doctors need to practice and I would much prefer them to practice on a simulator than on me or anyone else.

Posted by laura Wojtulewicz

Week of 11/09/09


Content:  This week's lecture was on Surgical Simulators. As well as the other students, I also appreciate that Mithra gave the presentation after Dr. Kahol was not available. In the lecture, a randomized double blinded study of surgical simulators was presented. In that study surgical simulations caused residents to perform procedures 30% faster and make six times fewer intraoperative errors during surgery. Some types of human functions that can measured with surgical simulations are hand, eye, and mental activities. Skill assessments can be made based on those activities. Those assessments can be used to measure the performance of users during conditions like fatigue. Additionally, the assessments can be used to measure performance differences in users who have and have not warmed up using virtual reality (VR) simulations.

I liked how multitasking environments were included in the lecture. Including those environments in simulations can help to create a lot more realistic simulations. I can imagine how an environmental distraction such as background noise during a laparoscopic procedure could be a significant factor in the concentration of the surgeon. Also, I liked the graphs of performance that the presentation included because they were an effective way to show the results from studies with the simulators.

Performance of laparoscopic procedures with and without VR training was measured in a study at:
http://www.ncbi.nlm.nih.gov/pubmed/19838652?itool=EntrezSystem2.PEntrez.Pubmed.Pubmed_ResultsPanel.Pubmed_RVDocSum&ordinalpos=3
The study showed that surgeons who were novices had the largest measured benefit from VR training.

Posted by:

Nate

Simulation in medical education

Content:

Thanks Mithra for the good presenting for Dr.Kahol's absence. This class is a amazing introduction for the simulation technology developed in medical education field. I have to see the technologies is very fancy to me, and it is hard to image if I do not really have the chance to touch it. My first impression of Dr.Kahol's reseach in simulated surgeon tranning began at the new graduate orientation BMI in this fall semester. Fortunatelly, I got the chance to Dr.Kahol's lab and try the equipments to virtually mimic the different feeling when your hand touchs different objects with different texture. Also I was surprised when I even saw the Wii console in his lab. The first question comes into my mind is that, is it for playing game in the rest time when the students feel boring in the research? Now after this class, I finally understand that the virtual technologies used in Wii have many similarity in simulation in medical education, and some modifications based on the existing Will play station can be applied into different trainning practice and simulations in medical practice. For example, some students in my previous Biomedical Engineering department have assoicated Nintendo's Wii with heart-pumping action to use the Wii system's motion-sensing remote to teach cardiopulmonary resuscitation. From the technological perspective, it is not a big challenge, but the meaning of this application in medical practice is really fantastic. I think there should be more applications of the simulated technologies in medical trainning, which have not been explored. For example, the virtual simulation system for nurse venipuncture trainning. I do not know whether there is other people have tried it. If not, I think this maybe developed into a class group project in furture in BMI.

Posted by  Di Pan
Lecture 19

I missed this class (and all of you, of course), but I'm pleased that I did attend Dr. Kahol's seminar on his work, earlier in the semester. I am in awe of the developments in this area, and have great appreciation for the computer scientists and bioengineers in this program.

Retrospectively, I think Resusci Anne takes us near the beginning of simulation innovation. She was the mannequin created to teach CPR to millions. Modeled after a woman who drowned in the Seine River in the 1880's, this early simulator was created in the 1960's.  She is commonly recognized by any trained medical person in this country, and perhaps world-wide. As I remember Anne and her contributions, and look at the leaps and bounds taken by such facilities as the Banner Sim Center, I am reminded that great contributions can be made with basic tools. Dr. Kahol proved that with his adaptation of the Wi controller to teach and enhance laproscopic surgery.

Lee

Thursday, November 12, 2009

Simulations and Box Plots

First, I want to thank Mithra for presenting in Dr. Kahol's absence. The study and innovations of simlations in the periop (surgical) areas is very interesting and obviously showing a marked improvement in confidence and ability. But more interesting was the effect of stress or cognitive noise in the performance and concentration for the task being carried out. It has been known for years that fatigue, stress and outside distrations impact the performance in the periop environment but I have never seen a means to research and study. The EEG cap sounds intriguing but more importantly, having found physicians willing to wear it is even more intriguing.

As you heard, Banner Health has a brand new Simulation Center. The Sim Center is not a virtual simulation center. There is over 50,000 square feet of physical simulated hospital space including a periop (surgery) room equipped with the bed, anesthesia machine, lighting, equipment and a patient (mannequin); an acute area that can simulate ED, ICU, Med Surg and a Labor and Delivery area. The attitude and expectations of the evaluators in the center is to have participants dress and act the part. The hi-tech mannequins are not to be called mannequins. They are to be called patients. The clinicians going through the simulations are required to dress in their appropriate scrubs, wear their stethoscopes, etc. It should look and feel like you are in the real enviornment. This Sim Center is a great tool for clinicians either entering into the clinical world for the first time (new graduates) or for experienced clinicians wanting to change their clinical focus and learn a new area of medicine but the Sim Center by itself can't gather the information presented. What's missing in the Sim Center lies in the research being done at ASU but what's missing in the research is the integration of workflow which is what the Sim Center will hopefully excel in. Although workflow doesn't sound exciting, it has been shown to be the key to adopting any changes in healthcare. Understanding how to perform a procedure is just one piece of a clinician's day. Knowing how to survive and be the most effective in getting through your day is workflow. Just as a "take home" model is being explored for laparoscopy simulation, it would be great to expand simulation into a workflow dialogue and help clinicians play this over and over to give them confidence to not only do a procedure but also document and interact with the patients in a timely and effective manner. There is significant stress with clinicians on their first few days following orientation when faced with a fully digital facility. Their workflows change overnight. In the absence of an EMR, clinicians spend a great deal of time tracking down information. In a digital environment, information is readily available and efforts for finding the paper chart and organizing their day around others using the paper charts disappear. What clinicians may have been doing for years completely changes in as little as one day when they enter into a fully digital environment.

Mithra introduced a new type of chart (box plots) which I have not seen before and took the time to explain the charts. I have since searched and read about box plots. There is another name for them called Box and Whisker Plots. One website I found helpful is: http://cnx.org/content/m10215/latest/  because it shows the raw data and explains how the 25th, 50th and 75th percentile are displayed as well as the boundaries of the whiskers and their meaning.
Posted by Debbie Carter

Sunday, November 8, 2009

Research Methods in Healthcare IT

Content: Due to unavoidable reason, I missed this week’s lecture. So, I had to entirely dependent on the slides and other student’s blogs to understand the context of the lecture. The blogs seem extremely useful as they represent individual’s understanding to the lecture from their own point of view.
The lecture “Research Methods in Healthcare IT” describes the possible fields where health informatics could be applied. It started with one example of healthcare IT – “Intermountain Healthcare (IHC)”. It is an integrated health network based developed in Salt Lake City, Utah. The lecture included the success story of the system. It also explains the architecture of IHC. The system receives data from different sources and then it organizes data using “central data repository” or “enterprise data warehouse” to create the useful healthcare knowledge. Finally the lecture concludes with very successful outcome story of the system.


Posted by Gazi

Saturday, November 7, 2009

Nov. 2, 2009

Content:
As mentioned previously, Dr. Parker presented us with an application example that seems to take parts and bits of other lectures we've had and made it a success. It was explained how using different methods of storing data depends on how the data is entered and retrieved. One other point that I found interesting was that it was mentioned that although the systems put into to place by Intermountain Healthcare was standards compliant, it was ensured that the systems were not standards dependent. This essentially means that if standards change, then the systems are adaptable to these changes without having to undergo major system reworking. This makes a lot of sense since although standards are designed to create a uniform method and set of requirements, the people defining these standards are not standard and changes to standards are made as opinions change.
Posted by Eric

Data ware housing

Content:
As I understood from the lecture on health IT systems, data warehousing is the key issue which is acting as a hindrance in implementation of successful EMRs. Two types of data repositories were discussed. One was the clinical data repository or CDR which is patient specific information consolidated in a single database from various sources like clinical notes, laboratory, radiology etc, and the other was enterprise data ware house (EDW) which includes data from financial and business sources in addition to clinical data.
The key issues with CDR include data normalization, use of standards across systems used to input data, storage capacity, interface between the CDR and various sources which are linked to the CDR which input the data into CDR. Storage issue further give rise to the question of using relational database structure or a object repository (I think this is the EAV model of database modeling).
Enterprise data warehouse is used in various other industries including finance, banking. Its use in medicine is particularly challenging because of the lack of standards across various platforms. Therefore standardization of the semantics is one of the key issues. I think an effective EDW can actually solve the issue of electronic health records. That’s easier said than done due to obvious lack of standards and due to role of multiple stakeholders in making an EDW. The merit of having one is undisputable, as this data can be used for research and public health reports. Infact EDW is used extensively in clinical research which has data inputs from many sources which do not necessarily follow same standards. SAS software package has its own EDW system for clinical trials. But as pointed out in the lecture, clinical data is complex and this solution may be complicated than it appears.



Posted by
Sheetal Shetty

Case study of HIT technology

Content:

The BMI 502 come back to the computers application in medicine again in this week. The health informatics technology is the stem field in biomedical informatics, and is the application of technology which most closely related to health care. Dr. Parker gave us a case study of HIT in Utah health system, and lead us to get a practical conception of how the HIT will impact the practice in medicine and what problems and issues related to the implementation of HIT in hospital. First from technology perspective, the main obstacle in HIT field is still the standard. Although there is a HD7 standard for integration of the health information system, the wildly acceptance of the standard is still is big issue. Because currently there are many big hospital systems, such as Banner, United health, Mayo clinics, etc, which all have their own health information system with different standards. How to integrate these different systems together under a uniform standard definitely involve the issues to upgrade of systems, which will involve more cost in implementation and in training. So how to push these hospital systems to give up their own standards to accept a new standard is a big issue. Then secondly, from personal perspective, how to resolve the security problem in patient information will also be a challenge. Once the wild implementation of health information system in national wide, the individual health information would be an important private field that require high security technology, as well as policy, to ensure the safety of patient health information. In Dr Parker's lecture, the security issue was not discussed in class. I hope to hear the discussion in the future classes about HIT technology.

Posted by

Di Pan

Friday, November 6, 2009

IHC

Content:
This weak Dr. Parker gave us a lecture of Intermountain Healthcare, an integrated health network based in Salt Lake City,Utah. He introduced us IHC's process for using clinical data and the architecture that supports the process.

The lecture covered Use data to determine areas of opportunity, Construct “Clinical Programs” around the areas of opportunity, Design an evidence based, multidisciplinary approach, Use informatics to assist in achieving the clinical goals, and Measure outcomes. The most important thing I learned is the data architecture they use to integrate information. Dr. Parker used a graph to give us an intuitive understanding. What impress me most is their combination of relational database and object repository. In relational database, data are arranged in tables, so is easy for indexing. While in object repository, complex data are stored in a single field, so is easy for retrieval. Dual storage of clinical information, with relational indexes point to object data, enables both fast indexing and fast retrieval. Another important point is the difference between Clinical Data Repository (CDR) with Enterprise Data Warehouse (EDW). The former is patient oriented and optimized for clinical care,  and the latter has various orientations.


Posted by
Jing Lu

Healthcare IT (IHC as case study)

Content:

The lecture on "Research Methods in Healthcare IT" was very interesting as it was able to give clear idea of how health informatics is being implemented. The lecture was based on the case study of Intermountain Healthcare (IHC). The case study showed how the clinical data are being used in decision making. The speaker also explained the architecture of healthcare IT; integration engine, EMR, Central Data Repository, Enterprise Data Warehouse being the main components. Data standards, data dictionary, and data normalization was also covered in the lecture, which are very important parts during the implementation of a data warehouse. The different kinds of clinical data repositories, relational databases (MySQL, MSSQL, Oracle) and object repositories(XML), were also explained. We also studied when to use central data repository and enterprise data warehouse. Generally, data warehouse is used for large comparative queries that leads to the creation of "knowledge". In case of normal data repositories, they are optimized for specific domain and are very useful for real-time queries. The outcome of the implementation showed the impact of Healthcare IT, and we can definitely enhance its uses to improve healthcare.

Posted by
Prabal

HIT Methods

Clinical data is not just information--it is a tool that can prove to be very beneficial to society.  The data itself is "living"--it can significantly impact the well being of individuals.  The methods employed by IHC showed a well laid out model for storing, processing, and analyzing clinical data.  "Studying" clinical data allows for pinpointing clinical areas of interest that have the potential to positively impact society.  It is also important to clearly define goals and use facts to appropriately measure outcomes.  Data is integrated from many areas, standardized as best as it can be, and distributed to appropriate holding sites.  Also, probably the most important aspect of clinical data is the role of teamwork.  Keeping the lines of communication open between all departments and disciplines allows for better flow and maintenance of data from beginning to end.


Posted by Annie

IHC Case Study

Content:

This week we only have one lecture about the research methods in healthcare IT and Intermountain Healthcare (IHC) is as a case study. IHC is an integrated health network which 21 hospitals and more than 140 clinics take part in. Physicians are employed by IHC to participate in health plans. The majority task of IHC is handling clinic data. There are 4 processes of IHC for using clinic data:


1. Determine the area of opportunity through clinic data

2. Construct clinical programs based on areas of opportunities

3. Design evidence-based and multidisciplinary approaches

4. Using informatics to define the solution

5. Measure the outcome

Much emphasis is put on the fourth step. Data integration and data normalization is the core of the architecture and they are realized by the integration engine and data dictionary. Integration engine is a server that coordinates the flow of clinical and administrative data within an enterprise. It determines the destination of the data received. Data dictionary improves integration engine with concepts in represent of clinical data so that the data can be transferred to common format. The one representation of the same concept not only decreases the development time of data but also the opportunities for errors. The data dictionary seems to be used as a standard, and actually it acts at the terminology level. Clinical data standards are very critical for the clinical data model. As what we have learned, the standards enable the communication of data from different systems and also decrease the costs. Nevertheless, standards are not perfect all the time. At this point, IHC is effective because it has its own data models which are ‘standard-aware’ but not ‘standard-dependent’, hence it will not have to change all the data when standards change. The clinic data repository combines the relational database, which is quick for finding the exact information such as blood pressure, and object repository which is fast for data retrieval so that it could be speedy. Besides the clinic data repository, there is Enterprise Data Warehouse in the architecture for the information other than the clinic data. Anyway, both repository and warehouse are for the purpose of better application of health IT.


Posted by  Xiaoxiao

A Research Innovation in Healtha care IT

Content:

Dr.Parker’s lecture gave a comprehensive view of the InterMountain Health care system. The study clearly conveys that effective and appropriate use of clinical data has the ability to transform the entire health care industry in an unimaginable way. InterMountain Health care system is widely recognised for its superior quality and offering health care at lower costs. The system uses an evidence-based approach and integrates data using Central Data Repository and Enterprise datawarehouse. An Integration Engine handles the clinical, financial and other administrative data, by receiving it from source, then transforming it according to the need and finally sending it to the destination. Thus, the lecture focuses on a tremendous research innovation in HealthIT.

Intermountain Health care is offering valuable resources supporting advanced clinical research.
http://intermountainhealthcare.org/about/quality/Pages/home.aspx


Posted by
Harsha Undapalli

Dr. Parker's Lecture

Content:
I believe Dr.Parkers lecture on research methods on healthcare IT made a lot of sense for me as I saw this lecture with an application perspective of what I have learned so far in BMI. Starting from the concepts of HL7 to standards to relational databases and warehousing to evidence based approaches. The case study of Intermountain Health care system lays a foundation stone to tackle the challenges in integration of Information technology in Heath care to improve the quality of care delivery, cost savings and developement of clinical Decision support system. For me the most intersting part was learning about the architecture of Data Integaration. The  clinical data coming to central data repository where the data is normalized and standardized and integrated using integration engine and data dictionary. Here HL7 and RIM play the key role. The other important factor being the quick  acess and retrieval of clinical information.
In all the Intermountain heakth care and Banner provide an example of better integartion of IT in health care as every health care provider have to go through this challenge in coming years.


Posted by
Ashutosh

Week of 11/02/09


Content:  The lecture for this week was on research methods in healthcare IT using Intermountain Healthcare (IHC) as a case study. IHC utilizes its clinical data for quality improvements, cost reductions, and research. That clinical data is stored in a clinical data repository and enterprise data warehouse. IHC adheres to evidence based medicine by using published evidence external to their organization and internal evidence collected at their organization. The clinical data collected at IHC is used as medical evidence internal to their organization. Some of the things informatics are used for at IHC are integrating data from multiple sources, normalizing and standardizing data, storing data, and making the data accessible.

Dr. Parker helped me to understand more about what work as an informaticist can be like at a healthcare organization. Also, I like that he presented information about Intermountain Healthcare because it clearly exibits a successful use of informatics services. Being able to manage data so effectively at IHC allows that organization to use that data in a large number of ways. I agree with Lee and other people that have posted that IHC sets an example that I think a lot of other healthcare organizations will be interested in following in their informatics strategies. IHC's web page on quality and research includes a quote about their use of information technology:
http://intermountainhealthcare.org/about/quality/Pages/home.aspx

Posted by:

Nate

11/2

Content:

I really enjoyed Dr. Parker's lecture. He presented a successful case study of  Intermountain Healthcare.  I think that when you see a success story like this one, people should really study it and learn from it. This particular lecture tied everything that we have learned into this lecture.  From data dictionaries and standards, to evidence based medicine. It is always nice to see applied example of everything we are learnig. It helps me understand the concept more.  I really hope we have to lecutures like this one.


Posted by P. Ortiz

Wednesday, November 4, 2009

Content: I first would like to comment on Dr. Parker's lecture. This integrates with Dr. Fridsma's lecture on Electronic medical records and Mr. Warden's previous lecture. Intermountain Healthcare is a good example of how information technology as part of an integrated approach applied to areas of high variability, including clinical issues can impact on care, cost, and efficiency. The fact that data can be generated for research is critical. The use of the evidence based approach is also important as part of this unified scheme. The fact that data comes into a Integrated Engine that is linked to a Data Dictionary provides a real world use of HL7 RIM for the integration engine and the terminology organizers such as SNOMED. Another very effective feature is the intergration of a relationational database that integrates with an object repository that can retrieve rapidly when triggered from the database. Each is part of a Central data repository that can communicate directly with users and clinical departments. This CDR is part of an Electronic Data Warehouse that handles clinical, financial, and business data and integrates this. I consider this and the Banner System to be the standards that we should be looking at nationally for the evolving BMI in clinical medicine.

At this time, I also want to discuss the two lectures by Dr. Dinu on methods in bioinformatics. As a molecular biologist, this is an area that has clearly evolved with enormous current and potential application. In 1981, using recombinant DNA tools that were just being developed, we were able to construct a cDNA library from rat liver. Subsequently, the plating of the cDNAs on to special large sheets of filter paper could be screened by sequential colony hybridization with cDNA probes from different hormonal treatments. This  yielded different abundances for specific cDNA, reflective of specific mRNAs that were regulated by hormonal treatment. Abundance determination was visual following autoradiography, usually, Using digitization and quantitation schemes developed by NASA and Johns Hopkins, abundance determinations could be made for large numbers of colonies and compared and then represented by differing color intensity. Today, the use of microarrays with large libraries of DNA in combination with PCR amplification, is an offshoot of these earlier more labor intensive colony hybridizations for selection of relevant molecules to look for mechanisms of transcriptional regulation. One of the major advances has been the use of optically active molecules incorporated into probes rather than radioisotopes that have decay and health risks. The technology has also taken us much further in terms of the identification and quantitation of changes in mRNA abundance by the use of machine learning techniques with semi-supervised and supervised learning. This is particularly amazing for me and I look forward to applying these techniques.

Another aspect of extreme significance is the ability to use microarray for SNP analysis to look for disease association and potential gene abnormalities.These types of analyses provide a basis for the development of molecular techniques for disease identification and screening, and also potential evaluation of disease severity or recurrence (particularly in the case of cancer). I expect that this will evolve significantly and in no small part due to new techniques in proteomics. In the exercise for the class, I was very pleased to see the power of the gene software and Blast, that has in the past been principally manual and very clunky.

In my next BLOG, later this week, I will try to make some sense of Natural Language Processing and text retrieval, which has great potential for mining data on gene expression that is not immediately found on simple Pub Med searches.


Posted by Stuart

InterMountain HealthCare Case Study

As far as I'm concerned, the information shared by Dr. Parker is the world I live in all the time and the guts of Clinical Informatics at present if you were to apply for a job with a healthcare organization  What InterMountain has accomplished is challenges and opportunities every healthcare organization will be facing to meet the HIT demands.  Many heathcare organizations will need to tackle the data dictionary concepts and the data storage decisions.  InterMountain has utilized the best of breed approach which requires a central data repository to pull all the information together.  Once that information is in a central location, the information can be fed to the data warehouse for reporting or to the EMR for clinician access.  Decision support is based upon the data flowing in the CDR so it is important to have a good data dictionary, data stewards, etc in place to oversee the quality of the data being entered.  Overall, I loved the presentation as it was true to reality in clinical informatics today and helped apply the knowledge presented.

Posted by :  Debbie Carter

Dr. Parker 11-2-09

Content:  I agree with you Lee, Dr. Parker's lecture and topic was a breath of fresh air that grounded me back to the basics of BMI.  I loved the real example of intergration.  It was good to see another example, outside of Banner that was successful at connecting a large network of hospitals and other clinical settings.  It seems so simple to take what they do and apply it nationally.  Too bad there are so many people out there who are more concerned with the money then the great outcomes this type of communication in a health care system can bring. 



Posted by Laura Wojtulewicz

Research Methods in Health IT

The lecture by Dr. Parker brings sensibility back to BMI for me. Understanding the value of NLP, genomics and other concepts that are not inherently logical to me, it's good for me to be re-grounded by a lecture that I can more easily relate to.

Intermountain Healthcare is a model for all to learn from, if not execute. The integration engine in HL7 format, the data dictionary that seems to utilize ontologies in a logical format, and the enterprise data warehouse seems to solve the problems with EHR's (and perhaps EMR's) that we have discussed so often. They're all common sense pieces to the unsolved puzzle of universality and connectivity. However, as Dr. Parker indicated, solutions become complicated when politics and market share are involved. My experience has shown that the medical field is a magnet for open palms hoping to "cash in" on a captive audience that must pay to comply. Until now, I had not really viewed this as a large obstacle. Overall, I found the lecture and the Intermountain Healthcare Model, logical and relevant.

Lee

Sunday, November 1, 2009

Interesting lectures on Natural Language Processing

Content:
Dr.Graciela Gonzalez presented us very interesting lectures on Natural Language Processing and its usefulness in dealing with biomedical literature. Many people tend to use the terms ‘text’ mining and ‘data’ mining synonymously. In the lecture, Dr.Gonzalez gave a clear picture of how data mining differs from text mining. Ambiguity is one of the key problems while handling any data especially biomedical data. The primary step in text mining is tokenization, which involves identification of tokens (words) from a given set of information. Its very challenging to tokenize data as we encounter lot of variants like abbreviations, hyphens, apostrophes in the data. The key to success effective biomedical text mining lies in properly handling all these variants of data and ambiguity in data.

In the next lecture, we discussed some details regarding using regular expressions in text processing. This is a good link to understand various regular expressions-
http://www.zytrax.com/tech/web/regex.htm#intro
We also dealt Finite state Automata(FSA) in the lecture. I referred this page for a better understanding on FSA.
http://www.eti.pg.gda.pl/katedry/kiw/pracownicy/Jan.Daciuk/personal/thesis/node12.html

Posted by
Harsha Undapalli