Just in case you have been hiding under a rug for a couple of months, I thought I would bring this topic back to the spotlight. I have gotten a little annoyed at news shows etc. because they keep repeating the same thing. You get to the point where you don't want to hear it anymore. It's like a broken record - and I don't want to sound like that all the time.
There is a crucial point to make though so I will say it again: Health Care, how I define it, is something that helps your body function properly. If you are sick, medicine might help you get better. If you twist something in your back, a chiropractor could help you get better. If you have something serious like appendicitis, surgery may be needed to help your body get better. Notice a trend? If you are pregnant, an abortion does not help your body get better - there are many health risks for you that usually are not associated with the actual pregnancy. As a result, abortion does not fit my definition and here is an article that explains the current events pretty well if anyone is interested.