I haven't done any hands on nursing for over 2 years now, but when I did finally leave it was after 2 years as an RN. In the mid 1980's when I graduated I was so happy, ecstatic even, in my chosen career choice and I was certain I felt appreciated by my patients. By the time I hit 15 years in my view started to change. You see health care systems started to be viewed by government as business systems, so patients became "clients" and "health care consumers". Avoiding law suits became top of mind to the higher ups, leading to things like ... if someone showed up at the emergency room external exit crying for help we were not allowed to step outside and help them in; we had to call 911 and simply stand there and watch them bleed to death until paramedics arrived (the bleeding to death is a little dramatic but you get my point here). At one new job site I was asked to remove my outdoor shoes (it was late Spring and the ground was dry) at the front entrance. Excuse me but walking in stocking feet in a hospital/community health setting is not particularly safe, nor sanitary in my view. It was circumstances such as these that began to drive a wedge between myself and the profession I had wanted since I was a child. Years went by and I could no longer work in a profession that seemed to have lost its very reason for existing in the first place; caring for other human beings. "Caring" seemed to take a far back seat to profits $$. I suffered emotionally and certainly financially when I finally made the decision to step out and away Today I am hearing that many new grads leave nursing after just year in practicing. Their reasons? Overwork, forced over time, bullying, violence, to name but a few reasons. The nursing profession is at the heart of, if not THE HEART OF, every health care system around the globe. Common practice now is hallway medicine, people dying in emergency rooms while waiting for care, and far too many other horror stories. Nurses must speak out, LOUDER, in order to protect the human rights of others