I don't even know who to talk to about this because I literally haven't talked to anyone about this issue. I was told several years ago that I had had an STI by a doctor. He said that it was not contagious if there wasn't a breakout. So over the last couple of years, every time I had a boyfriend and could feel an outbreak coming I would break up with my boyfriend at the time because I was so ashamed and didn't know how to tell them. I just stopped seeing anyone for the last 2 years because I was so embarrassed. My doctor recently retired and I went to a new physician. She said that it is NOT an STI and that it is just a skin infection that can easily be cleared up in a week with some antibiotics and a cream. I have been sabotaging my relationships for years due to this and now I'm being told that it's not contagious, it's not permanent, and its not something to be embarrassed about. I feel like I've wasted the last 5-6 years of my life being depressed, hiding and ruining any and all chances I've had at love.