When I lived in Denver I had an amazing pediatrician. I loved him. The first time I saw him after Ava was born he had weighed her to see if she'd gained weight from birth and he said to me "Well, we know momma's factory is working good!" I knew right then that I loved him.
Eight months later when we moved west I had to start all over again. At first I thought the group of doctors I had found were okay. But the more often I went and the more invasive they became the less and less I liked them. They started asking so many questions that I felt like were none of their business and I found myself feeling like I needed to lie to them (like I was afraid they'd call child services on me if I answered wrong). Okay, not that bad, I knew they wouldn't call child services but I didn't like it. My dislike of them was confirmed when my sister-in-law told me of her experience with one of the doctors dropping the pants on her 5 year old son without so much as a "hey, is it okay if we do this now?" or anything. Just yanked 'em down. Poor kid.
So here's my dilemma. Do you lie to your doctor too? Is it just me? Does anyone think there is a legitimate reason they ask the questions they do and that I should be totally honest about the fact that no, I don't put sunscreen on my child every time we walk out the door. (Sorry but the only time my kids wear sunscreen is if we will be outside for an extended period of time.) By the way, last time I was there I did give them a totally honest answer about the sunscreen and they were not happy with me. This is why I lie, to avoid the unnecessary discussion about why it's important, blah blah blah.
I'm tempted to just say "sorry, but I don't think that's any of your business" and see what happens. Would you?