I can’t believe there’s a ‘new’ field emerging in medicine, ‘Environmental Medicine’.
Isn’t all medicine supposed to take into account factors in your environment?
Maybe I’m just missing something, but ‘Environmental Medicine’ sounds pretty quackery to me.
It sounds like a way to engage sick people with things they can do to try and make themselves feel better (things like, ‘get Windex out out of your life’, then use what, bars of soap made from the cow I’m milking in my backyard?)
I’m sure these physicians mean well, and I’m sure cleaning up your environment and getting poisons out of your life is a good thing.
But as a category to practice?
Sounds just like another siren call preying on people who are sick and miserable, or a way for doctors to get rid of people with complicated cases, that they don’t really know how (or don’t want to) treat.
Eating a fruit and veggie diet, exercising, and avoiding chemicals in house, is going to help make people feel better. It doesn’t take a rocket scientist to figure that one out.
But for people with real, crippling medical problems–people like myself–I only wish it was as simple as eating a few more fruits and veggies. Instead, I wouldn’t be alive today without good ol fashioned western medicine. It’s not because I’m depressed, it’s not because I’m bored with my life, it’s not because I think doctors offices are cool places to be.
I need Western Medicine because there is a piece missing in my immune system and god knows I would do anything to solve it and go back to my old life that is starting to feel more and more like a forgotten dream.
I’m not deluded, delusional or crazy. I’m not a fragile emotional person who is looking for an authority figure to hold my hand.
I’m fucking sick and throwing out all of my household cleaners isn’t going to make me better, if anything, it’s probably going to make me worse because it’ll make me more prone to get infected with some funky bacteria lying around.
Am I missing something?