What Doctors Won’t Tell You

Probably most people who read this are, at least to some extent, cynical about the Western medical profession. In the West medicine is ruled by money, particularly the huge amounts of money made from drugs but there’s also a cancer industry and, for instance, vast amounts of money in hospital supplies etc. A few years ago … Continue reading What Doctors Won’t Tell You