Is it paranoid to think that doctors get paid by drug reps for selling their brand of drugs? Maybe paying isn't the right word. Perhaps they get free vacations etc. Or is it naive to think that they don't get incentives from drug reps?
If doctors do get incentives, how do we know they are really prescribing what is best for us? Why not just prescribe the drug that gives them the best incentive?
If doctors do get incentives, how do we know they are really prescribing what is best for us? Why not just prescribe the drug that gives them the best incentive?
Comment