Governments discourage and forbid the use of certain drugs because they are bad for your health. But if a drug has been shown to have positive health effects, should the consumption of that drug be mandatory?
It could make for some interesting pickup lines:
"Can I buy you a drink?"
"It's the law, toots."
Does anyone else get the feeling that scientists keep finding medical benefits for politically correct drugs?