The government needs to get out of the medical field, this has only made dangerous, poor quality, corrupt, taken away informed consent, caused permanent injury and death and the worst humans on this earth are making profits off of making deals with governments to mandate on medical decisions. Its also been used to sell out our soverignty to the WHO which is medical militarization. Doctors care quality has gone by the wayside for 15 years. There is no credibility and definitely no trust in them. Vaccinations should be compulsory
Be the first to reply to this answer.