CDC

The Centers for Disease Control and Prevention (CDC) is recognized as the lead federal agency for protecting the health and safety of people - at home and abroad, providing credible information to enhance health decisions, and promoting health through strong partnerships. CDC serves as the national focus for developing and applying disease prevention and control, environmental health, and health promotion and education activities designed to improve the health of the people of the United States.

Articles by CDC

Health