The foundation of American government rests on a simple but powerful principle: states are not mere departments of the federal government. They are sovereign entities with both the right and the responsibility to protect the health, safety and wellbeing of their residents.
"We're still a federal state, and that means that there are powers that are given to the federal government in D.C. and powers that are given to states and localities," Ross Burkhart told me.