Is car insurance in Florida required? Absolutely. In fact, having some level of car insurance is the law in every state except two (Virginia and New Hampshire). In Florida, you must carry proof of insurance with you whenever you drive and it must be current.