It depends on how you define empire, I suppose. In a simple view, an empire is one nation holding control over several others. Rome is a typical example of this. The US has military bases all around the world. No other country holds a military base in the US. The US has a strong presence throughout the world, and is unique in having such, but the nature of the US presence is not that of an invading conqueror, but that of a raiding special police unit. The US doesn't seek to militarily subdue other nations, but to ensure that they behave themselves.

I don't think the US is a military based empire. Times have changed since 400 AD. Is war the only means of conquest? Does an empire have to spread by the sword alone? There are other ways to control other nations: economic and cultural. The US is a cultural empire, and it's this that fosters the greatest resentment throughout the world. Invade Panama and depose Noriega, and the world scarcely blinks, but bring a Big Mac to Saudi Arabia and the world cries oppression. For better or worse, the US culture is homogenizing the world.