Any topic

Some historians view wars as catalysts for profound social, political, and economic change. Others believe that wars entrench the status quo. Looking at the major wars fought by the US (Indian wars may be considered collectively as a single war), with which side (if either) do you most agree, and why? [If you agree with neither, what conclusions would you draw about the impact of wars, and why?]