I'm not going to mention recent events but anyone who watches the news has to concede that the United States of America is in a steep social decline.
Anyone who harbours ill will towards the USA need not waste their energy and resources attacking what is a brutishly powerful nation. Rather, they only need to wait as it destroys itself from within.
Right wing Christian fundamentalism combined with a colonial attitude towards the world and the toxic obsession with the misinterpreted constitutional right to bear arms is destroying a briefly great nation.
The only real tragedy is that many far superior western nations have irreversibly entwined their fate with that of the USA.