"The End"

The death of romance in America

America’s society is changing in ways we prefer not to see. Especially in the relations of women and men, as they become dysfunctional and romance dies.