Women in many cultures used to be topless all the time, especially before marriage.
Men walked around only wearing enough to cover their man parts...if that.
What's fascinating to me is how sexualized WE'VE become that nakedness is automatically sex, sin, depraved, evil etc. That Puritan instinct hasn't gone away in America.
Violence/guns/drugs etc., is cool but naked boobies makes us blush and nervous.
Why are we OK with violence but not sex...or even anything that hints at sex?