I thought screen tearing occurs when the fps is higher than the refresh rate? I'm confused because wouldn't tearing be kind of impossible with lower fps especially if people are playing on what are 60hz displays? I'd think a lower fps than 60 would introduce stuttering but that's dependent upon how often it happens like a big explosion or something. Wouldn't motion blur make it look smoother even if it's not syncing with the refresh rate ?
well if it's "unlocked" the framerate can go over 60 as well. but I've definitely experienced tearing at below my refresh rate
60fps on a 60hz display is one frame per cycle. 30fps on a 60hz display is one frame every two cycles. anything in between is in between, and needs to be buffered (wait for the next cycle) or you get tearing (more than one frame per cycle). these things are lessened on a 120hz display, since there's more cycles to dump frames into. for instance 40fps becomes viable on a 120hz display, at one frame every 3 cycles
you can lessen how much the player notices these things in games with a fixed camera. but in an FPS, where you can look at something rather complicated, and the whip around to look at the sky, it's hard to avoid
these are long standing issues on the PC, where most people are forced into some compromise (usually choosing screen tearing over any input lag). that's the reason g-sync is such a big deal. on the consoles v-sync is pretty standard, mostly because controllers aren't fast enough to notice the input lag anyway. but you're still going to see noticeable stutter as the framerate jumps around