Exactly! And for me it's also usually a farther reach. I'm often using my Surface Pro 3 at my desk which my hands resting on the surface as I work and read on the screen. I had gotten into the habit of having one hand close to the screen. To go back I could just lift a finger and flick. Now I have to raise my whole arm to touch that back button.
The thing that really kills me is that writing a software routine to respond to a gesture by executing the back button action, is just a few lines of code. Therefore, Microsoft doesn't have a way to detect/process the gesture and call this routine, or Microsoft is purposefully withholding the feature. But I find it really difficult to believe the first horn of this dilemma. Microsoft has written plenty of code for detecting gestures; if not yet in Edge, then most definitely in other products proven by the fact that those products handle gesture input.
So now my question is, am I just missing some options, or can you tell me why MS is withholding the back gesture?