The light field captures both the spatial and angular configurations of the scene, which facilitates a wide range of imaging possibilities. In this work, we propose an LF view extrapolation algorithm which renders high quality novel LF views far outside the range of given angular baselines. A stratified synthesis strategy is adopted which projects the scene content based on stratified disparity layers and across varying scales of spatial granularities. Such a stratified methodology proves to help preserve scene structures over large angular shifts, and provide informative clues for inferring the contents of occluded regions. A generative-adversarial network model is further adopted for parallax correction and occlusion completion conditioned on surface consistent feature. Experiments show that our proposed model can provide more reliable novel view extrapolation quality at large baseline extension ratios compared with state-of-the-art LF synthesis algorithms.