Android live Wallpaper for OpenGL ES 2.0 with a simplified DOF effect
Idea to create an application video was in turn inspired by design of Deus Ex: Human Revolution, namely its elements with shards of glass. A couple of days matured ultimate vision scene and in the end over the weekend created the first version of the finished application.
The scene is extremely simple. It consists of a smoothly rotating shards of glass that hung in the air. The fragments were divided into 2 groups — near and far. Objects in the distance are blurred to give the scene extra volume. Also have blurred the background in arbitrary color. The camera moves left to right according to the movement of your finger across the screen, and thus the pieces are moving.
Technically, it is implemented so: Shards of glass placed along the cylinder around the camera. Fragments beyond a certain distance are rendered into a separate texture and blurred. Together with them, the same is drawn and the background. Drawn first blurred objects in the distance and on top of them already drawn fragments that are closer to the camera.
That's the simple way we have implemented a simplified effect of depth-of-field (DOF), which is quite enough for this scene. Full implementation of DOF is more difficult and more expensive. This would render to a separate texture map of depth to the scene, a finished scene, and its the same again but with a blur. And then to draw on the screen simultaneously blurry and clear the stage, mixing them according to the maps of depth and focus parameters of the camera.
Since all objects in the scene are transparent, that all rendering is made with a different blending mode. The entry in the depth buffer is not maintained to clear glass is not cut off objects behind them. So the glasses wouldn't cause too big of re-rendering pixels. To create the highlights and reflections to the objects of the fragments are drawn with the cubemap size 128x128.
1. Filtration FBO with the glClearColor color.
2. On top of the drawn mask on the whole size of the frame buffer. Thus, we obtain the decorative colored blur for the background instead of a solid color.
3. It then draws the glass for the background. The resolution is 256x256, the image is quite pixellated.
4. Blur just the background. Low resolution background almost imperceptibly.
1. Clear screen.
2. Rendering of the background.
3. Rendering glass front end.
This drawing is done without writing to depth buffer, since all objects are transparent.
Blur is implemented consistent rendering of the image between the two frame-buffers a special Shader. A Shader can do a horizontal or a vertical blur, it is settable. The blur method is called ping-pong rendering. Its essence lies in the fact that the first texture from the frame buffer And rendered with a horizontal blur in frame buffer B, and then Vice versa from the In in A but with a vertical blur. This procedure can be repeated the required number of iterations to achieve the required quality of the blur in the original image. An example implementation of this postprocessing effect was taken long ago from some example bloom, the link unfortunately I can not find.
It is noteworthy that modern phones and tablets (and even very old devices too) can have time to spend not even one but several iterations of blur rather quickly. In practice it turned out that the Nexus 10 gives a stable 50-40 fps even with 6-8 passes of the blur textures are 256x256, and one pass is a full — horizontal + vertical blur.
Picking up enough of a compromise resolution textures, the number of passes and the quality of the blur, stopped at three iterations and the resolution 256 x 256.
In a previous article, I threw a dig at nVidia. It's not because I just dislike nVidia — I also dislike any other hardware manufacturers that provide buggy driver for your GPU. For example, in the development described live Wallpapers are faced with the problem on the Nexus 10. The problem lies in incorrect rendering to a texture, and this is manifested only when changing device orientation. How does the orientation of the tablet can influence the rendering to texture for us remains a mystery, but it is a fact.
First, to make sure that I just missed some nuance when the context is initialized, write a question on Stack Overflow: stackoverflow.com/questions/17403197/nexus-10-render-to-external-rendertarget-works-only-in-landscape And here is to praise the staff ARM for their technical support. After a couple days I received a letter from the engineer ARM in which it is proposed to give a request about this bug on the forum Mali Developer Center. I have prepared a simple test application and describe steps to reproduce the bug: forums.arm.com/index.php?/topic/16894-nexus-10-render-to-external-rendertarget-works-only-in-landscape/page__gopid__41612. And using only 4(!) day received the answer that indeed there is a bug in the current version of video drivers for Nexus 10. Most interesting is that ARM suggested workaround to solve my problem, who miraculously helped — you just have to call glViewport() after glBindFramebuffer(). Such work support ARM a monument to them in life it is necessary to put employee ARM took the trouble to find my e-mail (and it is on Stack Overflow is not specified), and engineers support ARM found and solved the problem faster than I even expected.
Anyone interested in Android as on the Nexus 10 please vote for the corresponding bug in the tracker Google: code.google.com/p/android/issues/detail?id=57391
Download the program from Google Play the link: play.google.com/store/apps/details?id=org.androidworks.livewallpaperglass
The described method the simplified DOF effect can be applied not only for the scene with the same objects as in our application, but in all other cases where it is possible to separate the main stage from the background.
Article based on information from habrahabr.ru
the Construction of the scene
The scene is extremely simple. It consists of a smoothly rotating shards of glass that hung in the air. The fragments were divided into 2 groups — near and far. Objects in the distance are blurred to give the scene extra volume. Also have blurred the background in arbitrary color. The camera moves left to right according to the movement of your finger across the screen, and thus the pieces are moving.
Technically, it is implemented so: Shards of glass placed along the cylinder around the camera. Fragments beyond a certain distance are rendered into a separate texture and blurred. Together with them, the same is drawn and the background. Drawn first blurred objects in the distance and on top of them already drawn fragments that are closer to the camera.
That's the simple way we have implemented a simplified effect of depth-of-field (DOF), which is quite enough for this scene. Full implementation of DOF is more difficult and more expensive. This would render to a separate texture map of depth to the scene, a finished scene, and its the same again but with a blur. And then to draw on the screen simultaneously blurry and clear the stage, mixing them according to the maps of depth and focus parameters of the camera.
Implementation of
Since all objects in the scene are transparent, that all rendering is made with a different blending mode. The entry in the depth buffer is not maintained to clear glass is not cut off objects behind them. So the glasses wouldn't cause too big of re-rendering pixels. To create the highlights and reflections to the objects of the fragments are drawn with the cubemap size 128x128.
drawing Order of background:
1. Filtration FBO with the glClearColor color.
2. On top of the drawn mask on the whole size of the frame buffer. Thus, we obtain the decorative colored blur for the background instead of a solid color.
3. It then draws the glass for the background. The resolution is 256x256, the image is quite pixellated.
4. Blur just the background. Low resolution background almost imperceptibly.
Drawing main stage and the layout of the two plans:
1. Clear screen.
2. Rendering of the background.
3. Rendering glass front end.
This drawing is done without writing to depth buffer, since all objects are transparent.
Blur background
Blur is implemented consistent rendering of the image between the two frame-buffers a special Shader. A Shader can do a horizontal or a vertical blur, it is settable. The blur method is called ping-pong rendering. Its essence lies in the fact that the first texture from the frame buffer And rendered with a horizontal blur in frame buffer B, and then Vice versa from the In in A but with a vertical blur. This procedure can be repeated the required number of iterations to achieve the required quality of the blur in the original image. An example implementation of this postprocessing effect was taken long ago from some example bloom, the link unfortunately I can not find.
It is noteworthy that modern phones and tablets (and even very old devices too) can have time to spend not even one but several iterations of blur rather quickly. In practice it turned out that the Nexus 10 gives a stable 50-40 fps even with 6-8 passes of the blur textures are 256x256, and one pass is a full — horizontal + vertical blur.
Picking up enough of a compromise resolution textures, the number of passes and the quality of the blur, stopped at three iterations and the resolution 256 x 256.
Mali
In a previous article, I threw a dig at nVidia. It's not because I just dislike nVidia — I also dislike any other hardware manufacturers that provide buggy driver for your GPU. For example, in the development described live Wallpapers are faced with the problem on the Nexus 10. The problem lies in incorrect rendering to a texture, and this is manifested only when changing device orientation. How does the orientation of the tablet can influence the rendering to texture for us remains a mystery, but it is a fact.
First, to make sure that I just missed some nuance when the context is initialized, write a question on Stack Overflow: stackoverflow.com/questions/17403197/nexus-10-render-to-external-rendertarget-works-only-in-landscape And here is to praise the staff ARM for their technical support. After a couple days I received a letter from the engineer ARM in which it is proposed to give a request about this bug on the forum Mali Developer Center. I have prepared a simple test application and describe steps to reproduce the bug: forums.arm.com/index.php?/topic/16894-nexus-10-render-to-external-rendertarget-works-only-in-landscape/page__gopid__41612. And using only 4(!) day received the answer that indeed there is a bug in the current version of video drivers for Nexus 10. Most interesting is that ARM suggested workaround to solve my problem, who miraculously helped — you just have to call glViewport() after glBindFramebuffer(). Such work support ARM a monument to them in life it is necessary to put employee ARM took the trouble to find my e-mail (and it is on Stack Overflow is not specified), and engineers support ARM found and solved the problem faster than I even expected.
Anyone interested in Android as on the Nexus 10 please vote for the corresponding bug in the tracker Google: code.google.com/p/android/issues/detail?id=57391
Result
Download the program from Google Play the link: play.google.com/store/apps/details?id=org.androidworks.livewallpaperglass
The described method the simplified DOF effect can be applied not only for the scene with the same objects as in our application, but in all other cases where it is possible to separate the main stage from the background.
Комментарии
Отправить комментарий