Guide to Porting an OpenGL* ES 2.0 Application from iOS* to Windows* 8

Downloads

Download Guide to Porting an OpenGL* ES 2.0 Application from iOS* to Windows* 8 [PDF 697KB]
Download ios-to-windows-8-sample-app-release.zip [ZIP 131KB]

Contents

Introduction

iOS continues to be a popular platform for application developers, and many iOS applications utilize OpenGL ES to handle their 3-D graphic chores. OpenGL ES, or OpenGL for embedded systems, is a subset of the OpenGL 3D graphics API designed for embedded devices such as mobile phones. OpenGL is also available on Windows 8. But just how easy is it to move an OpenGL ES application, written in Objective-C* for the iOS platform to the ever popular Windows 8 where the dominant language for implementation of native applications is C#?

This document walks through a simple OpenGL ES 2.0 application and discusses the in’s and out’s of porting an app running on iOS to Windows 8 desktop.

The Demonstration Application

To show the basic structure and components a typical OpenGL ES application utilizes, we provide a simple application that uses OpenGL ES 2.0 to draw a three-dimensional cube with a texture image mapped onto each surface. The demo app functionality also incorporates a simple single source lighting model for the cube.


Figure 1. iOS* version of simple OpenGL* ES application

Users can manipulate the cube using common gestures:

  • Pinch to zoom in on the cube
  • Stretch to zoom out of the cube
  • Use a single finger (or mouse) to manipulate the cube using a virtual track ball.[1]

We will use this application to highlight the differences between working with OpenGL ES 2.0 on iOS vs. Windows 8.

Demonstrated OpenGL ES concepts

The application demonstrates the following OpenGL features:

  • WPF and OpenGL interoperability (Windows 8 version)
    • Creating an OpenGL context inside a WPF application
    • Rendering to the WPF-provided window surface
  • How to manage the OpenGL viewport and projection matrix inside WPF
  • Geometry definition and vertex specification
    • How to prepare vertex data for rendering. Including vertices, surface normals, and texture coordinates
    • Set up vertex parameters for rendering
  • Working with the programmable pipeline
    • Compiling and linking shader source into shader program
    • Setting up shader input attributes and uniforms for rendering
    • Working with textures
  • Basic ambient and diffuse components from the ADS light model
  • Supporting touch manipulation
    • Pinch object scaling
    • Manipulating the object’s rotation using an arc ball implementation

Development Environments

The iOS application described in this document was developed using the standard iOS development environment from Apple, XCode*. The application was entirely written in Objective-C and uses the native iOS OpenGL implementation and supporting frameworks included with the iOS SDK.

Our Windows 8 development was done using Visual Studio* Express 2012 for desktop apps. The application was written in C# using Windows Presentation Foundation and OpenTK library (http://www.opentk.com/). The OpenTK toolkit wraps OpenGL, OpenCL[2]*, and OpenAL APIs for the C# language, thus providing a convenient way to use them from .NET and applications written with WPF.

OpenGL ES

Our OpenGL ES application consists of the following basic steps:

  1. Context and window initialization
  2. Viewport setup
  3. Setting up vertex and fragment shaders
  4. Creating geometry buffers
  5. Draw call

Initializing OpenGL window and Context

iOS

The portion responsible for window handling in iOS is provided by the GLK framework. The view class responsible for window presentation is GLKView, and its backend operations are implemented by extending GLKViewController.

Following a typical iOS Model-View-Controller pattern, an Instance of GLKView is defined on a single window inside a storyboard and supported by the ViewController class, which extends GLKViewController. Our controller, therefore, must implement the following:

ViewController.m:

#import "ViewController.h"
#import "Cube.h"


@interface ViewController () { 
...
}
@property (strong, nonatomic) EAGLContext *context;


- (void)setupGL;
...
@end


@implementation ViewController


- (void)viewDidLoad
{
   [super viewDidLoad];
   
   self.context = ;
   scalePreviousIteration =0.0f;
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
   UITouch *touch = ;
   GLKQuaternion rotationQuaternion = [self calculateRotationQuaternionWithOrigin:currentPoint andDestination:touchPointMapped];
   
   arcCurrentRotation = GLKQuaternionMultiply(rotationQuaternion, arcCurrentRotation);
   currentPoint = touchPointMapped;
}

The touchesBegan function is always called first and is used to initialize all touch-related data. The second function touchesMoved is called on every single finger movement, and we use it to perform the rotation calculations. It sets a Quaternion that describes the current object rotation. The Quaternion is later used in the update function as well to construct the rotation matrix.

Windows 8

For touch input we simply use the built-in WPF touch events, TouchDown, TouchMove, and TouchUp.

//Touch down event handler
       void MainWindow_TouchDown(object sender, TouchEventArgs e)
       {
           //Get the touch point relative to the main window
           TouchPoint p = e.GetTouchPoint(this);
           //Set the acr ball start location
           m_arcBall.startPoint(p.Position.X, p.Position.Y);
           //increment the touch point
           ++pointCount;
           //Check the used touch point count
           if (pointCount == 1)
           {
               //Set the 1st touch device id
               p1InputId = p.TouchDevice.Id;
               //Clear the pinch manipulation distance
               pointDistance = -1.0;
               p1 = new Point(-1.0, -1.0);
               p1Valid = false;
           }
           else if (pointCount == 2)
           {
               //Set the 2nd touch device id
               p2InputId = p.TouchDevice.Id;
               //Clear the pinch manipulation distance
               pointDistance = -1.0;
               p2 = new Point(-1.0, -1.0);
               p2Valid = false;
           }
       }

       //Touch move event hanlder
       void MainWindow_TouchMove(object sender, TouchEventArgs e)
       {
           //Get the touch point relative to the main window
           TouchPoint p = e.GetTouchPoint(this);

           //Check the used touch point count
           if (pointCount == 1)
           {
               //Move the arc ball to the new position
               m_arcBall.movePoint(p.Position.X, p.Position.Y);
               //Check if we have a scene object to work with
               //Update the object's rotation using the arc ball provided rotation matrix
               if (m_object != null) m_object.SetRotation(m_arcBall.RotationMatrix);
           }
           else
           {
               //Check the device id and update the correct pinch touch point
               if (p1InputId == p.TouchDevice.Id)
               {
                   p1 = p.Position;
                   p1Valid = true;
               }
               else if (p2InputId == p.TouchDevice.Id)
               {
                   p2 = p.Position;
                   p2Valid = true;
               }
               if (p1Valid && p2Valid)
               {
                   //update the pinch distance
                   double newDistance = PointDistance(p1, p2);
                   //check if we have a pinch distance set
                   if (pointDistance != -1.0)
                   {
                       //Calculate the manipulation scale change
                       double scale = newDistance / pointDistance;
                       //Check if we have a scene object to work with
                       //set the object's scale
                       if (m_object != null) m_object.Scale((float)scale);
                   }
                   //Store the updated pinch distance
                   pointDistance = newDistance;
               }
           }
       }

       //Touch up event hanlder
       void MainWindow_TouchUp(object sender, TouchEventArgs e)
       {
           //Get the touch point relative to the main window
           TouchPoint p = e.GetTouchPoint(this);
           //decrement the used touch point counter
           --pointCount;
           //Check the touch device id
           if (p1InputId == p.TouchDevice.Id)
           {
               //Clear the 1st touch point id
               p1InputId = -1;
               //Clear the last pinch distance
               pointDistance = -1.0;
               p1 = new Point(-1.0, -1.0);
               p1Valid = false;
           }
           else if (p2InputId == p.TouchDevice.Id)
           {
               //Clear the 2nd touch point id
               p2InputId = -1;
               //Clear the last pinch distance
               pointDistance = -1.0;
               p2 = new Point(-1.0, -1.0);
               p2Valid = false;
           }
       }

Vertex, Fragment Shaders, and Scene Lighting

The actual shader code is written entirely in GLSL, which is fully portable between platforms running OpenGL. For the interested reader and completeness, we will briefly describe our pipeline implementation. For detailed information, refer to reference volumes such as “OpenGL Programming Guide, 4th edition” (aka the Red Book).

The OpenGL ES 2.0 programmable pipeline contains two types of shader programs: vertex and fragment shaders.

  • Vertex shaders operate on a single vertex and handle tasks like transforming the vertex position in space
  • Fragment shaders are invoked right after the rasterizer stage, which follows the vertex shader stage. Fragment shaders produce the output color of a single pixel usually accounting for texturing, lightning, and so forth.

Vertex Shader:

attribute vec4 position;
attribute vec3 normal;
attribute vec2 texCoord;

uniform mat4 modelMatrix;
uniform mat4 worldMatrix;
uniform mat4 viewMatrix;
uniform mat4 projMatrix;

varying vec3 vs_normal;
varying vec2 vs_texCoord;

void main()
{
   vs_normal = normalize((worldMatrix * modelMatrix * vec4(normal,1.0)).xyz);
   vs_texCoord = texCoord;
   
   gl_Position = projMatrix * viewMatrix * worldMatrix * modelMatrix *position;
}

The vertex shader takes as its input the vertex position, normal vector, and texture coordinates. The shader calculates the normal vector under the current projection and stores the value as output to the fragment shader. Texture coordinates are simply handed through. Finally, world and view matrices are combined with the current projection matrix, and the transformed vertex position is calculated.

Fragment shader:

varying highp vec3 vs_normal;
varying highp vec2 vs_texCoord;

//material color properties
uniform highp vec4 ambientColor;
uniform highp vec4 diffuseColor;
uniform sampler2D texture;
uniform highp vec3 lightPosition;
uniform highp vec3 lightColor;

void main()
{
   highp vec4 surfaceColor = texture2D(texture, vs_texCoord.xy);
   highp vec4 ambientColorCoefficient = vec4(lightColor, 1.0) * ambientColor;
   highp vec4 diffuseColorCoefficient = vec4(0,0,0,0);

   // for simplicity i am assuming reversed light vector is pointing from origin point into the light position
   highp vec3 lightDirection = normalize(lightPosition);
   highp float diffuseFactor = clamp(dot(vs_normal, lightDirection),0.0,1.0);
   if(diffuseFactor > 0.0) {
       diffuseColorCoefficient = vec4(lightColor,1.0) * diffuseColor * diffuseFactor;
   }
   gl_FragColor = surfaceColor * (ambientColorCoefficient + diffuseColorCoefficient);
}

The fragment shader fetches the pixel color from the texture and calculates ambient and diffuse light. It uses parameters passed from the vertex shader, which are now interpolated over all three vertices composing the triangle.

Closing

OpenGL ES 2.0 is a powerful tool for rendering 2- and 3-dimensional objects on a variety of computing platforms, and it is widely used on the iOS platform. Fortunately, this common graphics platform forms a bridge for developers looking to move OpenGL-based applications from iOS to Windows 8.

While the application used in this white paper is fairly simple, it illustrates the key concepts in working with OpenGL ES 2.0 and how those concepts are implemented on iOS and Windows 8. As demonstrated, the concepts are consistent between iOS and Windows 8, and moving from one to the other can be done in a straightforward manner.

[1] A virtual hemisphere, centered in the middle of the screen. See http://www.opengl.org/wiki/Trackball

[2] OpenCL and the OpenCL logo are trademarks of Apple Inc. used by permission by Khronos.

Intel, the Intel logo, Atom, and Core are trademarks of Intel Corporation in the U.S. and/or other countries.
Copyright © 2013 Intel Corporation. All rights reserved.
*Other names and brands may be claimed as the property of others.



Einzelheiten zur Compiler-Optimierung finden Sie in unserem Optimierungshinweis.