The Android development ecosystem has changed tremendously since I posted my first ever Android article almost 5 years ago. A number of tools and frameworks have been released that allow Android programmers to develop, test, and release applications rapidly. The next couple of posts are going to be overviews of some of the most cutting edge tools and approaches at your disposal. They will include simple but actual code examples from consumer facing applications I have developed.

Android provides a number of approaches for communicating between the major Android abstractions (Service, Activity, Application, and Fragment). These include Intents, Binder interfaces, Fragment Arguments, and Broadcast Receivers. Each of these approaches have their own set of gotchas and maintenance problems. Developers end up maintaining a number of references to Fragments, Managers, Service Binders, and Broadcast registrations as well as having to maintain a number of IntentFilter XML elements in AndroidManifest.xml.

Many recent frameworks have attempted to alleviate these communication problems by introducing an approach that is in use by many modern server side architectures. Event or message buses allow for the decoupling of services through a common communication “pipeline” that provides a queue of messages that can be sent, received, and processed by any service in the “network” without the services having to know about each other’s interfaces or even whether they even exist. This allows systems to be more maintainable and scalable.

EventBus

EventBus provides, well, an event bus for Android components to communicate in a decoupled way. Components simply register with the event bus, provide an onEvent* method for a certain type of Event, and will be able to receive any events posted by any other component(with some exceptions).

Simple register and unregister in an Activity. No interface implementation needed!


public void onCreate(Bundle savedInstanceState) {
		super.onCreate(savedInstanceState);
               //register
		EventBus.getDefault().register(this);
               ...
}
//unregister
protected void onDestroy() {
		super.onDestroy();
		EventBus.getDefault().unregister(this);
	}

Posting an Event. UpdateUIEvent is just an object, no subclassing or interfaces and can have any number of member variables.


bus.post(new UpdateUIEvent());

Subscribing to UpdateUIEvents to be received on the main UI Thread.


public void onEventMainThread(UpdateUIThread ev) {
    .... //perform operations on ev and update the ui
}

The onEvent* method naming convention allows developers to specify which thread they would like to receive events on. These are not enforced by interfaces so there is no extra “implements” boiler plate needed. Some other onEvent* methods are: PostThread (thread event was published on, default), MainThread (main UI thread), BackgroundThread (android’s background thread), Async (a newly spawned thread, be careful with these).

Examples
The following are some real implementation examples that provide a pain point and it’s EventBus solution.

Example 1 Activities and Fragments

The filter editing screen in my app 100filters is a combination of an Activity with a content view, an embedded fragment, and a subfragment managed by the embedded fragment. Before EventBus I was managing references to each through standard java member variables as well as the ever-fickle Fragment.getActivity() and FragmentManager.findFragmentBy* methods. The code for this is difficult to maintain and things like device orientation changes can wreak havoc on these types of implementations.



The main problem I had was when a user wants to apply a filter, they press the “check” button which lives inside of a subfragment(FilterSelector) that is dynamically added and removed by a root fragment (GroupFilterSelector). The “check” button uses the following code to call the save() method which is inside of the root FilterActivity.


public void onClick(View v) {
	if (((FilterActivity) FilterSelector.this.getActivity()).save()) {
		mAdd.setImageResource(R.drawable.ic_action_add_check_disabled);
		mSub.setImageResource(R.drawable.ic_action_undo_enabled);
	}
}

The save() method then calls an AsyncTask that applies the filter to the image and saves it. When the AsyncTask is successful I wanted it to do the above mAdd.setImage/mSub.setImage.. code inside of the FilterSelector fragment. The problem is FilterSelector is a subfragment of another Fragment and keeping track of which FilterSelector is currently active inside of FilterActivity(through FragmentManager or member refs), introduced a bunch of bugs that were difficult to debug to the point where I simply used the code above that just assumes that the save is successful.

Another problem is that the FilterActivity can also call save() on it’s own through a menu option and there was no way to perform the above code inside of the subfragment without the buggy referencing code which introduced an inconsistency into the interface that confused users.

The EventBus solution is simple. After registering for events inside of FilterActivity and FilterSelector I simply use the following code in FilterSelector:


public void onClick(View v) {
    bus.post(new SaveEvent());
}
public void onEventMainThread(UpdateUIEvent ev){
  if(ev.isSuccessful()){
       mAdd.setImageResource(R.drawable.ic_action_add_check_disabled);
       mSub.setImageResource(R.drawable.ic_action_undo_enabled);
  }
}

And the following inside of FilterActivity:


//receive the SaveEvent from FilterSelector
public void onEventMainThread(SaveEvent ev){
      performSaveTask();
}
//inside of SaveTask, post UpdateUIEvent to current FilterSelector
@Override
protected void onPostExecute(Boolean result) {
	EventBus.getDefault().post(new UpdateUIEvent(result));

}

This solution provides a simple, decoupled approach to performing tasks and updating the user interface(whether it lives in an Activity or Fragment) with minimal amount of code.

Example 2: IntentService to Activity

This example involves updating the UI of a currently active activity from an IntentService performing an operation in the background. This code is from an unreleased video app that would trigger an IntentService to upload a video to a server and then update the UI with a share link to the video on the web. Before EventBus, in order to do this the developer would have to implement a BroadcastReceiver and IntentFilter inside of the Activity and then create and send a broadcast intent from inside of the service.

The code in the Activity would look something like this:


@Override
protected void onResume() {
	super.onResume();
	mBroadcastReceiver = new BroadcastReceiver() {

			@Override
			public void onReceive(Context context, Intent intent) {
				String url = intent.getStringExtra("url");
				mShareUrl.setText(url);
			}
		};
	IntentFilter filter = new IntentFilter("com.awc.UPDATEUI");
	this.registerReceiver(mBroadcastReceiver, filter);
}

@Override
	protected void onPause() {
		super.onPause();
		this.unregisterReceiver(mBroadcastReceiver);
	}

And sending the notification from the Service would look something like this:


    Intent updateUi = new Intent("com.awc.UPDATEUI");
    updateUi.putExtra("url", url);
    sendBroadcast(updateUi);

With EventBus the Activity code is simplified to:


public void onEventMainThread(UpdateUIEvent e){
		mShareUrl.setText(e.url);
}

and the Service is simplified to:


    bus.post(new UpdateUIEvent(url));

This example is really basic but getting rid of managing intent filter string constants, registering broadcast receivers, and building Intents with a large amount of extras contributes immensely to maintainability (using EventBus is also faster than through the Intent Registry).

Example 3: Sticky Events and Activity Results

By now you are familiar with how components in Android can start Activities and receive data results from those activities. In my application HoneyGram the user can use their Instagram credentials to receive an OAuth access token that can then be used to interact with the Instagram API. In HoneyGram this flow involves starting an activity that opens the Instagram OAuth authorization endpoint in a fullscreen WebView. The WebView then parses the access token and passes it back to the Main Activity which then opens a number of Fragment Tabs with content using the token (if the log in was successful of course).

The boilerplate code for starting and receiving data from the Login Activity looks a little something like this:


//start login flow
Intent itent = new Intent(Main.this, LoginActivity.class);
Main.this.startActivityForResult(tent, 111);
//on result
@Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
if (requestCode == 111 && resultCode == Activity.RESULT_OK) {
      String token = arg2.getStringExtra("token");
       ...//save token and add new tabs
}
}

The code which returns the data in LoginActivity to the Main Activity looks like:


    Intent data = new Intent();
    data.putExtra("token", token	);
    LoginActivity.this.setResult(Activity.RESULT_OK, data);
    LoginActivity.this.finish();

EventBus allows you to create Sticky Events which persist in the message queue until they are removed from it by a component. We can take advantage of these events to simplify the process of receiving data from an activity without having to deal with request codes, result codes, and intent data.

The code from LoginActivity that uses sticky events simply becomes:


    bus.postSticky(new LoggedInEvent(token));
    finish();

And the code to pull the event from the sticky queue becomes: (note this code could also be in onResume)


@Override
	protected void onActivityResult(int requestCode, int resultCode, Intent data) {
		LoggedInEvent stickyEvent = (LoggedInEvent) bus
				.getStickyEvent(LoggedInEvent.class);
		if (stickyEvent != null) {
			bus.removeStickyEvent(stickyEvent);
			String token = stickyEvent.token;
                        ...//do stuff
                }
}

These are just a few simple real world examples for using EventBus in your Android application to streamline code as well as make your app more performant. If you have any examples of any heavy duty uses of EventBus feel free to post them in the comments.

Check out the github page for EventBus for source/jars as well as more documentation and some performance measurements compared to other Android event bus frameworks.

Posted on February 26th, 2013 | Filed under android, java, mobile | Comments Off

Really quick fun app for adding a bunch of weird filters to pictures. My reaction to the numerous photo editing and filtering apps.



Update:
What started as a joke ended up with ~65,000 downloads in about a month so I cleaned up the interface, added on screen color,size, and text controls, and added in-app purchases.

Posted on January 15th, 2013 | Filed under Uncategorized | Comments Off


A quick, fun app I wanted to do because I wanted to refamiliarize myself with the Android camera APIs, do some real-time image processing, and because iOS users shouldn’t have all the fun. The application lets you create images that are mirrored  on a chosen axis. Users can then share the images through various apps using the Android share intent system which is backported using ActionBarSherlock.  You can also select existing images from your gallery to apply the mirror filters too.

The main challenge was finding a combination of camera preview size and image processing techniques that would provide a suitable framerate for the camera preview as well as keep memory usage down.  Oddly enough doing things in portrait mode with the Android camera is still fairly unruly so I ended up flipping and processing the raw camera preview images myself .  I haven’t been able to get this working on any Android version under 3.0 but it is something that I am still investigating as Android 3.0+ usage is still very low.

 

 




 

 

Posted on October 30th, 2012 | Filed under android, java, mobile, photography | Comments Off

Lendry is my entry into the MintChip Challenge. The goal of the challenge is to create an application that uses Mintchip, a kind of e-wallet/virtual currency hybrid project created by the Royal Canadian Mint. MintChip is a chip with a secure element that stores currency and can create and claim value messages, which are signed hash messages that include a value and the ID of the chip that will receive the value. For more information on MintChip technology see the developer documentation.

Goals:
I had a couple of different goals I wanted to accomplish for this challenge. The main one was to create an end-to-end mobile payment application that allowed for painless peer-to-peer transfer of currency. Another goal was to deeply integrate with existing social networks, as well as take advantage of the latest technology available on Android devices and the Android SDK. A secondary goal was to create an interesting and attractive user experience to surround the “simple” task of transferring money between users.

There wasn’t much time to develop the application so in order to meet my goals I decided to use the Parse SDK. The Parse SDK provides a hosted back end service and native client code that allows developers to quickly develop applications without any back end development.
Some points on Parse:

  • Nicely abstracted SDK with detailed tutorials
  • Takes care of user login, creation, and password management
  • Simple facebook and twitter integration
  • Flexible key/value data store, simple image and file store
  • Integrated push notifications (I have wasted a lot of time implementing push servers in the past
  • cross platform if I wanted to create an iOS or web based port

The main challenge was integrating the MintChip technology in an interesting and non-obvious way. The MintChip SDK came with a demo application that allowed two MintChips to transfer funds between each other. The user would have to type in the MintChip ID of the other chip as well as manually transfer the Value Message hash to the other chip. I decided first thing that users should not be typing in a MintChip ID at any point in the process so MintChips are saved with their user data. Also, MintChip swapping should be as simple as changing the SDCard(physical MintChip) or adding a new certificate to the SD Card(hosted MintChip). Value Messages are also synced through the cloud so that users can be close or far from each other. I wanted to have rich meta-data for each transaction that includes a user avatar, a memo text field, and a high resolution image of whatever the “item” was that was purchased. I used the Facebook and Twitter integration to allow users to easily find their friends as well as share (or not share) their transactions on those networks.

I also wanted to take advantage of the latest technology available to Android devices. This would obviously be NFC which goes hand in hand with mobile payments. I took advantage of Android Beam to allow users to initiate transfers by tapping their phones together. I also added functionality for users to program NFC Tags that other users can scan in order to initiate a transfer to another user. This would be useful at something like a garage sale or a t-shirt stand where users can easily scan tags next to products to buy them.

Jelly Bean came out right in the middle of the contest so I implemented rich image notifications even though I am pretty sure none of the judges were using JB enabled devices.

In order to test my app on older devices (the only device I had that actually had an SD card slot that I could use with the physical MintChip was my Nexus One) I had to backport a lot of my code to use the Android Support Library as well as ActionBar Sherlock.

HD Demo Video: Lendry On Vimeo

Here is the description from the contest page which you can see here: Official Lendry Entry

Seamless Personal Payments

Lendry is the easiest way to use MintChip on the Android platform.

Send MintChip payments to friends for anything: Bar tabs, moving help, bicycle repair, or even good advice

Easy integration with both remote and SD MintChip accounts. Insert MintChip enabled SD card or place .p12 file on device filesystem, Lendry handles the rest

Deep Facebook and Twitter integration: Login using Facebook or Twitter accounts, find friends from both services using Lendry, and post payments to your timelines. Share as little or as much as you want

NFC Support: Use Android Beam to initiate payments or configure NFC tags to provide an easy way to make a preconfigured payment

High resolution images help you remember your payment history

Simple interface makes keeping track of and sending funds easy

Hybrid P2P/Cloud Solution allows you to send funds to someone right next to you or across the planet

Check out the video to see it in action!

Whats next?
The future of MintChip is uncertain but I am looking into possibly using an alternate payment system if there is any interest in the app.

Posted on September 12th, 2012 | Filed under android, identity, java, mobile, NFC, payments | Comments Off

Direct Link To HoneyGram In The Android Market

I have been doing incremental updates to HoneyGram the last couple of months and just recently got my hands on a Galaxy Nexus as well as GoogleTV 3.1 update, so I spent a few hours tweaking the app to support those platforms.  Upgrading was fairly painless with most tweaks going to layout and resource xml files and simply needing to include a layout-xlarge/ resources-xlarge with tablet specific resources and then tweaking the standard layout/ and resource/ directories to support both phone and television devices.  HoneyComb apps show up automatically for ICS in the Android Market and GoogleTV apps only need a few tweaks to be visible by those devices.

A few issues I ran in to:

  1. A webview inside of a DialogFragment gets full screened in ICS so when redirecting back to the app it launches a new Main Activity even though I had launchMode=”singleTop” I fixed this by switching it to launchMode=”singleTask”
  2. The text color for tab navigation labels are a little tricky to change.  I needed two sets of values/styles.xml  since ICS displays tabs on the line under the ActionBar the color used is the background of the main activity instead of the action bar.  I ended up having white on white text to background on phones and white on brown on tablets, but this was easy to fix by specifying a special values-xlarge/ resource directory with tweaked colors and text sizes.
  3. Still having trouble getting the StackView widget to display and scroll properly on GoogleTV. StackViews inside of normal Activities don’t seem to work with the GoogleTV keyboard.
  4. In order to have your app show up on the GoogleTV market you need to set the following in your manifest:<uses-feature android:name=”android.hardware.touchscreen” android:required=”false”/>

    You also can’t use GPS permissions since GTV’s do not have GPS chips (that I know of) so remove Permissions.ACCESS_FINE_LOCATION  and any GPS_PROVIDER code you have.

  5. Make sure you use the OnSelectedItemClicked listeners for ListView/GridViews or arrow and enter buttons on GoogleTV will not work properly.
  6. GridViews with ImageViews in them are difficult to size and layout properly, I am ending up with whitespace inbetween my columns that make the interface look funny, still working on fixing this.
  7. Enable system level cache for Http Requests using the information on this page.  Seems to slow down things a little on ICS but it may just be my connection conditions.

 

 

 

Posted on December 12th, 2011 | Filed under Uncategorized | No Comments »

Direct Link To HoneyGram In The Android Market

I’ve been meaning to learn all of the new Android 3.0(Honeycomb) APIs but due to the emulator being excrutiatingly slow(understandable as it is emulating an actual system image, unlike the iPad simulator that is basically just running an OS X app in an iPad shaped window), and not wanting to shell out money for a data plan, I have had to wait to get my hands on a Xoom Wifi.

For my first Honeycomb application I wanted to create an app that took advantage of the form factor, processing power, and faster network connection of the device. I also wanted something that could exercise all of the new APIs and UI widgets that are included in Honeycomb. I had been dabbling with the Instagram API and figured an image browser would take advantage of all of these features.


Popular images

The application allows everyone to view the popular feed on instagram, as well as query images nearby your location, and search through tagged images. If you have an Instagram account the fun really begins. Authenticated users can view their friend feed, their own pictures, favorite images, comment on images, and follow/unfollow other users.

The Instagram API uses OAuth 2.0 which was a snap to implement compared to OAuth 1.x! It also has a rate limit on how many queries can be made per key(users that are not logged in all share the same key, which can cause problems if the app gets popular). I created a simple app using App Engine that uses the memcache API to cache query results for a period of time.

Nearby Images

As for building the app itself, Honeycomb provides a nice default theme and widget set up that makes building your app quick and good looking right out of the box. Setting up a tabbed application with menu actions and subsections is a snap using the ActionBar. I decided to use fragments as much as possible as it seemed like the most difficult new API to grasp. After some trouble with Activity/Fragment lifecycles I got the hang of things and found it to be an incredibly powerful new set of tools. I found Fragments to be a mixture of an activities, views, and services. The activity portion is because of the various lifecycle methods and considerations. Views because much of the fragment lifecycle involves view creation/modification. Services because you have to go through the FragmentManager to manipulate fragments. You should never try to interact with a fragment through something like an instance/class variable.

Image Detail

One of my favorite features is the Image AppWidget. It uses the new StackView widget that provides a really nice looking and animated “stack” of views that automatically scroll and also let the user interact with through touching and dragging. HoneyGram lets you add an Image widget for popular, friend, self, and tag feeds. I also found that you could use the StackView in normal application layouts, although I plan on developing some other ways to visualize images.

AppWidget Dashboard Overkill!

Another feature I am proud of is the “interactive” surface view. This view uses OpenGL ES to present any set of images in a randomly generated “pile.” The user can then use multitouch to move, rotate, bring to top, and hide any of the images. Users can then save their work in the Gallery as well as share the image through any app on Android that allows image sharing.

Interactive Surface

Instructions

Uptake?

Overall I had a really positive experience developing this application, but there are some issues with the Android Market that are preventing this app from reaching its target audience(people that own Android tablets).

The biggest issue is that there is no way to filter applications that are designed specifically for Honeycomb from within the Android Market application. The only Tablet specific applications are in the “Featured Tablet Applications” list and that only contains 30 or so applications(most of which are just games that have been tweaked to work on the larger screen). When you view other application categories it gives you a list of all the available Android applications in that category. Anyone who has released an application on the Android Market knows that the “Just In” section of any category cycles so quickly it is practically impossible for anyone to see your application in that list.

Now I can understand that there simply aren’t enough Honeycomb specific applications to warrant a “Recent Tablet Applications” section but it is still difficult to find any outside of the featured apps. Tablet applications will also never be able to be in the Top Paid/Free Applications sections because they will never get enough sales/downloads as their phone counterparts.

Another big issue is that users cannot directly rate or review applications through the Honeycomb version of the Android Market. I admit I spent a good amount of time mindlessly tapping each part of my application description trying to find the right widget to rate and review. It turns out you can leave a review and rating(but not just a rating) through the web interface for the Android Market

 

I am still really excited about Honeycomb and designing and developing applications for Android tablets, but I am going to hold off on releasing any new applications until a) there are more devices in circulation and b) the Android Market improves its tablet sections

Gallery:

Posted on April 10th, 2011 | Filed under android, java, mobile | 1 Comment »

Posted on January 5th, 2011 | Filed under Uncategorized | No Comments »

I have released a simple casual game on the Android Market. I decided to make a quick game while between jobs to see how the market has changed since I released my last game about a year ago. I decided to dust off a 2D OpenGL ES engine I wrote about a year and a half ago. It also uses custom Box2d JNI bindings and the Android NDK for physics. I have also integrated the new Android LVL licensing system, OpenFeint leader boards and achievements, and obfuscation using Proguard. It is designed specifically for hdpi devices and I am excited to see how it works on some of the upcoming Android tablets.

One snag I hit was when obfuscating with Proguard and having classes with JNI methods, these classes need to be “kept” by the proguard config file so that the method names and package still match the JNI headers and native source compiled by the NDK:

-keep class com.awalkingcity.casual2d.box2d.Box2DContext {;}

Another thing I am attempting to do is limit which devices can see the app in the market. The app was designed for HDPI screens as well as devices with faster processors than some of the first generation Android devices. Limiting by screen size does not work because the phones I am trying to target report both Medium and Large. So what I tried doing was adding “uses-feature” android:glEsVersion=”0×00020000″. My reasoning is that faster hdpi devices support OpenGL ES 2.0 while slower devices do not. I am not sure whether some of the custom ROMs out there hack OpenGL ES 2.0 support into the older devices or at least spoof support for it.

I am going to update this post with a couple of more tips and tricks I encountered while developing this game.

Here is the market description and some screen shots:

Monkey Fill is a simple and addicting physics game!

The goal is to fill the screen with monkeys while avoiding the bouncing green balls.

Hold down and drag your finger to grow monkeys.They will stop growing if they touch another monkey or pop if they hit a ball.

After one day of being on sale it became very obvious a free version is needed to drive sales of the full version, so I created a limited free version, we will see how well it does in the sea of free apps in the market.

Here is the OpenFeint listing: Monkey Fill





Posted on September 21st, 2010 | Filed under Uncategorized | 2 Comments »

As I mentioned in my previous post I have been working through  the OpenGL ES 2.0 book and have been porting many of the examples to Android. I am going to go through some of the progress I have made so far.

Besides getting the various Android side of things set up the biggest annoyance I found was that all of the shaders in the examples were hard coded.  They all look something like this inlined in the code:


GLbyte vShaderStr[] =
"attribute vec4 vPosition;    \n"
"void main()                  \n"
"{                            \n"
"   gl_Position = vPosition;  \n"
"}                            \n";

GLbyte fShaderStr[] =
"precision mediump float;\n"\
"void main()                                  \n"
"{                                            \n"
"  gl_FragColor = vec4 ( 1.0, 0.0, 0.0, 1.0 );\n"
"}

Obviously this makes it fairly difficult to edit and test shaders in an external program so the first thing I did was create a mechanism to load a shader from the res/raw/ resource directory, drop it into native code, and compile/link into a shader program .  These code snippets should get you started:

In Android/Dalvik land load a shader into a string:


String vShader = readFileAsString(GL2JNIView.this.getContext().getResources()
.openRawResource(R.raw.vshader));

Send the string to native land and load it into a character array, compile/link the shader program, and free the pointer:


char *vertexShader = env->GetStringUTFChars(vShader, 0);

//load shader program

env->ReleaseStringUTFChars(vShader, vertexShader);

Here is a quick overview of some of the examples I have been working on, screen shots are straight from my Nexus One:

The mighty spinning cube demo, with a simple vertex and color fragment shader.  One major difference between ES 1.x and 2.0 is that all of the transformation functions have been done away with, instead you must manually manipulate a matrix and pass it into the vertex shader and multiply all the vertices by this matrix, fortunately the ES 2.0 book sample code comes with functions that map the old functions to the new approach.

Here is the multi texturing example from Chapter 10 of the ES 2.0 book, multitexturing becomes a fairly simple formula which is performed inside of the fragment shader.

The Cube Map texturing example from Chapter 9 of the book.  ES 2.0 provides a straight forward method for rendering things like reflections on objects by using a cube texture map.

This is a shot of the Particle System demo from Chapter 13 of the book.  It is difficult to get a good screen shot of all of the particle movement.  All of the particle parameter are passed into the shaders and calculations are performed by the GPU.

One of the first things I wanted to do after I got done with the basics was to load a textured .obj model in dalvik/Android and load it into native code and render it using vertex/fragment shaders.  This is a sample model from this very cool OpenGL ES android tutorial/project.

The ES 2.0 book assumes that you already know how to do graphics programming with OpenGL as well as program Shaders , so I have been using many other resources for learning GLSL and graphics programming in general.  I have found that the best resources are usually just OpenGL books from 5 years ago.  This is an example of a sepia filter from this great book: Shaders For Game Programmers And Artists.  The flag also waves using a simple sin() method on the z coord in the vertex shader.

This is a simple 4-sample blur shader from the above book.

This is my attempt at creating a motion blur shader. It renders the cube to a Frame Buffer Object then interpolates between a previous frame.  I haven’t gotten it to perform the proper blending of the two frames but I am pretty close to getting it to work.

Per-Pixel Lighting

I have been trying to get the per pixel lighting example from the ES 2.0 book to work but unfortunately they only provide the Shaders and a RenderMonkey workspace which leaves out some fairly crucial elements for actually getting it to run in code.

Next up I want to get familiar with all the various lighting effects you can do with shaders.  Shaders provide a much more flexible lighting model than the fixed function lighting options in ES 1.x, but it also makes doing lighting not as straight forward as it was.  After that I would like to put together some scenes and maybe get a simple game engine going with flexible shader options.

Here is a fur shader I made a while ago, I am having a hard time remembering the details of the implementation but it is based off of this post.

Posted on February 26th, 2010 | Filed under android, java, mobile | 5 Comments »

* Update: r3 of the Android NDK has been released with OpenGL ES 2.0 support get it here *

The Motorola Droid and the Nexus One are the first Android phones with graphics hardware that support OpenGL ES 2.0.  Developers have not yet had access to the advanced graphics API in the latest Android SDK/NDK , but it looks like official support for ES 2.0 in the Android NDK is imminent as the latest NDK code in the Android Open Source Project includes the libraries needed to access the OpenGL ES 2.0 API.  If you are impatient like I am and would like to dip your toes into the strange world of vertex and fragment shaders,  here is a quick way of getting OpenGL ES 2.0 support up and running in the current release of the Android SDK. ( This has been tested on OS X and a Nexus One, I would imagine it would work with Linux and the Droid too, no idea about Windows)

The first step is to download the latest Android source code from AOSP.  The project we are interested in is in: development/ndk.  Copy the ndk directory wherever you want.  The NDK is  missing the platform specific ARM toolchain binaries so we are just going to get those from the 1.6 release of the NDK which you can find here.  Copy the build/prebuilt/  directory of the 1.6 NDk to the build/ directory of the 2.0 NDK.

The next step is to run the host-setup.sh script in build/

sh build/host-setup.sh

After the set up script runs we are ready to build the sample hello-gl2 that conveniently comes with the new NDK.  This project contains all the code you need to set up the application and native parts to support ES 2.0.

make APP=hello-gl2

This will compile all of the native code and copy the binary to the libs/ directory of the Android project.

After this is finished we can build and install the application found in apps/hello-gl2/project/ onto our Android 2.0+ device and see ES 2.0 in action!

Pretty awesome right?  I will save actual OpenGL ES 2.0 walkthroughs for later as I go through the ES 2.0 book, but the source included in the hello-gl2 project should be enough to get you started.  Obviously this is all unsupported and unofficial and you should more than likely not release apps using these APIs onto the market until the Official SDK is updated.

You certainly wouldn’t want to release a Live Wallpaper using this code because it would probably get 1 star as most users are unimpressed that you got ES 2.0 working ;)

Posted on February 5th, 2010 | Filed under android, java, mobile | No Comments »