Monday, 8 March 2010

Java Developer Diaries; Developing Android Applications

On September 6th, 2007 i received a message from Chloe Arrowsmith of Google over Linkedin, asking if I am interested in a position Mobile Applications software engineering team they are building in London. As you may guess this team was setup for building Android and they didn't hire me in the end :).

Since then I am trying watch android closer. Frankly there things I like in android and there are things i don't.

Android is a mature platform thanks to google's strategy to release the sdk to public much before than any device available. When I start to deal with android i realized I am comparing every structure with iPhone and objective-c which is actually weird because I am a Java guy more than Objective-C. Like it or not iPhone and Objective-C is a well organized and elegant platform to work on and the deeper I dive into android sdk, the more I find things sometimes not so easy and well organized.

Honestly I am not really interested in JavaME since beginning since I find it a bit limited but new coming devices and SDK (such as Android and IPhone) are much more sophisticated and capable. Android offers much more than IPhone SDK can even imagine. You can code Activities which are the building blocks of user interfaces. You can use content providers to access and share data between applications. Besides you can use Intents to get noticed about various events, from hardware state changes to incoming data, to application events and also not only can you respond to intents, but you can create your own, to launch other activities, or to let you know when specific situations arise which you can't on an IPhone. There is one other key stuff which IPhone lacks, the services which can keep running even not related to an activity, like keep listening messages even if you close the control unit of your instant messaging client.

These were the best and most promising part since most of them are only available on Android devices. However I find the UI and controller integration a bit troublesome. Also even if the UI controller structures seems like taken from Objective-C, they don't seem as neat.

Enough talking lets see things in action;

If you had read my previous post on developing IPhone application, why don't we just build the same on Android, which will use GPS, accelerometer and UI actions (if not you can go back to check the iphone post since it may lead you a better comparison, which is, however, not necessary). First of all I assume you downloaded and setup the Android SDK and ADT Eclipse Plugin. The installation is pretty straight forward and ran smoothly on mac and ubuntu (should be ok on windows too). I also assume you have basic knowledge of eclipse and java development on eclipse.

Fire up your eclipse and create a new android project, as if you are creating a java project.



Next type a project, application and package name. We also need to specify a target runtime which I choose 2.1. Since we are not using any new apis it should be ok to select as low as 1.5. Click finish to create your project.



You may have noticed some errors in your project which is actually not a real problem.



If you clean and rebuild you project the errors will be gone. This seems like a small issue in ADT plugin which should be either because of not running the incremental builders in project creation time or not cleaning the error markers after the first build, whichever it is, just nevermind.



The MainActivity class is created with an overridden onCreate method which is fired when the activity is first created. Android has a different way of UI declaration. Here the main.xml file under layout folder is used for creating the UI layout which also provides a visual designer. strings.xml under values folder is used to hold static values. We would be able to change those within out code. To be honest I really like this approach, it is quite easy, well organized and the visual designer is a great plus. Also if you are coming from a xml declarative platform such as Flex(MXML) or Silverlight(XAML) you won't find it much different (which actually is).



Even at this stage you might realized Android application has much less files than an iPhone application (can check previous post). Although these makes it appear simpler and easier to start with, the deeper you go you realized Objective-C is a bit more organized. Open MainActivity class which acts as the view controller. The basic usage of the APIs would not be very different than iPhone, we are going to implement a listener interface, register or class as the listener and code the implementation method.

Lets start with building our interface, the default view already has a TextView printing hello. We can modify this one and add another to display our latitude and longitude.


<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:orientation="vertical"
android:layout_width="fill_parent"
android:layout_height="fill_parent"
>
<TextView
android:layout_width="fill_parent"
android:layout_height="wrap_content"
android:id="@+id/latitude"
/>
<TextView
android:layout_width="fill_parent"
android:layout_height="wrap_content"
android:id="@+id/longitude"
/>
</LinearLayout>


You may also noticed ‘id’ attributes are added in order to access this components within the code. As I mentioned before this approach is much like other XML declerative platforms. By the way if you feel comfortable (or brave enough) you may also try tweak the UI using the layout view.



Since this should be enough for the view lets go back to our activity class. The MainActivity class is created with an overridden onCreate method which is fired when the activity is first created.

First lets create two TextViews to control our UI code and get the LocationManager to access the GPS.


public class MainActivity extends Activity {

private LocationManager locationManager;
private TextView latitude;
private TextView longitude;

@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
//bind the ui components to code referances
latitude = (TextView) findViewById(R.id.latitude);
longitude = (TextView) findViewById(R.id.longitude);
//get the locationManager
locationManager = (LocationManager) getSystemService(Context.LOCATION_SERVICE);
}
//...


So far the most suprising thing for me was the need to bind the UI components to code references manually via findViewById method. Honestly this seems a bit unnecessary and weird. I was expecting the compiler to access UI components dynamically (as in other xml declarative languages) or use visual tools to do the job (as in interface builder on mac). This seems fine for now but more complicated views might become troublesome.
Since we are mostly done with the UI code we can start utilizing the GPS. We already added and initiated the location manager.

This would be enough get the location manager but to get the location updates we need to register a listener. To keep things simple and more like what I did in the iPhone tutorial, I am going to implement the needed listener interfaces to our activity class. You may prefer to create anonymous listener classes.


public class MainActivity extends Activity implements LocationListener {


Next we create two methods to register our class as the listener of the location updates via using locationManager.requestLocationUpdates method. This method provider constant, minimum time interval, minimum distance interval and finally the listener class.


private void startListening() {
//registers our class as the listener
locationManager.requestLocationUpdates(LocationManager.GPS_PROVIDER, 0,
0, this);
}

private void stopListening() {
locationManager.removeUpdates(this);
}


Since the methods are ready we can call startListening in onCreate method to register our class as the listener of location updates.


...
locationManager = (LocationManager) getSystemService(Context.LOCATION_SERVICE);
startListening();
...


Implementing the LocationListener interface brings four new methods to be implemented. To avoid errors lets create those and also implement the onLocationChanged method.


public void onLocationChanged(Location location) {
latitude.setText(location.getLatitude()+"");
longitude.setText(location.getLongitude()+"");
stopListening();
}

public void onProviderDisabled(String provider) {
// TODO Auto-generated method stub

}

public void onProviderEnabled(String provider) {
// TODO Auto-generated method stub

}

public void onStatusChanged(String provider, int status, Bundle extras) {
// TODO Auto-generated method stub

}


Any new location update will trigger onLocationChanged method. However we don’t need any update after we get the locations so as we get location update we assign those to UI text fields and call stopListening to unregister our listener class.

As you may remember from the iPhone post we want our application to update the location info when we shake the handheld. To achieve this we need to ask the sensor service to inform us on accelerometer events. This approach is a bit primitive when compared to shake API introduced in iPhone OS 3. However it is much or same with the older iPhone OS. First lets get the sensor manager.


private SensorManager sensorManager;


And in onCreate method initialize the sensorManager and add a listener.


...
sensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);
sensorManager.registerListener(this, sensorManager.getSensorList(
Sensor.TYPE_ACCELEROMETER).get(0), 100);
...


We also need to implement a new interface to let our class to become a listener for SensorEvents.


public class MainActivity extends Activity implements LocationListener, SensorEventListener {


Adding the interface will cause errors because of unimplemented methods. Add the needed methods and implement the onSensorChanged method to detect shake and reregister for GPS updates.


// The threshold value to detect shake
private float kThreshold = 1.2f;
public void onAccuracyChanged(Sensor sensor, int accuracy) {
// TODO Auto-generated method stub
}
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
float[] values = event.values;
if (values[SensorManager.DATA_X] > kThreshold
|| values[SensorManager.DATA_Y] > kThreshold
|| values[SensorManager.DATA_Z] > kThreshold) {
startListening();
}
}
}


Finally we want our UI to respond and rotate according to orientation changes. Unlike the iPhone counterpart, android does not need any code change for this purpose, just tweak the AndroidManifest.xml and it is done!





Although ADT offers visual editors for this xml file, it is not possible to change the screenOrientation parameter visually.



Now we are ready to run our application. Android SDK comes with an emulator. To set up new virtual devices, use awd manager.



You can add virtual devices varying from version 1.1 to 2.1. The SDK comes with 2.1 version but other packages can also be downloaded from available packages section.



Be careful about not choosing an older AVD version than the project build target you choose, shown in figure 2. As our AVD is ready now we can run the application. Right click on the project and select run as android application.



This should fire the emulator and deploy your project which should get an error as shown below.



Unlike iPhone, Android offers much more services and integration to developers which brings up another problem, the security. Android programming model expects you to declare the resources you want to use at the time of installation, so instead of asking permission each time the application started, Android lets the user know what the application doing during installation. While this approach is much simpler and less pain for the user, it will also provide eternal access to resources once user confirms.

To add those permissions open the AndroidManifest.xml and go to permissions tab.



Click add.



And add ACCESS_COARSE_LOCATION, ACCESS_FINE_LOCATION permissions from the dropdown.



Finally our application is really ready to run. I had some difficulties getting location data on the emulator. Also shaking the emulator window didn’t cause the accelerometer to detect any shake. However when I deployed the application on a real device everything ran smoothly. Later i realized I need to install SensorSimulatorSettings.apk to get mock location and trigger shake. To install use the following command;

%ANDROID_SDK_HOME%/tools/adb install SensorSimulatorSettings.apk

You can also refer this post for more info about SensorSimulatorSettings.

The running application should be similar to the screenshot below, responding to shake and orientation changes. (BTW thanks to Google giving the developer phone last year at I/O).



Android is a very promising platform and although its quite young its already quite mature. I will later write another post comparing iPhone and Android development but to tell the truth staying in Java and having a more open and flexible platform feels more comfortable. However when compared to iPhone documentation, samples and even books are still might not be as good.