Android Rest API Over C - Check! Finally!

09 April 2020

First, a little demonstration:

No, I’m not simply sending back a hand-crafted string claiming it being a REST server :) The frontend is still in the fake-wireframe phase though.

At this point, if I even have one reader - well, sorry for the long delay. This semester was easy at the beginning and quickly became really tedious as I’ve got plenty of homework assigned over a short period of time. But let’s see the results so far!

Plans with the client side

At its core this application is basically a C library that should be able to carry out the cryptographic procedures needed by GnuPG’s SCDaemon. It is expected to be used with some frontend and is going to have attachable interfaces for platform-specific implementations such as the means of using the available secure data storage. It only happens that early Android devices are targeted for backing hardware in the first place for we have a lot of them.

SCDaemon is bundled with the source of GPG and it serves the purpose of being the mediator between the smart card and the user program. It is supposed to be used over IPC and it provides an abstraction for specific smart cards and for the OpenPGP applet. Its interface looks like this:

$ ./scdaemon --server
OK GNU Privacy Guard's Smartcard server ready

(...content skipped...)

# SERIALNO [--demand=<serialno>] [<apptype>]
# LEARN [--force] [--keypairinfo]
# READCERT <hexified_certid>|<keyid>
# READKEY [--advanced] <keyid>
# SETDATA [--append] <hexstring>
# PKSIGN [--hash=[rmd160|sha{1,224,256,384,512}|md5]] <hexified_id>
# PKAUTH <hexified_id>
# PKDECRYPT <hexified_id>

(...content skipped...)

We will get back to that later when the client of this solution is implemented.

Right now the advancement is that we have a C library that is suitable for creating a TLS-secured REST API quite easily - it is called Ulfius - and it works both on my computer and on Android 1.5 (in the emulator at the very least).

Preparing the REST server for the emulator

Writing code with Ulfius for my hello-rest API in C is quite friendly and straightforward. After setting up the network parameters - parameters like which IP to bind on which port - basically all I do is define the endpoints with ulfius_add_endpoint_by_val().

void kc_start_accepting_requests() {
    bindaddr.sin_family = AF_INET;
    bindaddr.sin_port = htons(PORT);
    inet_aton("", &bindaddr.sin_addr);
    if(ulfius_init_instance(&ulfius_instance, PORT, &bindaddr, NULL) != U_OK) {
        printf("Error initializing ulfius!");

    u_map_put(ulfius_instance.default_headers, "Access-Control-Allow-Origin", "*");
    ulfius_instance.max_post_body_size = 32768;
    ulfius_add_endpoint_by_val(&ulfius_instance, "GET", "/hello", NULL, 0, &callback_get_test, NULL);

The handler for the simple /hello endpoint looks like this:

int callback_get_test (const struct _u_request * request, struct _u_response * response, void * user_data) {
    json_t *root = json_object();
    json_object_set_new(root, "response", json_string("Hello dude!"));
    ulfius_set_json_body_response(response, 200, root);

Neat! After compiling the source on the computer and running the executable I can read in my browser at the response that is being constructed here. The first obstacle is solved.

Running on Android

As the early Android devices rely on the ARM CPU architecture, the natively compiled code that my machine can run is not executable on the phone. We have to compile our software, the library and all of its dependencies using the compilers supplied with Android NDK.

Library trouble

It turned out quickly that my best bet is to link the libraries statically as it was a little unpredictable when will the shared libraries work. In short, shared or dynamically linked libraries consist of separate files .so files - similar to the .dll files on Windows - that are required to be at a well-known place at runtime.

Their advantages include that:

  • In my experience they are wildly used: some libraries do not provide a statically linkable archive.
  • They seem to be easier to link to as unresolved dependencies are reported both compile time as well as at execution time - in the latter case even naming the missing .so files
  • Multiple executables or intermediate libraries can rely on the same file, sparing disk space
    • (not actually a huge advantage here because we are going to bundle them into the .apk anyway)
  • They can be easily switched without recompiling the software with other versions or forks of the library - a reason that makes easier to comply to licenses like the GNU LGPL (see point 4/d)
    • (again, the .apk needs to be reassembled anyway)

By contrast, a statically linked library is bundled into the executable.

  • It is easier to distribute the executable with them as it is mostly self-contained.
  • I did not have uncertain results with them so far as opposed to dynamically linked libraries: Android 1.6 seemed to be able to use the same dynamically linked library that Android 1.5 could not.
  • In theory, only the relevant functions and parts are going be copied into the target executable or shared library.

I’m using what worked - static libraries in this case.

Whether I use static or shared libraries I had to compile them one by one. I’ve done that and even automatized the process using PKGBUILD recipes for Arch Linux. You may find these files on GitHub. There is even a Dockerfile for a compiling environment. The package android-ndk-r10e-s-arm contains a cmake.toolchain file to make the use of the libraries easier.

With the compiled libraries under my belt I was able to successfully craft the ARM executable.

#include <kc.h>
#include <stdio.h>

int main(void) {
    printf("Starting kc-core server\n");
    printf("Hit RETURN to stop server\n");


Please note that this did not work on the first try. Not even on the second. But, anyhow, the second obstacle was solved too.

Creating the application for Android

Now this was not easy. In the previous article I wrote about hunting down the old SDK.

It was time to fumble with the appropriate Eclipse versions that works with the appropriate ADT versions - as that was the way one was maintaining an Android app before Android Studio. When I first tried to use the latest eclipse with some ADT that I first found working, it was buggy, but the .apk somehow was produced anyway. It happened in a weird way - Eclipse literally and consistently built it after every second attempt to run the app and not even then got a notification about it because of other errors related to Eclipse trying to communicate with ADB. Being frustrated with the errors I’ve opened absent-minded the bin/ folder of the project and I’ve found the treasure I was looking for. I could push it to the emulator and it ran!

So, it worked… mostly. …well…. then… I’ve decided to clean up my machine and reinstall Arch as it was really filled with all the unnecessary garbage I’ve loaded onto that more than 2-years-old installation.

Relatively quickly I’ve set up the SDK and the emulator again, then reinstalled the prebuilt packages I’ve made out of the libraries. I’ve got back to the point where basic-demo-static worked. Then Eclipse could not produce an APK anymore with the new setup that was meant to be the same as before.

Traveling back in time to around 2009

That was the point that there were so many possible misaligned and unsupported versions producing weird phenomena and bugs that I’ve considered a new strategy. I’ve looked up the Wikipedia page on the release dates to find that Android 1.5 came out in April, 2009 and started to assemble an environment that resembled a developers tooling at around 2009 or 2010.

After some more googling and trials and errors I’ve set up the final environment:

Tool Version Release date (month)
VirtualBox & Guest additions
(hint on compatibility)
latest (6.1.4), from package manager (not relevant)
Ubuntu 8.04 LTS, i368 version April, 2008
Android SDK 1.6 r1 September, 2009
Sun Java JDK 6 (from Ubuntu’s archived packages) December, 2006
Eclipse Ganymede, 3.4.2 (“SR2 package”) June, 2008
Eclipse ADT 0.9.5 December, 2009

The hardest part was to find the relevant ADT version as I did not found a table on the versions and the release dates. What I did found, however, thanks to a Stack Overflow user, the old version download links.

Lucky enough, WaybackMachine came to the rescue. The snapshot in the image of has been captured in January, 2010.

Clue that I need Android ADT 0.95

Putting the VM together

It is miracle that the virtual machine not only can produce .apk files in a reliable way, but it is able to run relatively smoothly the Android emulator which integrates well with Eclipse. With one click I can deploy the application from the IDE and test it.

Last challenge: can the Java frontend use our small hello-REST C library?

The theory is relatively straightforward. In order to use the native library from Java, first I need to fit it to a JNI interface.

That means, that if I want to use these functions defined in my header:

// kc.h:
#pragma once
#include <netinet/in.h>

void kc_start_accepting_requests();
void kc_stop_accepting_requests();

Then I need to create a “translator” lib.

First I define the same functions in the Java interface:

package com.codekuklin.kc_droid_frontend;

public class KC_Core {
	static {
	public native void startAcceptingConnections();
	public native void stopAcceptingConnections();

With System.loadLibrary() calls I specify all the libraries, that are required for the functions below to run. In this case, it is only, as it is going to be designed to be a self-contained shared library.


  1. I need to compile the application
  2. After changing directory to the bin/ folder (where the compiled classes are), I need to generate the header of the translator lib
.../bin $ javah -o kc-core_bridge.h -classpath . com.codekuklin.kc_droid_frontend.KC_Core

Here is the result:

// kc-core-bridge.h:

/* DO NOT EDIT THIS FILE - it is machine generated */
#include <jni.h>
/* Header for class com_codekuklin_kc_droid_frontend_KC_Core */

#ifndef _Included_com_codekuklin_kc_droid_frontend_KC_Core
#define _Included_com_codekuklin_kc_droid_frontend_KC_Core
#ifdef __cplusplus
extern "C" {
 * Class:     com_codekuklin_kc_droid_frontend_KC_Core
 * Method:    startAcceptingConnections
 * Signature: ()V
JNIEXPORT void JNICALL Java_com_codekuklin_kc_1droid_1frontend_KC_1Core_startAcceptingConnections
  (JNIEnv *, jobject);

 * Class:     com_codekuklin_kc_droid_frontend_KC_Core
 * Method:    stopAcceptingConnections
 * Signature: ()V
JNIEXPORT void JNICALL Java_com_codekuklin_kc_1droid_1frontend_KC_1Core_stopAcceptingConnections
  (JNIEnv *, jobject);

#ifdef __cplusplus

Now it is the time to fit my library by implementing this header:

#include "kc-core-bridge.h"
#include <kc.h>
#include <stdio.h>

JNIEXPORT void JNICALL Java_com_codekuklin_kc_1droid_1frontend_KC_1Core_startAcceptingConnections
  (JNIEnv * env, jobject obj) {
	kc_start_accepting_requests(); // nothing rocket science, just calling the appropriate function

JNIEXPORT void JNICALL Java_com_codekuklin_kc_1droid_1frontend_KC_1Core_stopAcceptingConnections
  (JNIEnv * env, jobject obj) {
	kc_stop_accepting_requests(); // nothing rocket science, just calling the appropriate function

…then to compile this library into a shared library.:

$ arm-linux-androideabi-gcc \
        -Iinclude \
        -Wl,--start-group ./alllibs/*.a kc-core-bridge.c -Wl,--end-group \
        -shared \
        -o target/

This command line produces a self-contained shared library with no intermediate dependencies.

Note, that by default, the linker would resolve the dependencies by iterating over the libraries specified in the given order only once. Here, these libs are probably going to be substituted in an alphabetical order after ./allibs/. In my case, leaving the ordering so causes the produced lib to have a lot of unresolved and unlinked functions remaining. This happens because they are not ordered by their dependencies.

With this start/stop group ugly hack, we’re asking the linker instead to repeat this iteration over and over again until all the unresolved dependencies are found.

Using this method is also a way to resolve circular dependencies among these libraries (which, ewww, you should not have between them! what’s the point of having two separate modules if they only can be used together?).

According to Eli Bendersky’s blog, the reason it is done so by default - running the linker only once while expecting a topologically correct order of the dependent libraries - is simply that it would have a significantly higher performance cost - that is, it would be slower.

But, for now, who cares, right? ;) (this is how a typical sentence reads like that is surely going to backfire, sometimes in two hour’s time)

After some messing again, it looks like it is the simplest method to bundle the lib if we drop it into ${PROJECT_ROOT}/libs/<abi>, where <abi> is armeabi in our case. The library is going to be bundled with the APK. You may examine its content using a zip archiver to see if it is really there.

The last step is to call the library functions in the application. For testing purposes, I’m doing this at the first viable place in the code - when constructing the first activity:

KC_Core kc = new KC_Core();

Compile and start the application, then forward the appropriate port with:

$ adb forward tcp:5777 tcp:5777 # my API is served on the port 5777

And, you’re finally at the point that the video demonstrated!

I think, next time I will investigate SCDaemon and try to hack it to report the presence of a fake smart card.

As always, thanks for reading!