Rafael Sanches

June 13, 2011

Google Analytics lags on Android. How to make it more responsive!

Filed under: analytics, android, maintainability, performance — Tags: , , , — mufumbo @ 5:55 am

Google Analytics can be your best friend in order to track your mobile user behavior. Unfortunately the current Android implementation has performance limitations and the most problematic is that it uses SQLite to store your events.

Everyone who wants to write a responsive app knows that you can’t do SQLite operations in the UI Thread. Having to wrap the Google Analytics calls into a separated thread can be painful, so I wrote a very simple helper to handle it inside threads. I have many tracking events inside “button click” and it was taking about 200ms to execute, it’s too much on the UI Thread. It’s also too much if you have “onCreate” because it will take long time to open your new activity.

This helper is also very wrong because it maintains a static reference to the context. I do this in order to have better numbers on visit and “time on site”. You can just remove the static reference if you don’t like that.

Notice that my implementation has this: “Thread.sleep(3000);”
It means that I don’t want repetitive Google Analytics SQLite to be competing with my app inserts or gets.

This LAG happens because SQLite uses the internal memory which can be very slow depending on many factors, including concurrent SQLite operations or just internal memory without many space.

I hope it helps someone. Here’s the complete code:

package com.mufumbo.android.helper;

import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;

import android.content.Context;
import android.util.Log;

import com.google.android.apps.analytics.GoogleAnalyticsTracker;

public class GAHelper {
    String activity;
    static GoogleAnalyticsTracker tracker;
    static int instanceCount = 0;
    long start;

    // Limit the number of events due to outofmemory exceptions of analytics sdk
    final static int MAX_EVENTS_BEFORE_DISPATCH = 200;
    static int eventCount = 0;

    static final ExecutorService tpe = Executors.newSingleThreadExecutor();

    public GAHelper(final Context c, final String activity) {
        this.activity = activity;
        instanceCount++;
        if (tracker == null) {
            tpe.submit(new Runnable() {
                @Override
                public void run() {
                    tracker = GoogleAnalyticsTracker.getInstance();
                    tracker.start(Constants.GOOGLE_ANALYTICS_ID, Constants.GOOGLE_ANALYTICS_DELAY, c.getApplicationContext());
                }
            });
        }
    }

    public void onResume() {
        this.trackPageView("/"+this.activity);
    }

    public synchronized void destroy () {
        instanceCount--;
        if (instanceCount <= 0) {
            tpe.submit(new Runnable() {
                @Override
                public void run() {
                    Log.i(Constants.TAG, "destroying GA");
                    if (tracker != null)
                        tracker.stop();
                    instanceCount = 0;
                }
            });
        }
    }

    protected void tick() throws InterruptedException {
        Thread.sleep(3000);
        this.start = System.currentTimeMillis();
    }

    public void log (final String l) {
        if (Dbg.IS_DEBUG) {
            Dbg.debug("['"+(System.currentTimeMillis()-start)+"']["+eventCount+"] Logging on '"+this.activity+"': "+l);
            if (l.contains(" ")) {
                Log.e(Constants.TAG, "DO NOT TRACK WITH SPACES: "+l, new Exception());
            }
        }

    }

    public void trackClick(final String button) {
        checkDispatch();
        tpe.submit(new Runnable() {
            @Override
            public void run() {
                try {
                    tick();
                    tracker.trackEvent(
                            "clicks",  // Category
                            activity+"-button",  // Action
                            button, // Label
                            1);
                    log("trackClick:"+button);
                } catch (final Exception e) {
                    Log.e(Constants.TAG, "Error tracking", e);
                }
            }
        });
    }

    public void trackEvent (final String category, final String action, final String label, final int count) {
        checkDispatch();
        tpe.submit(new Runnable() {
            @Override
            public void run() {
                try {
                    tick();
                    tracker.trackEvent(
                            category,  // Category
                            action,  // Action
                            activity+"-"+label, // Label
                            1);
                    log("trackEvent:"+category + "#"+action+"#"+label+"#"+count);
                } catch (final Exception e) {
                    Log.e(Constants.TAG, "Error tracking", e);
                }
            }
        });
    }

    public void trackPopupView (final String popup) {
        checkDispatch();
        tpe.submit(new Runnable() {
            @Override
            public void run() {
                try {
                    tick();
                    final String page = "/"+activity+"/"+popup;
                    tracker.trackPageView(page);
                    log("trackPageView:"+page);
                } catch (final Exception e) {
                    Log.e(Constants.TAG, "Error tracking", e);
                }
            }
        });
    }

    public void trackPageView (final String page) {
        checkDispatch();
        tpe.submit(new Runnable() {
            @Override
            public void run() {
                try {
                    tick();
                    tracker.trackPageView(page);
                    log("trackPageView:"+page);
                } catch (final Exception e) {
                    Log.e(Constants.TAG, "Error tracking", e);
                }
            }
        });
    }

    public void checkDispatch() {
        eventCount++;
        if (eventCount >= MAX_EVENTS_BEFORE_DISPATCH)
            dispatch();
    }

    public void dispatch(){
        eventCount = 0;
        tpe.submit(new Runnable() {
            @Override
            public void run() {
                try {
                    tick();
                    tracker.dispatch();
                    log("dispatched");
                } catch (final Exception e) {
                    Log.e(Constants.TAG, "Error dispatching", e);
                }
            }
        });
    }
}

October 18, 2009

RSS parsing optimization for bandwidth and processing time with SAX and httpclient – pooling scripts

Filed under: android, maintainability, performance, programming — Tags: , , , — mufumbo @ 3:55 pm

My server was having a constant income traffic of 1.7mb/s for a service that download RSS from the internet and process them. Basically it need to return the last updates of multiple RSS feeds. It’s a very basic pooling system, but it was downloading too much data for just 15.000 active users. The growth wasn’t looking very feasible..

I was using the ROME java library to parse the XML. So far so good, the problem was that it downloads the whole feed and process it all. With my application scope I don’t need to download the whole RSS, just the new entries that i didn’t downloaded yet.

The solution was to use a custom SAX RSS parser, looping through the “” tags and identifying “”. In this way i can parse item per item, and identify if the current item is not updated, so I can abort the http connection and stop the download of the feed. I wish that ROME had an option to do that, like “stop processing when ‘publishedDate’ minor than..”.

The impact on bandwidth usage and processing time was impressive:

If someone is interested I can post and explain the java class. It’s compatible with com.sun.syndication.feed.synd and uses the SyndEntry and SyndFeed interfaces.

May 10, 2008

simple script to merge commits from a bugzilla id

Filed under: maintainability, programming — Tags: , , , , — mufumbo @ 9:15 pm

Today i have made my first PERL script!

For me it is very painful when it arrives the time to merge, into another branch, all the commits that i have done in the “trunk”. I have searched a little and did not find anything that could magically solve all my problems. I know that it’s better to create a separated branch when there are lot’s of commits, but there are some cases that a super-simple functionality can explode into a big ball of mud.

Practically the script merge all the commits of a bugzilla id to another branch. If someone knows a standard way to do this; please tell me!

The script take three inputs:

  1. The starting revision ID to filter the search.
  2. The SVN address of the source.
  3. The search string to filter the results. Here you put your bugzilla bug id.

Commands that are executed when you launch the script:

  1. Go to the directory of the destination branch.
  2. To execute the script simply do:
  3. svn_search_merge.pl 0 https://svn.example.com/main/trunk/ “1: “
  4. Note that “1: ” is the bugzilla bug id. What happens next is:
  5. svn log -r 1:HEAD https://svn.example.com/main/trunk/
  6. With that command we get the log of all commits from the revision 1 to the HEAD. After it’s just matter of check if the string “1: ” is inside the log. Then we simply execute:
  7. svn merge -r (ACTUAL_REVISION-1):ACTUAL_REVISION https://svn.example.com/main/trunk/

Source code of the script:

#!/usr/bin/perl

# Simple script to merge commits from a source branch to the current destination directory.
# https://mufumbo.wordpress.com/2008/05/10/simple-script-to-merge-commits-from-a-bugzilla-id/
#
# Example:
# $ cd my-branch-destination/
# $ svn_search_merge.pl 3000 https://svn.example.com/main/trunk/ "bug 673"
# Where 3000 is the starting revision and "bug 673" is the string to match in the comments.
#
use strict;
use warnings;

my $prev_revision = shift;
my $svnHost = shift;
my $searchStr = shift;

print "Starting Revision: $prev_revision\n";
print "SVN addr: $svnHost\n";
print "Search pattern: $searchStr\n";

my $buffer;
$buffer = `svn log -r $prev_revision:HEAD $svnHost`;
my $shouldContinue = "y";
LOGS: foreach my $changelog_entry (split(/----+/m, $buffer)) {
    if($changelog_entry =~ m/($searchStr)/) {
            #my (undef, $info, undef, $comment) = split(/\n/, $changelog_entry);
            #next unless $info =~ m/^r/;

        print "\n--------------------------------------------------";
        print $changelog_entry;
        my $revisionId = substr($changelog_entry, 2, 5);
        $revisionId =~ s/^\s+//;
        $revisionId =~ s/\s+$//;

        if ($shouldContinue ne 'a') {
            PROMPT: while(1) {
                print "\nShould continue with merge of revision '$revisionId'? (Yes,Always,Skip,Exit): ";
                $shouldContinue = &lt;&gt;;
                chomp($shouldContinue);

                last PROMPT if $shouldContinue eq 'y';
                last PROMPT if $shouldContinue eq 'a';
                next LOGS if $shouldContinue eq 's';
                die("User requested to stop.") if $shouldContinue eq 'e';
            }
        }
        else {
            print "\nAuto merging '$revisionId'\n";
        }

        my $pRevisionId = $revisionId-1;
        my $mergeBuffer = `svn merge -r $pRevisionId:$revisionId $svnHost`;
        print $mergeBuffer;
    }
}

Blog at WordPress.com.

%d bloggers like this: