Swift and BigData – SwiftLondon Meetup

Talk for the Christmas Get-Together Swift London Meetup

This talk is the result of this blog post where I tried to show a different approach of using Swift programming language specially focused in memory management and performance.

iOS Data Collection Using Swift

Here at MyDrive we are completely data-driven and as such we need to collect
as much data as possible. This data is then analysed by our data scientists and/or
used for Machine Learning purposes. To collect all that data we have designed and implemented our
own iOS data collection app (a.k.a. the iOS awesome app) that silently records
all our movements (lovely, isn’t it?).

Now let’s get to the technical part. The application has been written in Apple’s
brand new programming language: Swift, and we started developing it since it was
in beta version, so that means that parts of our codebase had to be rewritten to
adopt the incoming changes from any new released beta.

That has been my first hands-on time with Swift and I have to say that I like it.
Although I used to like it more in one of the betas than in the current first release.
The idea of writing

if var v = some_method() {

}

or

if some_var? {

}

rather than:

var v = some_method()
if v != nil {

}

was, IMO, really a cleaner way of doing nil variable testing, but, for some reason
Apple decided to remove that feature.

Apart from that detail, I think that the language has some ‘must haves’ and cool
features such as closures support, tuples, multiple return values, variable
type inference and functional programming patterns. It also has some which are
not so cool like optionals, dodgy casting errors, different parameter names from
within or outside functions, and the fact that it is still very young and things
are likely to change in the short/mid term.

Finished giving my first impressions on Swift, let’s now move on to the so called
ios awesome app and particularly to the most technically interesting part of
it that is how we managed to be able to record accelerometer data for hours
without running out of memory, because, as you have guessed, our first approach
was to store all the accelerometer observations in an array structure and then,
when finished recording, dump it for gzipping and submitting to the cloud storage.

That ‘all in memory/brute force’ approach worked not bad for a while, given that
we were collecting data at 1Hz frequency, but when we required to start recording
data at higher frequencies (30 and 60 Hz), problems soon appeared.

After spotting the cause of the issue, we decided to create a custom NSOperation
meant to run outside the main NSOperationQueue that, from time to time simply
dumps the contents of the array holding the accelerometer data to a file in disk
through a NSOutputStream and that worked fine except for the fact that after some
time using the app we realised that the last batch wasn’t fully dumped
(failed to wait for the dumping queue to finish before reading for gzipping).

Once solved, the code looks more or less like this:

func addDataRow(...) {
  data.append(...)

  if data.count >= 300 {
    let toBeDump = data
    data = [Row]()
    dumpArray(toBeDump)
  }
}

This piece of code simply appends data to an array and when it’s reaches a
size, copies it to another variable and clears it to avoid it becoming too large.

func dumpArray(data: [Row]!) {
  let op = CSVDumpOperation(file: filePath, data: data)
  if lastOp != nil && !lastOp!.finished {
    op.addDependency(lastOp)
  }
  lastOp = op
  dumpQueue.addOperation(op)
}

The copy is then given to a custom NSOperation to be dump to disk outside the
main operation queue. Those operations are executed sequentially to avoid data
being disordered.

The dump operation looks like this:

class CSVDumpOperation: NSOperation {

  let data = [Row]()
  let os : NSOutputStream

  init(file: String, data: [Row]) {
    os = NSOutputStream(toFileAtPath: file, append: true)
    os.open()

    super.init()

    self.data = data
  }

  override func main() {
    for row in data {
      let rowStr = "(row.x),(row.y),(row.z)...n"
      if let rowData = rowStr.dataUsingEncoding(NSUTF8StringEncoding, allowLossyConversion: false) {
        let bytes = UnsafePointer<UInt8>(rowData.bytes)
        os.write(bytes, maxLength: rowData.length)
      }
    }

    os.close()
  }
}

This CSVDumpOperation simply opens a NSOutputStream to the file and writes there
the csv formatted contents of the given array.

And that’s it!, with this simple approach for this simple application we intend
to collect hundreds of hours of different activities for further analysis.

MIMO Masters – iOS Module II (2014)

And happy to come back to MIMO Masters, and even better, this time with a whole module!! During this 20 hours module I lectured about CoreData, MapKit, CoreLocation, Push Notifications, Sensors and iCloud.

Rails Push Notifications Gem

For the last months I’ve come across several backend ruby developments that required, at least, one common thing. Mobile devices push notifications.

Although it’s true that you can find several open source gems out there, I found none fully featured, bug free and suitable for both iOS and Android so I thought that could be a very good contribution for the community and, after some time working on it and testing it here is the result.

The Rails Push Notifications Gem is an intuitive and easy to use gem that will allow you to integrate both iOS and Android push notifications into your rails project.

Rails Push Notifications Gem supports:

  • Multiple iOS and Android applications/configurations
  • Ad-hoc and bulk notifications
  • Fully customisable notification’s contents
  • Apple’s both sandbox and production environments
  • Apple’s feedback service

At the moment of sending the notification(s) you’ll have feedback information available on whether the push was successful or failed.

At the moment, I’m using it successfully in two live products and having a great experience.

Visit the GitHub repository at: https://github.com/calonso/rails-push-notifications for detailed installation instructions and a step by step usage example and, please, feel free to leave a comment here if you need help or just to tell us about your experience using it!

Enjoy.

 

Enumerados

Enumerados (Numbered) is, by far, the largest project that we faced at Unkasoft. It is a multiplayer board game which goal is to build simple (+ and -) arithmetical operations by crossing your given chips with the ones that are already on the board (like Scrabble but with numbers).

From the technical point of view it involved three different developments: a backend and two native (iOS and Android) frontends. Particularly I have fully developed the backend where I were almost fully dedicated during the last two years. Lots of iterations, changes, migrations and new features were required to get the final 3.7.1 version.

The backend is fully built using Ruby on Rails, starting at version 3.1 and finishing at 3.2.13. All the infrastructure was held at Heroku where it was using 4 web dynos as average, one single worker dyno and another plugins such as AdeptScale, PGBackups, Scheduler, New Relic, Blitz, RedisToGo and a Crane PostgreSQL Database.

The communication with mobile clients is done through a HTTP REST JSON API and, due to the required evolution on the whole project, the server ended up serving 3 different API versions concurrently.

By far, the most interesting and time consuming part of the development has been the optimisation, at all levels:

  • Database: Mainly using New Relic and Posgres ‘explain analyze‘ command I designed structure modifications, queries optimisations, PL/pgSQL procedures, etc…
  • Application: Again New Relic had a key role in helping spotting the worst performing transactions and bottlenecks along with profiling and benchmarking tests execution in my local development machine and also using Heroku Blitz add-on for load testing. At this level I implemented asynchronous query handling, background pre calculations, cache-based (sync and async) solutions where possible and some algorithms and data structures optimisation. Also improved database usage.
  • API and communication with mobile frontends: At this point, using users feedback and data provided by Google Analytics and New Relic native mobiles plugin we improved the communication layer by modifying (merging/splitting) queries, background preloading some other queries, etc…

Another thing that I am very proud of is the result of the push notifications feature evolution. At the very beginning I started using two different rails gems, one for Apple and for Google integration the other. I decided to go for third party open source gems mainly because I guessed that it would take me longer to develop, test and stabilise a new one from scratch. But after a while using them I started finding bugs and missing features that were important for us, so I began developing a new Rails Push Notifications Gem from scratch that, after some time I open sourced at my Github profile.

Going back to the game itself, here you can find the download links:

Also, here you can see some screenshots on the game.

iOS Augmented Reality Engine

It’s a long time ago since I was asked to build an iPhone application with augmented reality capabilities (typical feature to show stores around user’s location).

I used the http://www.iphonear.org/ project as starting point for my solution, but soon I realised that my requirements won’t be fully covered by that project so I started implementing my own ARKit Engine at that moment. Nowadays it has become a fully featured and easy to use iOS engine, which I have open-sourced and I’m really proud of the results.

It’s core features are:

  • Fully compatible with all versions of iOS from 5.0 onwards.
  • Supports all orientations.
  • Scales and rotates displayed objects based on distance and orientation.
  • Allows user interaction with displayed objects.
  • Displayed objects positions are updated live (According to the user’s position).
  • Supports and distinguishes front looking as well as floor looking .
  • Allows any custom overlay views.
  • Builtin support for radar view.
  • Fully customisable.

At the moment I have used it in these iOS applications with really good results so far:

Go to the iOS ARKit Github repo to know more or try it. I’ve open-sourced it along with a demo project that shows how to use it.

 

iOS Paginas Amarillas (Yell)

Páginas Amarillas’ ‘Cerca de mi’ iOS application is one of the latest applications that we developed at Unkasoft and is probably one of the most complete ones. It includes lots of features and it took us a long time and effort to develop it but I think that the final result really worths it.

As a non technical comment, that project was particularly difficult for us because it was developed by the Unkasoft IT team, but all the project management came from the client. I wouldn’t like to generalise, there are exceptions to this, for sure, but from that experience I learnt that this kind of IT team rent out is something that companies should do very carefully. As you probably guess, in the early stages all is about courtesy and polite words, but when deadlines are approaching, that lovely environment is turned into a very uncomfortable and sometimes insane one because of irrational pressure, those typical critical features that no one ever mentioned until the demo day, those big changes required for yesterday, etc…

But let’s move forward onto some technical stuff. The application is the iOS frontend of this businesses search company, so we had to integrate with their old-style SOAP XML API. Then it was all about different ways of presenting search results and different ways of user interaction (comments, photos, …).

The application counts with typical list, map, routes and augmented reality sections for displaying search results. This latter is the one of which I am most proud of, for two basic reasons, one is that it is the most technically challenging and the second is because it was my biggest contribution to that project.

This Augmented Reality section uses my iOS Augmented Reality Engine that I had started developing some time ago for a previous project and I definitely finished and tuned it to plug it into this iOS project.

After some time and a bit more work on it I open sourced it under my Github account. You can see more about it here.

Going back to the full application, here you can see some links, some screenshots and a Youtube promotional video which, shows mainly the Augmented Reality section. 🙂

 

 

 

McDonalds 30 Anniversary

At Unkasoft, we were very proud to work with clients such as McDonalds. In this case, we were responsible for developing the promotional 30 anniversary of the company in Spain. I was the developer of the iOS native application and, although it looks like a very simple application, it had a fair amount of functionality, mainly because it had three operation modes, based on date and location: The application was intended for a particular promotion held at a concrete date at a concrete point (Callao Plaza, Madrid, Spain). So depending on the date (before, during or after promotion) and the location (at Callao or not), the application should do some different things.

The result was a heavily promoted application (thanks to the brand, of course) and pretty nice looking with a very cool and curious feature, a candles blowing simulator :).

All the promotion, and application was built around the idea of the anniversary, and the selected way of celebrating it was by a cake with candles to blow (hard to guess, huh?). The special thing is that, during the promotion, the users present at Callao, should use their smartphones with the promo app downloaded to help blow all candles on an enormous cake shown at an advertisement panel by literally blowing candles on their smartphone each! (Watch the video below for more).

And I went for the most realistic approach possible: First of all, candles should look like real burning candles, and I think I achieved a very good simulation using a particles system with Cocos2D engine, the second important part was its behaviour responding to the user’s blow. Using the iPhone/iPad microphone, I ‘listen’ to the noise it receives, and depending on its intensity, the candles or smoke (if already put out) vibrate, move or put out themselves.

You can see candles in action and the result of the whole marketing campaign in these videos below. Also app screenshots below.

 

 

 

EcoCaixa

This time I want to show you a really simple 2D game. The client was a well known spanish bank that wanted to promote recycling among children.

In the game, various types of recyclable objects fall from the top of the screen and the player has to drag each of them to the corresponding recycling cube.

To make things a bit more complicated, recycling cubes swap their positions from time to time.

This game is developed from scratch, using no more frameworks than Apple builtin ones (CoreGraphics, QuartzCore, …). The most important thing and the one that I’m most proud of this proud of this project is having found the (probably) best way of moving layers around the screen with really high performance.


[CATransaction begin];
[CATransaction setValue:(id)kCFBooleanTrue forKey:kCATransactionDisableActions];
// Move CALayers here ...
[CATransaction commit];

Links:

Here you can see some screenshots: