On the eve of the decade I thought it might be fitting with a cautionary tale about dates and time.
An iPhone app that I was developing had to send data to a server. Among the data items was a timestamp.
As a best practice during development, the app was tested on several types of devices with different versions of the OS. It was working great. Except on one particular iPhone.
There was nothing obviously different about this device. Same hardware and same OS as on other devices that were working fine. We restarted the device and reinstalled the app several times. Still it refused to communicate properly with the server.
I’ll spare you the details of the hours of debugging that followed, and skip directly to the solution…
Sending timestamps between different systems is a common area of confusion and errors. For this app we had settled on expressing the time in the number of seconds since the Unix epoch (January 1, 1970). This value was then to be sent as a URL parameter in a HTTP GET request. This is a convenient way to transmit a timestamp between systems since most programming languages have ways to create a date + time object from this long value.
If you print out [[NSDate date] timeIntervalSince1970] you will see a 10 digit number. And that was what the server was expecting. However, if you look at the number (1262122135 as I’m writing this) you’ll notice that it wasn’t that long ago when the value went from 9 to 10 digits. In fact, this happened on the 9th of September 2001 at 01:46:40 GMT.
Upon further examination the obstinate iPhone that refused to run the app correctly, had its system clock set to early 2001. Thus the server call contained a 9 digit timestamp, instead of the expected 10 digits. This caused the failure when running the app.
The moral of the story: Never trust any data that you do not fully control. User input is a category that is so obvious that most developers always validate it. The system clock can also be set by the user and should therefore not be implicitly trusted.