New Swift language shows Apple history

Still reflects conventions going back to original adoption of Objective-C

A closer look at SwiftLike many Apple programmers (and new programmers who are curious about iOS), I treated Apple’s Swift language as a breath of fresh air. I welcomed when Vandad Nahavandipoor updated his persistently popular iOS Programming Cookbook to cover Swift exclusively. But I soon realized that the LLVM compiler and iOS runtime have persistent attributes of their own that have not gone away when programmers adopt Swift. This post tries to alert new iOS programmers to the idiosyncrasies of the runtime that they still need to learn.


Like all interactive systems, iOS notifies classes when something they care intensely about–a button press by a user, for instance–has occurred. To create a successfully responsive app, you certainly want to know as soon as a user has pressed a button. Unlike other user interfaces, iOS requires you to define a new class called a delegate to receive the all-important event telling you that the user has asked you for something. Sometimes you can define your original class as its own delegate, but you must in every case understand the concept of a “delegate” and define it to receive the critical event.

Apple enthusiasts like to present the delegate notion (I refuse to enshrine it through the term “pattern”) as a nice division of responsibilities. I haven’t seen any explanation why other systems, which also have to respond to user events, allow a class with the important user interface elements to respond to its own events. The delegate notion is an Apple oddity that survived the transition from Objective-C to Swift. Go along with it and learn to implement it.

Memory management

A bit of history for those who are not too impatient. Objective-C was a mostly ignored language in the 1980s (those who wanted C to support objects turned to C++, which had its own dragons), but was implemented by the Free Software Foundation in its GNU compiler, a great advance over other C compilers at that time. Steve Jobs had a soft spot for the GNU compiler and therefore based his mostly forgotten NeXT computer on Objective-C.

This historical footnote seems to be the only reason the LLVM compiler and its well-supported Objective-C became the official iPhone language and thus, by some reckonings, why Objective-C went from one of the most irrelevant computer languages to one of the most critical ones to learn after the iPhone was released.

Thanks for your patience–now, what was I talking about? (Andy, check your heading. Memory management.) LLVM offers garbage collection, but Apple chose to reject its use in iOS, because it can slow down an app at a critical time. When you write your app using, say, Java, you never know when garbage collection will raise its head and take over the CPU to handle background logistics that have no interest to the user. So Apple eschewed LLVM’s garbage collection, but instituted a fairly sophisticated memory management system of its own,

Please excuse another digression: why is memory management important? Well, C and C++ leave memory management up to the programmer, and the result is scads of apps that don’t release memory when they should. I personally have experienced the slow-down and eventual hanging of my laptop due to memory leaks from poorly managed memory; I’m sure others have too. If you need to reboot your system every week or so, it’s because some program depended on programmer-controlled memory management and introduced a bug that left a memory leak.

Memory management is hard because programs can’t depend on the programming concept of scope to get rid of unwanted memory. Scope is supposed to protect you from memory leaks by removing the memory allocated to the data that a function or loop defines. But many functions return variables to their calling functions, and these variables must be kept in memory as long as they’re used. When can they be released? What if no function has a clear ownership of a variable and no one ever releases it? You start allocating memory until you reach infinity.

What does Apple do to improve memory management? It understands that some variables are used only locally and should be released–good design–but assumes, by default, that variables shared with outside functions are allocated forever. You can specify, through the weak and unowned keywords, that variables should quietly go away when your function doesn’t need them anymore. If you use these keywords correctly, you won’t load down the user’s device with fatal memory leaks.

It turns out that weak variables are critical to using delegates without introducing memory leaks. You need to learn to use delegates, so you need to learn iOS’s memory managements system and weak variables. Bite the bullet.

Optional variables

Swift also preserved another oddity of the iOS runtime: the use of optional variables.

A typical optional variable is a pointer to some data allocated by your app. Suppose you try to retrieve an image or some other data, which may or may not exist. Success results in a variable pointing to the image or other data. Failure is conveniently indicated by an undefined variable, which points to a special term called nil.

Nil is hard to explain. It’s kind of a lacuna in the universe. If you have a counter, for instance, it can reach zero, and you expect it to do so. A nil counter is very different–it means the counter has no meaning, could not possibly represent a value, and is invalid wherever you refer to it.

Relational databases allow NULL values, which don’t indicate zero, but instead “This has no meaning.” That’s a very powerful concept, but one that easily leads to errors. For instance, if you use a relational database such as Oracle or MySQL and don’t check for NULL values, you may get erroneous results.

The seminal C language, which has formed the context in which other modern languages grew and still is used for key infrastructure, has long struggled with NULL values. Dereferencing a NULL pointer is still one of the key errors in C, and programmers are advised to write code checking that their pointer is not NULL before looking for data there.

Fast-forward to iOS. It offers optional variables, which can either be nil or contain an actual value. (I’ll spare you the pedantic discussion of the meaning of nil and NULL.) I know of only one other language that allows “optional” variables with nil values: the relatively little used OCaml.

Optional variables make sense in situations such as when you request a file or a resource over the Internet and you can’t get access to it. If you fail, you get a nil. So you have to be constantly alert to the possibility that an optional variable might be nil. Swift provides the ? character to determine whether you’re dealing with a real value, and the ! character to say, “Don’t worry; I’ve checked this optional value and it has real information in it.”

I see no reason to make a big deal over optional values, because if you check for something like:

the “if” statement returns false not only on nil, but on zero. Zero is treated as false, and so is a false value for a Boolean variable. The “if” statement doesn’t know the difference between zero for a numerical value, a false Boolean value, or a reference to data that happens to be nil.

This heralds a theoretical weakness for optional variables (how can you distinguish the legitimate value of zero from nil?) but in practical terms will have no effect. You already know the difference between a counter and a value you retrieve from an outside source, such as a web page. You know when you are checking a counter, and know what to do when it reaches zero. You know, in contrast, when you are checking a value you retrieved from an Internet operation, where nil indicates some network failure. So your program will never be confused, even though a zero counter and a nil Internet value produce the same result for an “if” statement.

The bottom line is to understand optional values in Swift and to handle them respectfully. Few other languages or environments require this, although the concept of nil or NULL is nearly universal.

Conventional language features

In this article, I’ve tried to highlight iOS oddities that might slow down programmers who have studied other modern languages. Much of Swift will be comfortable to programmers who who kept up to date with modern programming practices. A few such structures include:

  • Blocks, which Swift uses for closures and callbacks.

  • Parameterized arguments, which allow methods to be loaded down with optional arguments. The C language offers an equally versatile paradigm, which is to accept a struct as a single argument, but many modern language libraries prefer to string out separate arguments.

  • Immutable variables, which are most useful in functional programming contexts, but are offered by other languages as well.

  • Protocols, the Apple term for what Java calls interfaces.

If you feel comfortable with the concepts in this article, along with Apple’s conventions for long-named functions, arguments, and data constants, Swift should not present a high hurdle.

Editor’s note: Dive deeper into Swift with “Swift Development with Cocoa” by Paris Buttfield-Addison, Jonathon Manning, and Tim Nugent.

Public domain objective lens illustration courtesy of Internet Archive.

tags: , , ,

Get the O’Reilly Programming Newsletter

Weekly insight from industry insiders. Plus exclusive content and offers.

  • RahoulB

    Not often I comment on things, but you should do some reading on the history of Xerox, Apple, Objective-C and particularly Smalltalk.

    The “modern” practices (apart from immutability which is just as old), long-named functions plus Steve Jobs’ choice of Obj-C for NeXT all stem from Smalltalk and that visit to Xerox PARC.

    • dannyo152

      Yep. Also the “gcc soft spot” Steve Jobs supposedly had overlooks that Apple was sued and forced to follow the GPL with language and gcc changes. Once that was settled, Apple was faithful to the license, but GPL antipathy may explain why Apple (under Jobs’ leadership) loved non-GPL llvm/clang.

      NeXT hardly remembered? Maybe statistically among the 7 billion souls on this planet. But every book that teaches Cocoa programming mentions NeXT roots. GNU folks weren’t so enamored with [target selector: param] but rather with Objective-C as part of the NeXTStep development environment, replicated as GNUStep. Every biography of Steve Jobs written post-iPod mentioned prominently the NeXT years, mainly so as to simply explain why the second Apple run was so successful. If people don’t remember NeXT after that, then they don’t care about history. For these people, NeXT is as forgotten as Alan Turing or Grace Hopper, which is to say being forgotten is not a sign of lack of influence.

      Optionals are very much found in Haskell as the Maybe monad. Perhaps Mr. Oram has not studied as many languages, even paradigm-establishing languages (C, SmallTalk, Lisp, Fortran, Haskell), as he thinks.

      (Haskell and OCaml derive from ML. Those who would identify that as a better substitute for Haskell are welcome to do so. The identification of the monad — I still don’t really get it and these days prefer the heterogenous lists of racket— and the way it makes the developer change perspectives on programming is why I included haskell, though I’ve never looked at ML and, therefore, may be saying something completely wrong.)

  • While your points about optionals might’ve been true in earlier versions of Swift (and Xcode), they were not correct at the time this article was published. The boolean value true and the number zero are no longer equal to false. (Also note you have an extra open brace in your code snippet.)

  • You have shared an excellent post, thanks and keep up the good work.
    My website Cell Phones for Seniors