back to article Apple: We'll tailor Swift to be a fast new programming language

Apple stunned the audience at its Worldwide Developers Conference in San Francisco on Monday with a tool few expected: a new programming language for iOS and OS X software called Swift. There already is a programming language called Swift that was developed by the University of Chicago and Argonne National Laboratory for use …

COMMENTS

This topic is closed for new posts.
  1. ThomH

    What do you do when your commercial skills become obsolete?

    Read quickly, obviously.

    Honestly, the new language looks pretty good from the thirty-or-so pages I've read so far but I've yet to get to anything particularly complicated. E.g. if it uses the same runtime as Objective-C — reference counting (automatic or otherwise) rather than garbage collecting — then is it still a programmer's responsibility to avoid retain cycles? That's the main area where I felt the existing runtime (rather than the language) was looking kind of historic.

    1. SuccessCase

      Re: What do you do when your commercial skills become obsolete?

      You either have the possibility of the in-attentive generating retain cycles or the inexactitude of garbage collection, there is no middle ground that doesn't make decisions you should be aware of and making yourself. Given the choice between the two, ARC, for the serious programmer is, IMHO, the infinitely superior option because you have full and complete control. Plus the memory analysis tools are superb so you can easily identify any accidental retain cycles. The only problem with ARC is that the use case for understanding reference counting is rarely encountered, but when it is, reference counting techniques need to be fully understood. That's OK for me and from the sounds of it, you also, because we were raised with it. But for programmers who have never had to learn the art I feel truly sorry. 99% of the time, they will never need the understanding, but when they do, boy will life be difficult for them.

  2. Anonymous Coward
    Anonymous Coward

    Swift looks...

    ...very much like a strongly typed Python.

    1. Fluffy Bunny
      Meh

      Re: Swift looks...

      That's interesting, because I very much got the feel of basic Probably because of how variables and constants are declared. It would be a very easy language to port it over to a real computer platform like Windows.

      1. MacroRodent

        Re: Swift looks...

        "It would be a very easy language to port it over to a real computer platform like Windows."

        And risk getting hounded by Apple lawyers? No thanks. I will stick to languages that are not proprietary and owned by litigious corporations. There are still plenty of good choices.

        1. Ian Michael Gumby

          Re: Swift looks...

          If memory serves... objective-C is part of the gcc so that the core of Objective-C is open source. What is closed source is the libraries.

          So if you start with the gcc compiler (Apple/NeXT had to open source those bits under GPL) , use the same open APIs, and write your own libraries from scratch, you can do the same thing.

          IANAL so don't take this as legal advice, just that you could do it if you wanted to do it. It doesn't mean Apple won't attempt to sue you anyway...

          1. joeldillon

            Re: Swift looks...

            Apple not liking the GPL is why they switched from gcc to rolling their own C++ and Objective C compiler with clang. Swift is not a gcc front end, it's an llvm front end, so they're under no obligation to release the source to any of it.

            You could, of course, write your own compiler from scratch that follows the same language spec (I find it doubtful that you can copyright a programming language's syntax) but that's a lot of effort for not much gain, especially since you'd also need to duplicate Cocoa in order to actually port any Swift programs from the Mac (and as Oracle has proved, you can at least make expensive court cases out of that).

            1. Yet Another Anonymous coward Silver badge

              I find it doubtful that you can copyright a programming language's syntax

              That's not what Oracle say about Java,

              Apple could sell Swift to some evil supervillian years down the road who could claim that they own your independent implementation

              1. joeldillon

                Re: I find it doubtful that you can copyright a programming language's syntax

                You're exactly proving my point. Oracle aren't sueing on the grounds that Google copied Java (the language itself), they're claiming Google copied the APIs of Java's standard library (which is why I referenced Oracle when discussing Cocoa).

                (And why posit a supervillain? Apple are plenty litigious themselves, at least as much so as Oracle)

              2. Walt French

                Re: I find it doubtful that you can copyright a programming language's syntax

                Oracle's case against Google was always based on the organization and implementation of the libraries, not the java language itself.

                Might be a fine point for a programmer who doesn't care about what is in “the language” and what is in the libraries that make the language useful, but it's key for any organization big enough to actually consider implementing it.

            2. Ian Michael Gumby

              @Joe Re: Swift looks...

              And this is why I said ... If memory serves... :-)

              Sorry, but I'm dating myself when I look at Objective-C it was from my NeXTStep days. Way back in the early 90's.

              To the points raised... when a single company owns, controls and is the only adopter of a language... never bodes well in the long run. Even if its Microsoft or Apple.

            3. Walt French

              Cross-Platform Looks About as Easy as it Ever Gets

              It's definitely not worth MY time to write a Swift compiler, but LLVM is distributed under a very permissive license, and as others note, copyrighting a language seems just a tad unlikely.

              Seems that any firm with interest in having Swift on its platform (or, any firm wanting to add a Swift compiler to its existing stock of tools) should have relatively little difficulty. For instance, Google already uses the LLVM framework for much of its work; if they deemed the language to offer actual performance advantages over other languages, or if they wanted to facilitate cross-platform products, why wouldn't they?

              Seems interest is the key issue. NIH is strong in this world.

          2. Bronek Kozicki
            Windows

            Re: Swift looks...

            Yes ... front end for Objective-C is included in gcc suite, but it receives very little attention or maintenance and is no longer fully compatible with "reference" implementation which is in LVMM/CLANG.

            If one company "owns" a language they usually need only one compiler and will rarely spend the money to help maintain alternative platform(s). Look at Apple, Oracle (Sun before) etc. Microsoft involvement with Xamarin is exception to the rule. Still, if I only knew .NET, the icon above demonstrates what I would look like today (not that I look much better, but at least the clothes are clean most of the time).

            1. Ian 55

              If one company "owns" a language..

              .. it is a very good reason not to use it.

              They will, at some point, abandon it or do something else that breaks old code.

              Still, it's not as if there isn't a history of some people accepting the mantraps in an Apple walled garden.

              1. BlueGreen

                Re: If one company "owns" a language..

                > .. it is a very good reason not to use it.

                Word, bro! And that word is Dylan. Also developed by apple, then dropped, burning a lot of devs in the process.

                BTW dylan was pretty nice dialect of lisp.

              2. alcalde

                Re: If one company "owns" a language..

                Oh, now you've done it. You're going to get the Delphi users hounding you about why you should indeed use a corporate-controlled language (and pay $1500+ to do it). If you're in luck they'll also tell you that automatic memory management is for stupid people, type inference is "impossible" and that modern language constructs are "fads" added to "look cool" (all things I've been told on their forum).

  3. JDX Gold badge

    Just what the world needs

    Yet another new language. Apple are on thin ice using Objective-C already in my eyes, even MS don't force you to use their own MS-only language (.NET is only one option of many).

    Google did Go - which hasn't. MS have gained traction with C# but that's really the stand-out counter-example in terms of a new "next best thing" language that has been adopted into the mainstream.

    1. Byz

      Re: Just what the world needs

      I train Objective-C it is currently one of the most used languages due to the app store.

      I've had a look at swift and it is similar to Scalar, so it will be straightforward for coders to convert.

      It seems to remove all the brackets :)

      1. JLV
        Thumb Up

        >It seems to remove all the brackets :)

        Getting rid of brackets would be great. Last time I tried putting together an iPhone app, Objective Cs syntax made my head swim.

        I liked the Objective C <=> IOS API and integration model, very elegant. And I didn't mind Objective C for its C ancestry - coming from Python I actually find plain old C rather elegant, simple and expressive, though I don't use it much.

        But I am also slightly dyslexic and just can't wrap my head easily around languages that use brackets and parentheses in exotic ways. Reading and coding Objective C wasn't quite as bad as Lisp, but the bracket syntax just kept on getting in the way so I bailed on my personal project. Yes, could have used C, but almost everything is documented in ObjC and the iOS ObjC actually does a pretty job of looking after a lot of low-level C-ish stuff.

        If it seems Swift is here to stay and good enough for purpose, may have another look at iOS/Mac apps. Plus, marrying Python type simplicity with C type speed? Marriage made in heaven.

        As far as the world needing another language... look at ObjC's usage level. Huge, and that language was probably pretty minor before the app store made it popular. Swift, if well done, could easily get itself a pretty big mindshare.

        I think there is a huge opportunity to develop pedal-to-the-metal compiled languages that take on some the elegant simplicity we have seen in scripting languages and their built-in shortcuts. I.e. something that layers Python/Ruby ideas on C, but eschews C++ 's complexity and Java's obsequiousness. I had been thinking of looking at D at some point, just to step out of my usual scripting language focus, but this may be a good alternative too.

        Apologies in advance to all folks who hate Apple on principle. Should be toeing your line, my bad.

        1. E 2

          complexity and obsequiousness

          Vs. toadying.

          WTH is wrong with brackets?

          At least they save me from diverse tab settings.

          1. Charlie Clark Silver badge

            Re: complexity and obsequiousness

            WTH is wrong with brackets?

            The standard answer is that the brackets are there for the compiler not the programmer. People tend to prefer other methods for structuring blocks, but if you like them then good luck to you.

            1. Anonymous Coward
              Anonymous Coward

              Re: complexity and obsequiousness

              "The standard answer is that the brackets are there for the compiler not the programmer."

              Block structuring based on typographical conventions may be convenient for a subset of programmers, but it is even more arbitrary than brackets and braces. In effect, the syntax of space indented languages implies an underlying grid on the page - a hangover from teletypes - whereas bracket syntax continues to work without any kind of underlying grid. It's much more mathematically (and syntactically) satisfying for that reason.

              1. Charlie Clark Silver badge

                Re: complexity and obsequiousness

                It's much more mathematically (and syntactically) satisfying for that reason.

                I suspect it depends very much on what you're programming: whether you're essentially programming mathematical functions or doing something else. Which is why we use brackets sparingly when writing. And even then it remains a personal preference.

                Indentation is not more arbitrary for structure as it has to be consistent in order to be machine readable.

              2. Psmo

                Re: complexity and obsequiousness

                "In effect, the syntax of space indented languages implies an underlying grid on the page - a hangover from teletypes"

                Nope. More like a hangover from school where some teachers got you to structure your written notes leaving lines and space, so that logical flow was obvious and so that comments and marking didn't mess things up.

                " - whereas bracket syntax continues to work without any kind of underlying grid."

                Except that which is applied (differently) by every developer. More in a minute.

                "It's much more mathematically (and syntactically) satisfying for that reason.It's much more mathematically (and syntactically) satisfying for that reason."

                Perhaps you've never had to maintain or debug code written by 4 different people over the course of a decade and reindent several hundred lines of code because someone, somewhere, changed the logical flow due to a misplaced bracket or inconsistent bracketing style. Personally, I enjoy being able to read, reuse and maintain code I (and others) wrote years ago.

                1. JDX Gold badge

                  Re: complexity and obsequiousness

                  "Perhaps you've never had to maintain or debug code written by 4 different people over the course of a decade and reindent several hundred lines of code because someone, somewhere, changed the logical flow due to a misplaced bracket or inconsistent bracketing style. Personally, I enjoy being able to read, reuse and maintain code I (and others) wrote years ago."

                  If only there were incredibly easy ways to automatically impose a standard formatting... or if only it were possible for a lead developer to require people follow a convention.

                  Messy code serves a purpose, it tells you that the project has probably not been run well and you should be wary.

                2. Anonymous Coward
                  Anonymous Coward

                  Re: complexity and obsequiousness

                  "reindent several hundred lines of code "

                  This is what we have IDEs for. You set up your formatting conventions, you open the source code, you click on a menu item...everything is now laid out how you like it and you can check the control flow. But if the control flow is hard to follow, chances are some refactoring is needed. Which the IDE makes rather easy, especially if there is a set of coding standards that can be applied. I've cleaned up after enough departed programmers to speak from extensive experience.

                  Yes I have encountered programmers who regard line numbers as sacred, so don't allow any reformatting. And also programmers who insist on applying every single suggestion Netbeans offers, thus refactoring code without first checking with their superiors that this is OK. But let's face it, anybody who is using line numbers to maintain control of OO code is probably writing the wrong language, and the reckless editor needs to be taken aside and have the basis on which he gets paid (it's going to be a he) explained to him.

                3. Anonymous Coward
                  Anonymous Coward

                  Re: complexity and obsequiousness

                  "Perhaps you've never had to maintain or debug code written by 4 different people over the course of a decade and reindent several hundred lines of code because someone, somewhere, changed the logical flow due to a misplaced bracket or inconsistent bracketing style"

                  Perhaps you've never had to maintain or debug code that was working perfectly but now isn't because someone accidentally deleted a tab on the last line of a block which brought the line into the outer scope. Still perfectly syntatically correct , the code just doesn't work properly and there's no obvious reason why. Try deleting a single bracket in C++ and try compiling and see what happens.

                  Whitespace defined blocks are an abortion and no sane language designer would ever use them.

                  1. Psmo

                    Re: complexity and obsequiousness

                    "but now isn't because someone accidentally deleted a tab on the last line of a block which brought the line into the outer scope. Still perfectly syntatically correct , the code just doesn't work properly and there's no obvious reason why. Try deleting a single bracket in C++ and try compiling and see what happens."

                    My experience of multi-dev Python projects is limited, but that's probably why the Python style guide tells you to use spaces not tabs. Cool, didn't get that before- thanks.

                  2. alcalde

                    Re: complexity and obsequiousness

                    1) Your code should be using tabs, not spaces.

                    2) There would be an obvious reason why because simply looking at the code will reveal the problem. The indentation can't lie to you.

                    3) Try having code where the indentation does not match the brackets. Raymond Hettinger tells of a large, complex C project with a huge cascading if section. The developers knew this was trouble, ran it through a prettifier (yes C developers recognize in their hearts the importance of indentation) and used extra scrutiny on this section of code. The indentation STILL managed to lie to them and ended up costing them over a million dollars thanks to the error!

                    4) Whitespace defined blocks are not "an abortion"; they've not only been adopted by one of the most successful languages but other languages like Haskell have (optionally) adopted it as well. Today it's seen as a bold and innovative move and one the major strengths of the Python language.

                    1. Anonymous Coward
                      Anonymous Coward

                      Re: complexity and obsequiousness

                      "2) There would be an obvious reason why because simply looking at the code will reveal the problem. The indentation can't lie to you."

                      Really? Is this last line in the correct place or not?

                      d = datetime.date(.....

                      t = datetime.timedelta(1)

                      for days in xrange(1,5):

                      ..do something

                      ..d -= t

                      print d.strftime("%Y/%m/%d")

                      With a missing bracket it would be obvious. With python unless you look through the rest of the code its simply not possible to tell.

                      "Today it's seen as a bold and innovative move and one the major strengths of the Python language."

                      Well is not innovative since Occam was using it back in the late 80s but its certainly bold since its trouble waiting to happen as I've found out. People use python despite the block delimiting system, not because of it. Theres a good reason 99% of programming languages , even new ones - languages designed by experts in their field - do NOT use whitespace block delimiting.

              3. PyLETS
                Coffee/keyboard

                Re: complexity and obsequiousness

                Block structure based on typographical convention has the interesting effect of encouraging you to use the main roads more because of these minor speed bumps, rather than trying to construct very long journeys using seemingly more familiar and understandable but minor and tangled roads and lanes.

                For projects requiring more than a couple of hundred lines of code, you should generally be focussed on the source files, packages, modules, classes and objects concepts relating to the problem and solution, not on how you get to do function, loop and branch control done in order to patch together something that just about works but is neither scalable nor maintainable.

              4. Anonymous Coward
                Anonymous Coward

                Re: complexity and obsequiousness

                ... but the code is then impossible to parse as a pattern and requires mental gymnastics to visualise. Indenting has an important place, and it has nothing to do with the TT

                1. Anonymous Coward
                  Anonymous Coward

                  Re: complexity and obsequiousness

                  Indenting is good for human readability - completely grant you that.

                  Visible character based block delimiters ensure that the UI element is separated from the syntactical element, but IDEs can then display code using whatever UI structure you prefer. The entire line and indent structure can be removed to represent an entire class as a single line of text, and can then be reconstructed visually at load time by the IDE. Isn't that good?

              5. alcalde

                Re: complexity and obsequiousness

                This doesn't follow for me at all. It's not about "typographical conventions"; it's about indentation levels. It's about the human eye. Even with brackets or begin...end pairs it's expected to indent properly anyway. Sadly, that's once for the compiler, once for the human. Syntactically significant whitespace lets you employ one means for both. Try reading a block of code with all of the spacing removed. As someone once put it, if you have a "lawyer's eye" you'll eventually be able to parse out what's going on. Now take the same code and leave the spacing but remove the block delimiters (braces or begin...end). Its meaning will still be crystal clear to you. Hence, it's absurd to argue that bracket syntax is in any way more satisfying.

            2. Anonymous Coward
              Anonymous Coward

              Re: complexity and obsequiousness

              The standard answer is that the brackets are there for the compiler not the programmer. People tend to prefer other methods for structuring blocks, but if you like them then good luck to you.

              +1

              As someone from the Pascal and then VB camp (and SQL), I still prefer Begins and Ends over harder to see spidery litte { } things. Personal preference I know, but it's easier to see with my eyes.

              It's pretty sad in this day and age that languages aren't versatile enough to accept either, with the IDE showing you the ones you prefer. Hardly rocket science, dare I suggest.

              1. Dexter

                Re: complexity and obsequiousness

                As someone from the Pascal and then VB camp (and SQL), I still prefer Begins and Ends over harder to see spidery litte { } things. Personal preference I know, but it's easier to see with my eyes.

                -------------------------

                #define begin {

                #define end }

                1. Sander van der Wal
                  Angel

                  Compound statements considered harmfull.

                  Who cares how a compound statement is formulated, when the problem is that it exists? Compound statements are actually the root all evil in programming, after the banishment of goto.

                  In Oberon it was done properly: statement lists. No BEGIN or {, except at the star of a procedure. A couple of keywords terminated a statement list, and END was one of them, next to ELSE and possibly UNTIL (forgot whether Oberon has a REPEAT UNTIL).

                  1. John Smith 19 Gold badge
                    Unhappy

                    Re: Compound statements considered harmfull.

                    "In Oberon it was done properly: statement lists. No BEGIN or {, except at the star of a procedure. A couple of keywords terminated a statement list, and END was one of them, next to ELSE and possibly UNTIL (forgot whether Oberon has a REPEAT UNTIL)."

                    That's beautiful.

                    Remind me how many software projects have been built in Oberon.

                    Major corporate users?

        2. Fluffy Bunny
          Meh

          Re: >It seems to remove all the brackets :)

          Frankly, the only reason I see for learning Swift would be if it was a cross-platform tool. As an iPhone-only thing, you would just keep using a C variant.

          In it's day, BASIC was the tool of choice. Not because it was powerful, but because it was 1) easy to learn; and 2) it was ubiquitous - it would run on anything with only a little modification.

        3. Tom 7

          Re: >It seems to remove all the brackets :)

          Get rid of brackets? Why - it seems like a good thing to do for the lazy but have you ever tried, say, to do a web server page in python? Piece of piss in PHP but Python - any more than a couple of indentations and the meaning is lost. OK you can put in brackets but there's a false fear of them from fairy tales told by bad programmers and most python programmers seem to hide under the sheets when they are around.

          1. Charles Manning

            Python done right

            We had a discussion over this language yesterday and the concensus is that it fixes both the things that are broken in python:

            1) Typing.

            2) Adding braces. Indenting can be screwed up badly. Braces fix that.

            1. alcalde

              Re: Python done right

              1. It's static typing that's screwed up, not Python's. About 80% of all "patterns" exist to get around the static typing straitjacket that one chooses to slip on if the first place to avoid errors that almost never exist and if they do are easily spotted.

              2. Indenting CAN'T be screwed up in Python. It's syntactically significant. With braces the indentation can lie to you. In Python, it can never lie to you.

              for (i=0; i<10; i++);

              printf("Good morning\n");

              The indentation is lying to you about what's going to happen. This can't happen in Python. What will happen is always what it looks like will happen.

      2. Anonymous Coward
        Anonymous Coward

        Re: Just what the world needs

        "I train Objective-C it is currently one of the most used languages due to the app store."

        Perhaps it your world that means its the most used, but back in the real world the apple app store is a piss in a pond compared to all the other programs being written and running in the world that weren't written in Obj-C. Stuff you never get to hear about unless you apply for a job such as internal bank systems, or actuator control units or airport baggage handlers etc etc.

    2. ThomH

      Re: Just what the world needs

      In what sense does Apple 'force' you to use Objective-C? You can use as much C or C++ code as you like without having to hop any sort of barriers — the three can natively call each other directly, in the same source files. No managed/unmanaged border, no wrapper libraries, straight calls.

      Of the two on-the-box options offered by OS X 10.0 and 10.1 — Objective-C and Java — developers picked the former. Not Apple.

      As to whether the world needs a new language? It doesn't. But Apple needed one and decided they had to engineer it themselves because (i) they wanted it exactly to fit the existing runtime; and (ii) Apple usually thinks that way anyway.

      1. JDX Gold badge

        Re: Just what the world needs

        "In what sense does Apple 'force' you to use Objective-C? You can use as much C or C++ code as you like without having to hop any sort of barriers"

        How do I call system APIs using native C++? I genuinely thought I couldn't - you end up with C++/OBJ-C files which are normal C++ with lines like:

        UIInterfaceOrientation interfaceOrientation = [UIApplication sharedApplication].statusBarOrientation;

        How do I do that in pure C++, i.e. set the .cpp file as C++ rather than C++/OBJ-C in XCode? Maybe it's just because all the sample code I saw uses OBJ-C.

        1. ThomH

          Re: Just what the world needs (@JDX)

          From native C or C++? With the incredibly arbitrary restriction that the feature Apple specifically supplies for this — the ability to make Objective-C calls arbitrarily at any place within C++ code — isn't to be used?

          You'd use the C-level entry points to the Objective-C runtime: https://developer.apple.com/library/mac/documentation/Cocoa/Reference/ObjCRuntimeRef/Reference/reference.html

          You're working up to an objc_msgSend, probably, but you can also use class_getMethodImplementation to get the C function pointer for any method on any class. Then you just need to remember to specify the instance of the class as the first parameter, the selector for the method as the second and the other arguments in sequence after that (it's a va_list).

          For getting selectors, metaclasses and suchlike you'd use NSClassFromString, NSSelectorFromString and the gang. They take NSStrings as arguments but NSString is toll-free bridged — i.e. the two look the same in memory so just cast the pointer — with CFString, which is a pure-C API. So just use CFString.

    3. Adam 1

      Re: Just what the world needs

      .NET is not a language, it is a runtime. There is a pretty low level language MSIL that many excellent c# programmers haven't even heard of which is what your c# or VB.net or whatever compiles into which is if you like the native tongue of the framework which JIT compiles to assembler (usually x86/x64).

      But the point is that I can write some fancy component suite in c#** and you could buy it and use it within a VB.net codebase (or j sharp or whatever floats your boat)

      ** I probably couldn't

    4. Jolyon Smith

      Um, Apple don't force anyone to use Objective-C

      You can also use Oxygene (Object Pascal) or Hydrogene (C#) for Cocoa and/or CocoaTouch development. (or .NET or Java / Android for that matter).

      www.remobjects.com/elements

    5. Charlie Clark Silver badge

      Re: Just what the world needs

      Go is finding fans all over the place, though mainly in systems, because of its concurrency support. C# is, er, a rehash of Java for the MS runtime.

      1. JDX Gold badge

        Re: Just what the world needs

        >>Go is finding fans all over the place, though mainly in systems, because of its concurrency support.

        A few users is not a success. Where is Go in real-world use, something like 0.01% of developers? I bet there are more active Haskell developers

        >>C# is, er, a rehash of Java for the MS runtime.

        Yes, done very well. Who cares if it's a brand new paradigm as long as it works - which it does. It's very unashamedly "a better Java".

        1. SuccessCase

          Re: Just what the world needs

          Yes C# is one of the few developments to come out of Microsoft having a high degree of intellectual purity and rigour. It was designed by one of the best in the business and is IMO a very, very good language. And I'm saying that as someone who detests most things Microsoft.

        2. Charlie Clark Silver badge

          Re: Just what the world needs

          A few users is not a success. Where is Go in real-world use, something like 0.01% of developers? I bet there are more active Haskell developers

          No idea and no idea how anyone can reliably collect such statistics: what about all the shell and embedded stuff that never appears online?

          There probably are more people involved with Haskell than Go, though probably has a lot to do with the fact that it's taught at quite a lot of universities. The most recent Go conference managed to garner some 700 participants and I assume some of the sponsors are using it; we know Google and Canonical are.

    6. Mage Silver badge

      Re: Just what the world needs

      And C# is really MS concept of Java, derived from J++

      It's not that bad and does have Mono to run it outside .Net

    7. Greg J Preece

      Re: Just what the world needs

      MS have gained traction with C# but that's really the stand-out counter-example in terms of a new "next best thing" language that has been adopted into the mainstream.

      And let's be honest, while C# is a good language, that's also partly down to what came before it being so hideous.

  4. Snowy Silver badge

    The name...

    Would be good if they used a new name for their new programming language rather than one used by someone else's programming language. No one check the name to see if it was already in use?

    1. Anonymous Coward
      Anonymous Coward

      Re: The name...

      The other lot can just change their name. It's no big deal.

      Yours, S. Jobs

    2. Jordan Davenport

      Re: The name...

      The name's being in use already for similar products didn't seem to stop them with the iPhone or IOS...

    3. Jolyon Smith

      Re: The name...

      Yet another link with remobjects (of Oxygene/Hydrogene fame)...

      When Google decided they wanted to call their browser "Chrome" they had to ask RemObjects to change the name of their programming language (I believe they were compensated for doing so, but even so, it didn't seem to occur to Google to simply change the name of their browser (before it was released)). :)

    4. Anonymous Coward
      Anonymous Coward

      Re: The name...

      What, like Google did with Go? Oh...

      http://en.wikipedia.org/wiki/Go!_(programming_language)#Conflict_with_Google

    5. Adam T

      Re: The name...

      They did the same thing to Versions (the SVN client), when they implemented their fuck-me-where-is-Save As?? feature.. Nobody batted an eyelid.

      Long shadows cover a lot of indiscretions.

  5. Dan 55 Silver badge
    Meh

    No need to be so special, Apple

    The use case they give for Swift is quick game development, but that's already been done elsewhere (Lua), and for everything else there's a proper language like C++. If Apple had made them both first-class languages on XCode everybody would have been happy.

    Oh, and the last time someone decided to make semicolons optional, it didn't work out too well (JavaScript).

    1. JLV

      Re: No need to be so special, Apple

      Lua is a scripting language. Not the same thing at all, re. speed.

      1. Dan 55 Silver badge

        Re: No need to be so special, Apple

        That doesn't really matter with LLVM, Lua and other scripting languages can be compiled.

        1. John Gamble

          Re: No need to be so special, Apple

          Yes, the LLVM aspect caught my attention too. But there are multiple languages now that can use LLVM (including C), so while it's a nice feature in a new language, it's not trend-setting.

          Name-collision problem aside, it looks nice, but so do Google's and Microsoft's offerings, and that's just the corporate side -- the independents' languages are also going strong (yes, even the ones with the scary punctuation marks).

    2. Nigel 11

      Re: No need to be so special, Apple

      Oh, and the last time someone decided to make semicolons optional, it didn't work out too well (JavaScript).

      Works out pretty well in Python. Given tuple assignment you don't often need semicolons, but you can put multiple statements on a line if you want to.

      As for Swift, I lost interest the moment I noticed that variable names are unicode strings not ASCII-alphanumeric strings. Bleugh. Immediate fragmentation of the programming world into human-written-language-script communities. I can process code written by (say) a Frenchman or a Finn. The variable names may be less helpful than ones created by a Canadian or an Ozzie, but at least the necessary processing skill is there in my visual cortex. Which it is not, for a string of Chinese, Japanese, Korean, Tamil, or umpteen other possibilities.

      To say nothing of the fact that there are multiple unicode strings that generate the same visual representation (such as an e with an acute accent). It's bad enough dealing with O and 0, 1 and l and I. FAlL.

      1. Psmo
        Headmaster

        Re: No need to be so special, Apple

        Python v3 integrates unicode identifiers too. Here's the official rationale :

        <PEP_3131>Python code is written by many people in the world who are not familiar with the English language, or even well-acquainted with the Latin writing system. Such developers often desire to define classes and functions with names in their native languages, rather than having to come up with an (often incorrect) English translation of the concept they want to name. By using identifiers in their native language, code clarity and maintainability of the code among speakers of that language improves.

        For some languages, common transliteration systems exist (in particular, for the Latin-based writing systems). For other languages, users have larger difficulties to use Latin to write their native words.</PEP_3131>

        1. Nigel 11

          Re: No need to be so special, Apple

          I think it's a mistake.

          Dealing with a small set of foreign glyphs that are universal in a global programming community, is far better than the fragmentation that arises if every programmer uses their own script for their variables. It'll compile elsewhere, but it might as well be object code for all the use that the source will be outside that linguistic domain. I'll add, anyone who studies mathematics, gets to learn the Greek alphabet, and a few letters from the Hebrew one, and a handful of symbols not taken from any alphabet (eg union, infinity, ...). It doesn't give Greeks or Israelis any mathematical edge.

          I can imagine an alternative universe in which North America was settled by Russians. In that universe, the Cyrillic alphabet might be used globally by programmers. I'd be able to go along with that: learning to recognise a handful of new glyphs isn't hard.

          But learning 6000+ traditional Chinese glyphs in order to code: no way. I'd rebel and create a programming language based on the Latin alphabet. As for those in the far East ... well, China, Japan and Korea have all chosen to map their languages onto the Latin alphabet. Because we got to IT first, or because there are intrinsic advantages to our small alphabet over their huge ones? Don't know, but in China, this happened under Mao when the West was the Enemy, and before IT arrived there.

          1. Horridbloke
            Boffin

            Re: No need to be so special, Apple

            <PEDANT>

            Don't lump all those "far east" languages together...The Korean Hangul alphabet is small and reasonably phonetic.

            </PEDANT>

          2. Anonymous Coward
            Anonymous Coward

            @Nigel11 Re: No need to be so special, Apple

            wow, racism disguised as an argument about programming languages

            1. Anonymous Coward
              Anonymous Coward

              Re: @Nigel11 No need to be so special, Apple

              Racism? I see no racism in that post.

              Please explain.

            2. Anonymous Coward
              Anonymous Coward

              Re: @Nigel11 No need to be so special, Apple

              "wow, racism disguised as an argument about programming languages"

              Grow up you silly student.

          3. Ken Hagan Gold badge

            Re: No need to be so special, Apple

            "Don't know, but in China, this happened under Mao when the West was the Enemy, and before IT arrived there."

            Back in 1949, I don't think the West was the enemy. Japan and the Nationalists were the enemy. Marxism was openly acknowledged as a European idea, which is why Mao spoke of "socialism with Chinese characteristics". Mao was also friendly with Soviet Russia, which *we* might not think of as a western nation but the Chinese certainly do.

            Consequently, 1949 saw the adoption of the Gregorian calendar and the creation of "simplified Chinese" (as distinct from the "traditional Chinese" still used elsewhere). These were presented as good ideas from the West at a time in history when it was still possible to give credit where it was due.

          4. Anonymous Coward
            Anonymous Coward

            Re: No need to be so special, Apple

            > I think it's a mistake.

            Don't worry - Google will soon extend Google Translate to translate the code into English (or whatever) for you.

      2. Gary Bickford

        Programming languages could support internationalization

        Internationalization of programming languages should be fairly easy to do.

        I thought about doing the following for PHP a number of years ago. Have as an optional first line, an instruction saying what language the file is written in. Provide support in the parser the use of an internationalization table, like what is presently common in web pages, frameworks, and many software packages.

        This would require two sets of tables, one for the language's reserved words, and one for variable names defined by the programmer. The first would be essential, the second useful. For full support, it would also be useful to have either an identifier tag in every comment pointing to its translation, or to inline multiple languages in the comment.

        That's really all it would take, and then a code file could be managed simultaneiously by developers in different languages. A good IDE could even make a stab at translating variables and comments on the fly. (I know, hilarity would ensue - but that would make programming more fun!)

        This is so simple, it's rather interesting/odd that it hasn't been done already (AFAIK).

      3. JLV
        Thumb Down

        Re: No need to be so special, Apple

        >Immediate fragmentation of the programming world into human-written-language-script communities

        I fail to understand your point.

        Both of us would hate to read code written with Mandarin characters. So?

        Don't write your code with Mandarin characters. Avoid third party and OSS code using Mandarin characters.

        How is this radically different from avoiding ASCII-only programs written in languages other than English, unless that other language is one you are familiar with? How many German proggies on GitHub? I personally avoid programs written in French as well, though I am bilingual - makes it harder to collaborate with others.

        Giving Chinese coders freedom to write things the way they prefer in no way takes away any of your freedoms. Doesn't mean I am forced to pay or use any Chinese proggies myself, does it?

        It's a big world. Live and let live.

    3. Ian Joyner Bronze badge

      Re: No need to be so special, Apple

      Semicolons were introduced in ALGOL to make multiline statements possible.

      JavaScript might have made a mess of removing them, but it is entirely possible to have such a regular syntax in a language that they are not needed.

      Look at Eiffel for a decent language where semicolons are not needed.

  6. h3

    What about just straight C ?

    1. stucs201
      Coat

      re: straight C

      Huh? Look at it. Definately a curved letter. Nothing straight about it at all.

    2. Destroy All Monsters Silver badge
      Holmes

      What about just straight C ?

      Frankly, you are part of the problem.

      C was something like a cutter from a gore flick in the 80's. It is so now.

      The only good thing about it is brevity of the manual. Which is, of course, very important.

      1. Anonymous Coward
        Anonymous Coward

        @Destroy All Monsters

        I'm wondering what C did to you to cause you to be so bitter about it.

        1. Anonymous Coward
          Anonymous Coward

          @Daniel Palmer

          It's too difficult for him to learn. Poor baby.

          1. Psmo

            No one really recovers from their first unexplained segfault on a progamming assignment, do they?

            1. Anonymous Coward
              Anonymous Coward

              >No one really recovers from their first unexplained segfault

              > on a programming assignment, do they?

              If you are taught C but not taught on how to use a debugger then you need a better teacher I guess.

              C doesn't have super fancy exception handling and most of the time it's up to the OS to catch your mistakes but you can make the same mistakes in almost any language. Your first segfault should serve to teach you how to find your mistake with a debugger (or the raw address and addr2line), what you did wrong and why not to do what you did again in the future.

              I actually like turning up weird issues like compiler optimization errors.. like GCC breaking alignment rules for floats on some ARM machines. Issues like that have taught me a fair bit and earned me some cash in the process.

          2. JDX Gold badge

            re:It's too difficult for him to learn. Poor baby.

            Considering C is easier to learn than C++, or Java, that seems a rather weak attempt at an arrogant put-down. C is one of the easier languages because there is so little to it IMO. I learned C after BASIC and found it easy enough, learning C++ was a bigger jump.

            1. John Gamble

              Re: re:It's too difficult for him to learn. Poor baby.

              "Considering C is easier to learn than C++, or Java, ..."

              Hmm, agree with you on C++, but not on Java, which is pretty easy to learn (note that this says nothing about the relative merits of the languages). The only stumbling block I had was with the memory model, and it's a block that lasted all of five minutes before I got it sorted out.

              And I'm not sure that "easy to learn" should be a deciding factor in language use anyway.

            2. Fibbles

              Re: re:It's too difficult for him to learn. Poor baby.

              "Considering C is easier to learn than C++"

              I'd say C++11 is easier than straight C.

              "Wrap that in a smart pointer" is far easier than "malloc and for the love of god remember to free".

              Still wouldn't teach it as a first language though.

      2. Nigel 11

        C

        C is one of a small set of languages in which it's possible to write a useful operating system kernel. Don't knock it. But also don't use it, if you're not writing something that requires OS-like control over the fine detail of the generated code. And for heaven's sake don't teach it as a first language.

    3. lurker

      The main thing wrong with C is that life's simply too short. Operating systems written in C - great. Fart apps? Overkill really. Just not the right tool for the job.

    4. Ian Joyner Bronze badge

      C is an old language with holes so big you can drive a truck through them. C was developed on the very limited PDP8 architecture and it shows. Time to move on folks.

  7. stucs201
    Joke

    Is it a proper programming language?

    Its said its possible to write FORTRAN in any language. So: can you? If not its no good.

    1. Jordan Davenport
      Joke

      Re: Is it a proper programming language?

      println("FORTRAN")

      1. Nigel 11

        Re: Is it a proper programming language?

        PRINT*,"Swift"

  8. Anonymous Coward
    Anonymous Coward

    "Cupertino spent years developing Swift as a 21st-century language that's suitable for everything from short programs to entire operating systems."

    How on earth do Apple keep initiatives like this from leaking to the press for such a long period of time ?

    1. ThomH

      There were quite a lot of speculation around 2010 of a full-scale switch to Ruby; I guess Apple ended up deciding that they liked the idea but wanted a bit more control and a completely native coupling to the existing runtime?

      1. Charlie Clark Silver badge

        I don't think there was ever a real chance for that to happen and the fact that the presentation makes no reference to Ruby says a lot.

        Much as I like Python as a language I also have to admit that it isn't always the best choice. Google still makes extensive use of it internally but developed Go for the heavy lifting systems work. But having a simpler syntax for app development (assumes the runtime can handles most of the necessary magic) might work quite nicely.

      2. hammarbtyp

        "but wanted a bit more control"

        Ha Ha Ha Ha Ha ...Oh sorry that wasn't an attempt at irony

    2. TheOtherHobbes

      >How on earth do Apple keep initiatives like this from leaking to the press for such a long period of time ?

      Apple pretty much owns the LLVM team, which is not big anyway.

      The only possible leaks would have come from the third-party game houses, and I doubt they had many people working on this.

      So probably only 20-30 people were in on the secret.

      It's also why Apple dumped gcc. (That and crap performance.) With LLVM as the back-end, it's easy(ish) to put a new parser on the front-end without alarming anyone loud.

    3. Anonymous Coward
      Anonymous Coward

      another brick in the wall ?

      The press are also behind that Great Wall of China.

      Time for a cuppa.

  9. tempemeaty

    Interesting how this creates another advantage for Apple

    It looks to me like a good way to get programs running faster on Apple hardware in a way that Microsoft can't follow on their own. I doubt Microsoft has the ability to create a good faster language of it's own like this.

    1. JLV

      Re: Interesting how this creates another advantage for Apple

      >a good faster language

      Hold your horses. This is at a big step beyond slideware, true, and it looks interesting. But let's see how it performs in real life first, before endorsing it as a good language.

      Does it actually work as claimed? Will coders be productive in it? Will there be sufficient libraries and batteries provided? Will Apple support it sufficiently? Will you be able to get 3rd party tools & libraries? Will it be used on a large scale?

      Time will tell.

      Besides D, Go is also similar, but doesn't seem to be getting much traction either yet. To an extent, I guess that's because when we interact with Google's ecosystem we can mosty do it in javascript, other scripting languages or even Java. So the use case for Go is not as clear cut.

      Finally, MS's C# is not that bad a language from what I've seen. Syntactically at least, irrespective of reliance on MS libraries, shifting APIs, Visual Studio shenanigans and other aggravations.

    2. Fluffy Bunny
      Thumb Up

      Re: Interesting how this creates another advantage for Apple

      How about trying Swift.net... Don't tell me Apple has trademarked that? Ok, let's go with S#.

      1. Destroy All Monsters Silver badge

        Re: Interesting how this creates another advantage for Apple

        That looks like shit.

  10. Anonymous Coward
    Anonymous Coward

    I'm sure there's nothing at all retarded

    about the Objective-C strawman implementation of RC4.

    1. Kevin McMurtrie Silver badge

      Re: I'm sure there's nothing at all retarded

      And that brutally whipped Python example too. JIT compiled languages are anywhere from several times faster to millions of times slower than the reference C implementation, depending on code style.

      Objective-C is so ugly that I can't stand to look at it. Creating a new language that nobody has seen before is an odd solution. Was it Steve's dying wish to kill Google's Go?

      1. E 2

        Re: I'm sure there's nothing at all retarded

        Apple merely stealing a page from Microsoft's book: rename the APIs and make a new language.

        And then some magical thinking... by the time the furor and Apple Tame Press and Apple Fanboies quiet down, the Ghost of Jobs will have offered inspiration for New Product.

  11. steven W. Scott

    Rejoice Peasants!

    Verily, that which causeth our exile from Eden now blesses our bosoms with perfectification of language! Cast thy evil Java and C/C++ from thy houses, and stand with thy master Apple in righteousness! Let the Trinity of P's (Python/Perl/PHP) be cast down as derision!

    *...(dry hacking cough from third pew)...*

    Yeah, indeed, cast thy Google and Windows transgressions into the fire, lest they cause you more sin. The language of thy master Apple shall fall like coins, shimmering gold in the light, falling from thy wallet wherever ye may be found, so all may know ye as the people of thy master Apple. Rejoice indeed!

    *... (Choir sings hymn "Bringing in the Sheep")...*

    .

    .

    .

    .

    .

    .

    .

  12. Kevin McMurtrie Silver badge

    Leaky

    Ugh. Swift uses Automatic Reference Counting and Apple says it does leak on circular references. Apple outlines a technique for defining weak back references to avoid leaks in common data structures but a complex application (something you might want on Trash Can Mac) has a lot of threads, a lot of shared data, and a high degree of independence between threads. In those cases there are not always clear forward and backward references. I could be wrong, but it seems like a complex multithreaded program in Swift will have some of the same memory management nightmares as C++.

    Hopefully Swift can support memory tracing in the future to purge circular reference leaks.

    1. stanimir

      Re: Leaky

      Aside cycles the ref. counting (esp. automatic) blows for multi-core due to shared writes and need for atomic operations.

      Indeed it's deterministic but I don't see it as system language as marketed.

    2. Ken Hagan Gold badge

      Re: Leaky

      I can see that automatic reference counting is a problem unless you also allow for explicit weak references. That was known back when Lisp was young and there's a "creation myth" about it in the Jargon File. I would reckon it "embarrassing" to introduce a new language in 2014 with this flaw.

      I don't see how you make the jump from there to the "nightmares" of C++. But then, I don't have any problem with memory management in C++, because I use the facilities of the language to handle it all automatically. Perhaps you were thinking of C.

    3. Ian Joyner Bronze badge

      Re: Leaky

      Lack of true GC is a disappointment. Objective-C does have it, but it was ditched for iOS. ARC still has many problems, and non-GC is a nightmare in multiprocessing environments.

      Bill Bumgarner even had the good book on GC by Boehm which I'm pretty sure went into GC in obj-C. I think it must have only be left out of Swift because GC still frightens C and C++ programmers.

      But GC is now widely deployed in Eiffel, Java (not such a good implementation), Python, Ruby, etc.

      Not having GC means Swift really can't be counted as a modern 21st century language.

  13. E 2

    Don't drink the Koolaid

    Plainly an openCL implementation.

    Apple got no new physical shiny things, so rolls out shiny ephemeral things.

    1. 45RPM Silver badge

      Re: Don't drink the Koolaid

      Did you notice that this was WWDC? It's not about hardware - it's about the nitty gritty of software - the most interesting part of any computer system.

  14. herman

    More secure, but it has problems with reference counting and it has type inference. I.e. it can do unexpected things to the programmer just like Perl. Hmmm, maybe it needs a little more work - PASS...

    1. QdK

      No Perly crapness in sight

      Uh, it only has type inference when defining a variable without initial content:

      var test; // generates an error

      var test = "bla"; // obvious inference

      var test = 1; // obvious inference

      var test: String; // type declared, no error

  15. Destroy All Monsters Silver badge
    Holmes

    Yet another Dylan?

    Anyone remember Dylan?

    I was pretty excited about it - for a couple of months.

    It also sank without a trace.

    Well, back then Apple was throwing out people as fast as they could. Now they have the military budget of a small country in the bank. So....

    Anyway. Bartender, one LISP please. With added STM.

    1. Anonymous Coward
      Anonymous Coward

      Re: Speaking of LISP

      I once walked into an academic bookshop, in Manchester, and asked for a book about LISP. "Are you thakin' the pith," replied the sales assistant.

    2. Anonymous Coward
      Anonymous Coward

      Re: Yet another Dylan?

      > Anyone remember Dylan?

      Yes. Still got the install CD somewhere. No Mac old enough to run it on though. :-(

  16. 45RPM Silver badge

    Swift looks very interesting to me. Personally, I like Objective C (and plain old C too of course.). One of the things I like most about Objective C is that I can develop in Objective C on Linux - it's cross platform. If Swift is cross platform then I look forward to playing with it.

    I hope I'm wrong, but Swift looks to be Apple only. I don't like the look of that at all. If the milk turns out to be sour, I ain't the kind of pussy to drink it.

    1. Charlie Clark Silver badge

      I think they'd be stupid not to open source it (the core at least) and get academics paid by someone else to kick the tyres. Would surprise me greatly if they don't do it.

      1. 45RPM Silver badge

        @Charlie Clark 'I think they'd be stupid not to open source it (the core at least) and get academics paid by someone else to kick the tyres.'

        I'm in total agreement with you. At the moment, I'd be surprised if they did open source it - but I seem to be in a little negative minority! If they do open source it then it'll be a great addition to my programming arsenal, and I look forward to learning it.

        If, on the other hand, it's Apple system only, I'll probably still learn it (and I'll feel all bitter and twisted about it at the same time!)

  17. psyq

    Yet another...

    Every time a company in the Valley becomes big enough (sometimes not even that), they have to give it a try and make their own programming language.

    Apple already did this, it looks like this is their second try.

    World is full of "C replacements", they come and go... but for some reasons, C is pretty much still alive and kicking and something tells me that it is going to be alive long after the latest reiteration of Valley's "C replacement" is dead and forgotten.

  18. Anonymous Coward
    Anonymous Coward

    "There already is a programming language called Swift that was developed by the National Science Foundation, some other government agencies, and the University of Chicago for use in parallel computing applications. This isn't that."

    Apple is into recycling, especially names and ideas.

  19. Rich 2 Silver badge

    ...without the bagage of C

    What "baggage"? C is one of the most simple and sparse languages there has ever been. That's why it works.

    1. Nigel 11

      Re: ...without the bagage of C

      C is one of the most simple and sparse languages there has ever been. That's why it works.

      Oh really? So why hasn't it been universally trumped by LISP? (And for that matter why did they ever do C, given B? )

      1. Rich 2 Silver badge

        @Nigel 11

        Your comment makes no sense at all. Are you saying C DOES have bagage or not?

        1. Nigel 11

          Re: @Nigel 11

          No - saying C is not the simplest or sparsest language. That honour surely goes to LISP, and B was simpler than C. Simplest and sparsest is not the reason C is very popular in some programming communities. Like most successful languages, C has a niche, which is the writing of operating systems and realtime systems. I'll also grant that until computers became fast enough that interpreted languages weren't "too inefficient", C was probably the best general-purpose compiled language. (FORTRAN was and remains better for numerical coding, but only for numerical coding. Pascal, PL/I, and Ada never really caught on. I'll let someone else talk about C++ if they want to, because personally I loathe it).

        2. Tom 7

          Re: @Nigel 11

          The C language carries no baggage - there are folk myths that follow the language (and many others) around that basically translate as "I wrote some shit code in 'languagenotofchoice' so the language must be shit not the pack of lies I put in my CV when I applied for the job"

          I can write totally shit code in any language I choose - or I can use my experience (and that of a few million other programmers over 70 years) to paint all the potential fuckups in whatever language into a little corner where they dont hurt the end result.

          There are only 3 reasons for writing a new language

          1) you can be arsed to learn to use one of the ones that's around properly

          2) you have a phobia of some part of the keyboard

          3) you think you can gain some proprietary lock-in

    2. Nuno trancoso

      Re: ...without the bagage of C

      That's a simplistic view. Does it work? Certainly. Does it lend itself to pushing out applications real fast? Speedbump...

      Apples thinking seems to be, and quite reasonably, that this day and age, what you REALLY need is a decent glue. The heavy lifting is going to be done by the OS/Middleware, be it 3rd party or even your own. Thus, what you need is a better glue that brings all those bits and pieces together to provide the fastest (and easiest) path to go from prototype to application.

      And let's face it, that's precisely where most "conventional" languages don't really shine.

      To be fair though, he shouldn't have pinpointed C "baggage", but "baggage" in general.

  20. Simon B
    Devil

    As usual, trademarks only apply to Apple. Do as we the gods say , whilst we do the complete opposite. Twats.

  21. Rustident Spaceniak
    Thumb Up

    It's the first time in twenty years I've been tempted to write software again!

    Really, I never could get used to coding in C, and somehow never had time to learn any of the newfangled ones. But this thing sounds like a nice, clean new approach that migh be easy to learn for someone who still remembers Turbo Pascal. I'll sure give it a try tonight, and see how far I get.

    Not that I could lay claim to being anywhere like a developer, just a poor amateur. But if it's something usable for us amateurs, I'll write a personal email of thanks to that Mr Federighi.

  22. Alan Denman

    Nightmare on a fruity Elm Street.

    It was already in the 10 Apple commandments.

    Though shalt not write code for any other platform. This will make it more and more mandatory to have to write 2 very unique sets of code down the line.

  23. Pig Dog Bay
    Thumb Up

    C#+Thesauraus

    Quick guide to Swift for C# programmer:

    this...self

    interface...protocol

    var...let

    Thumbs up as anything is better than Objective C

    1. ThomH

      Re: C#+Thesauraus

      Swift uses var for variables, let for constants.

  24. Stevie

    Bah!

    Well if this doesn't drive a nail into the coffin-lid of "Apple is Unix" once and for all I dunno what will. C out, Swift in (not that I disagree that C is godsafwul, poorly represented by clever compilers, well overdue for a complete chassis-up rebuild and like that).

    I love that they reused the name of an existing computer language too. I guess they felt they had the better claim.

    Remind me again: How is this a different approach to that of the hated Redmond?

  25. Anonymous Coward
    Anonymous Coward

    I wonder...

    ... how much of a factor possibly breaking the cross device family kits was/is? I haven't gone that direction but it does seem an Apple/Jobs kind of thing to do. In any case, it looks like all the rest of the dozens that I've had to deal with over the decades. Gin up a crib sheet and I'll be fine.

  26. poopypants

    C is popular because

    - performance (generated object code is fast)

    - deterministic resource use

    - direct access to low level hardware

    - you can find a C compiler for the vast majority of embedded devices

    - the runtime and generated executable are small

    Keep in mind that there are far more embedded devices running C code than desktops running anything else.

  27. Anonymous Coward
    Anonymous Coward

    Cocoa Framework Integration

    I like some bits of the language: tuples for example and the comparison options on switch statements.

    However other areas are a bit odd: functions for example where parameters can be named, but aren't by default; unless you specify a parameter that has a default value in which case it is now named by default. Better and simpler, IMHO, would be to have all parameters named by default unless explicitly turned off.

    Also functions can have 'inout' params but you still need to prefix them with '&' when calling the function? Surely the compiler knows?

    Regular expression pattern matching is still AWOL.

    I'd need to see some examples of Cocoa framework integration before I can decide: being able to write println("hello world") is all very well but real-life apps draw windows, scroll table fields and respond to mouse movements / touches.

  28. Ian Joyner Bronze badge

    The claim is that it is a modern 21st century language. In some ways yes, but disappointingly, they have not ditched a lot of C baggage that should have been. Should not Apple lead the way into languages that ditch monospaced fonts. C syntax looks terrible in other fonts so needs to be in courier or monaco or similar. C syntax looks old now so Swift also looks dated.

    However, it is not artefacts of syntax. It's things like the auto-increment and decrement operators - known horrors from C (side effects). Why have enums and structs, when they could be unified with class? Inclusion of this stuff so as not to upset C programmers is "tasteless". Use of CamelCase - we do have underscores on our keyboards now (CamelCase came about because Xerox did not have underscores on their keyboards - that really makes programs look dated).

    On the other hand, Python-like lists, tuples, dictionaries are nice, in fact the unification of lists and dictionaries is nicer than Python. Optional values might be nice - do they give void-safe programming though?

    And they could have done garbage collection right. It was done in Obj-C many years ago, but dropped for iOS. So programmers will still have to deal with memory issues with ARC, which is bookkeeping that programmers should not have to deal with. ARC seems to cause a whole lot of issues and is frequently discussed in programming groups.

    Swift is a move in the right direction, but overall, I'm disappointed. It's quite a long way short of Eiffel.

This topic is closed for new posts.

Other stories you might like