5 posts • joined Wednesday 10th January 2007 23:34 GMT
Good to see
I'm a Mac developer who moved to Cocoa about three or four years ago, and has never looked back (I was using C++ before that with Carbon). It's good to see articles like this coming from a not obviously Mac-related site. As a professional developer I find Cocoa to be the most productive API (and with Xcode, toolset) I have ever used, bar none. And that' s by a considerable margin. I'd say to anyone wanting to dip their toes in the waters of Mac development: go for it. Obj-C's syntax might seem odd at first but after a while you'll appreciate its readability if not actually love it. And you can build a very functional small app (say, a complete word processor) in an hour or two which should be enough to impress the most hardened old coders. As for Java - well, it may be cross platform but (Mac) users know a Java app when they are forced to use one, so if you really want to make "proper" Mac apps, Cocoa is about the only reasonable way. I also find Java's documentation and class libraries bewilderingly complex compared to Cocoa's. Cocoa isn't perfect and has a few holes, but one of the great things about it is that plugging them is really easy (thanks to categories, among other things). I could go on... but I think I've made my point.
Never ignore a Pooh-pooh!
You know, if there's one thing I've learned from being in the army, it's never ignore a pooh-pooh. I knew a major: got pooh-poohed; made the mistake of ignoring the pooh-pooh -- he pooh-poohed it. Fatal error, because it turned out all along that the soldier who pooh-poohed him had been pooh-poohing a lot of other officers, who pooh-poohed their pooh-poohs. In the end, we had to disband the regiment -- morale totally destroyed ... by pooh-pooh!
Ribena concentrate marked with toxic HazChem symbol
Vitamin C aside, I used to live close to a Ribena bottling plant. The drums of concentrate were covered in HazChem "toxic" symbols. Made me look at the drink in a new light, I can tell you.
Followup regarding separate stacks
I'm not familiar with the 80xxx instruction set (though I am with a number of others) but I'm not fully convinced by the backward compatibility argument. For that to hold, code would have to be doing something very direct, such as an explicit "load the last address in the stack frame into the program counter" in order to do a subroutine return. In fact most ISAs have a seperate "RTS" type instruction, which does this operation implicitly. If that's the case, the internal implementation of the RTS is not subject to backward compatibility constraint - it can load an address obtained from any stack into the program counter. Stack frame alignment might be subject to compatibility - in which case such a processor can simply push an unused value onto the stack frame as a placeholder.
Small change to CPU design could fix this at a stroke
One thing I've often wondered about buffer overflow vulnerabilities - why don't CPU manufacturers simply keep the data and code/return address stacks separate? This could be done in a way that is totally seamless to the software, and would fix this kind of vulnerability at a stroke. Seems to me that the hardware builder that employed such a chip would have an instant marketing advantage too. Seems so obvious there must be a very good reason why it's NOT done - but what is it?
- It's true, the START MENU is coming BACK to Windows 8, hiss sources
- iSPY: Apple Stores switch on iBeacon phone sniff spy system
- Chinese gamer plays on while BMW burns to the ground
- Pic NASA Mars tank Curiosity rolls on old WET PATCH, sighs, sniffs for life signs
- How UK air traffic control system was caught asleep on the job