Of Intel Macs and Red Herrings
Scott Byer at Adobe put up a very nice post about the switch to intel and some of the growing pains many of us are going through. While I've added my own comments to Scott's piece, a number of my colleagues in Mac BU have asked me to weigh in on the subject here on my blog.
First off, it's difficult for me to discuss specific facts. I'm under NDA with Apple that precludes me from discussing specific issues with the transition, particularly with issues related to the tools.
Nevertheless, there are some red herrings out there, and I'm going to try to dispel a few of them.
1) Steve said it was just a recompile! What gives?
I've discussed this one before, but it bears repeating. When Steve Jobs did his demo at the last WWDC, he was talking to an audience of developers. He knew, as did every other developer in that room, that getting the code to recompile was only the first step in a long process of testing and verification. No two compilers generate code in exactly the same way, and no software developer worth the name goes through a compiler switch without extensive testing.
A few reports who've never written a line of code in their lives and who didn't actually take the time to go out and interview a variety of developers to get a different angle on the story propagated the notion that developers would be able to quickly move to Intel, but that's not what Steve was saying. Steve was only saying that the tools wouldn't be a major obstacle--or at least that Apple was prepared to get the tools to the point where they wouldn't be an obstacle.
2) Apple's been advising developers to move to XCode for years. All you had to do was follow Apple's advice.
At that time of Steve's speech at WWDC, most of the major applications weren't built using XCode. That's true for both Adobe and Microsoft. That fact alone complicates matters entirely. When this fact has been pointed out, this variation on the "Steve said it was just a recompile" meme has emerged. According to this meme, all we really needed to to was follow Apple's advice, and, well, we're the bad guys for not having done so.
While Apple did advise developers to move to XCode, Apple was rather tight-lipped as to the fundamental reason why. In the mean time, developers had to consider that advice along with the fact that Metrowerks' toolset was both significantly faster than the XCode/GCC combination and generated better code than XCode/GCC. No sane developer would sacrifice both a significant level of productivity and the quality of their product merely because Apple said so.
But all of that's very much beside the point. Whether we had gone through the pain of porting to XCode/GCC in some earlier release of our products, we'd have still had to go through this pain. The time spent doing this work then would have to have come from the features that we were, instead, adding to our programs. Arguing that we should have, somehow, absorbed this pain earlier really has little bearing on the nature and extent of the pain. We'd still have to do the work, and customers would have suffered as much then as they are now. The only difference is that, now, we have a tangibly legitimate reason for the suffering.
3) If you'd just ported from Carbon to Cocoa, your problems would have been solved.
I love this one. It so clearly demonstrates an astounding level of ignorance. I've already posted on the "Carbon vs Cocoa" issue, so I won't really belabor the point here. I'm just amazed to see people arguing that porting not only from one compiler to another but from one application framework to another would, somehow, magically be less expensive than simply porting from one compiler to the other.
4) Some apps have done it, why can't you?
In some ways, this is a legitimate criticism, but it glosses over significant differences between, say, PowerPoint or Photoshop and BBEdit. In this regard, there are two points worth considering.
The first is that the amount of work required to port a code base does not grow linearly in proportion to the size of the code base. There are a number of reasons for this, all of them related to complexity. For example, a more complex C++ program is far more likely to make use of certain language constructs (e.g. RTTI, templates and multiple inheritance) than would a less complex program. With more complex projects, the amount of work grows more on an exponential scale: i.e. twice the size of the code would require nearly ten times the amount of work to port.
The other is specifically related to GCC. GCC uses STABS to describe symbolic information so that the debugger can translate the code it's observing into symbols so that the programmer doesn't have to do this by hand. For those who care to take the time, RedHat has decent documentation on the STABS format.
The problem with STABS is that it's very verbose compared to other formats (particularly the xSYM format that Codewarrior uses). It's fairly easy to verify this. If you have XCode installed on your computer, create a simple "Hello World" Carbon project. Rename the main.c file to main.cpp (C++ generates mangled names due to overloading), and add a couple of simple classes to your app. Now, build both the debug configuration and the retail configuration. To be even safer, run strip on your retail build. Now, compare the difference in sizes between the two build.
I can't say which application, but I have it on good authority that a modern C++ application has actually hit the virtual memory wall (i.e. the combined code + generated symbols resulted in executable code larger than would fit in virtual memory). And, no, it's not an application that I've ever worked on or contributed to in any way.
Before I close this, I want to make two points very clear. First, I agree wholeheartedly with Scott's primary thesis: I can't imagine any developer who would prefer a long, drawn-out process of porting code from one build system to another to writing features that solve people's problems. We are already doing as much as we can possibly do. Indeed, one of the reasons I haven't posted very much as of late is because there really is a great deal of tedious work to do.
Secondly, I want to commend the tools group at Apple for the yeoman's job they've been doing to help us make the transition. I'm reluctant to name names, but I honestly wish I could. I've met with these people, and they really are an outstanding crew. Thanks, everybody. You know who you are.
Rick
Currently playing in iTunes: Dixie Chicken by Little Feat
Comments
Anonymous
March 24, 2006
The comment has been removedAnonymous
March 24, 2006
PingBack from https://scobleizer.wordpress.com/2006/03/25/a-bunch-of-stuff/Anonymous
March 25, 2006
Well I don't know what you know about computers but anyone who is listening to Little Feat is on top of things.Anonymous
March 25, 2006
The comment has been removedAnonymous
March 25, 2006
Thanks for your intelligent and no-finger-pointing post.
Who cares whose at fault? It's always a combination of factors. All we can do is learn from it and use the info to try to avoid doing the same thing again.
Thanks for working hard. There are users out here eager to use your stuff to accomplish great things.Anonymous
March 25, 2006
Good job Rick. I think that if more of the major application devs from Microsoft, Adobe, etc., were to make these kinds of posts, it could really help avoid "Adobe ported photoshop to Carbon in two weeks, what's your problem" syndrome that we are seeing now and was really bad early on in Mac OS X.Anonymous
March 25, 2006
Byer on MacTel, II: Photoshop engineer Scott Byer's essay this week on porting to new hardware/OS hit a need and was heavily linked... this morning I see he has collected an incredible number of comments at the blog. I can see waves coming in from theAnonymous
March 25, 2006
The comment has been removedAnonymous
March 25, 2006
The comment has been removedAnonymous
March 26, 2006
The comment has been removedAnonymous
March 26, 2006
Hey, your ports will still probably beat Vista to market :-)Anonymous
March 27, 2006
The comment has been removedAnonymous
March 27, 2006
The depressing side of this story, from the perspective of one of the original Metrowerks developers, is that Metrowerks possessed the technology necessary to make this transition painless for Microsoft and Adobe: an x86 compiler using the same C/C++ front-end that ran on the Macintosh and generated (decent, if not equal to gcc) x86 code. With a small amount of engineering work on the object file format and linker, and some assistance from Apple on debugging, they could have had an x86 version of Codewarrior available for the transition.
Sadly, the relationship between Apple and Metrowerks was never a good one, and the revenue from Macintosh tools was never enough to be worthwhile to Freescale. Add in the fact that aiding the transition would just hasten the drop in Freescale's PowerPC revenue from Apple, and it was a nonstarter.
I always hate to see a good product die (THINK C, CodeWarrior) and it irks me no end that Apple was able to achieve its desired goal of replacing CodeWarrior without having to actually build a better product.
Porting a large software product (or, in the case of Microsoft and Adobe, 5-10 large software products) from one toolchain to another is an unpleasant, time-consuming, and mostly unrewarding. I am waiting patiently for the Microsoft apps before switching to a MacBook Pro, and I am not blaming Microsoft for the delay...Anonymous
March 27, 2006
PingBack from http://www.secretweaponlabs.com/words/2006/03/27/porting-isnt-as-easy-as-it-seems/Anonymous
March 27, 2006
Now how about some cleanup of your code when you port to Intel. Can we PLEASE get rid of the Microsoft User Data folder from Documents and put it in Library where it belongs?
I understand Office 2004 is a huge product, but I tend to feel developers like Adobe and Microsoft never plan to make their code more portable, so these kind of sudden changes from Apple are a tad less painful.
Though I am not a developer, I do work in IT and have seen LOTS of project suffer from the 'We don't have time for that!" syndrome, where the bear minimum is done to get something done, knowing full well that there will be huge pain down the road when cleanup time comes.Anonymous
March 27, 2006
Rick Schaut has a great post on why moving to Intel is not just a simple recompile as some make it out...Anonymous
March 27, 2006
PingBack from http://www.perardi.com/blog/?p=74Anonymous
March 27, 2006
I wonder... would it be cheaper for Microsoft or Adobe to buy CodeWarrior from FreeScale and make that compiler do what they need, rather than do all the work moving their software to Xcode?Anonymous
March 27, 2006
Nope. CW sold their x86 complier to someone else before being bought (or so I read)Anonymous
March 28, 2006
"...According to this meme, all we really needed to to was follow Apple's advice, and, well, we're the bad guys for not having done so.
"While Apple did advise developers to move to XCode, Apple was rather tight-lipped as to the fundamental reason why. In the mean time, developers had to consider that advice along with the fact that Metrowerks' toolset was both significantly faster than the XCode/GCC combination and generated better code than XCode/GCC. No sane developer would sacrifice both a significant level of productivity and the quality of their product merely because Apple said so."
I don't want to be accusatory, because I know Rick is in the Mac BU, and I have a friend there, and I know you guys are on my side. But I have to wonder about the contrast between the emphasis on fast-code generation here, and my experience with Mac Office.
I use Office X. When I use one of the built-in invoice templates, it takes about 300,000,000 clock cycles for each character I type to show up on the screen. In general, in all other contexts on my computer characters show up instantly as I type (including right now as I type this message.) I have a hard time believing that the code in Microsoft Office X for Macintosh is really as optimized as, well, as almost any other program I've ever seen or used. Also, I somehow suspect that Office on Windows is dramatically more efficient than the Mac version, and that no tears are shed in upper management that non-Windows users might have a somewhat sub-optimal experience...
I realize this is quite tangential to the question of porting to x86, but I'm just curious if someone who knows what they're talking about better than I do might have something illuminating to say about this. For instance, I'd like to think things would be getting better as time goes on; but I tried Office 2004 and it was even slower and harder to use...Anonymous
March 28, 2006
I switched about six months ago to a PowerBook in part because I preferred Mac Office to the Windows alternative. I can point to a few things like the Formatting pallette but, in general, it just felt easier to use and interact with. And, it is able to do that with relative feature parity of all of things I'd use, or want to use, in Office 2003.
I have to admit that after watching this video (http://www.microsoft.com/office/preview/asx/OfficeUIIntro.asx) that I am really impressed with what Microsoft is doing with the interface of the Windows Office and yearn for some of what they are doing as far as the new interface and ready templates to create good looking documents and effects. It also struck me that their interface is very Mac-ish - it is the first MS interface on Windows that I think they can plunk down on a Mac machine and it wouldn't look too out of place with very little change. The catagories on the Ribbon even looks quite a bit like the top of the Apple website.
I know that the MacBU is going to have their hands full trying to get Office ported to the new XCode and Intel platforms and architectures, but to what extent and at what point can we expect some of that goodness? It would be the first thing in a long time that made me jealous of my Windows-using friends if their Office has a nicer interface and can easily make better looking documents than my Mac Office...Anonymous
March 28, 2006
Apple is on the working group which released DWARF-3 spec in Jan 06 - not sure why Apple is still after STABS when the FSF GCC/GDB moved away towards DWARF a while ago.
I am surprised though to hear of the virtual address space exhaustion due to debugging symbols! Either GCC is mind numbingly terrible when doing STABS or the project doesn't have any concept of modularization at all. All bigger C++ projects (especially dealing with UI) already have concept of doing everything via 'plugins'. Only load them when they are required and optionally dump them when not required. It's hard to understand one would hit the 2Gb limit just with Debug information - something horribly wrong.Anonymous
March 29, 2006
It has all been well-known for a very long time, also that Mach-O binaries are the native format for OS X, not CFM. If Kevin Browne was right, more than half of your codebase is shared with the Windows version anyway (I forget the source of that interview but it has been said in public).
In other news, Final Cut Studio is a Universal Binary shipping today. Doesn't seem like a small project to me. If XCode cannot be used for large project, how did Apple compile this beast?
Even Quark has a beta out... I guess this is more a problem of justifying the cost of a full version upgrade. I understand that, the effort must be worth it, after all.
But PLEASE be more honest to your users. It may not be simple to do the transition but even the Intel announcement is 9 months old, even older if you believed Scoble, your very own apologist, err, evangelist.Anonymous
March 31, 2006
Rick:
About your comment that "GCC uses STABS to describe symbolic information".
GCC 4.0.0 on Mac OS X 10.4.5 says it can generate debugging info in at least 7 different formats:
-g Generate debug information in default format
-gcoff Generate debug information in COFF format
-gdwarf-2 Generate debug information in DWARF v2 format
-ggdb Generate debug information in default extended
-gstabs Generate debug information in STABS format
-gstabs+ Generate debug information in extended STABS
-gvms Generate debug information in VMS format
-gxcoff Generate debug information in XCOFF format
-gxcoff+ Generate debug information in extended XCOFF
I realize that VMS and COFF are other platform/dead formats, but just for grins, have you guys tried to use anything besides STABS? "cc -gdwarf-2" works for me (and produces a "Hello, world!" binary 10% smaller than using "cc -gstabs" :) )Anonymous
March 31, 2006
"All bigger C++ projects (especially dealing with UI) already have concept of doing everything via 'plugins'."
The application in question uses a lot of plugins. And that is a big part of why it hit the address space limit: duplicated symbols, lots of duplicated debugging info, address space fragmentation, etc.
Plugins are not a panacea.
DWARF: I think Apple's GCC builds are a little behind the FSF (technically, they're a branch with a lot of custom code added), and Apple would have to adapt all of their tools to handle DWARF before they can make that jump.Anonymous
March 31, 2006
Ralph: CFM is just as native to OS X as Mach-0. The OS can load and use both binary formats. And both have the facility to contain binaries for multiple architectures, languages, etc.
Avie, er, I mean Apple, may prefer Mach-0 -- but there are a number of drawbacks to using Mach-0, and very few benefits.
Don't you suspect that Final Cut had a little advantage because they're an Apple application and have been using XCode/GCC already for a few years? And have you asked any of the Final Cut engineers about how difficult their transition really was?
The fact is, Rick is being honest, and Scott is being honest.
And both are holding back a bit.Anonymous
March 31, 2006
I agree fully that Metrowerks Codewarrior was a great product and Office a great and huge software to port.
Apple is not behaving very profesionnally with all these changes in the past 3 years. They have a history of messing with their partners, now they have enough money to buy some and make a sub-micropoly.
Microsoft BU on the other side should fix its Office 2004 and especially the VBA and memory management because Office 2004 is a buggy software even without the Intel story. Partly because OS X Server file sharing support is flawed and partly because Office looks like a monster to debug. The latest patch doesn't fix anything in the VBA area.
Fix Office, let the Rosetta run and take a year to port your code to Intel: who cares about being native as long as Office works?Anonymous
April 01, 2006
I agree about the disaster that Word 6 was as far as the Mac experience is concerned. What a relief when Word 98 came out. And I am also glad to see that Microsoft ports to Intel CPUs at all.
But I cannot see how MS Office is really optimized in any way for the Mac: The Microsoft User Data folder has been mentioned.
Worse than that I find the fact that all Office apps take up CPU when they should sit idly in the background. Open an empty Word, Excel, PowerPoint document and Entourage. Then check in Activity monitor how they eat anywhere between 0.5 and 5% each doing nothing for the user. What a nuisance specifically for laptop users.
Then there's the font issue: Many of the great typographic features of OS X's font system are not used in Word X (cannot say for Word 2004, although the non-idle issue has been confirmed for the current version). Just type "Zapfino" using the Zapfino font in TextEdit and Word and see the difference. And there is incredibly much more if you open the "typography" menu and play with the advanced options of OS X's Text System.
This is not to bash MS. Despite these deficiencies I still find it the best office package for the Mac available and I recommend it whenever people ask although the price beyond the student/teacher version is steep.
I very much hope that MS uses the UB "opportunity" to bring out a great Mac Office that is indeed optimized and feels native not just in the GUI sense but also behind the scenes.Anonymous
April 01, 2006
anonymous - Why would you need to load all plugins at one time? If you don't dynamically load a plugin why will it consume address space, present duplicate symbols and duplicate debug information?
I don't think that Debug info hitting address space limits can become a practical problem easily.
Riot Nrrrd™ - GCC 4.0.3 says it can and so does Apple's documentation, but if you try to use -gdwarf it errors out saying unknown option to -g.Anonymous
April 01, 2006
Riot Nrrrd™ - Scratch that - I was having 3.3 in path - 4.0.1 works fine with -gdwarf-2.Anonymous
April 02, 2006
Hey Rick, you made eweek!
http://www.eweek.com/article2/0,1895,1944730,00.aspAnonymous
April 03, 2006
Clarence!
We sorely miss your talents.
Yes, I did make eWeek. What's more, the reporter actually got it right! I'm stunned.Anonymous
April 03, 2006
hmm, intel cpu not so bad...Anonymous
April 21, 2006
I'm a couple of weeks behind in my podcasts, which is why I just noticed that the folks over at Your...Anonymous
May 18, 2006
PingBack from http://blog.sensepages.com/skype-tries-to-do-intel-macs-againif-you-were-an-3/Anonymous
May 18, 2006
PingBack from http://blog.sensepages.com/update-intel-to-undergo-broad-restructuringintel-expects-a-3-percent/Anonymous
May 19, 2006
PingBack from http://blog.sensepages.com/of-intel-macs-and-red-herringsscott-byer-at-adobe-put/Anonymous
May 20, 2006
PingBack from http://www.centplus.com/of-intel-macs-and-red-herringsscott-byer-at-adobe-put/Anonymous
May 20, 2006
PingBack from http://www.centplus.com/david-tuhy-on-intel-core-2-duoto-help-sort-out-2/Anonymous
May 20, 2006
PingBack from http://www.centplus.com/of-intel-macs-and-red-herringsscott-byer-at-adobe-put-2/Anonymous
May 20, 2006
PingBack from http://www.centplus.com/of-intel-macs-and-red-herringsscott-byer-at-adobe-put-3/Anonymous
May 22, 2006
loose lips sink shipsAnonymous
May 23, 2006
PingBack from http://www.centplus.com/of-intel-macs-and-red-herringsscott-byer-at-adobe-put-4/Anonymous
May 23, 2006
PingBack from http://www.centplus.com/intel-delays-45nm-processors-monday-22-may-2006-at-000000/Anonymous
May 24, 2006
PingBack from http://www.centplus.com/intel-planning-huge-investment-for-indiaintel-corporation-the-largest-chip/Anonymous
May 25, 2006
PingBack from http://www.centplus.com/amd-takes-aim-at-intel-with-notebook-chipsnew-turions-will-2/Anonymous
May 26, 2006
PingBack from http://www.centplus.com/intel-announces-new-brand-name-for-chipscore-2-duo-will-4/Anonymous
May 27, 2006
PingBack from http://www.centplus.com/list-of-universal-binaries-for-intel-and-ppc-mac-os/Anonymous
May 31, 2006
PingBack from http://www.schwieb.com/blog/archives/5Anonymous
June 01, 2006
What do you do in the down time between feedings when you're on parental leave pulling night duty with...Anonymous
June 23, 2006
PingBack from http://www.centplus.com/apple-shunned-chip-start-up-for-inteldid-apple-make-a-mistake/Anonymous
June 25, 2006
The comment has been removedAnonymous
June 27, 2006
The comment has been removedAnonymous
May 26, 2009
PingBack from http://backyardshed.info/story.php?title=buggin-my-life-away-of-intel-macs-and-red-herringsAnonymous
May 29, 2009
PingBack from http://paidsurveyshub.info/story.php?title=buggin-my-life-away-of-intel-macs-and-red-herringsAnonymous
June 08, 2009
PingBack from http://hairgrowthproducts.info/story.php?id=5123Anonymous
June 12, 2009
PingBack from http://jointpainreliefs.info/story.php?id=2458Anonymous
June 14, 2009
PingBack from http://adirondackchairshub.info/story.php?id=130Anonymous
June 15, 2009
PingBack from http://debtsolutionsnow.info/story.php?id=8177Anonymous
June 16, 2009
PingBack from http://fixmycrediteasily.info/story.php?id=14238Anonymous
June 16, 2009
PingBack from http://lowcostcarinsurances.info/story.php?id=7466Anonymous
June 18, 2009
PingBack from http://barstoolsite.info/story.php?id=5710Anonymous
June 18, 2009
PingBack from http://gardenstatuesgalore.info/story.php?id=1633