|Oh well, at least it's
Everything here is
my personal opinion.
I do not speak
for my employer.
How to run an iPhone app in the simulator without using XCode
I spent a lot of time looking around on the Internet for this answer, and the results were basically nonexistent. The answer is: iphonesim on github. (Despite its name, iphonesim isn't an iPhone simulator; you still need the iPhone SDK to be installed so it can use their simulator.)
The bad news is there's no obvious way to run your app in a Debugger using this system. Hopefully someday it'll be added or I'll figure it out, and then I'll be rid of XCode for good.
Why the iPhone Simulator is Awesome
While we're here, I'm very impressed by the whole concept on which the iPhone simulator works. Most embedded devices (including Blackberry, Android, and other phones) use software to emulate the embedded CPU, which then runs the embedded OS, which then runs your app. This kind of sucks, because the emulator has to work really hard (it often runs at only a fraction of the speed of a real device), and if you crash it you have to reboot it. Plus loading apps onto a simulated device is extra crappy, because you have to simulate a slow USB connection, and so on.
The iPhone simulator works nothing like that. Instead, you compile your app for your native CPU, and they made the iPhone simulator just a native program that runs on your workstation and provides the iPhone API (using native libraries). You simulate and test your program, and when you're finally happy with it, you recompile your app for the target CPU that actually runs on an iPhone. Then it won't work on the simulator anymore.
The result is that the simulator starts instantly and there's no insane two-layer debugging scheme in which you're running a native debugger and decoding non-native (and usually JITted) CPU instructions.
Some people would argue that this method is "less accurate" than precisely emulating the target CPU, and thus the simulator doesn't add much value, since you'll have to test the native app in the end anyhow. It's true that simulating this way is inaccurate and you should do final tests on a real device. The misconception, though, is that the old, annoying, slow, "emulate everything" method is any more accurate. In fact, it's worse.
The fact is, emulators are never perfect. CPU/hardware emulators are really hard to get right, especially if you're trying to make them run fast. If you're not trying to make them run fast, you have a whole different set of problems, because now your simulator is way slower than the real device, so all the animations/etc will be wrong. Try debugging an OpenGL app when your framerate is 1/10th what it should be.
By contrast, the iPhone simulator's method seems magically wonderful. Since the iPhone OS is MacOS, all the kernel APIs are the same. The natively-compiled frameworks, libraries, and display engine are built from the same source code, so you know they're the same too. And your Mac's CPU is a lot faster than the iPhone's CPU, so the simulator can slow down your program to iPhone speed, which is a lot easier than speeding it up (although admittedly imperfect).
In fact, with this method, the only potential sources of incorrect simulation are a) speed (which they seem to have gotten right); b) cross-platform bugs in gcc (I don't know of any); or c) differences in memory layout making memory corruption behave differently. (c) could be a problem, but they seem to provide a lot of debugging tools and you shouldn't be depending on memory corruption anyhow.
Incidentally, this design justifies the fact that you have to have a Mac to do iPhone development, and you have to have the latest MacOS (Snow Leopard) to run the latest SDK. This annoyed me when I first heard of it; I thought Apple was just trying to lock more people into buying a Mac. But now it totally makes sense: iPhone OS *is* Snow Leopard, so if you want to run the native simulator, of course you need Snow Leopard, or the simulator can't possibly work.
That's a really brilliant design tradeoff with huge benefits. And they get to lock more people into buying a Mac.
Update 2010/04/08: A few people have pointed out that the Blackberry
"emulator" is apparently not actually an "emulator" but in fact runs a
natively-compiled version of the Blackberry JVM. Okay, I guess, but that's
not really the point. The point is that it still spends upwards of 30
seconds booting the "virtual Blackberry" before it even gets to the
point where you can run your program. (And you have to do this every
time you want to run your program.) This is annoying, slow, and pointless,
and the (apparently native??) JVM still runs everything horrendously slowly -
slower than a real Blackberry. So if it's not a native device emulator,
then congratulations, it's somehow even stupider. Yes, I've done real
Blackberry development, and the difference between the Blackberry and iPhone
simulators is night and day.
Open source is stupid
Not the software. The people. And not the people who make the software. The people who comment about it. I guess now that includes me, which is appropriate, since writing this post will obviously have no positive outcome.
Background: I love Linux. I've written far more Linux software (commercial and open source) than software for any other platform. Back in the 1990's, I even wrote a Linux kernel driver and poem that's (to my ongoing dismay) still in use today.
So yesterday's random post, in which I said something nice about Apple, naturally tagged me instantly as an "Apple Fanboy." (Of course, the accuser here is "somewhat of an exception" because he "really loves Linux." Uh huh. Yay you.)
By the way, yes, thank you for asking, I do have a Blackberry, and I have used the SDK, and the Blackberry simulator is 100% pure crap compared to the iPhone simulator. This is obvious after 0.5 seconds of comparing the two, after which the iPhone one has finished loading your app and the Blackberry one hasn't even made its window appear yet, let alone booted the simulated Blackberry OS.
Nevertheless, it's true. I am probably officially an Apple fanboy now. I mean, I've had an iPod since 2005, which is nearly the beginning of time. I even upgraded to an iPod Touch recently. Also I have a Mac laptop because they're the only ones where power management actually works. Plus I totally downloaded their SDK last week.
Speaking of which, XCode sucks.
Anyway. Fanboy. Yes. Probably. I do have to admit that it's interesting following along with the whole Evil/Artist dichotomy. Or is it a dichotomy at all? I mean, how can you be a serious artist and then let people use Java? Yes, Android, I'm talking to you.
And oh, speaking of Android. While I'm flaming people needlessly, let me just add one more thing:
Top 10 Paid apps in the Android App Store
See the list here.
Remember, folks, Google is the worldwide expert at showing the very best stuff at the top of your search list. Just think what the next 10 look like!
Guys. This is what happens when you let Java people write apps for your platform. Who's evil, again?
Top 10 Paid apps in the Apple App Store
You will note that Apple actually explicitly prohibits Java developers from coming anywhere near their SDK. It's part of the license. I'm not even kidding. This is not a coincidence.
There. Glad I got that out of my system.
apenwarr. Lowering the quality of discussion since... the 1990's
sometime. Aw, who can remember exactly when. Whatever.
I may be internet famous, but I am not a primary source of original research
Yeah, I remember my teacher gave me a 10/10 on that one. 15 years ago. But seriously.
(Thanks to Eduardo for pointing out what happens when you search for
"apenwarr" or "Avery Pennarun" in Google Books. Answer: silly things.)
Why alienating developers is a winning strategy
I love this new iPhone SDK rule - the one that says you have to write all your apps in pure Objective C with no translation layers. Not because it's good for me (it isn't), but because it's fun to watch an old-school titan - Apple - play the platform game like they mean it. There hasn't been fun like this since Bill Gates left Microsoft.1
Here's the story so far:
(You might have seen this story before; see Crossing the Chasm and its highly-relevant-to-this-discussion sequel, Inside the Tornado.)
Then, suddenly, Apple clamped down even further on its SDK and app store.3 Shocking! People argue that this breaks the positive feedback cycle: this will mean fewer developers, which means fewer apps, which means less awesome, which means fewer users, which means other platforms can compete, and so on.
Here's the paradox. Apple does need developers to maintain the cycle of awesomeness. What Apple's doing is not good for developers. And developers will continue the cycle anyway.
This is why:
Apple owns the platform; if their platform wins, they win bigger than anybody else. All they need to do to win is to continue to deliver awesomeness to end users as fast as possible so that nobody can catch up. Because they're the biggest platform, they have the most money, so this isn't that hard to do.
The other players are the end users. They have to buy Apple stuff in order for the cycle to continue. The awesomeness must be there or they won't buy stuff. Java and Flash are the opposite of awesomeness. Thus, Apple rejects them outright.4
Developers are not part of the strategy. As a developer, you don't make decisions based on awesomeness at all. You might think you do, and the first iPhone developers ("early adopters") did, but that's not you. The modern developers, the ones with the pre-made market and fully-debugged SDK and profitable clients paying you to write iPhone apps, build for iPhone only because it's the leading platform.
In the long term, developers like you would be better off if they would boycott Apple and only develop awesome apps for something more open, like Android or even Blackberry.5 Users, as they do, would rapidly switch to the platform with the widest variety of awesome stuff, and everybody would win.
But you aren't going to do that, are you? Because nobody else will either. Unless all the developers switch, the only developers who switch will be suckers. You don't want to be a sucker. You want to make as much money in the short term before the whole thing inevitably implodes. Because it will implode, right? ...right?
Microsoft won on the desktop by being developer-friendly and Apple won in mobile by being developer-hostile. Developers never had anything to do with it.
1 The closest we have right now is Google vs. Yahoo vs. Microsoft, ie. a bunch of clueless losers shaking their fists at each other. Google isn't winning in search/advertising because of their awesome strategy; they're winning because the competition keeps producing crap. Which is an okay reason to win, but it's not thrilling. Apple vs. World is thrilling.
2 Actually #4 isn't even strictly true; last I heard, there are still way more Blackberries than iPhones in active use. But people believe it's true, which is all that matters for this discussion.
3 To be honest, the Apple app store was kinda fascist from day 1, so we're just comparing on a relative scale here. But people get upset anyway.
4 All apps written in Java or Flash are ugly and stupid, so end users benefit directly from this restriction. Another reason for Apple to reject such cross-platform apps is admittedly self-serving: if you own the leading platform, you will always get the app. So if you make it hard to port apps between platforms, you're sabotaging the other platforms, not yours.
5 Ha ha, I just called the Blackberry "open," even though the
only language you could use to develop apps for it for years has been
Java. Somehow Java people manage to spin horrible restrictions as
features. "100% Pure Java!" and so on.
Three types of distributed system designers
1. Paranoid privacy nuts. Systems designed by these people never become popular because paranoid people don't have any friends. (Examples: ZKS, Freenet, GPG.)
2. Redundancy leeches. These people want to back up their files (encrypted) to your computer for added redundancy. Unfortunately, you gain nothing by doing this for them; there's no way to force a leech to contribute space back to other leeches. So these tend to end up as for-profit services. (Examples: AllMyData, Dropbox, S3.)
3. Sharers. These people have data they want to share with other people. They benefit by giving you the data; you benefit by receiving the data, and if you like it, you'll feel nice by sharing it further. (Examples: Debian, Wikipedia, BitTorrent.)
(Free) distributed storage systems in groups 1 and 2 don't seem to ever succeed, because there's no network growth effect.
Systems in group 3 succeed regularly. And they don't need encryption.