You might not realize it, but there's an imminent phone number shortage. It's been building up for a while, but the problem has been mitigated by people using "PBXes", which basically add a 4-5 digit extension to the end of your phone number to expand the available range. The problem with PBXes is they don't work right with caller id (it makes it look like a bunch of people near each other all have the same phone number) and you can't easily direct-dial PBX extensions from a phone's integrated address book, unless your phone has some kind of special "PBX penetration" technology. (PBX penetration is pretty well-understood, but not implemented widely.)
Even worse: it's no longer possible to route phone calls hierarchically by the first few digits. Nowadays any 10-digit U.S. phone number could be registered anywhere in the U.S. and area codes change all the time.
So here's my proposal. Let's fix this once and for all! We'll double the number of digits in a Canada/U.S. phone number from 10 to 20. No, wait, that might not be enough to do fully hierarchy-based call routing, let's make it 40 digits. But that could be too much typing, so instead of using decimal, we can add a few digits to your phone dialpad and let you use hexadecimal instead. Then it should only be 33 digits or so, with the same numbering capacity as 40 decimal digits! Awesome!
It'll still be kind of a pain to remember numbers that long, but don't worry about it, nobody actually dials directly by number anymore. We have phone directories for that. And modern smartphones can just autodial from hyperlinks on the web or in email. Or you can send vcards around with NFC or infrared or QR codes or something. Okay, those technologies aren't really perfect and there are a few remaining situations where people actually rely on the ability to remember and dial phone numbers by hand, but it really shouldn't be a problem most of the time and I'm sure phone directory technology will mature, because after all, it has to for my scheme to work.
Now, as to deployment. For a while, we're going to need to run two parallel phone networks, because old phones won't be able to support the new numbering scheme, and vice versa. There's an awful lot of phone software out there hardcoded to assume its local phone number will be a small number of digits that are decimal and not hex. Plus caller ID displays have a limited number of physical digits they can show. So at first, every new phone will be assigned both a short old-style phone number and a longer new-style phone number. Eventually all the old phones will be shut down and we can switch entirely to the new system. Until then, we'll have to maintain the old-style phone number compatibility on all devices because obviously a phone network doesn't make any sense if everybody can't dial everybody else.
Actually you only need to keep an old-style number if you want to receive *incoming* calls. As you know, not everybody really needs this, so it shouldn't be a big barrier to adoption. (Of course, now that I think of it, if that's true, maybe we can conserve numbers in the existing system by just not assigning a distinct number to phones that don't care to receive calls. And maybe charge extra if you want to be assigned a number. As a bonus, people without a routable phone number won't ever have to receive annoying unsolicited sales calls!)
For outgoing calls, we can have a "carrier-grade PBX" sort of system that basically maps from one numbering scheme to the other. Basically we'll reserve a special prefix in the new-style number space that you'd dial when you want to connect to an old-style phone. And then your new phone won't need to support the old system, even if not everyone has transitioned yet! I mean, unless you want to receive incoming calls.
Or, you know. We could just automate connecting through a PBX.
June 30, 2014 05:43
I've heard it said that you can just alternate between two UI themes once a week, and every time you switch, the new one will feel prettier, newer, and more exciting than the old one.
This is a natural tendency. The human mind is intrigued by change. That's where fashion comes from, and fads. It gives you a little burst of some chemical, maybe adrenaline (fear of the unknown?), or endorphins (appreciation of the unexpected?), or perhaps some other kind of juice I heard of somewhere but I don't really know what it does.
In tech, this kind of unlimited attraction to the unexpected is the main characteristic of the first phase of the Technology Adoption Lifecycle, the so-called "Innovators."
Source: Wikimedia Commons
Perhaps people are happy to be included in the Innovator category. But Innovation isn't just doing something different for the sake of being different. Real innovation is the *willingness* to take the *risk* to do something different, because you know that difference is expensive, but that it will pay off in some way that more conservative sorts will fail to recognize until later.
In fashion, the end goal is to catch people's attention; if you do that, you are innovative. That's why fashion repeats itself every few years: because you can be innovative over and over again with the same ideas, rehashed forever.
In technology, we can hold you to a higher standard. Innovation requires difference, but it also requires a vision of usefulness. Change is expensive. Staying the same is cheap. Make it worth my while. Or if I'm an Innovator, or even an Early Adopter, at least give me a hint about how it's worth my while so I can exploit it while others are too afraid.
Every needless change creates expensive fragmentation. Microsoft ruled their market by being change averse. So did IBM. So did Intel. Even Apple. Whenever they forgot this, they stumbled.
Change aversion works because what makes a platform successful isn't so much the platform as the complementary products. For a phone, that means third-party power adapters, car chargers, headphones with integrated volume controls, alarm clocks with a connector to charge your phone *and* play your music at the same time. For a PC, it could be something as simple as maintaining the same power supply connector across many years' worth of models, so that anyone who standardizes on your brand will have an ever-growing investment in leftover power supplies plugged in wherever they might want them. For an operating system, it means keeping the same approximate style of UI for a long time, so that apps can learn to optimize for it, and a really great app made two years ago can keep on selling well, perhaps with bugfixes and new features but no need for rewrites, because it still looks like it's perfectly integrated into your OS experience. That sort of consistency allows developers to focus on quality instead of flavour, and produces an overall feeling of well-integratedness. It makes people feel like when they buy your thing, they're paying for quality. And yes, people - moving beyond the innovators into the more profitable market segments of the curve - will definitely pay for quality.
Real design genius lies in the ability to make something look pretty, and with gentle updates to keep it modern looking, without causing huge disruption to your whole ecosystem every couple of years. Following fashion trends, while not caring about disruption, does not require genius at all. All it requires is a factory in a third-world country and some photos of what you want to copy.
Ironically, even app developers mostly fail to recognize just how bad it is for them when a platform changes out from under them unnecessarily. Instead, they get excited by it. Finally, I get to rewrite that UI code I really hated, and while I'm there, I can fix all those interaction bugs I knew we had but could never justify repairing! Because now I *have* to rewrite it!
Redesigning things to match a moving target of a platform is really comforting, because it's a ready-made strategy for your company. The truth is, you don't have to think about what customers want, or how to make the workflow smoother, or how to eliminate one more click from that common operation, or how to fix that really annoying network bug that only happens 1 in 1000 times. Those bugs are hard; this feels like freedom. We'll just dedicate our team to "refreshing" the UI, again, for another few months, and nobody can complain because it's obviously necessary. And it is, obviously, necessary. Because your platform has screwed you. Your platform changed for no reason, and that's why your users can't have what they really need. They'll get a UI refresh instead.
And although they are less productive, they will love it. Because of endorphins, or sodium, or whatever.
And so you will feel good about yourself in the morning.
June 12, 2013 07:35
Apple interface guru Bruce Tognazzini tells this story. The in-box tutorial for novices, "Apple Presents... Apple," needed to know whether the machine it was on had a color monitor. He and his colleagues rejected the original design solution, "Are you using a color TV on the Apple?" because computer store customers might not know that they were using a monitor with the color turned off. So he tried putting up a color graphic and asking, "Is the picture above in color?" Twenty-five percent of test users didn't know; they thought maybe their color was turned off.
Then he tried a graphic with color named in their color, GREEN, BLUE, ORANGE, MAGENTA, and asked, "Are the words above in color?" Users with black and white or color monitors got it right. But luckily the designers tried a green-screen monitor too. No user got it right; they all thought green was a fine color.
Next he tried the same graphic but asked, "Are the words above in more than one color?" Half the green-screen users flunked, by missing the little word "in". Finally, "Do the words above appear in several different colors?"
That was what Apple did for a single throwaway UX question in a non-core part of the product - before its first release. It apparently took about 5 iterations of UX design followed up by UX research before they finally converged on the right answer.
The lesson I learned from that: usability studies are important. But you can't just take the recommendations of a usability study; you have to implement the recommendations, do another study, be prepared to be frustrated that the new version is just as bad as the old one, and do it all again. And again. If that's not how you're doing usability studies, you're doing it wrong.
Maybe I should re-buy that book. I gave mine away at some point. It's kind
of indispensable as a tool for explaining software usability research,
if only for its infamous "Can you *not* see the cow?" photo.
July 14, 2014 06:32
Today I have new evidence that the human brain is made up of multiple interoperating, loosely connected components. Because I was out buying dryer sheets and there's one with "Fresh Linen" scent. And while one part of my mind was saying, "That's rather tautological," another part was saying, "That's what I always wanted my linen to smell like!" So I bought it, and now you know which one wins.
In the same aisle I found a new variant of soap with the tagline "inspired by celtic rock salt." Now, inspiration can be a hard thing to pin down, but this soap contains nothing celtic and no salt. I'm not even sure there is such a thing as celtic rock salt, or if there is, that it differs in any way from other rock salt, or other salt for that matter. Moreover, the whole purpose of soap is to wash off the generally saltly sweaty smelly mess you produced naturally, so we'd probably criticize them if it *were* salty, for the same reason people criticize shampoos for stripping your hair of its natural oils only to sell the oils back to you in the form of conditioner. Also, how long has soap had an "Ingredients" section on the package? Why not a nutritional content section? And is it bad when the first ingredient is (literally) "soap"? But I bought it anyway, because Irish. Salt. Mmmm.
Finally, a note to you people who would argue that I'm overanalyzing this.
You might define overanalyzing as analyzing beyond the point required to
make a decision. Since the analysis figured not one bit into my purchasing
decision, by that definition, any analysis at all would be considered
overanalysis. And frankly, that just doesn't seem fair.
June 30, 2013 19:38