Can a little friction be a good thing?


A couple of years ago I posted on the issue of transaction costs. That post was focused on the many unseen and unaccounted for costs of enterprise purchases of new products. I think all those costs are still relevant today and B2B startups would do well to consider how to reduce transaction costs for their customers.

But what about transaction costs in the consumer market? I hadn’t given that any thought until reading the article in The New York Times by Kevin Roose entitled Is tech Too Easy to Use? My initial reaction was, absolutely not! Given the problems my wife and mother have in using new technology, and my occasional frustrations, especially over passwords, tech is still too hard in for many people.

Mr. Roose equates friction with with making a product more difficult or time-consuming to use.  He makes a good case that social media like Facebook and Twitter have enabled bad actors to spread disinformation because it’s both too easy to create accounts and too easy to post and share information, in both cases by design.

I rarely use YouTube but having finished the newspapers and perusing Flipboard, my go-to news aggregator, I  was looking for something else for a diversion. So I decided to see what might be of interest on YouTube. And there was a video interview with Brian Ferry, founder of Roxy Music, one of my favorite bands. It was well worth watching. But before I had a chance to give any thought to what else I might like to look at, YouTube took over the user controls and started playing another video! They claim this is a feature – it eliminates friction – but as far as I’m concerned it’s a bug. I don’t mind recommendations, such as those from Amazon and Netflix and occasionally they point me to an interesting book or video. But I do mind YouTube deciding not only what I should watch but auto-playing it before I have had time to think what else I’d like to view. Talk about friction – I have to stop the video that YouTube chose for me, go back to the home page, and choose another video. I’m sure their research shows most people are too lazy to do that – just like they are proven to be too lazy to change TV channels.

Mr. Roose has many other examples of the problems tech can generate when it makes things easier and faster to use.  After reading his article I agree with him that quick and easy is not always best, as he writes, “We wouldn’t trust a doctor who made speed a priority over safety. Why would we trust an app that does?”

But what was really striking to me was the last story in his article about the co-founder of Tulerie, who send out a Google survey which she thought would act as an invitation to join to hundreds of prospective members. Well long story short it failed – only one person filled out the survey! But like any good, persistent founder, Merri Smith tried again to acquire members. This time she took a completely different tack. Rather than the very easy to use survey this time anyone who wanted to join had to conduct a brief video call with a company employee first. This worked almost too well.

Logically, the new strategy should have failed. But it was a huge hit. Prospective members flooded the invite list, filling up the company’s interview schedule weeks in advance. By creating a more complex sign-up, Tulerie had signalled that its service was special and worth the effort. “It goes back to values,” Ms. Smith said. “People perceive it as harder to get into, and they want to be a part of it.”

So while I’m still a fan of making things quick and easy, as I’m a very impatient person, I can see the merit of doing just the opposite, when appropriate. When you are designing your next product give some thought to adding a little friction, or maybe not taking so much out, as your use case might parallel that of Tulerie, where users actually assign value to an on-boarding process that requires some effort. And perhaps even some thought!

The pros and cons of “defaults”


I first came across the concept of a default in a computer program when product manager for VisiCalc, the first electronic spreadsheet. The idea that a programmer could pre-select a setting or option for the user seemed very powerful to me. Having options is always good, except when too many options can either become annoying or result in “paralysis by analysis” or cognitive overload.

The art of choosing defaults or what are now commonly called “preferences” is a balancing act between annoying or confusing the user and providing them with the ability to personalize the app to meet their needs.

It’s been a few years since I’ve been involved in designing software, so I hadn’t given the concept of defaults much thought, if any, until recently. Like a good Apple user I dutifully downloaded and installed the latest version of iOS for my iPhone. All seemed fine until I noticed that the icon badges for email and messages had disappeared from my iPhone! Icon badges are very useful, as they inform me not only if I have new mail, but how many unread messages I have. This is the first thing I check on my phone after coming out of a meeting.

I turned to my universal tech support provider, Google, to find that iOS has a preference to turn off badges, not only for email but for text messages as well.  Somehow the new version had reversed the icon badge setting. I had to dig into my iPhone’s preferences to change the badge icons from “Off” to “On” for mail and messages. Not that difficult, but annoying, and I found that a change that had been made for me without notification a bit disturbing. What other defaults might be reset by new versions of iOS, with no notice, choice, or notification?

Those of you designing software for smartphone apps, give careful thought to what features have default options and how you set defaults. Those choices can result in either a smooth, streamlined user experience or one marred by an annoyance. And changing defaults without notice to the user can tend to erode trust in your app or operating system.

The concept of defaults can also be used in other contexts than computer user interfaces. For example, in my first startup we decided that everyone should get stock options, depending on two parameters: when they started with the company and what level of position they held – VPs got more than directors who got more than managers, who go more than staff (though star individual contributors could demand and merited more stock options). But we soon learned that the default of granting everyone options was not a wise move. It turned out that a significant minority of our hires would have preferred a higher salary in lieu of stock options. This group tended to be the family breadwinners (what a strange archaic phrase that is!) who needed more cash to pay their bills. Young, single people with no family to support and no mortgages preferred stock options. So we redesigned our compensation plan to give new hires an option: take a higher salary with no stock options or opt for a lower salary with stock options. Our CFO calculated the trade-off numbers for new hires and we found that new hires were now more satisfied with their compensation plans.

Note we only offered two choices, for simplicity’s sake. While some people might have wanted a slight salary decrease in exchange for some stock options, it is hard enough managing your employee compensation plan and cap table without having an infinite number of combinations of salary and stock options.

Going beyond a simple “on”or “off” or “opt in” or “opt out” risks generating needless complexity. Thus the brilliance of Facebook’s Like button vs the typical five stars used for ratings on other platforms, such as Amazon. Facebook makes the decision very easy, “Like” or don’t. And for the generator of the content they can simply measure the number of Likes, rather than attempting to compute their average star rating. Or should that be the median number of stars? Or maybe the mode?

There’s another term I like to borrow from software engineering: combinatorial explosion. A combinatorial explosion results from the multiplier effect of having choices or options on top of options such that the number of options the user had grows exponentially. That’s what our CFO would have faced had we offered new hires any combination of salary and stock options that struck their fancy.

As Einstein is reputed to have said, “Everything should be made as simple as possible, but not simpler.” I’d say that applies well to options or preferences, whether in technology or business.


What makes a good product?


The truism about a good product is that it solves a customer problem. But that’s just one clue amongst many. Jeff Davidson’s article on Medium’s The Startup The 10 Commandments of Good Products – what Defines Value? will help the entrepreneur build a lasting product that delivers value.

1 · It Makes an Undesirable Task Easier

Ease of use is certainly one of the defining qualities of a great product. Apple built an entire company around that value. And almost by definition, if you think about it, if you are solving a problem you are making an undesirable task – the problem – easier by providing a solution. But engineers and marketeers can be afflicted with scope creep – adding feature after cool feature until ease of use gets buried under a blizzard of features. Mitch Kapor hit the grand slam home run of products by brilliantly integrating the spreadsheet with charts and graphs. But his latter products suffered from trying to jam 20 pounds of features into a 10 pound bag – Symphony, Jazz, and Agenda to name three straight failures. The other reason for Apple’s great success was Steve Job’s insistence on simplicity and his vaunted ability to say “no” to features – sometimes even to Apple’s detriment. As Einstein supposedly said, “Make things as simple as possible, but no simpler.” Ease of use is tied in with simplicity of use. As the film directors say, be willing to leave your babies on the cutting room floor if these favorite scenes don’t advance the story.

2 · It Has Focused Value

Jeff has a truly great insight here and he knows it, as it’s in bold: Users have to imagine value before experiencing it. Job’s genius for marketing provides yet another example with his tagline for the iPod: “1,000 songs in your pocket – imagine that!” And millions did! Frankly I’m not a fan of the phrase “focused value” but the translation: your product does a few things, very well is the lesson here.

3 · It Lasts

Jeff has two great examples: the iPhone, which is basically unchanged in value delivery and form factor after 10 years – it’s specs have improved dramatically but its functionality is basically the same. The Vespa scooter is another example of a product that has remained relatively unchanged for decades. The term for something that last is classic. In today’s disposable culture of ADHD strive to create a classic, not a pop hit – here today, gone tomorrow. Jeff’s other point about lasting is that great products don’t fall apart or breakdown repeatedly. One reason why Apple leads the world in customer satisfaction surveys is durability. You need to engineer your product to survive every use case you can imagine. For physical goods you need not only great engineering but great manufacturing to build lasting products. The mirror image of lasting is planned obsolescence, products engineered to fall apart or become less useful to drive a customer to buy the next model. While this seems like a profitable strategy in the short term – please Wall Street with great quarterly revenue numbers – it’s a poor strategy in the long term. Why? Because you betray customer trust! And when you lose trust you lose loyalty. By building products that last you will build customer trust and loyalty – that’s where real value lies.

4 · It Has Aesthetic Appeal

Apple leads the world in this category. But how about Tesla for a change? Not just a breakthrough concept, but a beautiful car, inside and out. While there’s an occasional counter example, like Craigslist –  born ugly and still ugly today – in general, you want your product to be attractive in the transitive sense, that it attracts customers and as an adjective as well, it appeals to the senses.

5 · It is Intuitive

The saying “You only have one chance to make a first impression” is generally applied to people, and often used to coach candidates for job interviews. But this saying is equally true for products. If your product has a steep learning curve, like Mitch’s Agenda, relatively few people will invest the time to learn how to use it. The ratio of value to time invested must be high. As Jeff points outs, we buy when we try, which is why smart vendors offer free trial periods and friendly return policies. The days of shipping 5 pounds of manuals with a PC software application are long gone – replaced by perhaps three or four slides on the product’s home page, if that. Patience is going down, demand for immediate gratification is going up. That means you need to deliver attractive value immediately, even if it a while to learn all the features. Jean-Louis Gassée had the concept of the Z-axis in products. Your immediate impression is that the product is intuitive, but it’s also deep, not shallow. Of course, there are exceptions to this rule, like Photoshop, but generally products like Photoshop are aimed at professionals who are willing to invest more time and effort to get greater return – the concept I use called ROTI: Return on Time Invested. We often quickly forget how much we paid for something if the time we invest in learning to use it is small compared to the benefit the product delivers.

6 · It is Efficient

The term I like that I learned early on from software engineers is elegance. The scientific definition is pleasingly ingenious and simple. As Jeff points out, great products are usually done by small teams, not as Bill Gates once said of IBM, “masses of asses”! The joke amongst developers is the camel is a great example of something designed by a large group. Watch out for superfluous additions such as heavy stylization. Great products have a high value to resource ratio.

7 · It is Visceral

Here’s a term I have not run across, despite being a psychology major: stimulus-response compatibility. Stimulus-response compatibility refers to the degree to which a person’s perception of the world is compatible with the required response. Being visceral means going beyond simply looking good – eye candy. Products should feel good, sound good, and even smell good. A good product is pleasing to many senses.

8 · It Satisfies the Seeking System

The term seeking system is also a new one on me, but it’s what’s behind the drive for novelty, which I wrote about in the post Novelty – the driving force for today’s consumers. Jeff defines the seeking system as humans intrinsic drive to explore, try novel things, challenge themselves, and to learn. It’s the driving force behind entertainment and social products. The seeking system is an intrinsic motivation – people want to be in control but they also want to explore. Thus the enduring value of the search engine.

9 · It Serves as an Expression of the User

Thorstein Veblen discovered this decades ago, it’s called conspicuous consumption. Fashion is the ultimate “expression of the user.” This is why aspirational marketing works so well. A product is purchased not for it’s utility value, but as a reflection of the status of the consumer. Apple’s white earbuds soon became such a symbol – they broadcast the message that the wearer was cool because they owned an iPod! Being the least fashion conscious person I know – I was once called by a Lotus executive “the worst dressed executive I ever met” –  I’ll leave this more to the marketeers than the product designers. However, Jeff throws in something I think is much more important to the product designer: the Ikea effect: People become attached to their creations and products and they start to define themselves by them.

10 · It Helps People

Helping other people is the number one value I look for in a venture. If your mission statement doesn’t say something of the form: “Our product helps people do X” then I tend to be skeptical about the venture. I look for products that deliver utility, not novelty. Utility lasts, novelty does not. If your product helps people it will last, it will become a classic, it will succeed. That’s why I would list this quality first, not last.

I don’t know anyone who’s built a great product by following a checklist. Founders need to absorb the characteristics of great products, not try to mimic them. One way to do that is to own or experience great products. Steve Jobs owned and used a BMW motorcycle and Martin-Logan electrostatic speakers – best in class products. You don’t need to buy everything, but try borrowing products that are classics – test drive a Tesla even if you can’t afford one now. By using and studying great products, whether the classics of yesterday by German consumer products company Braun, or the classics of today from Apple and Tesla, you will instinctively incorporate these qualities into your product. They must come from the inside out, they can’t be pasted on from a checklist.

A Prototype Is Worth 1000 Meetings


As the saying goes, “Great minds think alike.” Not sure that it’s really true, but in any event I actually wrote two posts on Mentorphile based on the same saying, A picture is worth a thousand words:

Why a prototype is worth 1,000 pitch decks

Sequel to a prototype is worth 1000 slides

But Tom and David Kelly of Ideo fame’s article “If a picture is worth 1000 words, a prototype is worth 1000 meetings.” still has some valuable points for founders doing pitches. (And if you are not familiar with Ideo, they are a global design firm – check out their site.

Given how important design has become I recommend all founder’s learn at least the rudiments of design. Here are some valuable words of advice from the article:

Getting early feedback and making updates in a prototype is much convenient and almost takes no cost as compared to making changes in an implemented system. and

Without a prototype, it’s only a concept. It can be difficult to get a potential client to commit to the purchase of a concept. –Steve Upton


When the platforms change the players change. Watch out!

apple ii

I’m not sure where I heard the adage that when the technology platforms change the players – meaning companies, will change with them, but it’s a good one for would-be entrepreneurs. Meaning those with a real bias towards doing a startup, as three MIT grads I met with recently, who really wanted to do a startup but didn’t know where to start. Watching for the next wave is one way to launch a business that will have the advantage of a megatrend behind it.

I went through two of the major instances of platform changes enabling new competitors which we missed not once, but twice. If you are running an established company rather than a startup you or a co-founder should spend a few percentage points of your time scanning for imminent platform changes in the technology environment and when you see a massive change coming get all hands-on deck to prepare for it.

The Apple 6502 Platform

Here’s the first example. In 1979 Dan Bricklin and Bob Frankston started a company called Software Arts, Inc. to commercialize Dan’s invention of the electronic spreadsheet to be named VisiCalc. There were lots of PCs around in the dawning of the personal computer in the late 1970s but no single company had yet developed a platform, meaning a framework or environment in which end-user applications can be developed and run. Back in those days there was a strict delineation between systems software, what we now call the operating system, and applications software, like word processors, now called apps.  Dan and Bob wisely chose the Apple II from Apple Computer, Inc. and it’s 6502 processor as their delivery platform. If I recall correctly the Prime computer, a mini-computer played a key role in development, thought VisiCalc was coded in assembly language for the 6502 processor. Choosing the 6502 and Apple over the 8080 used by a number of other companies had a direct and highly beneficial effect on Apple’s skyrocketing rise to define the personal computer. As highly influential analyst Ben Rosen wrote of VisiCalc, It was he software that wagged the hardware dog. You can learn a lot more about the early history of Apple in a plethora of books and though no one has sought to write the history of Software Arts there’s plenty of information about VisiCalc onWikipedia. I’ll limit myself with providing you with a very insightful quote from Ted Nelson, a pioneer of hypertext and a hero of mine (deserving of a full post):

VISICALC represented a new idea of a way to use a computer and a new way of thinking about the world. Where conventional programming was thought of as a sequence of steps, this new thing was no longer sequential in effect: When you made a change in one place, all other things changed instantly and automatically.

— Ted Nelson[8]

The Apple II became a platform through two breakthrough ideas by Apple co-founder and technical genius Steve Wozniak: building seven slots into the motherboard, which enabled scores of hardware developers to build compatible cards, and by providing an early version of the BASIC programming language, thus making the Apple II accessible to those of us not equipped to program in assembler.

Software Arts built a very nice business on the Apple II and the strategy was to go broad but not necessarily deep. In other words a lot of the company’s resources were developed in creating versions of VisiCalc for others computer the process called porting, from Atari, Commodore, Radio Shack, and probably a couple of others I’ve forgotten

The IBM PC PC-DOS Platform

But then along came the IBM Personal Computer developed in a skunkworks located from from IBM’s gravitational pull of its HQ. The IBM PC was the massive change in platform that unfortunately Software Arts missed. We were given a very early version of the machine, code named “Peanut” which came in on a plywood board,  and sported an operating system that wasn’t yet to be called PC-DOS, as it was an OS purchased presciently by Microsoft from a small developer who had no idea that that the market dominating IBM was soon going to enter the PC business. Well I have to be careful or I’ll be regurgitating everything I know about the early days of the PC which isn’t my point. Others have done that quite well and it’s not my intent.

What happened with the PC was that Software Arts made a very large mistake. Instead of treating the IBM PC like the market and platform creator that was soon to dominate the business computing world, it was treated the IBM PC as yet another port, much like Atari or Radio Shack. So rather than coding a new version of VisiCalc from ground up to take advantage of it’s leading hardware features, such as function keys, larger memory address space, built in floppy drives, higher resolution screen and more, the decision was made to use a cross-assembler. It got VisiCalc onto the PC fairly quickly. But Mitch Kapor who had been the product manager for VisiCalc for Personal Software, the publisher and distributor of VisiCalc, had left that position to develop his own program for the Apple II, VisiTrend/VisiPlot. That graphing program, which used the same file format as VisiCalc so it was easy for users of VisiCalc to import their data into VisiTrend to analyze and plot their data. Mitch wisely sold the program to Personal Software. Many of us at Software Arts, like me, who knew Mitch expected him to retire to Hawaii given his penchant for Hawaiian shirts. But Mitch had developed several other programs, so he was deeply embedded into the personal computer industry and he presciently spotted the IBM PC for what it was – a ground breaking new platform that would enable programmers to take advantage of it’s more sophisticated OS, 8080 processor, larger memory, extended keyboard and other of its many advantages over the Apple II.

Although Steve Jobs brazenly took out full page ads welcoming IBM to the personal computer industry the Apple II soon became a speck in IBM’s review mirror. Mitch got as a partner Jonathan Sachs, whose programming wizardry rivaled Software Arts’ Bob Frankston’s, as I recall Dan telling Bob in exasperated tone. To cut to the chase, Mitch was the designer of the most popular program for the, PC and Jonathan Sachs, the programmer. By designing and developing Lotus 1-2-3 from the ground up for the IBM PC and incorporating the graphing and data analysis features of VisiTrend/VisiPlot, VisiCalc soon became a speck in Lotus’ review mirror. The platform changed: from the Apple II to the IBM PC (and its clones) and with it the players changed: from Software Arts, Inc.  to Lotus Development Corporation.

The Microsoft Windows Platform

But sharp as Mitch was it was his turn to miss the next platform change. Bill Gates at MicroSoft had developed a competitor to VisiCalc; like all the other competitors I tracked as VisiCalc product manager, pre-IBM MultiPlan went no where very quickly. But it gave Microsoft valuable experience in developing a business spreadsheet, experience that was put to good use as Microsoft drove the next huge platform change, from the command line interface of PC and MS-DOS, to the GUI of the Windows operating system. I remember first seeing an early version of Windows and it looked very clunky. But Microsoft made up for in persistence what they may have lacked in visual design and by Windows version 3.1 Microsoft owned the next and virtually last platform for the PC and became a multi-billion dollar public company on the back of its graphical user interface and the Excel spreadsheet, written by Microsoft to take full advantage of its own operating system. There’s tons more inside baseball history to this platform change, Wikipedia and Google can fill in the many details. But the moral of the story is that Bill Gates, like Mitch Kapor, leveraged their early foray into PC applications programs to totally dominate the next OS of their times.

The Smartphone Platform and the Rise of Mobile Computing

But next came Microsoft’s turn to miss the platform change, and yet again the players changed with it. That platform change was driven by Apple and its development of the market-making iPhone and it’s brilliant accompanying iPhone driver, the App Store. I vividly recall Steve Ballmer, then CEO of Microsoft, replacing Bill Gates derision of the iPhone. Check out the YouTube video Ballmer Laughs at iPhone or read That Time Steve Ballmer Laughed at the iPhone. Ballmer and Microsoft had become fat, happy, and complacent.  They totally missed the smartphone and app platform change and Apple did what no one ever believed possible, it became not only amore valuable company than Microsoft, but created a new platform that blew away Microsoft and the rest of the legacy PC world as well.

So you know the saying, Those who don’t learn from history are condemned to repeat it. Well platform change has happened yet again! I knew well before most anyone that AI would become the next platform change. But that knowledge did me no good, as I didn’t know when or why. But neural networks, which had been left dead and buried by influential MIT computer scientist Marvin Minsky rose from the dead, in the form of today’s machine learning. The infrastructure that enabled that resurrection, was the vastly more powerful computing power, the almost infinite storage capabilities, and most importantly, the reams upon reams of data to feed into the neural network to enable machine learning. Google spotted the AI trend early on and wisely scooped on many of the worlds’ AI experts to help it make use of the power of AI. Apple, which lead the iPhone platform change, was too busy taking selfies with celebrities to spot this platform change which unlike the others, was not an operating system change but an application driver – AI – change.

The Smart Assistant and the rise of the voice UI

And there’s yet are two more platform change to discuss, one being the move towards voice control of consumer devices, from iPhones to TVs to a wholly new platform pioneered by of all companies, Amazon with it’s Alexa smart assistance brilliantly wedded to its family of smart speakers, the Echo.

Again Apple had the early lead with smart assistants, just as it had with the Apple II, in the form of Siri, the virtual assistant purchased by Apple and delivered on Apple’s family of operating systems: iOS, watch OS and tvOS. But Apple made much the same mistake as did Software Arts. Instead of capitalizing on it’s early lead with Siri and investing the huge amount of resources at its fingertips, Apple applied the porting strategy, making Siri available on its family of operating systems: iOS, MacOS, watch OS, and tvOS and on it’s very late to the party, the overpriced HomePod smart speakers. Apple is now hellbent on catching up with Google, evident by it’s poaching Google’s head of AI to take the same position at Apple. And having this position report directly to CEO Tim Cook demonstrates the attention Apple is now putting on AI. But they are indeed playing catch up as their HomePod smart speakers were generally lauded for the sound quality but they we heavily criticized for the very weak version of Siri used with these speakers. Jeff Bezos, after failing mightily with Amazon’s attempt at manufacturing a smartphone, has mimicked Apple’s iPhone killer app: The App Store. Only Bezos calls smart speaker apps, Skills. But whatever you want to call them, he’s got thousands of programmers developing for the Alexa platform, and 10,000 plus skills available. In a move that mirrors Microsofts’ licensing of Windows and Google licensing of the Android operating system for smartphones, has licensed Alexa for use with other companies devices, including cars!

In Summary

So there you have it. Multiple platform changes from about 1981 to the present. All missed by the incumbent king of the previous platform. The lesson here is the by now stale quote from hockey all-time great Wayne Gretsky, I skate to where the puck is going, not to where it’s been.  Unlike my call on AI which, I’d become fascinated by reading many of my employer, publisher Addison-Wesley Publishing Company’s AI books, I have no idea what’s coming next. Certainly one could argue that crpytocurrencies are the platform du jour. And like the platforms before it in all probably one or two companies will rise to dominate the industry. Cryptocurrencies and it’s partner in crime, the blockchain, are about where personal computer was before the release of the IBM PC – just getting started, with no dominant player. So whether it’s robotics, quantum computing, biological computing, gene editing, virtual reality, or my personal bet, holograms, there many candidates to choose from. Entrepreneurs need to know some technology history and just as important, keep their antennae up for what’s the Next Big Thing by reading voraciously and attending industry events outside their domain of expertise. I’ll leave you with a well known quote from Alan Kay, The best way to predict the future is to create it. And here’s Alan on YouTube explaining what that quote means for developers today.



Let’s get rid of those damn beeps, please!


toasterI don’t know about you, but my world is infested with beeping machines, from my car to our microwave oven to my wife’s flip phone and our dishwasher. Beep! Beep! Beep! Some are even so stupid as to never time out – forcing me to find the source of the beep, not always easy in a three-story house, to shut the damn offending gizmo off.

I’m certainly not opposed to alerts and notifications, in fact LetMeKnow was one of my many startups that didn’t go anywhere, as during the customer discovery process I learned that my target market was wedded to using Twitter to alert their followers and had no interest in my specialized mobile phone alert app.

I do understand the historic origins of the beep. Decades ago memory was measured in bytes, or eventually kilobytes. That’s right folks, Bob Frankston wrote VisiCalc, the first electronic spreadsheet in 16 kb! Try that coding hot shots! So bits and bytes were hoarded, which lead Dan Bricklin, the designer of VisiCalc, to substitute inscrutable commands consisting of a slash followed by the initial letter of the command – he didn’t want to spend bytes on entire words. By the time the IBM PC launched and eclipsed the Apple II, birthplace of VisiCalc and hundreds of other apps, then known as programs or software, memory was in sufficient supply to enable Mitch Kapor to not only use entire words in his command line but also to package together a spreadsheet and a graphing program – a winning combination.

But enough of history! Today we are blessed with gigabytes of memory. Even my dishwasher probably has more memory and computing power than the Apple II. Yet we are still stuck with this infernal beeping. The shining star in nagging customers elegantly is the smartphone, however. Ringtones became an industry in and of themselves in the early days of the lowly flip phone and today I have such ringtones as Apex, Beacon, Hillside, Playtime, Ripples and Sencha. Yeah, and a free trip to Cupertino for anyone who can tell me the rest of the ringtone set without reference to their iPhone. Users loved ringtones and the ability to change the sound of their notifications. Not only are they fun, but it increased the chances that they could tell their ringing phone from everyone else’s.

Now in an endeavor to deliver the smart home we have refrigerators with cameras inside them so we can see what’s inside and what’s not when shopping. So manufacturers please don’t tell me your gadget lacks the memory or processing power to deliver a pleasant chime or other calm notification. And why can’t I buy a microware oven alert from Brian Eno? Brian would have a whole new income stream and my early morning ill humor would be banished as I listened to a calming chorus of ambient sound alerts from my cluster of kitchen appliances.

Not being the political type I have no idea how to get today’s appliance (and car) designers to banish the beep, or at least reserve the beep for when it’s needed as an alarm, not just a simple notification. Personally I’d still prefer the sound of waves crashing on a beach when my phone wants to tell me it needs recharging. But your tastes may differ, which is my whole point; the 21st century is the age of personalization. Wake up designers and banish the beep!

Design fiascos!



While I’ve been a loyal Apple customer since 1980 and I must have spent tens of thousands of dollars personally and in my companies on their equipment, software and services I’ve got a gripe about Apple. They seem to be sacrificing functionality and usability for appearance.

I no longer own a laptop. I write all my blog posts at home with my nice 27″ screen iMac. It’s several years old but very serviceable. Unfortunately that doesn’t seem to be the case with the latest MacBook and MacBook Pros. According to Mashable, a class action suit was filed on Friday against Apple over their defective keyboards. The plaintiffs complain that the new “butterfly” keyboards fail often – twice as often as older models and are very expensive to replace. The reason, as I understand it, is that Apple went with the butterfly keyboard because it allowed the notebooks to be thinner. I get it, Tim Cook subscribes to that saying, You can’t be too rich or too  thin. Adding insult to injury to accommodate the thinner form factor the keys have less travelTravel is the distance the key goes from its normal position to being fully depressed. Touch typists like me demand sufficient travel from Apple – and they normally comply, as with the very nice wireless keyboard I’m typing on now.  I guess I won’t be buying an Apple laptop any time soon.

But at least no one has died from using a defective keyboard! Today’s story in The New York Times got my normally high pressure blood boiling. Deadly Convenience: Keyless Cars and Their Carbon Monoxide Toll. As usual The Times sub-title tells you all you need to know: 

Weaned from using a key, drivers have left cars running in garages, spewing exhaust into homes. Despite years of deaths, regulatory action has lagged.Without having to turn and remove a key to shut off the motor, drivers can be lulled into mistakenly thinking that the car has stopped running. 

Have these car designers never used a financial site on the web, like their bank or PayPal? Software designers have a best practice for sites or apps that have sensitive information: it’s called the timeout function. The concept is quite simple as is the execution: the app monitors keyboard and mouse activity. After some set period, usually about 5 minutes of no activity, the app logs you out. So if you are using a computer in a public place, for example, and forget to log yourself out the program will do it for you. Last night I even got logged out of a site that sells CDs because I’d been inactive for about 10 minutes. Bravo to them!

So why can’t car designers emulate this very important function in cars without keys, that drivers use wireless key fobs and a push button to start the car.? You can read the whole article to learn about the villains in this tragedy which has cost the lives of many people and the quality of life and brain power of many others. I’ve got one of those cars and my engine is so quiet that I’ve gotten out of the car without shutting it off, especially if I’m in a noisy area and I don’t hear the warning beep when I exit the vehicle without pressing the button that both shuts off the engine as well as starts it. Fortunately for me, I don’t have a garage or I might be one of those victims The Times writes about, who drove their car into their attached garage and left the engine running, and running and running until well after the carbon monoxide killed the occupants of the house.

One startup I’ve often thought of but have never pursued would be a non-profit that tracked, recorded, and broadcast best practices across industries and non-profits like hospitals. Clearly the software development best practice of automatically logging users out of their program after a set period of inactivity absolutely must be emulated by car manufacturers. Caution, reading this article may get your blood pressure well above normal!