Lazy Developers November 17, 2008
Posted by Chuck Musciano in Technology.Tags: Interfaces, Irritants, Software, Users
add a comment
You see it all the time on web forms: the little bit of “advice” next to entry fields for phone numbers and credit cards: “No dashes or spaces.” This drives me crazy!
Let’s understand this: the developer is asking you, the unfortunate user, to make sure you enter data correctly to match what he needs. Because… it’s so hard for computers to get rid of characters that aren’t numbers? No. Because the developer is too lazy to write the code to get rid of the unwanted characters you may type.
Ever type in a phone number with dashes, only to be dinged with an error popup chastising you to not type the dashes? Ever put spaces in a credit card number only to be similarly admonished? If so, you have been the victim of a lazy developer, one who deserves to have their keyboard seized and their pocket protector revoked. That’s shameful coding, and it should be punished.
In case you were wondering, it is trivially simple to automatically remove non-numeric stuff from numeric fields. Actually, it’s trivially simple to configure the field to keep you from typing them in the first place. It actually takes a lot more work to check for the errant characters and pop up a window to irritate you that it does to fix the $%&^# field in the first place! Bad developers will spend more time writing bad code that irritates the user than they will writing good code that makes life easier for the user. Go figure.
All software development is about the end user experience. Period. The user experience should be natural, easy, forgiving, and rewarding. It should not be filled with pedantic errors and foolish activities better left to the machine. Developers who develop anything less should be ashamed, and user should complain vociferously when they are forced to use such systems. Stand up and demand better!
Another Ancient Artifact November 3, 2008
Posted by Chuck Musciano in Random Musings, Technology.Tags: Computing, History
add a comment
I had another “really old” moment with my son the other day. My first job out of college was with Harris Corporation, and I was explaining how Harris evolved from a company called Radiation. Back in the 1950s, Radiation got its start building telemetry equipment for the space program. I told my son that it was very clever technology for the time, capturing real-time data from rockets and recording it on magnetic tape.
And then I got the blank look. “Magnetic tape? What’s that?”
Certainly we haven’t reached this point with magnetic tape, have we? I scrambled for some common point. Finally I settled on cassette tapes. “Remember how we used to have those cassette tapes? The tape in them is magnetic tape. It’s plastic, coated with iron oxide, and you can record data and music on it. The telemetry was recorded on tape like that, but wider.”
My son nodded in understanding, but it was clear that this was a distant memory, at best. And why not? He grew up in the tail end of the CD era, the last physical media we’ll probably ever know. He manages his data online, shuttled between various devices via networks both large and small. He still likes to buy CDs for the cover art and liner notes, but immediately rips them to iTunes and puts the CD on his shelf.
I’m proud to report that I actually have a nine-track, 6250 bpi tape. (That’s bits per inch, by the way. Much denser than the old 1600 bpi tapes.)
When I moved from my first job at Harris (writing compilers) to my second (researching parallel computer architectures) I dumped all my mainframe programs to tape in case I would ever need them again. Fat chance! I’ve never read that tape, and I’ve never had a need for a crucial snippet of PL/I to complete a project. But I still have that tape because, well, you never know if the need will arise. Now, I just need to track down a nine-track, 6250 bpi tape reader. And a matching channel controller for it. And an IBM mainframe. And a 3270 console. Ebay, perhaps?
Head In The Clouds June 19, 2008
Posted by Chuck Musciano in Technology.Tags: Computing, History
add a comment
The latest rage in the world of IT is “cloud computing.” The “cloud” is the internet, often represented as an all-connected puffy blob in countless network diagrams and PowerPoint presentations.
Cloud computing moves your applications away from your local servers and desktops and houses them on servers located in the cloud. Managed by great, benevolent entities like Google, Amazon, and Microsoft, your systems will run better and faster. As butterflies dance around your worry-free head, you’ll be able to focus on your “core competencies,” whatever they may be.
Hmmm. Centralized computing services with local display technology. Where have I heard of this before? Oh, that’s right! We used to call it “mainframe computing!” And that local display technology? A 3270 terminal! In the ’80s, we built dedicated display devices called X Terminals and used them to connect to centralized servers, where we would run our applications. In the ’90s, we deployed “thin client” devices, moving the storage to the server but shifting the computing power to the device.
Those who forget history are condemned to repeat it.
Still using any of these? Of course not. If we have learned one thing in the past 50 years of computing, it is that users demand more and more local power, control, and capability. With that power they will do new and unforeseen things that will dramatically alter how we use information. Every effort to pull that power in, to restrict what people do, has failed. Trying to pull applications off the desktop and run them remotely may be possible technologically, but it will never succeed socially.
I say this even as I continuously try to standardize and manage a far-flung IT infrastructure for my company. The difference? I accept that there will be local applications and capabilities. My standards seek to embrace and manage that local element, instead of trying to pull it back and eliminate it.
Don’t misunderstand: you can shift certain services and capabilities to the cloud with great success. My company has outsourced several business processes to external service providers. My personal data at home is backed up to an external service called Mozy, which works very well. This blog runs on WordPress.com, instead of some server I manage myself. My personal email is externally hosted as well.
The idea of moving all of my personal data to the cloud and accessing my applications there is incomprehensible. Imagine doing everything (everything!) at the speed of your current internet connection. I have several thousand photos on my laptop at home. I manage them with Adobe Photoshop Elements, which provides a fast, high-fidelity interface that lets me flip through hundreds of pictures in a few seconds. Ever tried that on the web? Go to Flickr and try to preview a few hundred pictures. That’s an enjoyable experience. Now extend that to hundreds of documents that you’ll want to edit and manage. No way. Word and Excel are slow enough running locally; they (or their equivalent) will never be better at the other end of a long wire.
The speed problems aren’t the real problem. People like to use their computers anywhere, anytime. High-speed connections are not pervasive, and your cloud computing experience is only pleasant at very high speeds. It stops entirely when the connection breaks. Cloud proponents are struggling to provide an offline equivalent of their services so you can keep working while disconnected. Here’s a thought: since they cannot predict what you might want to do while offline, you’ll probably want to keep a copy of everything you need on your local machine. You know, just in case. And you’ll probably need to keep copies of the applications as well, so you can access your data. After all, data is useless without the application. Let’s see: local storage, local data, local application, local display and keyboard… it’s like your own personal copy of the cloud, but you can use it anywhere, anytime. We’ll call it… the Personal Computer!
No Free <Lunch> June 18, 2008
Posted by Chuck Musciano in Technology.Tags: Computing
add a comment
I’ve noticed a disturbing trend in sales pitches and product literature these days. When I ask if a particular product can easily import or export data with our existing systems, vendors often reply, “Of course! We can export XML!”
XML, for those readers with actual lives, stands for eXtensible Markup Language. It is a way to express data in a way that the data can be processed and managed in fairly standard ways. Essentially, you surround your actual data with keywords, attributes, and plenty of angle brackets to make it more understandable by computers and humans.
To hear some people tell it, anything expressed in XML is instantly recognizable by any other computer anywhere on earth. In fact, if you place two systems that use XML at opposite ends of your data center, by the next day they’ll have met in the middle, network cables and power cords wrapped around each other in an XML-inspired embrace.
Please. As we like to say in the computing business, “bits is bits.” Data, no matter how it is represented, can only be understood by a system that has been explicitly programmed and tested to process that data. XML may make the data easier to process, but someone still has to write, test, and support that code. And in many cases, XML makes things more complicated.
For example, today is June 18, 2008. Here is one way to represent that date for transmission between two systems:
20080618
I’ll bet most of you have decoded this particular data representation: four-digit year, two-digit month, and two-digit day. Here is the same date in a bit more old-school format:
08170
Slightly more cryptic, but not too hard to program: the first two digits are the year and the next three are the day of the year (June 19 is the 171st day of 2008). Notice the retro, pre-2000 two-digit year? It’s like shag carpeting for programmers!
Here is the date in one potential version of XML:
<date> <day>18</day> <month type=numeric>6</month> <year>2008</year> </date>
More understandable? Maybe. Self-documenting? Sure. Easier to read, parse, and decode? No way. You’ll need an XML parser, a definition document for this version of XML (known as a DTD), and a competent developer to make sense of this particular data stream.
When all is said and done, very little in computing is inherently easy or automatic. At every level, someone is designing, building, and testing all the little pieces that make that level work. You may build on that level, but you’ll have issues of your own to deal with. Never underestimate the difficulty in making systems play well together, and never believe what the salesmen say without digging into a details first.
Those Who Forget History… March 4, 2008
Posted by Chuck Musciano in Technology.Tags: History
1 comment so far
One of my favorite magazines is going through a reorganization, having been bought out with the promise of being kept in print. I’m referring to Invention & Technology, which offers up a widely varying history of technology, four times a year. Here’s hoping they find they way out of their current difficulties to keep publishing.
I find the history of technology fascinating. The lesson learned, over and over, is that what we are doing today, with our fancy computers and network gadgets, is no different from what other people did with sewing machines, or dishwashers, or airplanes, or cameras. Throughout history, and certainly in the past 150 years, technology has brought about sudden and unexpected societal change. That change is accompanied by bad investments, poor project management, horrible design decisions, unexpected outcomes, and undeserving heroes.
In many cases, the stories that originate in the 1800s provide the backdrop of the foundation of many modern companies. Josephine Cochrane invented the dishwasher as a way to make a living after her husband died; her great-grandfather, John Fitch, had a working steamboat service in production 20 years before Robert Fulton. Josephine’s machine was a tremendous success, but women were not well-regarded as heads of industry; she sold her company which became, after a few name changes, KitchenAid.
Electricity was all the rage at the turn of the last century. Varying load demands led power companies to only generate power at night, when demand was more consistent. Earl Richardson, a supervisor at a California power plant, developed an electric iron to encourage housewives to use power during the day. The iron was popular since it provided heat all the way to the tip of the iron, unlike other irons of the day. The resulting appliance company? HotPoint, of course.
On and on the stories go, concisely captured for our review and edification. We ignore them at our own peril; the lessons they teach about innovation, persistence, and leadership are as pertinent today as they were way back then. Fortunately, the full twenty years of the magazine are preserved online, with a nice search engine atop the archive. Take a moment to see how all these problems were solved by our forefathers; you just may find the solution to the problems you are battling today.
