Nineteen Eighty-Three, revisited.

In the early 80s, when I was but a child, I wanted the coolest new toy that any of my friends had: an Atari 2600 games console .  My parents were of the opinion that a computer had to be more than just a games machine – it had to be something that whole family could benefit from.  And so we ended up with a TRS-80 Colour Computer  with “Extended BASIC” and 16K of “memory”.  (This memory was divided into RAM and ROM, so you ended up with just over 8K of “useful” memory)  They could not have realised it at the time, but they pretty much set me on my path to a career in the computer industry. 

“Secondary storage” for the TRS-80 was a tape drive.  A proprietary plug on one end of a cable, led to three connectors for microphone, speaker and “remote control” plugs.  Any tape deck that had the mic and speaker plugs could be used.  The baud rate for the tape device was quite fast for the medium used.  As a result, “saving” your work was very much “in the hands of the computer-gods” and the chances of success were certainly improved with the crossing of fingers and holding of one’s breath.  If memory serves me correctly (this was 25 years ago, so maybe it doesn’t…) there was also a nasty price to pay:  When loading saved programs from tape, any program in memory would be overwritten.  Now, if loading the program failed and they did fail  you were left with nothing.  So not only was saving a dubious process, but you were left with no way to check without performing “destructive testing”.  I learnt two things from this:

  1. Always keep more than one copy of your work.  (i.e. perform multiple saves)
  2. Patience. – Sometimes even three copies of your program wasn’t enough to ensure success.

It was sheer madness to expect that this solution for secondary storage was ever going to be acceptable to the general public.  It was far too unreliable – but worse, it was unforgiving.

Skip ahead to the current day and allow me to draw a parallel with web based applications.  Have you ever found yourself in the position of writing a vast amount of “stuff” into a text box on a web-form, hit the submit button only to be confronted with some sort of page error?  It’s happened to Eric Sink:

I actually spent about half an hour wordsmithing a multiple-paragraph response.  But when I hit the submit button to post it, WordPress gave me a generic error page.  Presumably something timed out while I was crafting my reply.

And when I hit the Back button, my comment was gone.  :-(

*&%$#@!
My mind raced.  What are my options here?

Maybe I should just re-type the whole thing?  It was only 300 words or so.  Nah.  The text I wrote was perfect.  I probably won’t be able to remember it just the way it was.  And why should I have to?  Firefox and WordPress screwed this up, not me!

Now Eric’s story had a happy ending – he got his masterpiece back.  Some of the comments on his post suggested other ways – including writing the post “off-line” in Notepad and copying the text into the on-line editor.  (The strategy I use composing most of my blog entries) But non-computer folk use the Internet too!  It is simply not acceptable to throw away people’s thoughts because of some silly error!  One of the commenter’s mentioned that the Opera web-browser wouldn’t have lost the text.  If this is truly the case, kudos to them! 

I’m not sure what the best answer to the problem is.  Maybe it includes the Google Gears project?   Maybe browsers should just be better?  Maybe a web-application for composing long comments isn’t the way to go?   The only thing I know for certain, is I’m closer to my death now, than I was in 1983.  In those days I had more time to waste!  Please don’t make me waste my time needlessly now!

It’s not a push-bike!

Segways aside, I love two wheeled transport.  I’ve always enjoyed cycling and to me, motorcycling was just a natural progression of this.  I still enjoy riding an “acoustic motorbike” – even if I don’t ride as often as I once did.  (I’m more of the baggy T-shirt kind of cyclist, rather than the lycra clad streamlined type)  I only mention this as I don’t want any potential cyclist to think I’ve got something against their chosen mode of transport…

From time to time, I end up having conversations with people who would otherwise not have contact with the motorcycling community.  Invariably, it’s when I’m doing something that you’d never have seen Marlon Brando do in “The Wild One“. Things like turning the motorcycle around in a confined space on uneven ground.  I weigh just under one third of what my motorcycle does and I’m not likely to represent Australia in any weight lifting competitions.  So it’s fair to say that it can take me some effort to move the motorcycle on soft or uneven ground.  Occasionally, if the ground permits, I’ll do the “cool bike shop” maneuver of pulling the motorbike onto the side stand and pivot it with the wheels in the air – but this was definitely an easier trick with my previous bike (a CBR929) as it was about 40kg lighter.  I’m no civil engineer and this sort of behaviour does make you wonder how strong the side stand really is.

This is the first thing that surprises the non-motorcycling community. The statements go like this: “It’s doesn’t turn around quite as tightly as I was expecting“.  This is quite likely the first time they’ve even really considered the size of the bike you are riding.  You can tell by their next question: “Is it easy to pick up when it falls over?” (the question never seems to be *if* it falls over…)  Then come the questions of “How often have you fallen off?”  as if remembering summer days of grazed knees from their childhood.  Strangely enough, no-one ever asks me how often I’ve crashed a car…  (Mind you I’ve never tried to push a car around someone’s front yard, either!)

For as long as I ride, I don’t think I’ll ever be free of such lines of questioning.  I suspect I’m not the only motorcyclist who is asked such questions.  Quite frankly, I’d rather answer those sorts of questions than be run out of town by an angry mob who thought “The Wild One” was a documentary. But before you join that crowd, remember there’s one answer that takes care of most of your questions: “It’s not a pushbike” :-)

The problem with the Internet (Part 2)

As I have mentioned earlier, one thing the Internet does well is allow people to collaborate.  With the now common-place Web 2.0 sites, there is no end to the social interactions available on-line. 

The problem here, is not all things work well “on-line”.  Asynchronous or not, the reliance of AJAX applications to communicate with a Web server can be a right pain.  This is probably a more pronounced issue in Australia.  We are a long way from the US and an even further distance to the UK and Europe.  There is around a 200-300 msec hit taken for data to cross the Pacific.  Unless there is some major break throughs in quantum physics, this delay is simply unavoidable.  Put simply, even things moving at the speed of light, take a notable time to traverse this distance.  Three hundred milliseconds may not sound like a lot.  Rest assured, you notice it.  Visiting US developers certainly can’t believe the slight delay that happens on every web request (presuming the web server is not located in Australia of course) and conversely most Australians abroad can’t believe just how much more responsive the Internet is in the US or the UK. 

This delay is hardly anyone’s fault, of course.  But, unless you experience this delay yourself, you are not likely to ever take it into consideration in building your application.  Here in-lies “the rub”.  Web applications are not speedy.  Oh yes, in AJAX land where sunshine and rainbows occur in green fields of joy things are much better than they once were – forms are much more “dynamic” etc, but the performance is still woeful compared to the previous generation of desktop applications.

This gives rise to the second issue.  No-one wants to develop desktop applications anymore.  The computer industry is at least as guilty as the fashion industry for being swept along with the “latest craze”. If you aren’t developing web-based applications (regardless of whether they are practical, or in any way benefit from their onlineness*) you aren’t in demand as a programmer. 

Do customers care about the technology used in the applications that they use?  Of course not – as long as the application does what they need it to do.  This lack of caring should allow the industry to stick with tried and trusted technology, rather than invent new technologies.  However, the inverse ends up being true.  That is, because the customers don’t care what technology is used, they don’t end up demanding that the industry use tools that work best for whatever the problem is at hand.   

Sometimes the Internet is the right tool for the job.  Unfortunately, the trend has been to bash it into the only tool for all software jobs.  And that leaves those that use the tools with a less optimal solution than they may have otherwise had.

* I made this term up to suit my needs… but the Internet assures me I wasn’t the first to do so!

Best use for Internet – EVER!

Okay I admit it.  It’s probably not the best use of the Internet ever created, but it is a web-site that does do what the Internet does best.  What the Internet does best, is join together geographically displaced people with a common interest.  In this case, motorcyclists.  (Now can you see why I’m a little enthused?)  There are plenty of motorcycle forums “out there on the web” catering for all different types of motorcyclist.  The web site I am talking about has a forum too.  But where this web site differs from other web sites is its primary focus.  The site is www.motowhere.com and according to its banner, it “helps motorcyclists discover the best places to ride”. 

Members can sign in and create routes to share with other motorcycling enthusiasts.  Anyone (i.e including non-members) are free to browse these routes and print off the “cue sheet” to take with them on their next ride.  The routes are displayed overlaying Google maps and can include a running commentary on what to expect or where to stop or things to enjoy along the way.  I still stand by my comments I made regarding the accuracy and level of details of Google maps, but I applaud the web site creator(s) for their simple (and simply brilliant) idea.  (I suspect they chose Google maps as you can drive them free of charge! – Whether that bothers Google or not, I haven’t bothered to find out…)  At any rate, I can’t blame them!

As with all “community content” web-sites, there’s no promises for accuracy and the level of detail differs from route to route.  Still, with any luck the next sunny day off you get, you won’t be stuck for a choice of new roads to explore.  Have fun out there and ride safe…

A fair price for software

How would you set the price for a software application?  When a customer purchases software, they are purchasing “Intellectual Property”.  These days, Internet downloads of applications are a common distribution channel.  So the consumer receives no physical object.  At work, I have purchased a single licence for software in excess of $1000 US dollars (at a time when the exchange rate meant that it was worth something!).  In exchange for the company’s money, I got an e-mail with a product key.  Everything else, I had already downloaded from the Internet!

Software and intellectual property are things that are very difficult to put a price on.  Those people outside the computer industry can’t relate to how much hard work went into the production of something they can’t touch.  It’s probably fair to say that even in a manufactured product, most people do not have a good level of comprehension of how much effort went into the building of the product. (Including the development of the product).  But with a physical object at least they can see the craftsmanship and feel as though the money they paid is worth something.  There is also the matter of “comparative works”.  If you were to purchase a television, you have some idea about how much it should cost.  Once you have decided on a feature set, screen size, technology etc, you will know approximately how much you could expect to pay.  In software markets where there is direct competition from various manufacturers, it appears as though the pricing is fairly well formalised as well.  – For instance, in the market for video editing software, most shrink wrapped products are around the same price
 
It would be a fair assumption that the cost of producing a television in a given quantity would be fairly similar, no matter who manufactures the product.  Therefore, the price at which they sell can be neatly aligned. By and large however, there can be a great disparity with respect to the cost of producing software.  The price at which it sells however, is determined by the level of competition, not the cost of design and production.  As a result, price does not guarantee quality, nor features, nor usefulness.  Whilst in most markets this is strictly true, shoddy products simply cannot continue to be sold at high prices: the market works it out and the product stops selling

The intangible nature of pricing software, plus the ease with which files are copied leads to an inevitable problem: Software Piracy.  Given the choice of “free” versus any dollar figure, most people will choose free.  You can build a commercial model that competes against “free” and does so successfully – that’s not an argument I’m trying to make.   The point is most people “in the know” will look for freeware and/or open source projects* that suit their needs.

I would argue that once you get away from the computing industry, the average person in the street would not have a clue about “Open source software” even if they consider themselves “computer savvy” / “computer literate”.  They would however, know about pirated software… 

Pirating software is not something that I can condone. Put simply, I have not heard an excuse that makes it in any way acceptable to me.  I feel this way, undoubtedly because it hurts the industry that provides me with a pay-cheque.  It’s akin to “stabbing my fellow developer in the back”.  Ignoring that software piracy exists is being naïve.  Outside of the actual software industry, there often seems a mentality that it is an acceptable thing to do.  Possibly this is due to the excessively low conviction rate for software piracy or a “safety in numbers”/“everyone else is doing it” mentality. The excuses start with things like “I’m only an individual and not using it for a business” / “I’m only going to use the software once”. The variations of this excuse are many, but most revolve around the fact that it is acceptable for them to pirate software as long as someone else pays for it.

So there is no good answer to “what’s a fair price for software?”.  The only answer I can offer, is “the price the vendor charges for it”.  There are almost always freeware/open-source alternatives to commercial software.  If you aren’t willing to pay dollars for a feature set, look for the alternative open-source solution.  It may be clunky (it may well not be too!) or unfamiliar, but that’s the price you pay, for not paying a price!

* – Yes, I am aware that “Open source” is not the same as “freeware” and depending on the open source licence used, the software can cost money.  Typically (especially for “home use”) it tends not to be…

Honda’s Dual Combined Braking System

In my post about braking, I mentioned that my current motorcycle features a “linked braking system”.  Honda refer to the system fitted on my motorcycle as “Dual combined braking system”.  It’s their contribution to “rider safety”.

Hydraulics work on the principle that liquid cannot be compressed.  When a brake lever is “squeezed” this moves a piston in the master cylinder.  Moving the piston, displaces brake fluid.  We know that this brake fluid cannot be compressed, so the end result is that it must be displaced elsewhere in the system.  At the other end of the brake line, is the brake caliper.  It contains one or more cylinders with pistons in them as well.  The brake fluid therefore pushes the piston(s) in the caliper – which in turn pushes the brake pads into contact with the disc.  Typically the pistons in the calipers are much larger than the piston in the master cylinder.  This acts as a “gearing” allowing a greater pressure to be applied to the brake pad than the rider exerts at the lever.  But, because the surface area of the piston is greater, the piston won’t move as far.  

On the Honda, there are three brake calipers, each with three pistons. Two of these calipers grab large discs on the front wheel, whilst the third grabs a smaller disk on the rear wheel.  Larger discs mean better leverage and thus can be described as having a larger “braking force”.  (Remember that up to 90% of efficient braking force can be applied to the front wheel)

If the bike were fitted with a conventional braking system, the right-hand lever would activate all six pistons of the calipers used on the front wheel, whilst the right-foot pedal would activate the three pistons of the rear caliper.

The right-hand lever on the VFR activates five of the six pistons on the front calipers of the VFR.  That is, all three pistons on the front right caliper and the outer two pistons of the front left caliper.

Everybody knows brake fluid is blue!

The right-foot pedal activates the outer two pistons of the rear caliper via a “Proportional Control Valve” (PCV), and the middle piston of the front left caliper.  As near as I can tell (based on Internet research) the PCV acts as a “pressure reducer”.  Judging by the workshop manual, the PCV is strictly a mechanical device with no requirement for electrical input.

Or maybe it's green?

Those who have been paying close attention may now be asking: “How does the middle piston of the rear caliper get activated?”.  Well, that is answered by how the front calipers are mounted to the bike.  The right-front caliper is attached in a fairly standard way to the right fork leg.  On the left hand side, the caliper is not.  Instead, it is mounted at the bottom by a pivoting joint.  At the top is what is referred to as a “Secondary master cylinder”.  The piston for this master cylinder is attached to the fork leg.

It used to always be that clean!

Once the left front caliper grabs the front disc, the anti-clockwise motion of the disc (presuming the bike is moving forward) pivots the left caliper, forcing the piston further into the secondary master cylinder.  This secondary master cylinder is attached to a brake hose, through another PCV and then onto the middle piston of the rear caliper.

So, to summarise how the lever and pedal work in isolation:
When the rider applies the “front” brake lever:

  • Five of the six front pistons are applied.
  • The left front caliper exerts force on the secondary master cylinder.
  • This in turn applies the middle piston of the rear caliper.

tada!

When the rider applies the “rear” brake pedal:

  • Two of the three pistons in the rear caliper are applied.
  • The middle piston of the front left caliper is applied.
  • The left front caliper exerts force on the secondary master cylinder.
  • This in turn applies the middle piston of the rear caliper.

Pity the mechanics who bleed 19 hoses

The next obvious question to ask is: Does it improve braking performance?  The only answer I can give you is that I expect it does.  – I should qualify that by stating I expect that it reduces stopping distances for most riders.   Honda fit DCBS to several of their models.  Over time, the system has evolved.  Differing pistons have been utilised in previous versions – as have “delay valves” which staggered the intervals at which pistons were applied.  I suspect (but do not know) that maybe even the PCV has been developed over the years in an attempt to improve the system.  I know that some readers may want a better answer than “I expect it works”, but that’s a story for another time.

Still looking for a fair EULA (Part 2)

Earlier, I talked about the End User Licence Agreement (EULA) that bind software users to legal restraints – preventing lawsuits against the Software manufacturer in the case of failures. In it I stated: “abdicating all responsibility for the software failing upon the user is a cop out.” In my mind, there are two immediate problems preventing the abolition of this cop out. The first I shall call:

The tale of a bad Meat-loaf song and an oddly scaled reflection:

I stand by my “cop-out” statement. But on the flip side of the conversation, I also want to state: “People need to take responsibility for their own actions”. By this I mean: where a user makes errors using computer software, due to their own lack of understanding, they do not deserve legal protection. This is the counter argument that prevents the issue of fair EULAs being a black and white issue. Years of pop-culture has seen me take the viewpoint that lawyers live to make money by suing those who already have it. If the great mediums of film and television are to be believed (even partially believed), a person’s stupidity is no defense for the company/organisation/individual being sued. This attribute probably explains the “Objects in mirrors are closer than they may appear” stickers that appear on car mirrors.

Without any legal protection, the software industry would be forced to produce software that provides absolute protection for the absolutely clueless. I can see the “Are you sure?” dialogues now…
Software that molly-coddles users to the extremes of looking after those who hadn’t figured mirrored images may be a little difficult to judge scale by, will likely be tedious and unproductive to use.

The second “immediate problem” I see, I couldn’t think of a tricky title for… So I have simply called it:

The blame game:

A likely area of confusion is figuring exactly who is liable in the event of an error. Normally, the aspect that fails in a software application is the one closest to the user. Say for instance: an error in CAD software means a building is not constructed strongly enough. In this example, the error will likely be in the calculations performed by the CAD software. But this may not be the case.

  • An operating system function may not have returned the correct value.
  • An error in rounding floating point numbers may have caused a value to be “just the wrong side” of the required value.
  • The compiler used to compile the CAD program may have generated the wrong assembler / Intermediate language instructions.
  • The processor may have returned the wrong answer for a division.

Exact determination of who is at fault would no longer be a case for lawyers alone. Evidence to find the guilty party may prove to be elusive. Suing the software producer would probably succeed in tying up the programming resources of the company whilst they ascertain whether they should counter-sue someone else. The only people who benefit from the long proceedings that would likely ensue, would be those who are getting paid by the hour… To me, it appears as though only the lawyers come out of this scenario any better off.

In summary:

The threat of legal recrimination may well force software developers to put more effort into thorough practices that raise the quality of their wares. As you may have noticed, quality software is something I am passionate about. But I am a realist. I don’t think I will live to see the day that all software released is “bug free”. I am convinced that nothing short of starting the computer software industry again would achieve such a goal. The financial, economic and cultural shock required to “stop the industry whilst we get our act together” is not one the rest of the world would be prepared to bear.