Category Archives: Technology Industry

VMWare player changes software demos!!

As reported by Slashdot, VMWare just released a free VMWare “player” that will play any VMWare machine. I’m quite surprised, but pleasantly! This means that ISVs, consultants, and trainers can all prep an absolutely “perfect” PC at a point in time and distribute it to others in a method that ENSURES that it will fire up and work correctly. The days of “installing” software demos is done… Demos have to be WICKED easy and this does it! This will be invaluable to setting up classrooms on my OWB Workshops / Courses! Hooray!

It’s brilliant!


I look around my office I see a few rack servers, desktops, and a couple of laptops… Mostly Dell, some offbrand, and my Gateway 200ARC notebook. I am 100% certain that the notebook will be the last piece of revenue I ever contribute to Gateway.

I am, apparently, not the only one who feels this way as Gateway has been losing money (and revenue) the past three years. Based on my latest experience with them this is not only likely, but desirable. Gateway product support and customer support was the worst I’ve received of ANY software or hardware vendor, ever (I’ve worked in IT since 1995, so that might tell you something). No company that can perform as poorly as Gateway did with me should thrive in economies of healthy competition. If Gateway makes it, Adam Smith will roll over in his grave and demonstrate the “Doh” made famous by one Homer Simpson.

If readers aren’t interested in descriptions of poor reasoning, business sense, baffling company policies and rude customers service agents than stop now! If you are curious how a well known brand can make a series of mistakes that will cost them a customer (and perhaps many more of warnings heeded from this blog), read on.

It’s quite simple really; the 200ARC was manufactured with a defect that allows the hard drive to become disconnected. This means that after business trip, one needs to whip out a screw driver and reset the drive. No biggie, unless it happens when it’s sitting on your desk running XP. This happened to me last Friday and corrupted my XP operating system.

After four calls to Gateway support (all with relatively helpful CSRs that were doing well executing a series of procedures to troubleshoot) it was clear what had happened. Due to defect, the drive was partially corrupted but no physical damage was done. So my data was safe, which should be a good thing… unless you are talking to Gateway.

Since no physical corruption occurred the solution is to use the recovery disk. Most reading this are tech savvy, so you know what this does: completely wipes out your hard drive. Complete loss of data (which for me, meant rolling back to a full backup 2 weeks prior). That was not, acceptable, as I knew (from a quick linux CD boot) that “My Documents” was in tact and not corrupted.

So, I need a way to get going on the recovery, but to be able to recover the data files that have changed in the past two weeks. I need an identical drive that I can use to reinstall and I’ll recover the files from the other drive over the next couple of day.

Nope… No replacement drive. Their faulty product caused this and I am covered by their highest tier full coverage accidental super duper warranty program (cost an extra $200 USD at purchase). I can’t get Montie (badge CA358) to send me a drive. How about a loaner? Just send me a drive and I’ll send you mine (which you assure me is just fine) in 30 days? Nope, no way… Data Corruption is not Gateways problem, even if it was caused by our product (that was an actual quote).

So here I am with a major pain in the butt situation, and a $50 hard drive can mitigate a huge customer concern and some knucklehead thinks it’s more important to value a corporate guideline than a customer relationship.

That’s what sealed the deal: Gateway is clearly unaware their business is about more than parts, UPCs, and customer support procedures. Dell gets it. The newest Dell commercials, have nothing to do with the amount of RAM in their computers, it’s a guy calling Dell support to make sure Dell will value them as a customer and help them do the things that matter to them (email Aunt Maude and browse

I can’t stress enough to those reading, please heed a warning from a knowledgable purchaser of technology (notebooks, desktops and servers). Gateway doesn’t get it so buy only if you really want the hardware and that’s it!

There is a relatively happy ending, and I love the fact that it is these sweet words: Linux saved the day! 🙂 A simple download, iso burn of Trinity Linux I was able to “scp” all of my files to a server and then proceed through the process of wiping out my hard drive per Gateway procedures.

My parting advice is download this small bootable CD and burn it right now. It’s simple memory only boot, file system, network support is the convergence of exactly what one needs for recovery situations.

Good riddance Gateway and good job Trinity Linux!

Gibberish Talks at Conferences

I thought this was notable mention, originally posted on slashdot. Have you ever attended a presentation at a conference that made absolutely no sense? In this case, some computer scientists were able to prove that one can get ENTIRELY GENERATED AND BOGUS PAPERS accepted for conference presentations.

Rooter: A Methodology for the Typical Unification of Access Points and Redundancy
Many physicists would agree that, had it not been for the congestion control, the evaluation of web browsers might never have occurred. In fact, few hackers worldwide would disagree with the essential unification of voice-over-IP and public-private key pair. In order to solve this riddle, we confirm that SMPs can be made stochastic, cacheable, and interposable.

We've come a long way

I’ve had some moments over the past few days that have all served to solidify that the world has come a very long way in the last 10 years. I recall 1994 where I was about the only person I ran with socially that had an email address, and the idea that I had a website (hosted by a research project for Sun) with my resume was very fringe and nerdy.

How all that has changed… As I write this, I am sitting at a corner seat at Tully’s coffee shop in Seattle, WA with only a power cord connected to my laptop(WiFi is brilliant). I am connected remotely and securely to a customer location working on building a web traffic and customer analysis Data Mart. I have just ordered a 6.0/768 Mbps line for my new home office that will be provisioned in one week. I have ordered Voice service over that dedicated data line for just $30/mo more. The progress is astounding… I am listening to a radio station from the UK clearer than local FM. Put it all together and I, along with others, expect the next few decades of human to truly transform society and life on this planet. These are great times we live in…

Six Apart gets nod from Investors

The makers of Typepad and Moveable Type have long been regarded as the professionals blogger tool. There are copious amount of good, free, open source blogging tools available which is fine for a great many purposes. However, the quality and support from a nominal license fee ensures that those wishing to do a professional looking and operating blog have a source. Ran across someone who thinks highly of the company, as well as the technology on a blog on

Of course, there’s more to Six Apart than powerful software. As elegant as I think both Movable Type and TypePad are, to quote myself again, “it’s the people, stupid!” The Six Apart team is phenomenal and getting better every week. I don’t invest in products or ideas or business models.

Note: The blogger has invested in the company… 🙂

VPN over SSL is only acceptable

Companies have different ways of implementing and securing their corporate networks. The ways to secure a corporate network are too numerous to mention here, let alone add any substance to. There are a variety of vendors providing copious amounts of networking and security hardware.

I have a customer that is using an SSL based VPN solution, to allow access for it’s employees and partners from offsite locations. This is one of those technology solutions that looks fabulous on the proposal from the vendor to an infrastructure group, but leaves a little to be desired when it comes to using it on a day to day basis.

Take for instance, a Juniper Networks product formerly called NeoTeris. The premise is simple, install the device in your “Public Zone” and give it access to your corporate network. Configure it to authenticate against a Windows Domain server. Employees hit a website,, log in with their username and password, and then it’s just like being on the corporate network without any special software. Well, not exactly.

There some limitations to this approach… Mostly related to performance, but some as it relates to functionality. The implementation of this system involves the download of an applet that routes traffic via SSL to and from the remote server. When access remote applications (SQLNet, JDBC, VNC, ssh, Windows Share, etc) the most notably drawback is latency. The network packets are being processed by an embedded java applet before being routed over SSL, which for those who are familiar with those two technologies you’ll understand why they aren’t well suited to quick, instantaneous packet forward. Throughput is minimized as it appears (I don’t know if this is, in fact, true) as if the applet is chunking data into packets according to a rigid configuration, rather than using the great TCP/IP feature of ratcheting.

The features are somewhat limited in that because a VPN system has just placed you on the remote network, you are required to configure applications port by port. For most applications this is acceptable, as you connect through them to a certain port, and life is good. There are some applications that change up their ports, and come configured by default to “move to another port” or are built to do so (you can’t turn it off). Because one must configure the ports of the system individually, a user of the SSL based system has no chance of using these applications because they couldn’t possibly guess the port ahead of time.

Overall it’s an application that provides remote access for a great number of people, with a great provisioning model, works for most applications, and is alright for occasional home based use. My experience stretching one customers installations to the limits to do my work for them, indicate that it’s not very suitable for remote workers. Remote workers that expect their remote applications and access to a corporate network be a similar experience to their commuting peers, will be disapointed with VPN over SSL. Companies should know that if their needs are limited, this is a good solution. If you have remote workers who might expect a good remote experience, they should stay away from VPN over SSL.

Open defined by customer

Found an interesting blog on Jonathan Schwartz’s Weblog about the true definition of Open. While I don’t agree 100% with everything in his article, I think he’s hit the nail on the head.

He’s absolutely right.

…the definition of open that matters most is the one experienced by a customer or end user – “openness” is defined by the ease with which a customer can substitute one product for another

It’s interesting reading, and I hope to have some more time to add my own thoughts here…

Take a look in the Mirra

Found a neat looking network application today. Mirra is a network backup appliance that does a continuous backup, from wherever you are on the network to your Mirra drive.

It stores up to 8 previous versions of changes to a file. It does so continuously, and in the background so that you are always backing up. As a consultant at client sites often, my notebook computer is my lifeblood. This could save me from having to lug my 250 gb Maxtor OneTouch with me. According to the manual, the Mirra software queues up data changes (ie, backup information) when detached from the network and then sends those along once reconnected. Perhaps I’ll buy myself this for Christmas, or my Birthday; heck, Arbor day would probably be enough of an excuse. 🙂


Competitive Advantage for Open Source?

I was reading on slashdot this morning that Mozilla has been officially recognized as a 501(c)(3) by the United States federal government. Getting qualified as a charitable non-profit, giving software to the world can be a significant competitive advantage for the Mozilla directly and Open Source in general.

Being a non-profit can provides significant advantage to Mozilla, and it’s respective aims. There will be opportunities to both decrease outlays on goods (hardware, servers, etc) and services (professionals donating time can reasonably deduct the hourly rate for those pro-bono). Mozilla could, depending on how far they wish to stretch the limits of the non-profit, provide tax breaks to open source developers in the US contributing at a reasonable rate. I have no idea if they plan on doing this, but it’s an interesting premise all the same and I think it would be just brilliant. There are also advantages from a revenue perspective.

Companies wishing to support Open Source initiatives had to do so previously by funding that internally through developers time, etc. While this time is deductible as a business expense, it does appear to deduct against directly the business unit/department/project that is making said contribution. Companies now have the ability to make a greater contribution and have that contribution to the world of science, and humanity reflected in their tax bill. In theory, if Mozilla manages their fundraising efforts properly they may be able to significantly increase the amount of $$ they could spend on a central development team adding clarity and continuity to projects that are full of heart, but sometimes lack focus.

I’m not saying that Open Source is just as much a worthy cause as many of the other humanitarian and charitable organizations. At the end of the year, I’d still likely spend a few hundred dollars that has a direct effect on saving and improving lives. Open Source does that, but in a different and proportionately smaller ways. However, providing this logistical benefit to companies wishing to support Open Source is a move in the right direction.

Time and Money MUST be part of an IT architecture recommendation

Why – good architecture is good architecture right? I don’t ever remember the advanced OO design and analysis courses I’ve taken covering cost as part of their curriculum. OO, Design Patterns are supposed to be universal. The scope of the project shouldn’t matter – encapsulating data is good, no matter what!


  • That’s what the engineers say.
  • That’s what computer science and programming books teach us.
  • As a practitioner of good OO I used to evangelize that.
  • Academics and architects say that.

How would one arrive at the conclusion that good architecture concepts, such as encapsulation, polymorphism, etc are good architectural concepts regardless of the context? We must FIRST look at the definition of what GOOD ARCHITECTURE is.

ENGINEER A collection of systems, programs, data that supports leading application and system development methodologies, concepts, patterns, and leading thinkers to minimize defects and increase reliability and extensibility Polymorphism, encapsulation, web services, component based architecture, standards based development, etc.
CEO/SHAREHOLDER (BIZ) Capital investment(of time/money) to:

  • decrease deployment and maintenance costs and increase reliability
  • decrease time to market for internal and external applications
Virtual Clusters, enterprise monitoring, standardize on J2EE, investing in coding standards, building infrastructure services and components

Fundamentally these are not in opposition. They are complementary and hardly ever are they directly opposed. The engineers view has pieces that will enable many of the BIZ requirements for good architecture. If they line up, and they are mostly in line and are complementary, when might they be in conflict?

BIZ users evaluate their capital expense in a variety of methods… Nearly all boil down to a very simple question of ROI. What is the benefit to me over time if I spend this money today? The strategy will be (no real surprise here), to invest capital that will have maximize benefit and minimal investment (do more with less). There is no shortage of the differing calculations, tweaks, estimations involved in this process.

So let us say that as an engineer or an organization engaged in commerce (ie, an IT department at a corporation) that given there are no motives for academic excellence or intellectual achievement that the true definition of GOOD ARCHITECTURE outside those endeavors is that of the BIZ:

  • Decrease costs
  • Increase reliability
  • Decrease time to market for new technology

Again, the engineers view is not in direct contradiction. They line up quite nicely and indeed are mostly congruent. Good engineering principles, from an academic perspective, can have immense benefits to a business including decreased costs, increased reliability, and shorter implementation times.

Let us, for the sake of clarity, label these two camps that are nearly always in agreement the Academic(predominant view of an engineer) and Applied(view of a business owner or executive management) architectures.

Yeah… but you promised to tell us when they are in conflict… We’re getting there…

Ok, so when might an architectural decision be good Applied architecture but bad Academic architecture? Let’s take a practical example:

Two particular architectures supporting two separate methods (A & B) for producing widgets:

  • METHOD A : Requires X staff to produce Y widgets. Rates an 8.5 on an abstract ACADEMIC QUALITY SCALE.
  • METHOD B : Requires X-10% staff to produce Y widgets. Rates a 9.5 on the ACADEMIC QUALITY SCALE.

Let’s look at some example numbers. There is a proposal to expend capital C (10k/year) on architecture/infrastructure to enable method B. X is 5k per year.

The engineer inherently evaluates on the ACADEMIC quality, and straightforwardly evaluates (according their engineer precepts) that CAPITAL C is money well spent on GOOD ARCHITECTURE because it increases the academic quality and saves 5% on X per year.

The biz evaluates on the APPLIED model, looking for ROI. In this case, it would take 20 years to recoup the cost of the capital with no increased revenue (still Y widgets). In this example the cost required to implement method B, while significantly improving productivity and the academic quality of the architecture, is actually greater than the benefit. In this case, doing the right thing (from an academic perspective) is actually a bad architecture based on the BIZ definition.

In cases where the projected COST OF BAD ACADEMIC QUALITY (decreased output, slower response times, difficult QA periods etc) is less than the estimated/projected COST SAVINGS OR BENEFIT from the invested capital in the GOOD ACADEMIC QUALITY ARCHITECTURE then it’s a better business decision to go for poor academic quality.


I’m not saying engineers striving for good architectural concepts is bad. It’s noble in precept and may also make great business sense. One can’t put a solution up on a white board and say, trust me, it’s good architecture because depending on what your IT charter is, it might not actually be. You have to prove it – add projects estimates, efficiencies, use statistics from trade organizations. Tell the people funding the cost of the shiny well tuned machine that it’s in their best interest because they’ll get 30% more application throughput, 30% reduction in time to delivery for requests, 75% cost of maintaining systems X,Y, and Z.

UPDATE: I was thinking it would be prudent to point out that I’ve observed the most successful implementations of sound engineering techniques has always been done by groups of people who fundamentally understand the concepts, where they are currently are, where they want to go, and then just consistently and opportunistically “realize” that vision through projects over time. If you see opportunities to make things better; do well.