Originally published in The Clarion | August 31, 2011
The technology marketplace is ever-changing. Just when one thinks that things are as they are, a new game-changing technology emerges, a mammoth corporation swallows up another while a historical big boy spins off major parts of its business in order to simply survive. If you keep up with the world of technology and your head doesn’t spin, I envy you. The world of IT is often a cut-throat business with only the strongest of the strong surviving. The events of a recent few-day span are a perfect example of just how volatile the IT arena is and how, even with right-now communications (and communications leaks that seem to have become the norm) it is still possible for the impossible to happen right before our eyes.
Within a three-day span in mid-August, news from two IT giants shook the geeksosphere. One a substantial acquisition, the other a substantial downsizing, the combination of both exemplify the volatility in the realm of Information Technology. The first of these two announcements came as a total surprise, to me at least. Google announced that it would be purchasing Motorola’s Mobility division for a cool $12.5 Billion dollars in cash. The implications of this purchase are nothing short of enormous. With the popularity and continued success of Google’s Android mobile operating system, this acquisition will put Google in an equal position as Apple Corporation – full and total control of both the hardware and software in their mobile offerings. Time will only tell where this puts Google but one can only imagine that things are looking really good for them.
On the flip-side of the coin, three days after Google’s big announcement, Hewlett-Packard (HP) announced a downsizing or evacuation from a large segment of their core business. HP’s announcement that they were ceasing development and production of their line of tablet devices including their webOS platform, signified just how competitive this area of IT is. Since 2001, HP has merged with Compaq and made several large acquisitions including 3Com and Palm, Inc. The merger with Compaq made HP the largest PC manufacturer in the world, surpassing Dell. Now, only a short decade since that merger, HP’s backtracking signifies trouble for the company. With additional rumors that HP may also leave the PC manufacturing arena entirely, one has to wonder what is next for them.
Today is a very difficult time in Information Technology. With a small handful of giants making things happen, more and more corporations are only scraping by. Downsizing, sell-offs and closures are a sign of the times, with companies like Cisco and Microsoft even showing signs of weakness. It is going to be interesting to see how things shake out over the upcoming months and years to say the least.
Originally published in The Clarion | August 24, 2011
A reader recently emailed me with a pretty common IT-related question, one that I hear talk of quite often but also one that I have really never investigated beyond the obvious. The question – “In the last couple of weeks, I have gotten the “Blue Screen of Death”. Does this mean I’m about to crash? What can I do to prevent it happening again?” – may bring a slight chuckle to those of us who work in the IT realm, a possible gut pain to those of us who have been down this road before. Whatever the case, the now-infamous Blue Screen of Death (BSoD) in Microsoft Windows operating systems has been a thorn in our sides, well, since Windows 3.0 debuted in 1990.
If you have never experienced the BSoD, odds are good that you either a) are a Mac or Linux user b) don’t use personal computers or servers (ever) or c) are just plain lucky. In general terms, the Windows BSoD is a mechanism the operating system’s kernel uses when a fatal error has occurred. While the display is typically filled with all sorts of data, to most people it is simply jibberish. The data is useful in identifying (and eventually remedying) the source of the problem, although a typical BSoD almost always can only be overcome by manually turning the machine off. Outside of the severity of the issue(s) that cause a BSoD to appear, I am rather fond of them as they remind me of the display on the Commodore 64 computer. All joking aside though, if you experience a BSoD odds are good it won’t be your last.
Now you may be asking yourself how to permanently fix a BSoD problem. For starters, the BSoD is in itself not the problem but an indication of an underlying issue that is causing the system to crash. As painful as it may be, your best bet when presented with a BSoD is to get out a pen and paper and document exactly what is displayed before your eyes. Believe it or not, there should be valuable information amongst the jibberish that will help you identify the source of the problem and an eventual resolution. Once you have documented the BSoD output, manually shut the machine off and boot it back up. Sometimes, depending on the source of the problem, your machine may work like a champ for days or even weeks. This is not always the case though, as some system issues can cause the machine to fail as quickly as it booted up.
With your BSoD notes, you can often find good information by searching the Web for keywords from the display. Sometimes the fix is a simple hardware driver update. Other sources of a BSoD could be faulty or failing memory, a power supply on its last leg or components that are overheating. Whatever the case, issues that cause a BSoD are almost always able to be fixed without loss of data.
Originally published in The Clarion | August 17, 2011
Sometimes, in my eyes at least, people take the powers of technology a bit too far. I’m not referring to embryonic stem cell research or cloning sheep or anything like that. I’m talking more about the use of time – and time is money after all – and lots of time, to provide information and services that to me just seem pointless. In an article back in February, I wrote about pointless apps that included a smartphone application that was solely for the purpose of organizing one’s wardrobe. Sure I’m a tad jealous of developers who have an uncanny ability to throw together a conglomeration of code and spit out slick software applications. It’s beginning to seem though that all of the good ideas have been taken, and the result of a lack of good ideas typically leaves me shaking my head.
I recently heard a radio advertisement from The Weather Channel that had me wishing my car stereo were a recorder so I could rewind and make sure I had heard what I in-fact had just heard. Thankfully I was able to remember the Web address when I was back to my home computer. Before getting to the punch line, let’s talk a bit about weather technologies and how they, like so many other technologies and sources of data, have invaded our lives.
For starters, weather is quite the popular subject in our neck of the woods. Having endured the results of a gravity wave in 2009 (who among us had ever heard of a gravity wave?) and the tornadoes of April 2011, it is probably fair to say that most of us are at least a little more weather-aware compared to a few years ago. I often watch the evening news from Huntsville and question how meteorologists made a living before our modern technology age. I’m still convinced that much of what they do is a guessing game, but taking data from computer models and satellite imagery sure has to be a lot easier than trying to get things right without such tools. Today we have up-to-date weather on our gadgets, phones and websites. It’s seemingly everywhere.
Now to the punch line – and maybe I am missing the point. Outside of the typical weekly weather forecasts which include high and low temperatures, rain forecasts, pollen and UV warnings, The Weather Channel is now a source for – are you ready for this? – mosquito forecasts. Don’t take my word for it, look it up yourself. Apparently someone finds it important or beneficial to know how active mosquitoes are going to be down to an every-two-hour prediction. Sure mosquitoes are aggravating, but so are many drivers on our highways. Maybe I am just letting off a bit of steam but for me, this one takes the cake. A waste of technology in my opinion, surely the time (money) used to forecast mosquito activity could be better used for more accurate predictions of severe weather.
Originally published in The Clarion | August 10, 2011
As time has passed, America has become more politically correct than ever. While there is no disputing they don’t exist, prejudices including racism, sexism, homophobia and religious discrimination (just to name a few) do not seem to be as prevalent or dominant in today’s workplace as they may have been in previous decades. There seems to be an attitude of “hire the best person for the job” instead of “make sure the applicant meets our strict guidelines in relation to age, gender and social status”. Because of the changing trends, an article I recently read really opened my eyes to what some may call the obvious – the lack of women in the world of Information Technology.
As it turns out, women were once very prevalent in many computing professions. Before the introduction of the personal computer, women were vastly employed by corporations as the commercial computer industry ramped up from infancy. The mid- to late-60’s included with it an enormous boom in commercial computing and women were right in the mix making things happen. Various tasks, including computer programming, were gender-neutral. There was a need for capable and proficient employees and the hardware didn’t care the gender of the fingertips sending it instructions. There were even articles in publications like Cosmopolitan Magazine that portrayed computer programming as “women’s work”. Unfortunately, and for many reasons, things would soon change.
The introduction of the personal computer in the early-to-mid 1980’s created a stigma that our society still seems to hold on to today – computing is a guy thing, and the more introverted and antisocial the guy, the better he is at what he does. Surveys show that enrollment by women in computer-related college courses took a nosedive in the 1980’s and remained extremely low until very recently. Thinking back to my college days (barely a decade ago), I would guess that no more than twenty percent of my peers in computer and technology-related courses were female. A guess of ten percent may even be more accurate. From what I recall, all of my female classmates were above the curve. They were good at what they did and most seemed to have quite the competitive attitude. There is no doubt in my mind that they are all successful professionals in their chosen fields of Information Technology today.
Recent surveys show a resurgence of females in the various computing curriculums. This is very encouraging. The world of Information Technology could use a good swift kick in the pants by a group of young, energetic and most importantly professional women. As this group of new female graduates enters the workplace, my anticipation is that the introverted antisocial loner label may eventually be wiped away from our profession. It could be wishful thinking, but I am holding to it. The things we do in Information Technology are certainly not magic, nor are they in any way biased. Good things will come when more women become exceptional contributors to the world of technology.
Originally published in The Clarion | August 03, 2011
Continuing from last week, I feel that in order to be fair it should be pointed out that Verizon Wireless is most definitely not the only wireless provider to implement data caps on their top-tiered services. I only used them as an example of where leading-edge technologies are headed and the ironies that often come bundled with such services from a technology perspective. As of this writing, only Sprint offers unlimited 4G wireless downloads. Beyond wireless carriers, other traditional ISP’s have been and are continuing to implement data caps on their subscribers. This is a very common practice in Canada and some service providers in the States use this method to curb (or cash-in on) utilization.
One question I have received in my call for questions relates to wireless technologies and where I see them a decade from now. From last week’s article, it should be obvious that there is at least something (if not several somethings) that has driven most major wireless service providers to implement data caps. The first thought that comes to mind is a simple lack of capacity on the network, especially the last segment of the network – from the mobile device to the tower. Any Internet Service Provider could theoretically sell any level of service, no matter what the means of transport of the packets to the customer device. In the real world though, there are always limitations to the technology, regardless of which technology is being used to deliver the content. Common sense tells me that even the wireless 4G technology simply cannot accommodate uber-bandwidth transfer rates to multiple customer devices at the same time.
Because of the apparent limitations of 4G wireless services, it should go without saying that in order for mobile wireless device technologies to be much if any different ten years from now than they are at present, new technologies must come about. One newer technology that could feasibly fill the gap and exponentially increase the throughput from the Internet to mobile wireless devices is ‘white space’ technologies that I introduced back in October. By making use of unused frequencies in the broadcast television spectrum, it’s not too far-fetched to anticipate wireless carriers hopping off of their traditional frequencies and to begin selling data services in the ‘white space’. Testing in these areas of the broadcast spectrum have shown quite promising and attractive data transfer rates. Only time will tell if the wireless carriers (and the FCC) proceed with this technology and where it may go. Without a different or improved method of delivery, I feel it is quite fair to say that data caps, at least on the highest levels of wireless services, are here to stay.