Originally published in The Clarion | June 27, 2012
Looking back at computer operating systems over the decades, it is quite astonishing how far technology has come. As machines grew faster and more robust, Graphical User Interfaces or GUI’s replaced simple text-only displays. Over many years, essentially every GUI – regardless of the platform or operating system being used – has remained basically the same. Recently though, with the influx of mobile computing and the tablet-type devices that have become so popular, graphical interfaces have begun taking on a new flavor. Since the display on most mobile computing devices is also the input mechanism for the device, software and operating system manufacturers have redesigned the GUI in order to make these devices as user-friendly as possible. Unfortunately though, at least in my opinion, some software vendors have begun taking the GUI a little too far, and the decisions they have made are going to have an impact on each and every one of us.
The problem with the new era of GUI’s is that most operating system vendors have inherited a one-size-fits-all mentality. My first experience with this unfortunate situation was a few years ago when using the Ubuntu Linux distribution. Seemingly overnight, Ubuntu decided to revamp their GUI to one that was more suitable for the then-popular netbook devices. While the GUI looked nice on smaller screens, in my experience it was quite haphazard on a typical desktop computer. Other GUI suppliers began following suit, causing frustration in the realm of productive computing. Now, Microsoft has jumped on board and has entirely revamped the GUI for their upcoming release of Windows 8.
I want to mention this now simply because the GUI in Windows 8 is so drastically different from every other version of Windows it has the potential to create a plethora of disgruntled consumers once it is released. A few weeks ago I had the (unfortunate) opportunity to try out a pre-release version of Windows 8. I had read reviews of the new GUI and had even seen images of what to expect. Until I sat in front of it though, I was in no way prepared for what I was presented with. Microsoft has thrown all of their eggs into one basket, offering a display that regardless of what device is being used, should only be present on some sort of touch-screen device. The familiar “desktop” is there, but only after it is chosen from a grid of awkward boxes – quite frustrating to say the least. Navigating from place to place on the machine is cumbersome at best. It took only a couple minutes for my frustration level to grow to a point of throwing in the towel. If you plan to puchase a new personal computer with Windows 8, please take the time to try out the new GUI before you do. Nothing can be more frustrating than being stuck with a counterproductive computing device.
Originally published in The Clarion | June 20, 2012
Last week we finished our look at the four primary components of a personal computer (and most any other computing device for that matter) with a closer look at the hard drive. While virtually all computing devices contain these four primary components – the main- or motherboard, memory, central processing unit and some sort of fixed media, most of us know that a typical home or office computer includes additional components. Today we will run down a short list of pieces that, along with the four primary components, complete the puzzle of computers.
It should go without saying that any computing device typically includes a case or enclosure in which the individual parts exist. Computer cases, along with every other component, have evolved quite drastically over the years. While the function of cases has not really changed, their style, design and aesthetics have. Not so many years ago, personal computers were almost always what are referred to as desktop models. The case enclosures were horizontal and intended to sit on top of a desk or table. They were both bulky and heavy and, more times than not, were placed atop a desk with the mammoth CRT display monitor on top. Today, most “desktop” computers come in vertical “tower” cases, allowing the user to put them on the floor, the back corner of a desk or most anywhere else out of the way. This design has become the norm, and while it’s not impossible to find a horizontal “desktop” case, most of the available cases on the market today are of the tower style.
Another design modification that has come about over recent years focuses on fan cooling. Today’s faster, more robust computer components generate an enormous amount of heat. To overcome this potentially fatal aspect of computer components, most computer enclosures now include at least one case fan which is integrated into the case and strategically placed in order to properly assist in cooling the devices inside. Some cases, depending on the application, are designed to include several fans – as many as four if not even more. Also, modern CPU chips come with a heat sink that sits directly on top of the CPU, pulling heat away from it and quite often include a fan on top to help cool the heat sink. Extreme computers (typically for gamers or hobbyists who like to push the limits of their machines) are often modified with liquid cooling – using water, liquid nitrogen or any number of other liquids to flow throughout the machine in order to transfer heat out. This, of course, should be done with extreme caution as liquids and electricity simply do not mix. A final consideration of case fans is that in order to function properly, they must be kept clean and free of obstructions and debris – primarily dust. An occasional cleaning with compressed air in a can can help ensure a long life of service from cooling fans.
Originally published in The Clarion | June 13, 2012
Last week we continued our look at the four primary components of a personal computer (and most any other computing device for that matter) and took a closer look at the central processing unit, what it is and how it fits into the mix. As a reminder, the four primary components of a personal computer are the main- or motherboard, memory, central processing unit and some sort of fixed media. Today we will look at the last of our four components, the fixed media – typically referred to as the hard drive. Contrary to popular belief, a personal computer must not necessarily include a hard drive to function, although almost every computer has one. Thanks to technologies like Live Discs and Client/Server computing, it is possible (and even practical) to have a machine that includes everything but a hard drive for functional computing. While appropriate in some scenarios, such a setup is not the norm in typical home and business computing environments though.
Hard drives are the meat and potatoes of typical computers. When they are performing properly, all things are good. On the other hand, when things turn sour, things can get really really bad in no time. Along with power supplies, I feel it is fair to say that hard drives are almost always the most vulnerable components of a typical desktop or server machine. There are a few underlying factors that attribute to this, including their physical structure and the inherent vulnerabilities of the operating systems and software we install on them. Unlike most other PC components, the hard drive is a mechanical device. It has moving parts that are constantly on-the-go if you will, and like most any other mechanical device, things eventually either wear out or break. Unlike a transmission in an automobile which can and should be serviced on a regular basis to ensure functionality, the mechanical parts of a hard drive are not serviceable. Hard drives simply work until they quit, and once they do quit there is typically no getting them back.
I have written many articles over the last couple of years stressing the importance of data backups. The number one reason for this recommendation is the failure rate of hard drives. Whether physical damage or a software virus infestation, failure of hard drives is a “when”, not an “if” situation. Sure things like software virus issues can be remedied, but doing so often results in data loss. It is also possible to retrieve data off of a physically-broken hard drive, but such services are only provided by professionals and are never cheap. Never forgetting that your hard drive will some day fail will assist in reinforcing the fact that routine data backups is a must.
Originally published in The Clarion | June 06, 2012
Last week we continued our look at the four primary components of a personal computer (and most any other computing device for that matter) and took a closer look at the motherboard, what it is and how it fits into the mix. As a reminder, the four primary components of a personal computer are the main- or motherboard, memory, central processing unit and some sort of fixed media. Today we will look at the Central Processing Unit or CPU. Without the CPU, a motherboard has extremely limited functionality, and your fixed disks (and operating system, software, files etc. included on them) are essentially inaccessible. While the motherboard can be viewed as the central nervous system of a machine, think of the CPU as its brain. The CPU ties all of the “nerves” on the motherboard together, sending and receiving instructions to and from all of the components included in the system. Just like with humans, some brains (CPU’s) are more diverse and capable than others. Unlike the human brain though, a CPU doesn’t necessarily store information, it simply gets the instructions from one place to another similar to the human brain telling my fingers which letters to type as I compose this article.
With this explanation, it should be very apparent how vitally important a CPU is to a computing device. Without it, your digital photo or music collection on your fixed disk is inaccessible. At the same time, the more “capable” your CPU, the more you are simultaneously able to do things on your machine. CPU’s even as recently as ten years ago were quite incapable compared to ones available today. Thanks to multi-core technology, one physical CPU chip might actually be two, four, sixteen or even more individual processors all in one. When used with a compatible operating system, a physical computing device can run a plethora of applications simultaneously, spread across the individual processors. This feature has allowed for a newer server technology called Virtual Machines (VM’s). Larger companies (and anyone for that matter) can have one physical server with one physical CPU yet run several individual “virtual” machines on the hardware simultaneously – resulting in extreme cost savings on hardware, rack space and electricity. For example, a 16-core CPU installed into a server can easily function as a 16-server box – one processor dedicated to the physical VM server itself plus 15 individual VM’s all running at the same time and providing any number of differing functions and services.
CPU’s are quite rugged devices. As long as they are kept cool via proper fans, rarely ever do they cause problems. One important note about CPU’s is that, unlike devices like CD ROM drives, they must be properly matched up to the motherboard on which they are installed. Odds are very good that a new four-core CPU chip will not work with your 5-year-old motherboard. CPU’s must be paired up with a compatible motherboard. Upgrading one almost always requires upgrading the other.