Did the “Low -end” upgrade or did the “High-end” downgrade? Why are we losing the feel of a real workstation for the feel of a smart phone or tablet?
In the mid 90’s, when I started my professional career in graphics and print, I progressively encountered relatively boosted PCs. The contrast between the “user grade” and “professional” grade was considerable.
But before, let me showcase something I noticed through out the years. In the late 80s computers were very immature in today’s terms. Display was shy, and the display to hard-media transfer was unreliable (at least for the vast majority of non-dedicated machines back then).
In early 90s, WYSYWYG, in parallel with the advancement of graphics and displays, made our machines more usable, more productive, and no longer simple hobby items.
In the mid 90’s, these machines were able to do absolutely everything we do now. But what changed was the bigger files were getting and the lesser times they consumed. For example, in 1995, one was able to make 100x70cm 300 dpi raster file, but working on it needed tens of minutes for every step. Additionally, most probably that file consumed 1/10th of the hard drive, while today that same file, given the limits of printing, takes few seconds to save and maybe takes up 1/1.000.000th of the hard drive.
In our current era, there are many thresholds we are not able to overcome, due to the disproportionate growth of many technologies that need to be implemented together. A quick example, in smart phone R&D, efforts are being made to increase the computational power of the processors AND reducing its power consumption simultaneously while batteries technology is evolving at snail speed.
Another more elaborate example, in digital cameras, a camera sensor has 3 factors in its evolution: (1) pixel count, (2) SNR, signal to noise ratio, (3) power consumption, if these 3 “tracks” don’t synchronize, it will lead to a fail to makers, or at least a major setback in the overall camera model progress.
On the other hand, an optimal sign of proportionate technological progress: a 6-megapixel image needed 1-3 seconds to open on year 2002 Photoshop, using an industry standard Xeon based workstation. The same timing is now needed to open a 35-megapixel on the current Photoshop and a current Xeon workstation. Refer to Moore’s Law.
Meanwhile, display technologies are not catching on the same rate (excluding ultra high-end and specialized solutions that rarely make it into mass professional market).
In short terms, in computers as standalone devices nothing fundamental has now changed in the last 20 years, i.e. the way it changed in the 1990 milestone, the same for the 1980 milestone, for instance connectivity, like networking and internet is whole different plane and not covered in here.
Now let’s get back to the article’s main point. There is something I noticed in the evolution of most professional tools in our hand, it is once again the seam between the high-end and the low-end that is getting smoother and the gap smaller; either by the “high” not getting higher or the “lower” getting higher. For example, in camera sensor manufacturing, entry-level DSLR have sensors 90% as good as the flagship model, refinements that make the flagship superior are mostly based around feel and handling artifacts while they were drastically qualitative few years ago. Modern laptops compete with workstations on almost every type of work, and the timings gaps are no longer as they were at least 10 years ago (jobs on a decent workstation back then were impossible on an end-user pc or laptop). One might think that the design is one and specifications are only reduced by manufacturing processes. Maybe the scope of operation of a workstation is wider, but most jobs fall within the range in which the laptop and workstation both perform great. The same applies for entry-level cameras and flagship when situations are optimal.
Customizing a workstation with cards and peripheral is no longer an art. In 2009, I reverted back to Apple when they embraced Xeon processor, which I was using since 2003, and now at Apple, our margin of freedom peaks at selecting the highest specs of what they only offer. It is also odd to see top features implemented in lower models and not being as an option for the flagship (same with Nikon cameras.)
With those blistering OSX upgrades, a workstation no longer feels or behaves as a workstation, it is now an ultra fast iPhone with a 27 inch display. Personally, I lost the feel that I am running a machine that needs dedication, concentration and pushing to the inaccessible limits where lower machines fail. That feel of downgrade, is absolutely the same feel I got when the computer made its way into the living room.
In the 1984 Macintosh commercial by the almighty Ridley Scott, these guys made a loud message “killing the Big Brother” (IBM back then), but unfortunately years turned Apple into one. Our computers will re-become terminals for input and display only, if not tomorrow, the day after.
The power of this beast, and its form factor is unprecedented. But the iterations of operating systems are downgrading the experience update after update
Customizing a workstation to one’s rhythm, leaves him with no excuse. A good work-station should be as fluid as its user’s thoughts
Realtime aka responsiveness and absence of lag, is key for making the machine an extension of one’s body (photo from 2001)
Always push your machine to its limits, or at least make her perform where it performs best. (one of a pair of Intel Itanium II demo processors I once got. Unfortunately i had to settle on my 2.4Ghz dual slot and 3.1Ghz dual slot Xeon computers since I couldn’t find a board or the proprietary software to run them).
Listen to every word. And you shall deduce how tools help in achieving qualitative supremacy. Note the use of “Real-Time”
Register below, and stay updated