The mid-2000s were full of change in the personal computing world, most notably when Microsoft introduced Windows Vista in 2006. Vista was Microsoft’s first new consumer operating system (OS) in over five years, the longest amount of time they had allowed to lapse between consumer OS releases to date. It was the successor to the long-tenured Windows XP.
Although Windows Vista failed to meet expectations, the new OS brought with it a host of new technologies, both hardware and software. Some important ones are still alive and kicking, while others have been left in the archives with the now-defunct operating system. Let’s take a look at four technologies that Vista helped bring to light.
SideShow made headlines at CES 2007, but its time in the limelight was short-lived. In fact, its existence was so brief that only a few supporting devices ever made it to market.
SideShow was a secondary display built into notebooks and tablets. The technology could have been built into other devices as well, like remote controls. The SideShow display could have been full color or monochrome, and was primarily intended for quickly viewing information, such as email, calendar appointments, and data presented by other gadgets, like the weather. It worked even when the host device (usually a notebook) was powered off by running on a small memory bank that cached data from the operating system (Windows Vista). The idea was to provide convenient access to data without having to stop and open up your computer.
So what happened? The modern smartphone made SideShow obsolete before it could make it off the ground. The original Apple iPhone was announced in 2007, and the rest is history. Imagine taking your bulky notebook out of your bag in public to poke at a miniscule display on its backside, trying to make out emails and other data. It’s a ridiculous concept in light of what the smartphone is capable of today.
One of SideShow’s faults was that when the host machine was off, the data it displayed was merely cached, not updated in real-time. You’d have to turn the host computer on to get actual updates on the display. In addition, user input was limited to an arrow keypad. In other words, SideShow was useful for basic read-only or acknowledgement-based tasks, and little else. Not a whole lot could be seen or presented on such a small display in the first place. But at the very least, we can say it looked cool.
Here’s our original in-depth SideShow article.
Intel Turbo Memory
Intel Turbo Memory, the first mainstream flash memory cache technology, was available in mostly high-end notebooks with Windows Vista, like the Lenovo ThinkPad X61. A flash memory cache is designed to supplement the performance of the computer’s storage drive. Storage is typically the slowest part of a computer. In the Windows Vista era, storage in personal computers was most certainly a mechanical hard disk, as Solid State Drives (SSDs) were then in their infancy. (In an ideal world, we’d use SSD storage exclusively, but the cost per GB makes it prohibitive. Flash caches used in conjunction with traditional hard drives help provide SSD-like performance in certain situations, while still primarily using economical mechanical storage.)
The original Turbo Memory 1.0 in Windows Vista was a 1GB flash memory card connected to the computer’s motherboard via a PCI-express slot. Half of it was dedicated to Vista ReadyDrive, which acted as a write cache for the operating system, and half to Vista ReadyBoost, which cached small, frequently-used files. Both of the aforementioned technologies were there to improve the overall responsiveness of the computer when it came to running programs.
Turbo Memory 2.0 was released in 2008, and was available in 2GB and 4GB cards. Unlike version 1.0, it supported Intel Pinning, a software technology that allowed you to select applications to cache on the Turbo Memory card. The idea was that the applications would be more responsive running off the faster flash memory than they would be running off the computer’s main hard drive.
Although Intel has long discontinued Turbo Memory, the concept is still alive and kicking. Storage performance continues to be the bottleneck in most computers, so any improvement there is always welcome.
Turbo Memory spilled over into the hybrid hard drive concept. Seagate was the first to introduce one of these in 2010, dubbed the Momentus XT. The original Momentus XT combined a 500GB hard drive with a 4GB SSD cache. It used a proprietary Adaptive Memory technology to automatically cache frequently-used data and applications in the SSD. If you used a certain program often, for example, the drive’s cache would identify that behavior and store the program’s files, thus making future launches of the application faster than reading it off the hard disk itself.
When flash memory became less expensive, notebook computers started to include SSD mini-cards to act as a boost cache for the operating system. These were typically 32GB or less in size. The HP Envy 4 TouchSmart we reviewed in 2012 featured such an arrangement, with a 32GB SSD coupled to a larger hard drive. The SSD didn’t actually show up in Windows as a drive, but was slaved to the hard drive behind the scenes. The first 32GB of data written to the hard drive was mirrored on the SSD, and whenever the computer read that first 32GB of data, it was reading the SSD.
Even today’s entry-level computers are starting to come with real SSDs of usable capacity, thanks to lowered flash memory prices. 256GB and larger SSDs are readily found in notebooks priced at a few hundred dollars, whereas they wouldn’t have been found in notebooks south of four figures just a few short years ago.
If flash memory was as cheap as hard-drive based storage, fill-the-gap technologies like Intel Turbo Memory and flash caches wouldn’t have a purpose for the average user. Now that flash memory prices are at an all-time low (and continue to fall), and SSDs are becoming more and more affordable, the usage of these flash caching technologies in personal computing is becoming limited to specific applications that require extraordinarily high bandwidth, beyond that offered by a traditional SSD.
Manually Switchable Graphics
Back in the Windows Vista era, it wasn’t technically feasible to switch the OS’s display adapter e.g. graphics card on the fly. However, it was possible to have more than one display adapter and switch between them by restarting the computer
The idea behind switchable graphics was the same then as it was now. This concept allows a notebook to provide good battery life and good 3D performance, with a dedicated graphics card for 3D work, and integrated graphics for power-sipping battery life when 3D prowess isn’t needed.
The Sony VAIO SZ was the leading example of a notebook that supported manually switchable graphics. The first iteration we reviewed in 2007 had integrated Intel GMA 950 graphics on its motherboard, as well as a dedicated Nvidia GeForce Go7400. The former was dubbed Stamina mode, and the latter Speed. A hardware switch above the keyboard allowed switching between the two.
Manually switchable graphics have been almost entirely replaced by dynamically switchable graphics, notably in the form of Nvidia Optimus. This technology automatically detects when 3D performance is needed, and switches to the dedicated graphics card without user intervention. Technological improvements since Windows Vista have made this on-the-fly switching feasible.
(Nvidia’s BatteryBoost is a related technology, designed to improve battery life while playing games on the dedicated graphics card. It throttles the performance of the graphics card to meet the desired level of in-game performance, thereby consuming less power than running the game as fast as it can. For example, you might decide to play a game on medium visual quality settings while on battery so the graphics card doesn’t have to work as hard, therefore extending the amount of time you have to game.)
Windows Vista Sidebar and Desktop Gadgets
Desktop icons were simply static little objects you could double-click, until Windows Vista introduced desktop gadgets. These interactive pieces of software were several times the size of a normal desktop icon. They could show all kinds of information, such as the current weather, an RSS feed, and battery usage, while others allowed user input. These gadgets lived in Windows Vista’s Sidebar, a dedicated section along one side of your display. Many gadgets could exist in there at once.
The Sidebar concept was left behind with Windows Vista. Gadgets continued to live on in Windows 7, however, where they could be placed anywhere on your desktop, as opposed to just living in the sidebar. (It makes you wonder why it wasn’t like that in Vista.)
But starting with Windows 8, gadgets went by the wayside, too. They were replaced by the active tile-driven Start menu. Tiles are in large part similar to gadgets, but more versatile in the sense that they can be clicked on to launch an actual app or program, as opposed to their functionality living entirely within the gadget itself.
Although Windows Vista didn’t garner the best reputation, it’s hard to argue it didn’t benefit the tech industry. It was the necessary precursor to Windows 7, Microsoft’s most successful operating system to date. The technologies we looked at in this article were all part of the futuristic technology wave that surrounded Vista. The concepts we explored benefitted the personal computing industry in some way, and many of the concepts are still very much alive today. Intel Turbo Memory, for example, was the first mainstream flash memory cache technology, designed to improve overall system performance. Flash memory caches are still widely used in storage technology, from hybrid hard drives to slave SSDs. On the software side of the equation, Windows Vista’s desktop gadgets were the runners-up to the tile-driven Start menu in Windows 8 and 10.
The overall takeaway is that change in the personal computing industry usually comes in waves spurred by a firm with industry-wide influence. Apple and Microsoft are two that routinely bring about change in the way we use technology, for better or worse.
If you liked what you saw here, take a look at our Top 10 Iconic Notebooks article, which goes back through a decade of notebook computer history to explore products that changed the industry.