The One Thing Windows Vista Did Right

Photo: Justin Sullivan, Getty

Vista was bad. Coming five years after XP, it was heavily anticipated by Windows users who were impatiently awaiting something interesting from Microsoft as Apple's star was on the rise. Yet when the OS dropped publicly in January 2007, it was immediately reviled by, well, everyone (except our expert reviewers).

It was slower than XP, had annoying DRM that grossly restricted what people could do and removed a ton of features people liked. It is not hyperbole to say it might be the most hated software product Microsoft has ever produced — impressive for the company that gave us Internet Explorer and Clippy. But Vista did one thing very, very right and 11 years later, it's never been more in fashion.

So what was Vista actually prescient about? Translucent design elements.

All the way back in Vista, Microsoft introduced Aero, a design language intended to be a futuristic update to XP. Aero's most eye-catching feature was the Glass theme, which could make elements throughout the UI transparent. When it was released, it didn't get more than a passing nod from reviewers who noted it was slick if somewhat irrelevant to the actual performance of the OS.

Aero lasted through Windows 7 — Microsoft's most critically lauded OS until Windows 10. Then in Windows 8, Microsoft introduced a new design language: Metro. Metro actually kicked off another major trend in user interface design: flat design elements. But it still maintained some of the cool translucent effects introduced in Aero.

Those translucent effects were carried over to Windows 10 and are easily seen in Edge, the Start menu and the Notifications panel. They're so popular, some Windows 10 users are even hacking the OS to add translucency and transparency to everything else!

The effect is super noticeable in the start menu. (Screenshot: Windows 10)

The trend isn't reserved to Windows. Apple seems to have been inspired, too. That's because UI designers, like everyone else, are subject to trends. Once upon a time, everyone tried to make their app icons and buttons look rounded because of iOS.

Then, after Windows and Android embraced a flatter look, iOS followed suit with iOS 7 in 2013. It also began sprinkling that sweet, sweet translucent design throughout.

Look at these pretty menus! (Screenshot: macOS Mojave)

The translucent elements first appeared in Mac OS X Leopard 10.5 as an option to turn the menu bar translucent. That was in November 2007, nearly a year after Vista launched. Apple seriously began showing off translucent elements when iOS 7 added translucent menus and notifications in 2013. MacOS 10.10 Yosemite began embracing translucency a year later.

Since then, both Apple operating systems have added more and more translucent elements. The most recent additions come courtesy of the betas for macOS Mojave and iOS 12. That's because both are adding dark translucent elements, which seem to highlight the translucency effect even more.

It is reminiscent of glass that's been frosted and tinted. It's very attractive. Sometimes I get distracted into marvelling at it instead of doing work.

I mean just look at it in Safari!

GIF: Alex Cranz, Gizmodo

It's so good, I find myself using Safari instead of Chrome just so I can watch stuff I'm scrolling through turn blurry as it hits the browser frame.

The transparent elements, while not as ubiquitous in iOS, are still present there too — particularly in the iOS 12 beta, which has done away with the garish white panels in the notification centre and embraced a dark and translucent look.

Screenshot: iOS 12 Beta

Since Microsoft introduced Aero in 2007, the transparent elements of the Windows UI have evolved and been refined from an operating system's splashy party trick to an elegant element you might not even notice. Apple has embraced the trend and even Android is now flirting with translucency.

Since Android Oreo was released last year, more and more translucent design elements have appeared throughout Android. It's especially noticeable in the beta for Android P, the next version of Android expected later this year.

From left to right: The Notification menu in Android P. Top view of open apps in Android P. The notification menu on a Samsung Galaxy S9. (Screenshot: Android)

Google's Android, like Apple, is embracing the trend begun with Vista. Which means, yeah, one of the touchstone design ideas in operating systems and apps today didn't come from Microsoft's best operating system. It came from its worst.


Comments

    It is not hyperbole to say it might be the most hated software product Microsoft has ever produced

    Nope. Windows ME was a hundred times worse.

      Oh that's right, totally forgot about ME. I skipped Vista so missed out on all the fun, but knowing me and my want to try all the new stuff back then, I probably did try out ME.

        the best part about ME was the cdkey, all 0 was a vild cdkey, LOL

      I'm on of the few people on the planet that had no problems with ME, In a shared house, its inbuilt internet sharing was most helpful.

      Thing with just about every "bad" Windows OS was that they were followed by a great one. Which was built on the foundations of that "bad" version. A lot of ME made it into XP, Windows 7 is heavily based on Vista, and there aren't that many differences between 8 and 10.

      The newer versions just don't have the stigma and hate attached to the name. But look under the hood and there aint much difference.

        Sounds like you were lucky, everyone I knew that used WinME had no end of trouble. XP was a godsend in comparison.

        Windows 8 onwards has been Microsoft's shift to iterative OS development. I don't know if it's still their plan but it was at one point their intent that Windows 10 would be the last major Windows version moving forward and future development would go into iterative updates to that instead. Understandable, since it helps keep everyone on the same version and reduces their LTS burden for older operating systems, and things like the Spectre fix really drilled home how important it is to have everyone on the latest version.

        I know a lot of people disagree, but telemetry controversy aside, Windows 10 is definitely the best operating system they've made to date. The kernel is incredibly stable, the rendering pipeline is mature, minimalist and clean, and it has by far the best error handling/recovery to date. As a software developer and PC enthusiast I genuinely recommend it for everyone, even the XP/7 holdouts.

          Vista was the only problem OS I didn't use, but from personal experience most peoples trouble was a bit overstated. More relevantly, the guts for most of them got recycled into something people loved, so the ideas weren't BAD, just badly implemented.

          I remember with Vista as an example, Microsoft made the mistake of asking people what they wanted, and got told "more security". Which they gave by dialing it to 11. That security was still in 7, you just didn't get asked 5 times if you want to install something.

          Its the asking 5 times crap that people remember though, not the much better security versus XP.

          Windows 8 saw the tiled approach, which people hated. They didn't want to change from the Start Menu approach (and I cant blame em), so rejected it. That launch screen was really just a full screen start menu though. And you still see it in 10 in the current Start Menu, just dialled back to what people can accept.

          I live 10 myself, its a great summary of everything that's evolved since 95. Only downside is the lack of support for older software, and how hard it is to get some older games working. Games I still don't mind playing.

          To the point I'm tempted to get a cheap lappy, and downgrade its OS to something like 7 or XP just to play them. I still have an unopened copy of 7 somewhere.

            I actually haven't had any issues with old games not working on 10, but I haven't tried that many to be fair. DosBox works just fine for the particularly old generation, I guess it's more stuff designed for the Win95 era that have the most trouble.

            Virtualisation is an option if your PC is beefy enough, although I've never used it to run hardware-accelerated games so I don't know which suites can handle it and which can't.

              Most games work fine thanks to DosBox and compatibility modes, but there are a few games that just don't fit the mould. The Movies for example, which I don't mind running through once or twice every couple of years. It just doesn't play nicely with Windows 7 onwards, and I assume Vista as well. I was able to get around it in 7, but haven't been able to with 10. Unfortunately, theres no other game like it that I can replace it with.

              There are a few others from roughly the same period that aren't set up well to work in the newer environments. In the end its mostly the loaders, and there are often workaround out there, but it can be some work. Compatibility modes don't fix things either.

                Windows 10 did basically drop support for certain types of copy protection. Mainly older versions of TAGEs and SecuROM.

                Maybe that is the cause of your problem?

                  Yeah that would make sense. I know some of them broke because of the SecuROM issue, but not all of them. The Movies for example I could get working in Win 7 by running a secondary exe rather than the boot exe, but that doesn't fix it in 10.

                  Its not the end of the world, I have plenty of other games to play, but when I get the whim to play one of those old games its annoying when they don't work. I'll always have Transport Tycoon and Rollercoaster Tycoon installed for example. If they stopped, I'd be devastated, but thankfully I have versions that have no probs.

                  @skrybe yeah, I've seen the VB solutions. I was just not THAT interested in playing it that I bothered tbh. Its a lot of work for a game I'd end up uninstalling a month later.

                  Its just a bunch of old games I like loading up for old times sake. I don't really want to go to too much trouble, its just a shame they no longer work.

                  Plus I don't really like those type of solutions for what you suggest - they're pathways to viruses, and with how many of those solutions end up getting the job done, you can never be sure if the virus checker alert is a false positive or not.

                  @grunt: While I'm sure that some of game patches (cracks) have viruses in them I've never encountered one. And when the patch is on a major fansite for a game it's going to have a lot of eyes on it. Typically if it was dodgy it'd be called out very quickly.

                  And fair enough, I understand the sentiment. I have a few old games I wouldn't mind playing but they don't run on first install so the interest wanes rather quickly when there are dozens of other games to be played.

                  Side note: Not advocating piracy, but when a game you own is unplayable and a crack fixes it then I reckon go nuts.

                  @scribe I have no problem with cracks that make playing games you own easier. No CD exe's for example, just so I don't need to go searching for a disc every time I change games.

                  Convenience is important and one of the biggest reasons Steam got so big so fast, and still relates to non-steam games.

            I tried Vista on my main PC but by default it had some QoS tweaks that messed with network traffic. I remember trying to move files around and it was taking literally twice as long as it did on XP. It didn't last long on my PC before I reinstalled XP.

            I find 10 frustrating partly because of the "we're gonna update whether you like it or not" approach and partly because of their design decisions. Stuff like having a settings app and a control panel. Dammit, put everything in one place! Plus tablet style controls and UI elements that look like ass when displayed on a 30" 4k monitor. They may be fine for a tablet but not for a desktop.

      Sometimes the brain blocks out traumatic memories.

    I used ME and Vista, and both products for me were superior to the alternatives on the market (98 and XP at the time).

    Whilst I'm willing to allow ME the benefit of the doubt (MS removing protected DOS did mess with some people, although it was massively overstated IMO), Vista's issues for most came down to two issues, and only one was Microsoft.

    Microsoft's performance killer was Indexing was on by default, which hurt performance pretty hard on those of us not running a RAID array. You turned off indexing, and Vista worked pretty well.

    The second is almost entirely Nvidia's fault, and their bugtastic drivers that bluescreened Vista like crazy.

      It has been a while since I installed either ME or Vista on anything, but I recall both being very solid.

      I do recall being pretty sad that MEs support for DOS native had departed, but it was time to move on.

      My only recollection from Vista, was how SoundBlaster dropped their SBLive soundcard drivers like a hot meat pie (because Vista was going to do away with game ports (old pin interface connector for joysticks) in favour for USB).
      The bit that pisses me off about that, was that literally a year or two earlier, MonsterSound were still making much better soundcard hardware for gamers, and lost out to SB...
      The joke is on SB though, I’ve been using on board soundcard a ever since... no looking back now.

        The Aureal based soundcards were amazing. I had the Diamond one and it kicked the crap out of the SB cards when it came to sound positioning. It was almost like having a cheat on in games that supported it (counter-strike). Was disappointed when they went under then got bought out by SB. Like you I haven't used a dedicated sound card in years now. Onboard realtek has been good enough for ages.

      People forget that NVIDIA had ages to write a driver for WDDM that wasn't complete shit, and they completely failed at it.

      WDDM was a major step forward - a crashing display driver could just be restarted without killing the entire system!

        WDDM is the biggest thing that Vista did right, not "translucent UI" like this stupid article states.

      Among its many issues, Windows ME had straight up memory leaks in the OS itself. To the best I can recall, it was never fixed by Microsoft during ME's normal run (perhaps never fixed?). Microsoft seemed more interested in focusing on NT kernel supremacy instead, with XP being its next major release.

        I was going to ask what your personal horror stories were for Windows ME (I've only used it in a VM without issue), but I don't think I need to hear more.

        If a OS has a memory leak in itself then it clearly is fundamentally unstable.

        "Restarting" to clear memory was an issue with Win98 as well from what I remember.

    AMD had the same issues with drivers.

    The change to the new WDM caused tons of issues with hardware.

    ME was a terrible buggy mess and W98 was a far better OS.

      This is actually false. ATi's drivers were relatively stable on Vista release.

    My favourite part of vista was the ability to put commands right into the start menu, no more going to RUN first.

    Vista was pretty unfairly maligned, in my opinion. It came at a rough time. Personal PC usage increased massively with Windows XP - more people were using computers than ever, often for the first time. This was all during the mainstream rise of consumer gadgets and a huge spike in 'average' people using PCs. So for a lot of people - Windows XP was all they knew. Vista dropped, and for a lot of people it was the first new thing they'd seen, and it was more a case of 'THIS IS NEW AND DIFFERENT, THUS I HATE IT.'

    It also came at a time where RAM specs were really starting to shift, and we had the rise of budget laptops and desktops - stuff around the $500-$800 mark, which unfortunately were made so cheap by massively underspeccing them on RAM - amongst other things. Most budget PCs came with 512mb, which isn't enough for Vista to work well - then couple it with most bloatware these budget PC manufacturers (Compaq/HP, etc) filled them with and it was a recipe for disaster.

    A lot of these PCs were doomed from the start, and people squarely blamed Vista for it. Sure it wasn't perfect, but it was nowhere near as terrible as people made out. Strip the bloatware out, and put 1-2GB of RAM in, and it trundled along beautifully.

    I was working in a local computer repair shop at the time, and lost count of how many budget PCs came in with people screaming 'VISTA IS TERRIBLE AND I WANT XP BACK!' In a lot of cases it wasn't possible because there weren't XP drivers for these machines. But with a small RAM upgrade and a bit of love, most people were shocked at how well their PCs ran.

      That generation of laptops with Vista crammed on, nowhere near enough RAM and zero accountability for bloatware was the stuff of nightmares. I know so many people who threw away serious chunks of cash on laptops that were never going to work properly.

    While not horrible by any means of the word, I hope nobody minds if I share my Vista tale.

    It sticks out in my mind because it was also the OS I installed on the first PC I built myself (in the past a had a shop build it for me).

    And talk about being dumb - I installed 32-bit Vista because I failed to notice my Core2 Quad (a Q6600 from memory) was a 64-bit operating system.

    That aside though, despite the constant horror stories, I didn't have issues with Vista and I even had a 8500 GS back in the day (yes I did try to get a 8800 GT back then but the weren't to be found for love nor money!).

    It was slow (I only had around 8 GB of RAM at the time) but it still did the job and I got by.

    And of course, eventually the betas for Windows 7 were floating about. So I got a copy off my co-worker (I didn't use the key, I was only going to use it for a few days), and installed it into VMware player.

    Needless to say, it was limited in its resources (only 2 cores and 4 GB of RAM).

    But here was the kicker - the Windows 7 beta in my VM was running better than my native install of Vista.

    Granted, there could have been other factors but needless to say the difference was great enough that when Microsoft opened the public beta, I signed up and eventually got my download image.

    Even though it was in beta, it was also where I composed my PhD thesis and was happy to get the full version of Professional when the time came.

    In all, I like to feel that Windows Vista suffered the same headaches as Windows ME (granted the latter or which is academic to me as I never owned nor used it during its hay-day).

    When it comes to operating systems, they are often only as stable as their most buggy driver. The removal of real-mode DOS clearly disrupted things and most likely broke the inner workings of some drivers thus leading a hand to the instability. At the same time though, it was the turn of the millennium so ME was (in honest truth) a cash grab.

    Vista had the same problem. The architecture for its drivers had changed but some (not just nVidia) either got lazy or basically just said "FU" to Microsoft (I'm looking at you, Creative!).

    So to me, Windows 7 was not only a significant improvement in terms of architecture and optimisation - it was also damage control for the relationship between Microsoft and driver manufacturers.

      Scratch that part about real mode DOS.

      I always thought it was removed but nope, I was wrong. It was just restricted.

      Can still have effect but the claim I made was still wrong.

      I'm trying to remember, was Vista the first one where MS blocked low level access to hardware? If so that was the reason drivers from companies like SB really suffered. The added layer between hardware and driver was a real problem.

      I had Vista preinstalled on a laptop I bought and found the same thing you mentioned. Windows 7 ran way, way better on it. I even installed windows 10 on it a couple years ago as my test device, and Windows 10 ran better than Vista.

        I'm trying to remember, was Vista the first one where MS blocked low level access to hardware?

        Er, I think that was NT 3.5 (?) where NT was the OS. Windows 3.11 all the way up to the 9x series (including ME to a degree) were graphical shells on top of DOS.

        NT went on to be NT4, 2000, XP and so forth.

        So no joke, Windows 10 had its origins all the way back in NT 3.5.

        If so that was the reason drivers from companies like SB really suffered.

        I'm assuming by SB you mean SoundBlaster (sorry in advance if this is wrong).

        From my experience, I had a Audigy 2 Value (still a good card for older systems) which supported EAX (added to the environmental effects) but in Vista onwards that capability was missing.

        What put me off was a modder created custom drivers to bring that functionality back into Vista and Creative responded by setting their lawyers on him.

        The capacity was there, Creative just didn't want to put the resources and were trying to push their (then current) X-Fi products.

        It's why I ask about any manufacturer other than Creative when it comes to sound cards as any good there is long gone.

          I'm not just talking about a HAL. From memory there were drivers using that for awhile. But there was a version of Windows that basically stopped all drivers from bypassing the HAL. Which caused major headaches for soundcards like the Soundblaster ones. I thought (but could be wrong) that was Vista. I haven't been able to find a relevant article via google or I'd confirm it (too many hits and they don't seem to be related).

          I've used Windows since 3.0, we went to NT4 (skipped NT3.5 for the standard desktops for some reason) at work while I stayed on the Win95/98 branch personally until we started using Win2k at work then I jumped to that.

          I feel like Creative bought and buried Aureal simply to keep pushing their (at the time anyway) inferior alternative. I'm pretty sure I used those modded drivers on the Aureal card I had back in the day. I remember it running fine for ages then needing to tinker to get it working after a windows update - so it's probably the same thing you're talking about.

        WDDM was a thing back in XP, and you could still negotiate access to harder, just device driver manufacturers really didn't want to update their drivers to comply with it. Which was sad.

Join the discussion!

Trending Stories Right Now