As I stated, I've used PC's, Linux boxes, UNIX systems, and Sun SPARCS. Due to commonalities in the graphical user interface concepts, they've all been easy to use. (I'm not saying I logged on and was an instant power user. I'm stating that I have been able to logon to a variety of systems, and been able to work as needed.
Exactly- I believe the main issues people have with the different UI's is they try to take the paradigms from one UI and apply them in another- that is, they treat, say, Ubuntu, or the Mac OS, the same as they would windows; and then, when they fail, they blame the Operating System GUI.
A Prime example, would be my experience with the Mac OS, which was AGES ago (OS 8.6), I wasn't super-familiar with windows, either. anyway; back then a Window had a Square box in the upper left, which closed the window, and on the right a "restore" button that maximized it, and a "roller-upper" thingy that shrank the window to a title bar.
For my first uses, I mentally translated the Square button to be the same as the windows "X" button- that is, it closed the window. However, I later discovered that the analogy ended there- while it closed the window, it did not close the application, as evidenced by the task-switcher; the applications were still running. At first I even assumed that closing the program was impossible, however, it became clear that I could switch to the application, and the system menu would change accordingly, allowing me to select the proper command to exit.
Once I figured that out things went a lot smoother... (namely because I didn't have 20 or so instances of Internet Explorer)
These same sort of nuances are also found (more subtle, of course) in succeeding versions of Windows.
Take the switch from Windows 3.1 to Windows 95. Basically, the focus was shifted from Windows 3.1, which was more or less application centric (you start the program, and use the Open command from there to open the file) to Document-centric, where you started a document, and windows started the proper application with that document- the idea was, that you no longer needed to know to start, say, "Excel" to open an excel spreadsheet. Of course, you could still start the application and then open it, too.
Additionally, Windows 95 overhauled the user-interface- Windows 3.1, was for the most part "flat" in appearance; while 3-D controls and libraries were available the core Windows controls still drew in monochrome. Windows 95 rewrote these default drawing procedures to draw controls and windows with 3d borders and shading, in a fashion similar to that exercised by the libraries from windows 3.1. It also moved the Various window control buttons. Windows 3.1 featured three buttons in it's windows- the Control Menu Button (which probably had the strangest icon of all) the Minimize, and maximize buttons. 95 created a new icon- the close button, which never existed previously (unless you count right-clicking the control menu) placing it on the right, where, in windows 3.1, the maximize button lived. The control button became a control icon, which would display the newer smallicons that windows 95 programs would have (or, if running a 3.1 application, a shrunken version of the 32-pixel icon). this icon retained the functionality of the old control button, and did away with the strange toaster-like icon that really didn't make sense. (What was it! I beg of you!!!?). At the same time, however, it introduced "migration issues" for longtime 3.1 users, who would automatically click the right-most button for maximize. In 95, such action would result in closing the window... this was the idea behind separating the icon by a few pixels from the other two icons, which usually made people pause before clicking it. While not quite a paradigm shift in quite as many ways as other operating systems, it still meant that users had to change their behaviour.
In general, it is this "change of behaviour" requirement that causes people to become frustrated with a new Operating System, or a different Operating System then what they are used to. In fact, it is quite personality specific how receptive people will be to a UI redesign; the opposition is essentially that "the user will decide when to change" unfortunately, given the oppurtunity, they never will. You can proclaim that a certain feature will be removed, and replaced, and they should learn the new method... but they will still use the old method, and will still complain just as loudly when the feature is removed, and still clueless about it's replacement...