Re: In the old days
The current approach is a bit like buying a car that is constantly in the garage for repairs.
A garage, also known as a "service" station by some in the US?
1025 posts • joined 31 May 2015
The current approach is a bit like buying a car that is constantly in the garage for repairs.
A garage, also known as a "service" station by some in the US?
Also the funky names e.g. Anniversary Update, Spring Creators Update...SatNad thinks major updates to an operating system are like expansion/content updates for a MMORPG.
You're not the first one to make the comparison, though I never thought about it in terms of the update names. MMO customers demand frequent updates to keep things from getting stale, and bugs are tolerable. Microsoft seems to think that's a good model for an operating system, which is pretty crazy.
If they're looking for things to remove, why not start with Cortana, Edge, Microsoft Store, and every other UWP "app" on the system? And with those gone, why not just get rid of the entire UWP subsystem?
It's not wallpaper and regedit that are making Windows 10 so bloaty. Rather than get rid of things that are useful and that should be a part of an actual OS, how about getting rid of those things that are not useful? They are relatively easy to identify... just make a list of everything you've added to Windows since Windows 7 SP1 and consider that your chopping list. There's sure to be a few things in there that should be kept, but not a whole lot.
Making something "lean" by trimming out the meat and leaving just the fat doesn't really work. Need I break out the Inigo Montoya meme again? I'll do it, Microsoft, I swear I will...
happily running 10.13.4 on a mid-2011 iMac (6 years old! Still perfectly useable! can't get my head around that...)
Moore's law is at least in a coma, if it is still alive (at least in the manner in which it is usually interpreted). I have not been keeping up with transistor counts, but the days of a reasonably current PC being worthless a year or two after it is purchased (by means of obsolescence) are well behind at this point.
I'm reading/writing this from a desktop PC whose motherboard also dates from 2011 (Sandy Bridge/Cougar Point). I have no plans for upgrade at present, and if I did, it wouldn't be about the actual performance of the CPU/RAM/PCH themselves, but about the desire for things like USB 3.1 or a M.2 slot without having to worry about addon cards. I use a single discrete GPU that doesn't even come close to saturating the PCIE bandwidth I have now, and I don't predict that changing anytime soon either.
My main laptop (which sees about as much daily use as the desktop) was manufactured in 2008, and originally came with Vista preinstalled. The desktop, being originally shipped as parts, didn't originally come with any OS.
Both machines are perfectly usable right now, despite their age. Both are dual-boot setups with Linux Mint 18.3 Cinnamon and Windows 8.1, 64-bit all around.
Now, if your comment was about the fact that a machine that old is able to run the newest version of the OS... well, I am running the newest release of Linux Mint, which is where I spend most of my time, and as for Windows... the reason my machines remain perfectly usable is that they lack Windows 10, which is perfectly worthless.
For the record, I did try 10 on both of these machines, and it worked about as well as 10 could be expected to run... no bluescreens, performance issues, or glitches occurred. The problems I had (and have) with Windows 10 are the things it does on purpose.
For the first time since I bought the laptop in 2008, I bought a new PC in the last two weeks of 2017. Came with 10, now running Mint like its stablemates. Despite being over nine years newer, it's far slower than the 2008 laptop in CPU speed, GPU speed, disk speed, and the amount of RAM. The screen resolution on the newer laptop is lower, and it lacks the swappable battery and the touchpad with discrete buttons (which I prefer to multitouch any day).
What the new one does have as a definitive advantage is that the built-in battery lasts all day, while the ten year old laptop manages to exhaust its swappable battery in well under three hours of light use. Sure, I can swap the battery (and I have several, so I can), but by the time I get done hauling the laptop itself (which is not really all that thin and light in the first place) and lots of batteries, I'm too tired to use the laptop anyway.
It still amazes me that my newest PC is also my slowest. Even though it is definitely low-end and neither of my main machines were low-end in their day, just the fact that a ten year old machine has any relevance at all is something quite remarkable to anyone who remembers the frenetic rate of improvement of PC hardware through the 90s.
It's pining for the fjords.
When you hide or uncheck (deselect) any given update, you're only hiding that specific version of that update. If MS releases a new version of any given update, it is treated as a new update that you've never seen before, so even if the previous version was deselected and hidden, the new one is shown and selected (if that is the default option for the update in question) once again. This is not unique to KB2952662, but it's probably one of the few that gets rejected often enough to catch people's attention.
I must be the only one who hates win 10 flat everything and prefers 3D widgets
For the Win32 bits, you just need to change the theme. You will need something to remove the Microsoft prohibition on installing unsigned themes (and they only sign their own), but there are several solutions for that (several patchers and one service that does the patching in memory. Just pick one!).
For the UWP garbage... as far as I know, that can't be themed. I avoid it by avoiding any OS that has UWP in it.
To get rid of the ribbon, there's Old New Explorer, and I think WinAero also has something that does this. I use Old New Explorer in Win 8.1 (also afflicted by the ribbon in Windows Explorer).
You have to go back s long way to not have Intel management engine built in, which is an obvious who to backdoor your system.
You only have to go to Haswell or older, according to what I've read, as far as consumer CPUs go. Not only that, but to exploit the vulnerability, you have to connect using the ethernet port whose controller is integrated into the PCH (formerly called chipset or southbridge).
I have a bunch of PCs, and only one is newer than Haswell... my low-end laptop that really should have been a Chromebook, but it came with Windows. It has no ethernet port at all, so its otherwise vulnerable CPU/SoC doesn't present a threat.
My main desktop is Sandy Bridge, so it is way too old to be vulnerable. Even if there was vulnerable, though, the motherboard has two built-in ethernet ports. One's the Intel, the other is Realtek... if I were concerned, I could just use the Realtek and disable the other one in the UEFI.
My main workhorse laptop is a Core 2 Duo, which is much older than Haswell... but in addition to that, it has only a Realtek ethernet controller.
None of my other PCs are vulnerable either, for multiple reasons. My Ivy backup server is too old; my other desktop system is too old and also has dual NICs onboard, my even older than C2D laptop is AMD, and so is my even older than that laptop. And my Compaq portable plus luggable... let's just say it is not subject to this either.
I've made no effort to try to buy gear to mitigate the vulnerability. Even though I own eight functional PCs, I have none that are vulnerable to this. It might not be as hard to avoid as you may think!
In some areas, San Francisco for example, we seem to be descending into the Valhalla problem. Only heroes are allowed in Valhalla so who is going clean the toilets and feed them all?
It's a self-solving problem. If none of the service workers that are needed to operate a city can afford to live there, they commute; if they can't afford the cost of crossing the bridges and parking, they take BART (public transit); if they can't afford BART, a shortage of people to do those jobs forms, toilets go uncleaned, and those people who need toilets cleaned offer more money, and they keep offering more until they get the problem solved (or they do it themselves). The more money is offered for the service, the less the barriers like high rent or high commuting costs matter, and the greater the incentive for toilet cleaners to figure out a way to make it happen (or for their employers to do so for their own benefit).
Feel free to reward all the workers that strive for more with more, is it too much to ask that society is fair?
Yes, it is. Every attempt to make life more "fair" has only succeeded in the opposite. Reality is what it is; wishing things were different (even if you have legislative power to issue laws, which is distinctly different from being able to affect the change you want) doesn't make it so.
But even if it was not asking too much for society to be 'fair', whose definition of 'fair' do we want to use? Someone demanding that you pay a person three times what a service is actually worth isn't fair. If you want to pay the neighbor kid $20 to shovel the snow off your driveway, and he agrees to the deal, but some government official steps in and tells you that that's not acceptable, you have to pay him $60, 'cause they held a vote and decided that what you and the kid agreed to isn't enough, would that be 'fair?'
There's no such thing as something that is universal in terms of being "fair" or not. If you have something of little value that you want to sell because you are desperate, is it fair that you can't get much for it even though you really need the cash? If all you have to sell is a bag of apples, do you expect to get paid as if it were a bag of truffles because you really need the money? Would that be fair to the buyer, to be expected to pay more than something is actually worth in fair market value?
Away with your communist nonsense crap, you still need the cleaner no matter what, why not pay them a living wage?
Because a job is not a welfare benefit where your pay is based on what some other person thinks you need. When you work, you are selling your labor on the open market, and if the job you are performing has low value, you receive low pay. If you have far more people who have no marketable skills than there are job openings for them, the pay is going to be low, necessarily, and that's the situation that we find ourselves in. You're purchasing a service from the employee, and not all services are worth whatever you consider to be a living wage.
Trying to legislate the low pay away with minimum wage hikes doesn't help the unskilled poor. In the case of fast food jobs, it has led directly to replacement of human workers with robots, and once that genie is out of the bottle, it's not coming back. Robots never quit without warning, show up late, complain, or get sick.
For things that don't (yet) lend themselves to that level of automation, wage laws simply encourage employers to hire illegal workers (who won't complain to the authorities, given that they are not allowed to be working or present in the country in the first place). Where that doesn't happen, you get the perverse situation of employers only hiring overqualified individuals; if you have to pay $15 an hour for office cleaning, why not hire someone whose work history and skills (related to what the business actually does, if not his actual role within it) actually command $15 an hour, even though you're only going to be using them for cleaning? Why hire a $6 an hour worker for $15 an hour when you can hire a genuine $15 an hour worker? If their skills beyond cleaning are ever needed, which is a virtual certainty in time (attrition and what not), they are already part of the organization and ready to step up, whereas the person who can't do anything other than chase a vacuum around doesn't offer that possibility.
As long as there are large numbers of workers without skills, but only a relatively small number of jobs for them to do, this is always going to be a problem. You don't fix the problem of an oversupply of workers who can't do much of anything with legislation; that just pushes them even further to the margins. There are only a finite number of unskilled labor positions around, and as long as there are far more people who want those jobs than there are jobs, it's always going to be a miserable situation for someone in that position. The thing to do here would be to try to reduce the size of the pool of people who can't do much... demanding that the laws of supply and demand be waived doesn't work, and it never has.
I don't know where you live, but where I go, every place that has drinks with aspartame or sucralose has the sugary ones at the same price, and in far greater quantities. Often the zero cal ones are sold out and the sugary ones (which consume 3/4 the shelf space) are all that is left.
When people ask why I don't use it I explain it's shit and invasive and I've already got phone, e-mail and even a functioning mouth for archaic low-bandwidth vocal comms.
I've been waiting to have that conversation with someone, but so far it has never happened. Turns out that I don't actually know anyone who uses Facebook... they're all like me.
So how are ex-users / non-users Facebook supposed to Opt-Out?
There's no way!
Of course there is. It's just not official, like opting out of Windows 10 doing whatever the hell it wants.
Set your script blocker (you should be using one-- if not, fix that), ad blocker (same), and/or firewall to block all communications with facebook.com and facebook.net. Set your browser to deny third party cookies, and use an addon that deletes all cookies after the tab is closed. Use an on-demand cookie crusher before and after logging into anything dodgy, like Google, and never use it for any service that involves personal data (such as anything on any Android phone-- it's a tracking device that has complete access to everything you do on it). No Gmail, Google Calendar, or any other thing that you do not want public. And for God's sake, never use your real name or photo in anything!
I use what amounts to a dummy Google ID to post on sites like this, but I never use the associated email address, and I don't do anything within it that I don't expect to be public. Posts like this are meant for public consumption, so it's okay for me by my own standards. Only you can decide how far you are willing to go, or not go.
When I am about to make a post, I kill all cookies (to deny Google the ability to read anything from any cookies that may have been set since the last deletion at the moment I sign in), then sign in (which will set new cookies), write the post, and again kill the cookies. Yes, I do have to sign in each and every time I write a post, but this isn't a post about convenience... it's about privacy. Use a password manager to lessen the stress on yourself, and make sure its store is well encrypted.
My dynamic IP from my ISP assures that I won't be using the same IP addy two days in a row, and I live near a large city and I use a large regional ISP, so the pool from which the IPs are drawn is large. That stops the two most common and pervasive forms of tracking... but the bad guys are always coming up with more. You can read it if you search browser fingerprinting, and not all of it can be properly mitigated if you're not using TOR (and even then it breaks some sites), but just by blocking the stuff in this post, you stop the biggest majority of it.
Once again, Ballmer was not calling Linux itself "cancer," although his choice of words could have been better. It was the GNU GPL (general public license) under which Linux operates that Ballmer was criticizing when he said that "Linux is a cancer that attaches itself in an intellectual property sense to everything it touches (emphasis added)." If there is any doubt that he's talking about the GNU GPL, just read what Ballmer said immediately after that.
The GNU GPL has long been a source of consternation for proprietary software vendors, and Microsoft is by no means the only one. To some degree, that's by design, since GNU GPL's author, Richard Stallman, is on record stating that closed software is itself unethical in concept. That's like sunlight to a vampire in the classic movies (not the terrible ones in which vampires sparkle); the very idea of having to release the source to any given product is anathema to a tightly closed software house like MS.
The GNU GPL, which states that all derivative works that contain any GNU GPL code must also be released, source included, under the GPL, was not meant to empower or encourage closed-source vendors, but Stallman himself explains that it wasn't meant to shut them down cold either with regard to Linux (or as he prefers, GNU/Linux). Microsoft is free to release any proprietary software they want for Linux, under any license they wish, so long as they do not include any GNU GPL code in the project, even in separate libraries, if they are distributed along with the proprietary code.
Ballmer wasn't completely wrong when he spoke of the metastatic nature of the GPL in that case, but he wasn't completely right either, and he goes further off the rails when he derides the concept of open-source in general and paints closed-source software as morally superior and "available to everyone," which is such a stretch that I have to wonder if he didn't have to suppress a chuckle.
"oldies that went on sale between 2007 and 2011, so it is likely few remain in normal use."
Yeah, I was thinking about that line too... you know how we keep hearing about the tragic decline in PC sales? The reason is that the end of Moore's Law (such that it has been called) means that older kit stays usable much longer, and people are using it much longer. I certainly am, and I know several others running gear old enough to be on Intel's "wontfix" list. I think you might be surprised at how much old computer equipment is still in use-- and why not? For most computing tasks, older gear is still very usable today. We've reached a point that a great many people only replace PC gear when it stops working, not because it's too slow... they're like toasters or other commoditized items. If it works, keep using it until it doesn't.
Sure, Intel can put a lot of resources into fixing 8+ year old chips, which are probably used by less than 3% of the market... but doing so will likely stop Intel from providing good raises or other benefits for its employees,
and/or raise the cost of the next computer you purchase by a couple of hundred dollars.
You think that releasing a microcode update for each of the "wontfix" CPUs on the list (the ones they promised had fixes incoming) is going to add that much the cost of my next computer? How do you figure that?
The last computer I bought (Dec 2017) cost less than a couple hundred dollars as it was, but even if it was a high-end desktop instead of a Chromebook-spec Windows laptop (well, used to be a Windows laptop), that figure is still pretty ridiculous. Microcode updates are a regular part of development for a given CPU; mine have received several over the course of their lives, as OS updates.
You think issuing just one more microcode update for a CPU that has already had several over its lifetime is going to cost that much?
Also, why would Intel's difficulties have anything to do with the cost of an AMD system? 'Cause, fsck Intel if they're not going to stand behind their products OR keep their word.
And almost all, if not all, of that kit will not be in use in such an environment where any of this matters.
None of it matters for any PC anywhere as long as the threat remains theoretical, but it remains to be seen if it will. My C2D Penryn laptop is assuredly in an environment where this could matter, browsing the web and what not...
Meltdown should definitely be patched as soon as possible, and is safer because it doesn't involve microcode updates, it's an OS patch.
Microcode updates can be delivered that way too. I'm not recommending any of the firmware patches for Spectre that have been released... just do it at the OS level. In Windows, I believe this requires downloading the microcode update directly from the Windows catalog, as it is not being delivered by Windows Update, for some reason. For Linux, of course that depends on the distro... I use Mint, so all I need to do is... nothing. It appears in the updates when it's ready.
...that is, of course, if the PC in question was not one of the ones that just got shit on by Intel, after they promised for months that a fix was incoming. My Braswell laptop already has a fix available (in the form of firmware, so no thanks), but my Core 2 Duo laptop is now "wontfix". Even though the C2D unit is far faster and more capable than the Braswell across the board, I guess it's obsolete, but the Braswell isn't.
Strangely, no one from Intel ever contacted me to ask whether my C2D laptop was "closed" to the internet; I guess I'm not one of the "customers" Intel talked about. I wonder who was.
You really think delaying updates until they've been properly tested (by other end users) and fixed means that his computer must be a malware-ridden disaster? That's absurd.
99% of avoiding malware is not doing something stupid. Most people with malware on their systems installed it themselves. While drive-by malware infections exist, they're not the norm, and the individual never said anything about running an out of date browser.
I'd lay odds on a system that's unpatched being clean if it is used by someone who understands the threat vectors than I would on a completely up to date system that is used by a regular user who has little understanding of anything tech. If a person falls for some fake virus warning somewhere on the web and is tricked into installing the malware himself, there's not much a patched OS can do to stop it if the user willingly hands the intruder the keys to the kingdom. Even if the user is prompted with an antimalware program warning, he's liable to click whatever it takes to get the warning off the screen, because that's what people who don't understand (or care) about malware do.
An experienced user with an unpatched system isn't even in the same league in terms of risk. He recognizes the fake antimalware and browser update warnings for what they are; he rejects email with suspicious attachments or with suspicious links. He has security countermeasures in his browser, often times, like NoScript. When he wants to download a program, he verifies that the publisher is legit and that the URL he's getting it from is as well. He checks the hash of the downloaded file against the published data (hopefully that from another site than the one from which he downloaded, in case it has been breached). He checks the signature on the file before running it.
I know which of the two I would feel better about giving my full (non guest isolated) wifi password to.
I asked our local friendly Microsoft rep about this statement, apparently 'Mobile First' refers to mobile computing, rather than phones
I don't think so. Laptops are functionally like desktops, from an OS perspective, and that's where Microsoft has always been (on desktops and laptops). There would be no need for a "mobile first, cloud first" slogan if they were talking about what they had always been doing. Clearly, Nadella intended for that phrase to indicate the new direction of Microsoft, not the old one.
That is, of course, unless you believe that the phone UI grafted onto Windows in its last two versions means something other than what we all interpreted "mobile first" to mean. When Windows on the PC looks like a mobile phone OS and the CEO is telling us they're mobile first now, I don't think "he must be talking about laptops, like the ones I have been using since the Win2k era."
I like laptops. I have a couple of them. I'm using one now to write this. They're not my only tools, though; I have desktops too, and they each have their role within my electronic hierarchy. This laptop I'm on now is only a few months old, but I have its OS (Linux Mint) set up exactly as I do on my main desktop. Why would I need anything with the OS or the UI to be different? I'm still the same individual with the same usage patterns, preferences, and workflow.
Certainly, the laptop has has more power saving features than my desktops, but it runs the same programs as my desktop. The laptop's user input devices consist of a hardware keyboard and non-touchscreen pointing device, like on my desktop. The display is smaller on my laptop, of course, but it's still larger than an iPad, and way larger than a phone, and it doesn't need the oversize controls that work best with big fleshy fingers on a touchscreen.
In other words, I still have no need or desire for the inane phone UI even though it is a laptop. It's not a touchscreen, which was intentional on my part when I selected the unit. I don't need or want a touchscreen on a laptop! A touchscreen is necessary for handheld devices where a mouse or keyboard is impractical, but it's redundant on a laptop that already has a real keyboard and a touchpad. Most of the people I've communicated with who use convertible 2 in 1 units say they seldom or never use the touchscreen when the display is snapped to the base, and I wouldn't either. It's just too tiring to be holding my arm out to touch the laptop screen! I can use a touchpad for hours (I find the older ones with the dedicated buttons to be better, but I am adapting to the annoyances of the clickpad on my cheap new laptop), and a real mouse is even better.
Ergonomically, a touchscreen is just too hard on the old arm muscles to use for more than a short while when the screen is in a fixed upright position (as it is in a laptop or desktop). The mouse or touchpad allows you to easily hit targets of only a few pixels, enabling better user interfaces that don't have to have kludges like disappearing UI elements or hamburger menus. The separate point and click actions on a mouse or a touchpad also allow hover effects that simplify and accelerate a lot of things people tend to take for granted. They're not new or trendy, but mice and touchpads are just better for nearly everything for which people use laptops.
With that in mind, it makes little sense to throw away the UI advantages of an interface designed around the mouse and to saddle it with the same old compromises as are necessary on touchscreen devices. If laptops are what Microsoft meant by "mobile first," the UI of Windows 10 is still just as ill-suited to that use as it is on desktops.
I have been using Windows since 3.0 in 1990, 27 years ago. I've used 3.0, 3.1, 95, 95 OSR2.1, 98SE, ME,XP, 7, 8.1, and 10. I've been using Linux as a primary OS for under a year, and part-time for under two, once I saw how "the last version of Windows ever" was, though I keep Windows in a dual-boot setup too. But yeah, I don't understand Windows 'cause I am a Linux user.
That is the price tag, yes. But at least Chrome OS works almost flawlessly on any halfway decent device, which start from less than £200.
I bought a low-end Windows laptop just after Christmas for $180 US, and immediately cleansed it of its Windows 10 infection. It's running Mint now, and it's pretty close to your description of "almost flawlessly" (almost because nothing ever is really flawless. It never crashes, everything works, it's fast enough to be used by a reasonable person who does not have the patience of Job, etc). Also, no slurping, and I don't have to use Chrome. It's kind of like a Chromebook where I get to use Waterfox with all of my addons and without telemetry!
It's not the fastest laptop... it's not even MY fastest laptop (and my other one is a 9 year old C2D). The new one does, though, have long battery life, and it was cheap-- the two things I was going for. Really, this thing should not have ever been sold as a Windows unit, as its onboard storage is definitely Chromebook-spec at 32GB (too small to even install Win 10 1709, by some accounts), but the good part of that is that it has a regular PC UEFI and not the Chromebook specialty firmware, which makes it really easy to install whatever I want.
As for the MS tax... it must have been offset by the crapware that came on it, as it is definitely priced in the range of similar-spec Chromebooks.
Where is ChromeOS in the list?
"GNU/Linux." It's a distro, at least according to some (including the Linux Foundation, apparently).
Edit = in case you really don't know, Apple has a thing for not speaking to El Reg.
MS only talks in places where fanboys hang out, like their Insider forum after they purged all of the people who had any real criticism. Then they drink in all of the cooing and adulation, which they will later claim means they "listened to our users." They do listen to their users-- they are just very selective which ones they listen to. It's the ones that are saying the same things Microsoft is saying, in a grand circle of logic.
Hardly any calls since then for OS glitches: maybe it's helped that with ClassicShell and anti-slurps I've made them all look just like XP.
It takes more than Classic Shell (which is no longer being updated, thanks to Windows 10's update schedule). You need something else to get rid of the ribbon (I use Old New Explorer), and something else again to enable the custom themes that are needed to make it resemble XP, somewhat. And how can you get rid of all of the UWP crap that comes up and looks decidedly foreign since it blatantly ignores whatever theme you have in place and ignores all former conventions of Windows UI design? More and more of the system dialogs are being moved into UWP with each subsequent release of Windows 10. It just keeps getting worse!
Windows doesn't include adverts unless you count a static crapp or two in the start menu.
I do. The presence of OneDrive in my taskbar (when I have done nothing to suggest I have that service) is an ad for OneDrive, and its presence in the navigation pane of Windows Explorer is another. The existence of the "Xbox" app on the computer is an ad for Xbox. The "Get Office" popups that begin as soon as Window 10 starts for the first time are clearly ads. The notification for a sale on OneDrive storage that appeared in Windows Explorer some months ago was a blatant ad. And, of course, the thing that got many people into this mess in the first place, GWX, was a particularly nasty piece of adware.
The entirety of Windows 8 and 10 in general used to be ads for Windows Mobile, when there was such a thing. They look very obviously like what they are, which is to say a UI element designed around mobiles. Each time a user opened the start screen or menu, he'd be getting an ad for Windows mobile-- which is presumably why Microsoft only shrunk the tiles down for 10 when everyone had been demanding their removal.
The Metro and UWP elements that infect Windows 8.x and 10 are the same. They look like they belong in a mobile OS for a reason... they're there to remind you that this is, in fact, a mobile OS, so why not make your next phone a Windows phone? Even the proliferation of the word "app" in Windows is an ad for Windows mobile. On PCs, we've long called those software thingies "programs." Remember, they're installed in \Program Files, and they're uninstalled with "Programs and Features," right? So why suddenly must Windows ask me what "app" I want to use to open a program? Apps are for phones... and people associate them with phones. It's like MS wants to remind everyone (if they missed all those other reminders I mentioned) that Windows is a mobile OS!
If you think I am nitpicking, I can assure you that I am not. Megacorporations like MS don't make changes to long-standing nomenclature haphazardly. There's a reason for everything they do.
None of this mobile emphasis on desktop PCs was ever meant to benefit PC users. It was meant to sell phones, no question about it. Why they continue to forge on with their mobile-first OS when they have apparently recognized their failure in the phone market is anyone's guess, but you can bet there's a reason. It may be a dumb reason; it WILL be a cynical reason that benefits Microsoft at the expense of its users. Whatever it is, it exists. They just won't tell us what it is.
But i do know Windows 10 is an inconsistent mess and i would love the Windows 2000 GUI again ....... but i cannot so i will go for what i think is the next best thing, MacOS.
Cinnamon is pretty good too. I also like KDE, but despite its wealth of options (something I really like), it has always had one or the other thing that kept it from being "just right." Cinnamon makes me dig a little deeper to customize things that would be simple in KDE, but it has that warm, fuzzy feel that I also get with an interface like Win2k (also my high-water mark as far as Windows UIs go).
Why didn't Apple lead the drive for 2:1 or multi use of laptops?
Because it's a bad idea, maybe. Touchscreens are a kludge that become necessary when the form factor of a handheld device makes a mouse or touchpad and a real keyboard unwieldy or impossible. When the hardware keyboard and mouse/touchpad ARE available, as when a 2 in 1 is in the docked or laptop configuration, touch isn't needed, and that means the massive UI compromises to make touch work (at the expense of workflow, intuitiveness, information scent, convenience, and resistance to fatigue) can be jettisoned as well.
That dual mode also means dual UI and dual input regimes, and that leads directly to the kind of "neither fish nor fowl" Windows 8, Windows 10, GNOME 3, or Unity user interfaces that have each caused much consternation and gnashing of teeth, not to mention the exodus of former users. The only way I could ever see that kind of thing working would be if the devices had the full desktop UI and the full touch UI on board, with each "app" also having separate and distinct designed-in UIs. By this, I mean that both the OS and each "app" that runs on it would have two distinct user interfaces designed for each usage regime, not one that is designed for mobile and "adapts" to desktop, because that does not work).
It takes a lot more than replacing a hamburger menu with a menubar and reflowing things to make a mobile UI into a *proper* desktop UI. There are a lot of subtle differences (like the inherent precision of a mouse and its ability to use hover effects, not to mention right and middle click) that, when added up, make the two environments too different to ever work under one UI regime.
That's why Apple never did it... because it could never be done to Apple's standards of UI consistency. Microsoft has no standards... just look at the mess we've had since 8. There's a Settings app and a Control Panel, and it has been like this for five years. Some things are only in Settings, others only in Control Panel, and it's up to you to guess which is where. The UI of both 8 and 10 has been half "mobile" and half desktop for just as long, with a level of inconsistency and lack of polish that Apple probably would not even accept in a beta, let alone in release for half a decade.
It's time we give up on this idea of one UI that works across multiple devices. It doesn't. We need one UI designed as a touch UI for use on touch devices with apps that are designed to use touch, and another designed as a traditional PC interface for use with traditional PCs and programs. There's no reason for the two even to come in contact with one another if the device is not a convertible or 2 in 1, as the vast, vast majority aren't.
I disagree the WIn 8.x is anything but adominable.
Out of the box, certainly it is, and that stopped me from even considering it at first (I bought my first copy of 7 after 8.1 had been released, and I skipped 8.x intentionally. Unti then I happily used XP, but it was time to enter the 64-bit era, and XP x64 never really seemed all that good).
When I saw how much modification it took to get 8.1 reasonable, I scoffed at first... why would I want to buy something that I have to put so much work into just to get it into a state of basic usability? Then I realized that even though 7 is far better than 8.x out of the box, I had still installed all kinds of stuff to tweak 7 to be even better. I still used Classic Shell, including Classic Start and Classic Explorer, a custom theme of my own design, 7+ Taskbar Tweaker, and all kinds of registry edits, to customize 7 and make it into exactly what I wanted instead of just kind of what I want.
If I used all those same things in 8 (and one more: Old New Explorer, to get rid of the ribbon in Windows Explorer), I could transform it from "WTF is this crap?" to being a dead ringer for 7 (and I mean really a dead ringer; people have said that 10 is "just like 7" on the desktop, but it's not even close). If I am going to be modifying it with all kinds of addons anyway, what difference does it make if it started out "not too bad" or "WTF is this crap," so long as you end in the same place?
Apple as a company reminds me of google; a company which in the public eye has an undeservedly inflated reputation, which like google, it abuses.
Google has that? Everyone I know hates Google... I know a guy who gave away a Chromebook in disgust when he realized he had to create a Google account just to use it. I would have kept it and put Linux on it, but he's no tech guy; he wanted a cheap way to get to the web that didn't involve Windows, and it didn't work out for him. Google has its fanboys, but then so does MS (oddly enough). I loathe them all, personally.
Win 10 is awful. The only reason that people accept it is because it replaced the even more awful WIn 8.
Win 8's problems, though, are generally of the skin-deep variety. It takes some effort, but with things like Classic Shell (which should keep working since 8.x has not been in active development in years, even though it just this year exited the mainstream support period), Old New Explorer, install_wim_tweak, Old New Explorer, and a custom theme of your choice, Windows 8 can become pretty good. I am a stickler for the traditional UI, and Win 8.1 is the only version of Windows I still have installed on anything I use on a daily basis.
I don't see Win 10 as being an improvement over Windows 8.1, really. Over the original Windows 8, sure, but even then, Classic Shell and the other programs would have fixed it all up nicely. I don't see how the Win 10 start menu, the disjointed mess that it is, is any better than the full-screen version of the same in Windows 8. I hate them both, but the 8.1 version is somewhat less bothersome to me. With that, the return of the start button, and the ability to boot straight to the desktop, I don't see how 10 is really any better than 8.1. They both stink pretty bad, but only one can feasibly be corrected with aftermarket tools.
You still get to turn updates off on 8.1; no silliness with "metered connections," where MS promises to only download updates if it thinks you really really need them (so it's still their choice), or active hours (which you can't set to 24 hours a day, as I would; you must still allow MS to have some time). You can defer updates with some versions of Windows 10, but even then MS may introduce a "bug" that causes it to go ahead and install feature updates even when they are in deferral, as reportedly happened with 1709 four separate times.
Of course, if you use 10 Home, you don't get that option at all, so stop complaining, you beta tester drudge. Get back to work! We don't pay you to not test Windows! (Well, actually, we don't pay you at all. You pay us! But still, get back to work.)
Windows 8.x never had the permanent beta quality arising from "Windows as a Service," nor did it ever have the more than an hour of down time whenever it damn well felt like installing an upgrade you don't even want, during which time I really hope you didn't need your PC for anything important, 'cause you're not getting to use it. "Windows as a Service" is the worst Windows 10 "feature" by a long shot.
Windows 8.1 has the abominable Settings app like Windows 10 does, but unlike 10, most of the functionality of Control Panel is still intact, so you can consign Settings to oblivion and still be able to do things. The few things that 8.1 did move to Settings can easily be done by alternative non-Metro means. You can't do this in Win 10; the UWP garbage is just unavoidable. I never see any Metro anything in my 8.1; every app (including Store) is long gone, and they haven't come back. I haven't seen Settings in the better part of a year, and I don't need it.
Win 10 is worse by far than Windows 8.1, and had 10 not come along, 8.x would have continued to evolve in the right direction. Win 8.2 was in the planning stages, and was codenamed "Threshold" at the time that the previews were published in the tech press. Win 8.2 would have had more desktop-friendly features, including the option for the Windows 7 style start menu. MS had miscalculated badly how 8 was going to be received, but back then they still had the crazy idea that they should deliver what customers want to improve the adoption rate. Then Nadella came along, with his new "friendlier" Microsoft, and the idea was to make Windows into something that obviously would not be well-received, but to force-feed it to us by virtue of their monopoly status. So friendly!
Of course, "Threshold" mutated into something terrible after that. There would be no Windows 8.2; now the "Threshold" project was repurposed to be the first release of 10, and the Windows 7 style start menu was history. So was the idea that the PC's owner should have any control over his machine. So was my future in the Windows ecosystem...
Mac OS has been effectively on life support for years.
You know what would breathe new life into it? They could release it as a standalone software product for PCs. With Windows in the loo as it is now, and a lot of people afraid to make the jump to the scary world of Linux (not my sentiment; I use Linux myself), the market share of MacOS (perhaps re-renamed to OSX for its PC version?) would skyrocket overnight.
For this to work, Apple would have to thread the needle between the traditional Mac model of MacOS/OSX only running on Apple hardware (but generally working well since they control both) and the traditional PC model of the OS running on anyone's hardware-- with the OS publisher catching the blame for substandard hardware and drivers over which they have no control (which has long bedeviled Microsoft). They would have to preserve the idea that Apple products "just work" while emphasizing that any problems you have with OSX on your PC hardware is a function of your PC not being "Apple" good.
I don't buy into the legendary quality of Apple hardware myself, but Apple knows that a lot of people still do, and they would want to preserve that. That would be one reason not to ever do what I suggest; some people, regardless of how Apple spins it, are going to blame Apple when OSX runs like crap on their "Great Value" brand PC. Still, with the Mac platform languishing, there's not much to lose. The Apple devotees will never buy anything but Apple hardware; they have bought into the hype already. The rest of us are an untapped market, and doubling or tripling the MacOS/OSX market share overnight would spur software development which would in turn make genuine Macs more appealing.
I know it's just a pipe dream, but I would love to see Microsoft hoisted on its own petard here. Forcing people to accept crap via monopoly tends only to work for a short while... just ask the Internet Explorer devs. Personally, I find Linux more than good enough, but I know a lot of people want the greater hand-holding that MacOS/OSX can provide. I'd even try it myself... I may not end up buying it, but I would certainly give it a fair shot. Only on my hardware, though; I won't ever be buying their hardware.
three years [Still waiting!] before it switched back, with Windows 10 name TBD.
FIFY, the best I could with the paucity of data regarding when or if this switch back will ever happen.
given the difficulties that Canonical trying to make a new Top layer/UI had I tend to thing that it is a bit harder than most people think it is.
Had the Canonical devs followed the Unix philosophy of "do one thing and do it well," they would not have been trying to create one UI for all devices. Hell, if the Unix philosophy is too highbrow for them, try Curly's fireside chat from the 1991 movie City Slickers, in the scene where Curly (played by the legendary Jack Palance) was explaining his philosophy to Mitch (Billy Crystal). One UI to rule them all is one of those ideas that seemed like a good idea at the time, but once attempted, soon revealed itself to be unworkable.
In my diatribes against Windows 10, I've often cited Apple as having gotten this one right by opting out of the unified UI, and I've remarked that Apple would never have released a disjointed mess like Windows 8 or 10. As much as I have always seen Apple as the "bad guy," one thing I cannot fault them for is demanding excellence in their UI design, whatever the particulars of that design may be. If they stick to that, this "half and half" idea will necessarily be rejected once again by Apple as it has been in the past, for the very same reasons Tim Cook and others have articulated.
People object to profanity because certain expressions convey either the speaker's intent to offend the listener or his lack of concern over whether the listener is offended, and it is the lack of consideration that is rude and offensive, not the word itself.
They don't have to be the government.
They are a company and subject to the laws of the US, the same way they are subject to Gender laws.
Um, yeah they do have to be the government. The First Amendment, like all of the Bill of Rights, is a list of things the government is not permitted to do to people. That's what a right is-- something you can do without government interference.
The government has no rights... the government has powers, and those powers are delineated and restricted by the Constitution and other laws. Private entities have rights, and Microsoft is a private entity that has its own right to dictate what kinds of expression are permissible on their network. If Microsoft was owned by the government, there would be a real problem here, but it's not.
Well any business that has been stupid enough to think cloud computing - whether with Microshite, Amazon or anyone else is sensible...
It does seem like the counterpoints are coming in fast and furious these days, like all of the cloud providers are out to inform people demonstratively of why cloud services are a crock. Amazon's music storage service is going away, and Microsoft's Groove service (I think) already went. Photobucket alienated a ton of people by effectively locking up a ton of pictures hosted there and shown in various internet forums by going from "free" to "orders of magnitude more costly than it would ever really be worth" all at once.
I've used Photobucket to host pics on a few different forums scattered across the internet, and there's no way that I am going to go back and edit those ancient posts (if I even can) to point to a new host, and to reupload all of those pics on those forums to that host. There's also no way I'm gonna pay $400 a year for low-traffic forum hosting; if I was running en eBay business or something like that, I'd probably think differently (though I would still have cheaper and better options), but for my use, Photobucket is pointless now. There's a lot of information in forum posts like that that is essentially lost now, as all you get is an obnoxious demand for money in each of those posts instead of whatever the poster was trying to illustrate.
Sure, Photobucket can do what it has done if it wants, since it owns the service, but that's just the point... it owns the service, so it can do whatever it wants, whether or not that conflicts with my interests. That's the wonder that is the cloud... keeping your files on someone else's computer, subject to their business plan, their security practices, and their whims. Great, where do I sign up?
Yep then you have smart post. Which starts off FedEx but is delivered by USPS even on sundays.
Surely you jest. The Post Office delivers... anything? I can't get them to deliver a letter on any day of the week (no one in my zip code can), and packages... the best I ever got was a little slip that said I had something waiting in the post office.
Or was it because she tried to enact some modicum of gun-control legislation, resulting in a useful ban on Assault rifles, the ending of which shows a marked increase in homicides.
Wow, you really don't know what really happened, do you?
"Assault rifles" were ever only used in well under 1% of gun murders (closer to a tenth of that). Bare hands kill far more people in the US each year than "assault weapons," without exception. Even if NO ONE could get "assault weapons" despite the ban (prohibition worked wonders with drugs, right?), and if NONE of the criminals who killed people with them would have done so by other means, the most the murder rate could have declined was way under 1%.
Also, the so-called assault weapon ban (AWB) ended in 2004. Completely. Like it never even happened. The murder rate has continued to drop since then (far more than 1%), even as the number of states that went from never issuing (or practically never issuing) concealed gun licenses to issuing them to anyone without a criminal record skyrocketed. At the start of the AWB (1994), 20 states issued to anyone without a criminal record (a short one-day training course is typically required, but this is but a formality; no one fails it), 17 issued with discretion, meaning that law enforcement gets to choose whether to issue them on an individual basis, and 12 did not issue any permits to anyone.
Now we have thirteen states where no license _at all_ is required to carry concealed guns, up from only one (Vermont) prior to 2004. Twenty-nine more issue them to anyone without a criminal record. The remaining eight issue them with discretion by law enforcement. Zero states of the fifty don't issue them at all now.
While all of that change in carry laws was happening, the US murder rate was in a steep decline. It is now _half_ what it used to be during the peak of the crack epidemic, which was about when the trend of liberalized concealed carry started as well as the now-defunct AWB.
These are facts; if you don't believe any of them, please feel free to look it up. You may not like that this is how things are, but that doesn't change that it is. Downvote if you must, but be aware that you're downvoting verifiable and objectively true statements, and think about what that means...
That is an *awfully* long title. I can only imagine how bad the article must be.
The idea that "a real-time, continuously-changing environment" is necessarily a good thing is naive in the extreme.
If anyone actually believed it, certainly. That's just a glob of rather obvious marketing doublespeak, meant to make the victim believe that stability is actually a bad thing. Consistency and a coherent UI that is actually designed for the way people use the OS is bad, but constant code churn and permanent beta quality (along with significant down time while the thing keeps changing, continuously, one time-wasting update at a time) are good.
We've heard this nonsense from Microsoft before in their efforts to tell us that in today's world, only Windows as a Service can possibly hope to keep up with the evolving threats (because its new that the threats are changing and adapting now, apparently), even while more than half of the users of Windows out there were still happily using an OS that hasn't seen a "feature update" or anything like it since 2010, fully eight years ago as I write this. If MS wasn't "inadvertently" breaking Windows 7 with updates that do more harm than good, it would be keeping up with the threats just fine. Find a security hole, fix the security hole; lather, rinse, repeat. What's new about that?
Windows 10 has far more untested, unproven code than Windows 7 (when 7 was written, they actually had paid people to do the testing, and then it went through nearly a decade of post-release usage in the real world). Windows 10 has more attack surface with the Win32 bits (which 7 also has) as well as the useless UWP portion (with which Windows 7 is blissfully unencumbered). Nearly every month, Windows 10 has the same or more security fixes as 7; clearly, the same issues exist in both of them, plus some new ones in 10 only.
You don't need "feature" updates every six months for security purposes. In terms of security, you'd be far better off without them... each one of them introduces more code churn and thus more bugs, and all of it for "features" that very few, if any, of Windows 10 users asked for. The things they have been asking for have thus far been met with laughter or the sound of crickets...
That's why I am getting off this crazy train after 27 years on board. If Windows 10 is the last version ever, it will be the last version I ever say "Hell no!" to as well.
OH NO! NOT THE TERMS OF SERVICE!
I use SteamOS (I think every game I've bought for the last 2 years are available on Linux/SteamOS) or an old Win7 setup for what is not on Linux.
I'm not much of a gamer, but I do like to play from time to time. I've been using Windows since 3.0, but Windows 10 is the bridge too far, and I'm migrating to Linux (really, I have migrated, but I say "migrating" because I still keep Windows around (on bare metal as a dual boot) in a dual boot in case I need it.
Because I am not much of a gamer, I don't need to play any given game at any given time. I'm not dying to play any of the latest AAA releases; chances are, I would not even recognize those titles if I heard them. I just browse the games that are available (and on sale, generally) and buy them if they seem interesting. Thing is, if they don't have the little symbol indicating that they run on Linux, I don't even look at them. When you don't know what you're looking for, and you're just going to pick one of the ones that is offered, it doesn't matter if a lot of stuff isn't on Linux. To me, at least.
I've always resisted Steam as I am opposed to DRM platforms in general, but on the other hand, they singlehandedly have made Linux into a credible gaming platform, even if the "best" games are still Windows exclusive for the moment. That's the one thing that got me to end my Steam prohibition and try it... and only on Linux. It's not installed in any of my Windows setups. I want game devs to know that the effort to port their titles to Linux is worthwhile, 'cause if they don't, their game isn't worthwhile to me, or to a growing number of others. Windows 10 is that bad!
Kids these days don't understand that we used to have to buy a card specifically to get sound out of a computer, then configure each application for it.
Not just sound, either. You also had to buy a card specifically if you wanted any I/O ports (serial or parallel), a network card, a floppy drive, a hard drive, or a picture on a screen. The AT keyboard port could be plugged directly into the motherboard, but that was it. Everything else came on an ISA card, though in those days we just called them "8 bit card" or "16 bit card".
I wouldn't say that kids these days don't get that... while I am sure it's true, it's so far in the past for them that they don't even have a frame of reference with which to think of it at all.
Ubuntu among others, but primarily Ubuntu has sold their soul to the devil in order to receive UEFI certification from Microsoft
UEFI is an open standard. No one needs to bow to Microsoft or anyone else to use it.
(UEFI is not secure boot. Just turn it off.)
If you're trying to use the version of VirtualBox in the Ubuntu repo (which is also what Mint uses), it's too old to work with kernel 4.13. If you get the newest VirtualBox from Oracle directly, it will work quite well.
Meh, you beat me... but the Borg never said "I."
We are Pentium of Borg.
Division is futile. You will be approximated.
If the software is free then how can you complain when it forces you to use it's own browser rather another free browser?
Well, it wasn't free, as someone else has already explained. Even if it was free, though, that has nothing to do with its suitability as an operating system. An OS needs to serve the user, and how it serves that user is also up to the user. If it fails that, it's not fit for purpose, even for free.
Biting the hand that feeds IT © 1998–2018