100 posts • joined 20 Jun 2012
What about the Chief Technical Officer?
Brendan Eich was not just a short-lived CEO. He was also co-founder and Chief Technical Officer, and now Mozilla has lost that role.
What impact is this going to have on Mozilla? Did the office of CTO actually do anything important? I heard that Eich was known for being "humble," so he may have downplayed his contributions, if there were any, but I'm concerned about Mozilla's future. I'm disappointed that no journalist has covered this aspect.
Google is using Chrome to promote new DRM technologies. Apple and Microsoft are using Safari and Internet Explorer to promote patented codecs. Opera has abdicated from technology leadership. The W3C has capitulated to hostile special interests. We need a leader of technology that advocates for user rights, and Mozilla has provided that leadership. Now that Mozilla's chief of technology has been forced out, and a former chief of marketing put in his place, what impact will there be on Mozilla's technology leadership?
Re: More issues with OpenSSL
I'm going with the theory that the OpenSSL core developers don't deserve more volunteers. SSL is incredibly difficult to do right, and OpenSSL is badly written and badly maintained code. Fixing it is like throwing good money after bad. It should be replaced. The trouble is that it's hard to find another library to standardize on, and to be sure that it's correct.
The discussions also include anecdotes about how hard it is sometimes to get improvements into OpenSSL. But if OpenSSL improves drastically due to Google's involvement, then it may become a good idea to standardize on OpenSSL again.
Re: For sufficiently small values of "wide"
I require a good reason why I would bother learning and using Cobol, when other languages are readily available and widely supported.
It doesn't feel like this is the case with Cobol. GNU Cobol is obviously making things better, but it's implemented in C. I wonder just what is the advantage of using Cobol, when there are so many languages that are better-integrated with my operating systems, and when I personally don't have any Cobol legacy code.
The worst mistake? I think that would be C.
As Poul-Henning Kamp writes, C has caused The Most Expensive One-byte Mistake, that is, data structures without built-in bounds checks. Most recently, that type of programming is in the news because of the Heartbeat bug. (Yes, I know the Heartbeat bug doesn't specifically use strcpy; the principle is the same.)
The menace from C is insidious and inescapable. Modern processors are built to be good at running C, not some safe programming language, and runtimes for other languages are implemented in C. For example, in theory, Java is a very safe programming language, but in practice Java is the vehicle of countless security vulnerabilities. Because of so much legacy investment in C-based systems, including Windows, Mac, and Unix, I don't see an easy way out of this trap.
Who expects Microsoft software to run flawlessly?
even if “... a certain non-Windows guest has integration services that were based off earlier Hyper-V protocols, the guest is expected to run flawlessly on newer Hyper-V releases.”
Oh, like anything that Microsoft produced has "run flawlessly." What universe are these people writing from?
Technically excellent people at Red Hat with horrible people skills
It's not just the Linux kernel maintainers. Debian's technical committee almost didn't recommend switching to systemd, precisely because many people have a... different... standard of how to do technical collaboration.
Though, it's alarming that Kay thinks he can just ignore the upstream kernel people. Lennart Poettering is trying to develop kdbus and get it into the kernel, so Linux would have a more useful RPC system. Lennart seems to be siding with Kay here, and Red Hat does maintain its own sets of patches away from the upstream Linux kernel. This doesn't bode well for collaboration.
Gentoo is already forking udev away from systemd. Since systemd is free software, perhaps the solution will be forking systemd at some point, like all those forks that GLIBC and GCC used to get until their respective leaderships changed.
Windows 95 was useful, not great
I remember when Windows 95 came out. Shortly afterwards, I visited some computer stores, and I was assailed by that Windows 95 boot sound from all sides. I interpreted that to mean that Windows 95 was not especially stable. Almost 19 years ago... Feels like a completely different world, one where Microsoft was considered a hero by everyone except the Apple Evangelists (an actual thing, headed by Guy Kawasaki), and people lined up to buy copies of Windows 95 at launch. At least Ctrl-Alt-Del worked reliably.
Windows 95 was slower than Windows 3.1, but I used it anyway. It was the mid-90's: Why continue to use retarded 8.3 filenames? Windows 95 also had preemptive multitasking among Win32 processes, and its own built-in TCP/IP stack, and Plug-and-Pray so you didn't need to configure devices using text files as much, and it did useful things with that right-click button that exists on every single mouse intended for a PC. Still didn't do anything useful with the middle-click, but that was less common.
I think of myself as an optimist. Computers are stupid. Every OS sucks. I hate computers. I think I am an optimist because I consider upgrades to be an opportunity to approach, ever so slightly, the ideal of a computer that actually works for you. Windows XP is insecure and slow and bad at 64-bit and bad for the Internet, so it needs to be eliminated. Windows 8 is terrible, but I think it's better than XP.
"What other computer platform has been more-or-less supported for 20 years?"
Well, the MC6800 series is still represented by the HC08 family of processors. That's more than 30 years now.
I don't think the computing platform should be part of the device. The interface should have open specifications, so whatever platform in the future could be adapted to drive it. For example, the best replacement for the Commodore 64 disk drive might be a flash memory adapter, not an exact replica of the original disk drive with all its original problems.
Re: Keeping Windows XP alive is not good for anyone
"'Do you expect a PC platform to last for 20 years? That's insane.'
Meanwhile, out here in the real world...."
...decisions are made by insane people with money?
Well, "ignorant" might be more charitable, but there's only so much charity I can tolerate.
Re: So does OSX and Linux...
MacOS X, that is a valid criticism. Apple wants you to replace everything on a regular basis.
But for Linux, the comparison is not apt. Linux does not have a single support option, because there is not one Linux. There are Mint, and Ubuntu, and Debian, and Red Hat, and Arch, and Slackware, and many others. And there are the non-Linux operating systems. There is less incentive for long-term support of Linux distributions, considering that most people get paid very little for it and release their efforts for free, but if you want very long support there are Red Hat/CentOS and SuSE Linux Enterprise Server.
And, because Linux and most of the software built on it are free and open source, you have the option of downloading the source code and fixing it or contracting it out yourself. That is inconceivable to a mind trained on Microsoft and Apple technologies. That is why Richard Stallman is right in the long term.
Also, Microsoft did not intend to be supporting Windows XP for so long. It was a horrible historical accident, due to the exposure of insecurity during the rise of broadband, and due to the extremely poor fit of Windows Vista. Microsoft is not committing to support any other system for more than 10 years and some months.
Keeping Windows XP alive is not good for anyone
This type of thinking would so be foreign in Microsoft's earlier days. After all, Windows 3.1 came out in 1992, and nobody was calling for it to be supported in 2005, 13 years after it was released. Granted, Windows XP compares much better against Windows 8 than Windows 3.1 compared against Windows XP.
Keeping XP alive is bad for Microsoft. It means that Microsoft has lost control of the Windows APIs. As HiDPI screens finally appear, Microsoft needs programmers to switch to APIs that work well with these displays. Otherwise, people have a horrible experience and continue switching to tablets running Android or iOS.
Keeping XP alive is bad for the Internet. Besides the obvious security issues and botnet zombies, XP sucks at IPv6 and web standards. XP also never will support exFAT, and will gradually lose the ability to run newer hardware, as the drivers are written for newer versions of Windows.
Keeping XP alive is even bad for the people who use it. Industrial equipment requires Windows XP? That's so your industry's fault for tolerating proprietary drivers. Everything should have been open, so you could drive it with whatever operating system exists in 20 years. Do you expect a PC platform to last for 20 years? That's insane. 20 years ago, Intel's latest processor was the 100 MHz Pentium, running in a Socket 7 motherboard with maybe 4 32-bit PCI slots, some ISA slots, and IEEE 1284 parallel port and RS-232 serial ports. It's now hard to find a PC with any of these interfaces, though you can get cards and adapters for everything except for the ISA slot. It's best to think of the junky Windows XP box as part of the otherwise fancy machine, and woe will befall you when the weakest, least replaceable part finally fails.
Using Windows indefinitely is a horrible idea. It needs contact with Microsoft to be officially activated, which means it requires Microsoft to keep their activation systems running. Microsoft will do so for now, but the day could come when your system fails, and you fix it somehow, but that triggers a reactivation, and Microsoft's servers could have stopped responding. It's better to use a system that doesn't require activation in the first place.
And when can you use this in Android?
It's great to see this in Java, but when will it actually get to Android?
Java 7 was released in 2011, but Android is stuck on Java 6. Only in late 2013, finally Android could use Java 7 language features, but Android still can't use the Java 7 libraries.
Basically, I think of Android as a fork of Java 6.
Re: He's right... and wrong!
Wow, I wasn't even thinking about hard drive firmware. If you're really paranoid, you can employ full-disk encryption on that. Bus master peripherals are trickier...
I was thinking more like coreboot, based on LinuxBIOS. There's no fundamental reason for the motherboard firmware to be proprietary software, when motherboard makers suck at writing firmware and free alternatives exist.
What about DNSSEC, etc?
HTTPS is an inconvenience for the Great Firewall, but since the Chinese government controls a certificate authority and spoofs DNS answers, it's not an insurmountable barrier.
What we need is end-to-end trust. They can start by signing the google.com zone, so a validating DNS resolver will refuse any spoofed responses. They can add the certificates that google.com uses to the DNS record using DANE or similar, so future browsers can refuse fake certificates without out-of-band techniques such as certificate pinning.
That's still not foolproof. Clearly, we can't trust google.cn. The Chinese government might decide to run its own DNS root, and outlaw domestic use of the IANA root. With the US finally deciding to get out of the business of running ICANN, the future of the root authorities could come into question.
Humans can't handle the semantic web
Piaget taught that humans reach a "Formal Operational" stage of cognitive development around the time that they reach physical adulthood. Going by his classification, I think most people barely get into the "Concrete Operational" stage, which is supposed to end near puberty.
Semantic tagging is very abstract. Most people don't understand abstractness. And if you do understand it, it's a waste of effort to add appropriate metadata when there are no programs to process it. It's just much easier to stick to ad-hoc textual conventions. That's why Google needs all those PhD researchers, to extract the semantic information from the mess of text.
What about Opera?
I guess Opera is not as exciting because it's a private company in the wastelands of Norway, so it doesn't publish its internal workings. And now Opera is dead to me, because they've decided to stop developing Presto and become a thrall of Google Chrome for some reason.
The death of the 1920x1200 screen was very sad, but what I'm watching now are the 4K monitors. They're starting to become affordable, for sufficiently stretchy definitions of "affordable." Sometimes a 4K IPS screen even goes below $1000. If I had plenty of money sitting around, I would so get that.
It has to be unobtrusive
For health monitoring to go mainstream, it has to be effortless. Any little difficulty means an exponential drop-off in the number of people doing it. I think having to wear an ugly watch is a substantial barrier to successful monitoring.
But I just got the first exercise report from the Moto X that I bought a couple weeks ago. I didn't install it, and I certainly wouldn't activate it if I had to do so every day. Now that I have a report estimating how far I've walked and how far I've biked, I certainly think it's interesting.
Re: 2007 hardware obsolete?
My 2004 Pentium 4 desktop is also running Windows 8.1. It's running the 32-bit version, but it's running fine. However, it has an NVIDIA GeForce 7800GT video card. I wouldn't count on the motherboard's video running well.
Re: OSX Mavericks
It's not just an arbitrary hard-coding that prevents Mountain Lion and Mavericks from running.
It's device drivers. Apple is not bothering to support the Intel GMA 900 with 64-bit device drivers. Apple is still supporting the GeForce 9400M, so that's supported. So, you can get a newer OS installed, but it will have miserable performance and probably not display with native resolutions.
Re: What are we waiting for?
Not all of the Core 2 Duo systems can run Mavericks.
The CPU can do it, but Apple never bothered to write 64-bit drivers for the GMA 900 video processor on the Intel 915GM chipset. The first "unibody" Macs introduced the NVIDIA GeForce 9400M, which Apple is still supporting with drivers. So, I think some adventurous people managed to get Mavericks to install, but it doesn't do native resolution and the performance is miserable.
This lack of support sucks for me. I was trying not to install Lion on my early-2008 MacBooks, because they have only 2GB of RAM, and reportedly Lion sucks with 2GB of RAM compared to Snow Leopard. I didn't think it was a good use of limited funds to upgrade those to the 4GB RAM or SSD so that Lion runs well.
What can make you wish for Scott Forstall to come back?
I've been reconsidering Jonathan Ive's actual design taste ever since he replaced all the fonts with spindly light text on low-contrast backgrounds. I thought the home movies with the unreadable white text were amateur mistakes, but now the marketing materials on Apple.com feature that, too. It's hideous.
Now this. Ive should never have been let out of the industrial design lab.
Why so negative about Hi-Def?
In my organization, we have a traditional standard-definition DVR, and it's almost completely useless. We haven't blanketed the outside of our building with cameras, so a single camera has to cover a large area.
Multiple times now we've had perps come in and vandalize some part of our property. Then we look at the recording and say, yep, that's them. That blurry block of pixels. The police can't do anything with this information.
My TV has had high definition for a while now. I anticipate high-definition security cameras some day. Eventually.
And what about services?
This so soon after news breaking out about Lumias sending info to Redmond. Nokia actually classified the OS as a third-party component when denying that Nokia violated privacy.
It's not so long ago since Microsoft accidentally implemented their Chinese censorship across the globe, too.
Of course, Microsoft tries to follow the laws in the companies where they do business. Even if they are draconian laws from totalitarian countries.
No privacy benefits to Microsoft
This is an awfully bad time to be touting the privacy benefits of being Microsoft instead of Google.
Nokia smartphone leaks information abroad
PowerBook 5300 wasn't that bad, either
The PowerBook 5300 didn't sell very well, especially the high-end model, but it worked decently. Except for the part where they disabled Li-ion batteries, because of the fires that no customer actually experienced, and then didn't enable them again when Sony sorted out their problems. The PowerBook 3400 and first-generation PowerBook G3 batteries were physically compatible, but the PowerBook 5300 wouldn't take them.
The CD-ROM drive was a slight problem, but it could take external CD-ROM drives through the SCSI port. Even then, I didn't use CD-ROMs that often, and I even reused the external drive (originally bought for my 1991-era Quadra) on my PowerBook 3400 instead of buying an internal drive. Apparently, a CD-ROM drive was built for the 5300, but it could take only small discs, and was only seen in that horrid Independence Day movie.
I think the biggest problem with the 5300 was just the narrative that the press wanted to build, that Apple was failing and doomed, that Steve Jobs managed to reverse.
"They didn't innovate, but they didn't fail either, so hooray for them."
Microsoft did innovate. Remember Microsoft Bob? While that was a commercial failure, it did give Bill Gates a wife and family.
Microsoft has been a lot more open with what they're doing than Apple, so you can see their failures. Things like Singularity, WinFS, Longhorn, and Courier would never have made it to the public in a modern Apple. A few innovations actually do get out of the lab, too, such as the big table Surface, not to be confused with the failed tablet Surface.
Apple //c was not bad
The Apple IIc wasn't that bad. It was essentially everything from the Apple IIe, in a compact case including the floppy drive but excluding expansion slots. For ordinary use, that was sufficient. My school had 2 Apple IIc and about 10 Apple IIe in the computer lab, and 1 Apple IIgs. I preferred the IIc's keyboard. I had an Apple IIc at home, too. But I don't know how expensive they were, nor exactly how well they sold.
Re: ...they can be persuaded to switch to a Mac
"OpenOffice is great but it barely can hold up with ancient Microsoft Office 2003, let alone 2007 or newer. LibreOffice is even worse, as it's essentially a features whore (why finally getting these annoying bugs fixed when we can have skins!)."
Spoken like somebody who never uses OpenOffice or LibreOffice. In fact, OpenOffice and LibreOffice are horribly glitchy and slow. But they are legally free, and as long as you're aware of their limitations then you can avoid trouble.
The major difference between LibreOffice and OpenOffice is that LibreOffice actually has a community behind it, so it has bug fixes and new features. Apache OpenOffice is the result of Oracle throwing in the towel on any commercial ambitions for OpenOffice, but being unwilling to join a real open-source community. So, they're getting contributions from IBM, but that's about it.
Almost done with Opera
Well, I for one stopped using their browser when all the technically fascinating stuff became neglected afterthoughts, for example Opera Unite. Now, I'm still using Opera Mini on my phone, because the phone is just too weak to use a modern browser. I will stop as soon as I get a real smartphone.
Re: Things I hope for...
"Have you heard of FreeBSD, Samba and Mono yet?"
Surely you mean OpenLDAP or something, and not the platform from the guy who's so in love with Microsoft technology that he named his clone of .NET after the kissing disease.
Re: MULTIPLE SCREENS!!
My memory of the era is a bit fuzzy, but I'm pretty sure you could extend the display on a 512K Mac by attaching a display adapter to the CPU. As in, open the thing, pull out the motherboard, and clamp an adapter precariously to the pins that attach the 68000 to the motherboard.
I'm not sure anybody actually built a display adapter that did that, though. I definitely remember some adapter designed to be clamped on like that, but I'm not 100% sure what it was.
Re: Strange Article
AppleTalk was great, and I loved how much faster the Chooser was than the Network Neighborhood, but your anecdotes don't seem to jive with reality.
I don't know where you got your ImageWriter, but I've never seen one with an AppleTalk card installed. From the documentation, and from the drivers that came with the Macintosh OS, I know they existed, but I've never seen one. Likewise, I never bought the software that would let me share my StyleWriter with the other Macs on the network. That was the domain of businesses that actually had enough money to spend. Also, you did have to worry which serial port your printer was attached to, except Apple labeled them Printer and Modem instead of COM1 and COM2.
Multiple screens were nicer than PC, but rare. If you wanted multiple screens on a Mac, you needed to get an additional NuBus card. Or, later, a PCI card. Powerbooks could run only one screen at a time, at best mirroring.
SCSI was nice, but that's what you get when you put workstation technology on a PC. Here are some shortcomings:
1) It was not hot-plug. All my Macs had a copy of SCSIProbe to activate any device that wasn't there when the computer first booted up.
2) It used manual addresses. 7 and 0 were SCSI controller and internal hard drive, respectively. But what about the rest? In the words of some forum philosopher, "WHO FUCKING CARES?"
3) It turned users into amateur electricians, because it required terminators to eliminate reflections.
4) Apple didn't keep up with storage technology. My Quadra 900 had 5 MB/s SCSI in 1991. My Power Mac G3 had 5 MB/s SCSI in 1998. Ultra Wide SCSI (40 MB/s) existed, but was the domain of expensive workstations and servers.
In one way, PCs were even worse than you describe. IBM introduced the PS/2 mouse and keyboard connectors. Now, instead of 2 different ports for 2 different peripherals, you had 2 same ports that were not interchangeable. Plug the mouse into the keyboard port and vice versa, and you get an error when you boot up. Macs were so much better; you could plug keyboards and mice into the ADB in any combination you wanted. Keyboards even had ADB ports and power buttons, so you needed only one extra-long ADB cable to put the computer somewhere far away and give you some quiet.
Re: Locked into enforced throw-away
"I've got a decent monitor, but now I have to throw it away and use the crappy one in my new computer."
No, you don't. Most people have pretty crappy monitors, and now Apple refuses to sell an iMac with less than 1080p IPS with anti-reflective coating.
But if you do have a good monitor, you can connect it to any Mac. The iMac can even drive 2 external monitors using only mini-DisplayPort adapters.
"I've got a decent computer, but now I have to throw it away and use the crappy one with my new monitor."
No, you don't. You think you did because it's big and you spent money on it years ago, but even the slowest iMac is faster than most computers I see in people's homes. But nobody is forcing you to buy an iMac. I don't see the point of getting an iMac instead of a nice IPS display if you truly don't want to use the iMac.
I wish my MacBook Pro had a palm rest!
The palm rest is a feature that I unexpectedly miss in my new MacBook Pro. Jony Ive has gone all minimalist industrial in his designs. To keep the lines all straight and clean when the laptop is closed, the laptop's base is now ringed by a sharp edge that cuts into my wrist if I rest on it.
Later passive matrixes weren't that bad
These days, I'm looking for high-DPI IPS or OLED screens, but back in the day I had a PowerBook 190cs. Apple actually continued to use passive matrix screens until they finished selling the budget "Wallstreet" PowerBook G3 in 1998.
The early passive matrix screens were a blurry mess. You had separate brightness and contrast controls, where the contrast varied between washed out and completely dark, with no good image in between. Operating systems included a pointer trails feature, because the screen updated so slowly that you would easily lose your pointer if you moved it faster than 1 mm per second.
The later passive matrix screens weren't so bad. Sure, due to the crosstalk between the display transistors, there was a massive amount of image bleeding, and the colors were horrible. But the display refreshed quickly enough to be usable, and most importantly they were relatively cheap.
Boycott the USA
The USA is becoming quite the hazardous place for a security researcher. I still remember the Sklyarov affair. Most recently, Adi Shamir, the "S" in "RSA," has been getting a hard time procuring a visa to do his lecture tours in the USA. The American law allows the customs and immigration to be jerks at discretion.
I don't know what's a good place to have a security conference. This year, I guess TrustyCon needed to work around people's existing travel plans. But next time they should find some place safer.
Re: What we want to know is...
TIFKAM is like Marmite. Some do admittedly love it, but too many people out there do not like it. Too many for a company like MS with around 90% of the desktop/laptop market to just force it unconditionally onto everyone without creating a backlash.
And many people do not like the Start menu. It seems like a major waste of space, to have the entire screen in front of you, and you have to scurry into the corner to reach important controls. People are just used to the Start menu, because the Start menu was the main interface since 1995.
Note, I rarely use the Start button. I vastly prefer to use the keyboard to open the Start menu/screen.
Patents more harm than good
How can there possibly be a patent covering the idea of Flash memory on DIMM? Ever since flash memory was invented, it has been placed on the memory bus to provide persistent storage. Linux has an entire device subsystem devoted to that sort of device, the MTD subsystem.
I concede that putting Flash memory on a DIMM is a new invention. I claim that it is an obvious invention to anybody skilled in the art, that nobody did it before because it wasn't economical, so it shouldn't have been patented. Beyond the technical slog of actually making it work, the main barrier I saw was the software to use it effectively. I seriously doubt that Netlist had anything to contribute to Diablo and SMART's product.
Chrome is now unstable and unusable
If you're using a non-touchscreen, then if you try to reach that start button in the bottom-left, the Windows start button appears and covers up the Chrome start button.
The Metro mode doesn't work properly with Windows 8.1 Snap view. It doesn't resize the windows to fit half the screen, so it's sort of useless.
Furthermore, now I'm seeing Chrome crash while editing Google Drive. Does anybody at Google actually test Chrome on Windows anymore?
Of course, another possibility is that the Chinese government wants information on how to run a major wireless chips manufacturer, for the benefit of their own industry.
Linksys is overpriced
It's sort of nice that the WRT54GL is such a stable target, as in never changing, but I don't see a good reason why I should spend so much money on such an obsolete device.
54Mbps 802.11g, stuck on a Linux 2.4 kernel
200MHz MIPS CPU
For $80 in 2014
This is just sad.
For that price, I'd rather get a Buffalo WZR-600DHP:
"600Mbps" dual-channel 802.11n
Gigabit Ethernet, with a separate Ethernet interface on the WAN port
680MHz MIPS CPU
It even comes with DD-WRT, and is extremely compatible with OpenWRT. The platform is also very similar to the CeroWRT platform (Netgear WNDR3800), so it shouldn't be too difficult to adapt.
Abolish patents to create a clean slate
We have too many low-quality patents. Just improving the rules about new patents would still leave at least 20 years of bad patents for us to sort through. Patent reformers have tried making it easier to reexamine existing patents, but these efforts have been rebuffed by lobbying from Microsoft.
If we abolish the patent system, then we can cheaply abolish all bad patents. Then we can start patenting again with a clean slate.
Re: How about taking a cue from the music industry for derivitives?
In remixes, the samples came from somewhere. If you have decent record-keeping, you could trace where your sample came from, because it definitely did not come from you. Or you can make your own sounds, and then nobody could assert a copyright on it.
We are deluged with low-quality patents. You can't use them to build anything. You can only invent your own thing, and then get sued by a patent troll. You can't use good record-keeping to track what inventions you're using, because they're independent inventions. And then patents that you never knew about will appear and sue you.
I'm not entirely pleased with music licensing, either, but that's a different story.
Re: Canonical feels like the Apple of Linux Distros
If you paid attention to the latest findings from psychology, and the common practices of the best marketers, you would already know that users don't really know what they want. Not even you.
When the iPhone and iPad came out, they were revolutionary in their category. Yet, millions of people who are familiar with the old paradigms bought them. There's a major disconnect between your point of view and reality.
The paradigm may be already lost
We think we need a full-power Linux phone. Some of us already saw it coming, in the Maemo project. That had everything. You could do devops on a Maemo phone. You could use a Maemo phone for monitoring (or cracking) WiFi networks. Then it became too visible and was killed by tragic mismanagement.
Modern phones may be weak compared to modern desktops, but modern phones are very powerful compared to desktops 10 years ago. Remember when Windows XP was still new? Those desktops could certainly run desktop operating systems. So, smartphones can, too.
But I think that we're moving into a different paradigm. When PCs took over from mainframes, we were not running mainframe programs on PCs. We were running vastly weaker systems with no concept of memory safety and a whole lot of security problems. When PCs grew to many multiples of mainframes' original performance, we didn't then run mainframe programs. We finally got multitasking and virtualization and so on, but they were tacked onto existing PC systems, and it has been a struggle to make those safe.
So, I think that the iPhone paradigm will dominate. I'm not sure of the exact drivers of that: The mainframe systems had the disadvantage that the major mainframe OS vendors (IBM) wouldn't license them to run on PC. Desktop Linux is free. I suspect that cell phone companies like phones to be devices that depend on being connected, so customers don't dare run the thing without a cell contract. Also, people don't like change, unless there's a status improvement or drastic usability improvement that can come from it. I don't think desktop Linux brings the types of improvement that inspire change for most people, compared to Android.
When I hear that you can't work on a mobile device, I'm reminded that PCs were said to be unable to do work. All that naysaying was interpreted as a challenge. People discovered that they could indeed do work on a PC. So it will be with these horribly limited mobile platforms.
Protesters are protesting the wrong things
The protesters are protesting change. You can't have life without change. If you try, all you get is death.
The 1960's-style hippies were not the first people in San Francisco. Nor were the gold miners of the 1860's. Nor were the missionaries of the 1780's. Ever since white people encountered San Francisco, it has been under constant change. I don't see why the hippies should be so privileged to keep their position in it.
Re: "Brock and mortar"
I already buy my clothes online. The local stores don't carry the styles that I want, and their brands are way too expensive. I do careful research, learning the vocabulary of fashion, to find clothes for my body.
It makes terrible financial sense to buy a new car. Car dealerships are some of the least trusted businesses in existence. There's a reason Tesla is selling their cars direct, and why car dealers are trying to outlaw Tesla's sales model. Used cars are big business for eBay.
Online reviews suck. When a product has reviews, most of them are superficial, and there's a lot of fraud. There's safety in bigger brands, but a lot of the time you just need to try it for yourself. It helps to have a great return policy. Especially when Amazon pays for the return shipping.
Re: Becoming a bank
Well, eBay is constantly nagging me to get a PayPal MasterCard. I consider PayPal the most hated way to pay online, so I steadfastly refuse. PayPal looks like a bank and sucks like a bank, and in Europe it's regulated as a bank. But they need to stop sucking so much.
Apple loves to push the risk onto others. The cost for the iPhone subsidy is borne by the carriers, and Apple imposes minimum order contracts. Apple sells Macs on credit, but the debt is with Barclays, not Apple.
Google and Amazon have a better chance. I have no idea why I would trust Facebook and Microsoft with my money.
- Updated Zucker punched: Google gobbles Facebook-wooed Titan Aerospace
- Elon Musk's LEAKY THRUSTER gas stalls Space Station supply run
- Android engineer: We DIDN'T copy Apple OR follow Samsung's orders
- Pics Audio fans, prepare yourself for the Second Coming ... of Blu-ray
- Microsoft: Windows version you probably haven't upgraded to yet is ALREADY OBSOLETE