51 posts • joined Wednesday 20th June 2012 06:13 GMT
Re: Becoming a bank
Well, eBay is constantly nagging me to get a PayPal MasterCard. I consider PayPal the most hated way to pay online, so I steadfastly refuse. PayPal looks like a bank and sucks like a bank, and in Europe it's regulated as a bank. But they need to stop sucking so much.
Apple loves to push the risk onto others. The cost for the iPhone subsidy is borne by the carriers, and Apple imposes minimum order contracts. Apple sells Macs on credit, but the debt is with Barclays, not Apple.
Google and Amazon have a better chance. I have no idea why I would trust Facebook and Microsoft with my money.
Auto-flushing urinals: Awesome. Auto-flushing toilets: The worst.
From the era of motion sensors, we have motion sensors attached to flushing mechanisms. It's nice not having to touch a dirty handle, though arguably a no-water urinal would be more environmentally conscious. It's funny, though, that the no-water urinals are out of service a lot more often than the conventional urinals.
But automatically flushing toilets are horrible. I squat forwards so I can reach my hand underneath, and the next thing I know, there's a bit of mist and my toilet seat cover is gone. Fortunately, I've been doing Yoga, or else I would have to sit back down on the now-exposed and undoubtedly contaminated toilet seat.
No! Elop for MS CEO!
Elop's specialty is destroying a business so it can be acquired by somebody else. If any company deserves that fate, it's Microsoft.
The Register is way behind here
You're well behind. Ars Technica already published both Marlinspike's critique, and Levison's official response.
Escher Girl pose
Oh my goodness, what is her hip doing underneath that skirt? She must keep her chiropractor in business.
Re: Epic whoring, but why?
"It's like next he will be telling us [other operating system] is a state sponsor of terror, responsible for spreading hatred and extremism throughout the ecosystem"
Prices going down is a problem?
Do you get upset when you buy a new computer and immediately afterwards a new computer comes out at the same price? So Apple is doing the same thing with the software that comes with the computer.
So it turns out that the $15 that Jasper spent for Pages and Garageband would not have been necessary if he had waited, what, a few weeks? I would be feeling a small amount of buyer's remorse, too, but only if I didn't bother to use those apps. I thought you get computer stuff to do a job, and if you sit around waiting for the price to drop then you won't get your job done. You pay the current price to do things now.
Also, unless you're in the grips of poverty, $15 is just not that much money. Does The Register need to give its writers a raise?
DRM does not belong in the standard
Okay, so DRM is going to be around? The least-bad solution is to marginalize it. Netflix wants to DRM their videos? Fine. Let a hundred Netflix apps bloom. Google wants the Netflix app to go through the ChromeOS browser? Let them collaborate with Netflix to make a proprietary browser. It should always remain clear that DRM is limited and proprietary, like we always understood Flash and Silverlight to be proprietary.
With the WHATWG doing the actual HTML5 development, the W3C should be used to having real development happen outside its bureaucracy, anyway.
Douglas Engelbart was sad that his Mother of All Demos deeply influenced computing, but he was never able to raise enough influence and money to finish the project. I guess Tim Berners-Lee is the opposite case: He had an innovative idea, but his influence continues long after he ran out of benefits to bring to society.
Re: Sure let him
The problem is that not everything will be available DRM-free.
Sure, movies and video games will be cracked as soon as they're released. But what about Internet-connected cameras that currently use ActiveX plugins? What about bank web sites?
And what about future developments? For example, I can imagine, with little imagination, very bad consequences of DRM in the eventual mass-market version of Google Glass.
DRM is a pernicious threat to personal computing, and I oppose it.
Re: Anyone fancy reading that?
"The entire post is based on the assumption that Microsoft doesn't add their IPs to pools and allow people to licence them thereby preventing industry growth when the entire opposite is true.
Microsoft's mobile earnings are almost completely made from licensing IPs to Android OEMs"
But Microsoft doesn't help develop Android. Microsoft is not a member of the Android Project, nor do they submit patches.
Instead, they sue Android manufacturers for independently developed and sometimes obvious parts of the Android system. For example, having the operating system contain certain UI elements for all applications to use (US Patent 5,889,522), or attaching notes to a read-only document by using a separate file (Patent 6,957,233), or using a phone to schedule a meeting (Patent 6,370,566). But their deep pool of patents and the court system's presumption of the patents' validity means it's a challenge to fight the patents.
The dubiousness of the patents means Microsoft can't shut down Android entirely, but they can tax it and make it more expensive than it deserves to be.
Re: Over my cold dead browser
And you're an idiot.
What we're worried about is a return to the bad old days when you needed to run proprietary software to see content that you do want to see. We've always been able to compile our own browsers, but we haven't always been able to view content using them.
In the old days, it was DHTML and then ActiveX. In the less remote past, it was Flash. Java was there, too, but its bloated runtime always kept it from being dominant. Web sites built on all those technologies were unavailable in open source browsers. That is a problem when the web site is for your bank or your home security system, for two examples that I've encountered recently.
Now, because of Firefox and then Steve Jobs, those proprietary technologies are fading away. Good riddance. But the evil people at Google, Microsoft, and Netflix are trying to bring proprietary back, and now they've got Berners-Lee's blessing to put it into the official standard. I'm severely disappointed at how shortsighted that is.
Non-profit discounts now apply to religious non-profits, too.
Microsoft has long tried to lure non-profits into using current Microsoft software by providing severe discounts through organizations such as TechSoup. Now, I just noticed that they added religious non-profits, so there are fewer excuses for your random parish office to stick to pre-2010-era Microsoft software.
Can't standardize, yet. Innovation still needed.
You know what I miss about the old barrel plugs? The ability to insert the plug in any orientation. (Almost) Everybody has standardized on MicroUSB, but it's difficult to tell which way you're supposed to plug it in.
I'm not exactly happy about the clutter generated by new Lightning cables, but I think Apple is proving that there are still desirable improvements that we can implement in our mobile device cables.
Re: Hard and soft platform?
Most hardware manufacturers don't own any industry. Customers don't buy hardware because it has great specs. They buy hardware to run software. The developers of the software own the industry.
If the manufacturers had pushed harder for Linux, then the hardware manufacturers would have been part of a community developing their own software and owning their own industry. But, for most companies, the transition would have been painful because most of the programs that customers need are developed for proprietary platforms. So, to avoid short-term pain, they allow software companies to push them around.
Re: Waiting ...
1. The "open source" was never a significant factor in Android. They carefully avoided packaging any software owned by anybody with a GNU philosophy, and the Google Apps were never open source. I don't think they'll charge for it. Controlling the Google Play platform within the Android platform is probably enough.
2. Note how Amazon has already forked Android. Forking Android is legal and possible. I think Amazon will be in a dominant position when there are no other national retailers. They're already almost the only national bookseller, and they're starting to use differences in pricing to try to extract profit from rich/careless customers. Not to mention that Kindle book sales are almost pure profit.
Who would listen to Carl Icahn?
Why would Tim Cook listen to Carl Icahn? Does Icahn have any advice worth listening to for a company that doesn't plan to kill itself?
Isn't $20 billion enough for Mr. Icahn? It's not like he has a big family full of heirs to distribute his possessions to when he croaks.
Where is the SEC in all of this? Isn't the SEC supposed to protect the interests of shareholders against harmful individuals?
Re: Bill Gates still needs Microsoft
"There is no market, there is no money."
Well, then Google must be excessively evil or stupid to propose this thing. Because there is very little profit in advertising to people who have no money.
No, greater access to the market gives greater access to money. For example, I chose the shampoo that I currently use because I wanted to experiment with unscented soaps, and the Everyday Shea brand was the cheapest unscented shampoo I could find at my local store. The soap is made using fair trade shea butter from West Africa. Instead of letting the shea nuts rot on the ground while the people scavenge for food, global markets have allowed Africans to harvest the nuts and turn them into boutique soaps for pampered Westerners.
Bill Gates still needs Microsoft
"But this may be because Gates has taken his money and distanced himself from day-to-day activities at Microsoft, allowing him to run the charity without being conflicted by business goals."
On the contrary, Bill Gates is trying to hurt Google because Google is bad for Microsoft. His charity is funded mainly by his Microsoft stock, so it's still in his interest to promote Microsoft. I think it's bad form for one charity to criticize another just because it's working on different goals. And the Internet does have a philanthropic effect.
Greater communication allows greater access to the market. It allows people to make deals with other people at larger distances, thereby finding the best prices for their wares. It frees them from having to sell to the nearest market, which is subject to great price variation.
With better access to money, people become richer and are better able to afford mosquito counter-measures, which we take for granted in the United States. And that reduces the infection rate of malaria, Bill Gates' pet project.
Also, there is stuff on the Internet besides LOLcats. Notably, MOOCs.
Smart meters may actually lead to remote control
Well, smart meters don't currently allow the government to remote control your appliances. The power companies just want to reduce the staff needed to check the meters every month. But the futurist people do imagine a world where that is possible. For entirely altruistic reasons, of course. So we could avoid taxing the power grid when the renewable power sources aren't working at full power.
Probably the government won't have direct control, though. If remote-control appliances do happen, they would be run by the power company, which in many places is a government-regulated monopoly. I don't like the proposed erosion of privacy, I don't like the possibility that these energy-saving techniques might randomly combine to make my refrigerator turn off too long and spoil my food, and I especially don't like the risk that a hacker could take over and make my house very uncomfortable. So, I'm not sure what would be the uptake on remote-controllable appliances if they actually happen.
Of course, Rush is just in it for the money.
MacOS is there for you
If you want hierarchical icons and consistently placed menus, you should be using MacOS. Apple has always been better at having Human Interface Guidelines (even if they don't always follow them), and they haven't dramatically changed the interface since the switch to MacOS X. The downside is they've never fixed the OS X Finder, which still sucks at the basics compared to the Classic MacOS Finder.
I think a search-based interface is crucial to manage the complexity of modern systems. A hierarchical clicking system is fine when you have only a few files, but as the number of files goes up then it becomes unwieldy. The worst is the Windows XP Start Menu: Non-alphabetical, multi-column, with sub-menus jumping out at you like bogeymen if you don't move the pointer to the correct selection with the proper grace. When MacOS X 10.4 and Windows Vista introduced searching that worked, I was pleased to use it.
I didn't realize there was a competition for "worst possible plug design."
What about component video via RCA plugs? Too many pins, and you have to arrange them manually. Always a fun activity to do on an HDTV mounted to the wall. Now I finally have a use for the flash on my phone's camera.
But the greatest fun is if you unexpectedly find yourself needing to plug into a projector that takes component via BNC plugs.
I'm not terribly fond of barrel plugs, either. Nothing intrinsically wrong with them, but they come in a great diversity of sizes. I especially dislike how a couple of my HP laptops have very slightly different sizes for the barrel plug, making their power supplies incompatible.
USB as better? No!
Weren't you just complaining about how USB is horrible because it can be plugged in only one way? Also, I don't know what your visual acuity is so that mini-DisplayPort "does not look much smaller to the original." It's less than half as wide and somewhat thinner, and I don't remember Apple ever using full-size DisplayPort. My problem with mini-DP is that the original didn't have audio, so I have to worry about how old is the MacBook Pro and which mini-DP-to-HDMI adapter I'm using.
The Lightning connector is irritating because I need to collect all new adapters, but it can be plugged in both ways. That ease of use resolves a lot of my angst, but my angst increases again when I think about how it's patented. However, my need for plugs has declined quite a bit since I got an Apple TV.
Meanwhile in USB land, I have to contend with not 1 USB, but 7 connectors and 3 standards. There are USB A, USB B, mini-USB, micro-USB, and USB 3 versions of USB A, USB B, and micro-USB. I have an entire drawer full of mini-USB cables that have become redundant because of micro-USB, which "does not look much smaller to the original" in my view. But it's easier to tell which way is "up" on mini-USB. And now I need to worry about whether I'm plugging my device into a USB 3 or USB 2 port, in addition to any USB 1 hubs that might still be floating around.
The sooner we can switch everything to standard wireless, the better.
Palm's troubled history
I think Palm's problems go back to the original PalmPilot. I seem to remember Palm selling themselves to USRobotics to benefit from their "infrastructure and resources," but immediately USRobotics sold themselves to 3Com. The CEO of 3Com, Benhamou, drove away Hawkins and Dubinsky, the founders and visionaries, leaving Palm rudderless at a crucial time.
Even after Hawkins and Dubinsky came back when Palm bought Handspring, Palm meandered and frittered away any advantages they had. So, this ending for Palm is sad, but not out of character.
Shocking lack of imagination
I've frequently been annoyed by people who say that they can't imagine needing faster broadband because Netflix doesn't take any more than 6 Mbps. (Well, it now takes 12 Mbps in 3D Super HD, and it's bound to only go up.) So, the market is mostly made up of slow connections, and the major ISPs can point to low adoption as proving low demand instead of ridiculously high prices.
But the real crime is the lack of upload. Cisco had a very pretty networked webcam that flopped hard, mostly because Cisco sucks at the consumer market, but also because it required 1.5 Mbps upload for the 720p version and 3.5 Mbps for 1080p. Nobody on DSL has that kind of upload speed.
Not to mention all the other ideas that have to be shelved because speeds are just too low.
Re: Yes and no.
When some company starts using Linux, Linux people see it as a success. When Apple started using FreeBSD, many Linux people think it's a failure. Why is that?
Maybe it's because of the BSD license. Apple freely takes from FreeBSD, but rather selectively gives back. I think, if the FreeBSD people are happy with that arrangement, then that's their business. Vive la différence and all that junk.
The GPL encourages companies to contribute to a community. Sometimes they have to be dragged in kicking and screaming, especially embedded device vendors, but mostly it has led to the greatest cooperative projects we've ever seen in software.
Re: In an ideal world...
It's true that DRM will not prevent pirate copies from appearing 10 seconds after the content goes live, but the people who are paying still get a gimped service.
Gimped in what manner?
DRM has not been practical for preventing piracy. They use it to control average users. So far, DRM has been especially effective in DVD, BluRay, HDCP, Apple iTunes, and Amazon Kindle. I don't care about video games, but it's there, too.
In DVD and BluRay, DRM prevents you from buying a disc in one region and playing it in an average disc player in another region. The media companies want to use differential pricing and staggered release dates to maximize revenue. Sometimes they decide that a region isn't worth the cost of releasing a product. Tough beans for you if they decide never to release a disc in your region. Also, DRM prevents you from skipping ads and propaganda about fictional laws that they insert at the beginning of movies.
HDCP theoretically prevents you from copying movies as you're playing them. In practice, the HD stream is so big that, even though the HDCP master key has been leaked, nobody bothers using it to pirate. You get better speed and quality from breaking the AACS encryption on the BluRay. Instead, HDCP ensures that you're using properly restricted media players and properly restricted output devices to play videos exactly the way they're intended to be played. But there is some minor flaw that makes the DRM in your fancy-expensive device not operate properly with the DRM in your other device? Too bad, so sad.
Apple and Amazon use DRM to make sure that, if you buy any content for your fancy gadget, then you're trapped with that brand of fancy gadget for the rest of your content's life. Apple does it to make their line of fancy gadgets more attractive to buy, making more money on hardware sales than music sales, though now also making a lot of money in the App Store. Amazon is in the low-margin high-volume market, and wants you sticking around for them to shove ads in your face. Also, Amazon uses DRM to remote-delete content from fancy gadgets, including, fittingly, copies of George Orwell's 1984.
DRM has not yet been effective at preventing piracy, but it has been great at causing problems for people who want to use content legally.
DRM is exactly what Jeff Jaffe claims to decry
DRM is about ensuring that somebody other than the user is in control of the user's own computer. You can't do that without making the DRM module closed-source, preferably tied to a hardware component like TPM. That makes it fundamentally, mathematically, incompatible with the Open Web.
The Web was built from the ability of people to make their own browser. We wouldn't even have had HTML5 and the <video> tag if Blake Ross et al were not able to create Firefox, their own web browser that worked with the content on the Web. If DRM is part of the standard, then it would reserve the ability to make a standard web browser only to companies big enough to ensure that the DRM is effective for the average user. DRM marginalizes open source, which means it closes the Web and eliminates a lot of its potential for disruption.
Adding DRM to the Web doesn't even solve anything. Google and Microsoft already add Flash to their browsers, and they're free to add whatever else they want. In no way does this belong in the standard.
I expected this sort of evil from Microsoft, but I'm disappointed in Google.
Enemy of Fitts' Law
I see they've brought their signature design element to Chrome: A 1-pixel border above the tab bar. Google and Mozilla make their tabs infinitely tall, by having the tab right at the top of the screen. Microsoft ignores Fitts, except for making the close button have the largest Fitts' Law area. Maximum destructiveness, just what you'd expect from Microsoft.
But Opera is unique. Opera puts the tabs almost at the top of the screen, but not quite. There is a border between the top of the tab and the top of the screen. Just 1 pixel tall. This new version puts the O menu at the top of the screen, while the old version had a 1-pixel border, but the tab bar is still barely not at the top.
Re: Tried it this morning
Looks like we're starting to see the results of Opera's firing of all their browser engineers. I wouldn't stick with the old version for too long, because now nobody is releasing security updates for it.
On the upside, I now get to play with other browsers. First, I'm trying Dolphin.
Tired? Try spoiled and lazy.
It has been how many years since Windows 7 came out? Microsoft came out with the next version right on schedule.
IT administrators have been spoiled rotten by the decade-long reign of Windows XP. That was an anomaly, and a hindrance to progress. After Vista was released, Sinofsky got Microsoft back on track, releasing new versions of Windows on time.
Before Windows XP, Microsoft released new versions of Windows every couple years. After Windows XP, Microsoft is again releasing new versions every couple years. This is how it should be.
DRM promotes interoperability? Wrong!
It's sad that the CEO of the W3C would be so wrong.
The Encrypted Media Extensions would standardize the APIs for the content decryption modules, but it won't standardize the content decryption modules (CDMs). It can't standardize the CDMs, because that would make the encryption scheme unworkable, due to pre-existing W3C policies on open source. Therefore, the content "protected" by the CDMs would be restricted to "applications inaccessible to the Open Web or completely locked down devices." Exactly what Jaffe said he didn't want.
EME will not make the web more open. It will only make life easier for people who try to restrict users' freedoms. It's the <video> tag all over again, but even worse.
Re: Oh Dear
Of course, he's referring to the N9. The N900 runs Maemo, and was released before Elop became CEO of Nokia.
For one thing, there's the matter of scale. Shuttleworth is bankrolling Ubuntu, but he has only about $500 million to play with. Microsoft has over $60 billion to use in any way as long as a regulator doesn't stop them.
For another thing, Microsoft rarely competes on the merits. They use unfair contracts and misleading advertising to spread into new markets. To develop a market, you need to spend time in it, iterating and improving your product, until you have something good. Most companies need to be profitable pretty early in this process, but Microsoft has demonstrated that they will lose billions of dollars to conquer a new market. It's exceedingly unfair to compete against a player using negative margins, and it has destroyed many innovative people.
IBM used to be the evil empire, so we used to hate them. Their management was incompetent, so they fell to morally neutral status a while ago.
Re: Solving PI vs. time to mine one Bitcoin...
If you really want to understand Bitcoin, then you really should study from the people who work on it, such as http://bitcoin.org/en/how-it-works
Question 1: Is mining Sisyphean?
I can't tell whether Bitcoin mining is Sisyphean. The story of Sisyphus is a morality tale about the futility of undermining the rules of the gods. The problem with Sisyphus was that his ultimate opponent had divine powers. Bitcoin has no gods as enemies, as far as I know. Likewise, not everybody is involved in Bitcoin for the monetary reward.
The calculations are not to slow the speed of the expansion. They do, but their purpose is to reinforce the integrity of the Bitcoin system. Bitcoins are traded from one wallet to another via transactions. Transactions are listed in blocks of transactions, in a chain of blocks going back to the first block. Zero or more transactions are bundled together in a block, along with some additional data such as the identity of the previous block, and the block is validated as being part of the block chain by having a header with the appropriate difficulty.
The hard part is finding a header with the appropriate difficulty. The difficulty is adjusted up and down depending on the speed of finding the previous several blocks, so each block takes on average 10 minutes. The Sisyphean aspects are that:
1) The more miners working on the blockchain, the less likely you are to find a block, so your mining equipment becomes less valuable. CPU mining is already worthless, and GPU mining should become unprofitable due to competition with FPGA and ASIC.
2) The reward for finding a block is a combination of the transaction fees and the new Bitcoins themselves. The number of new Bitcoins is gradually decreasing, by design, until just under 21 million Bitcoins will be generated by 2140. Assuming the Bitcoin system is still operational at that time. Miners will still have an incentive to mine because they receive the transaction fees, but that depends on Bitcoin becoming an active currency with enough volume to make the transaction fees significant.
Question #2: Could Bitcoin be used for utilitarian purposes such as SETI or Pi?
It would have to be a different protocol, not Bitcoin, because Bitcoin is inextricably linked to the use of cryptographic hashes to find headers of appropriate difficulty. I guess you could make a newer version of Bitcoin that uses a more difficult hash, if SHA256 turns out to have a fatal flaw within the next hundred years, but it doesn't work for general-purpose computing.
The great thing about the cryptographic hash is that anybody can verify the headers, much more cheaply than the headers take to calculate. You could hypothetically mine using SETI, but there's no way for somebody to independently verify that the SETI-miner has done the work. They would have to download the same blocks from SETI and do the calculations again. That's a lot of work and a single point of failure.
Also, solving pi is an extremely impractical method of compression. For one thing, there is no "solution" to pi because it's infinite. For another thing, to use pi for compression, you would need to store the "solution" somewhere so you could use it for compression, which would more than wipe out any savings. Finally, because it's infinite, the number of bytes you would need for the index is also infinite, so in general you don't save any space in the transmission. I'm sure better mathematicians than I have proven it somewhere.
Re: Dumb question
I gather that it's designed to get significantly harder to generate a new bitcoin as more are brought into existence, with a theoretical maximum number that could ever exist.
So eventually a new bitcoin will require near-infinite processing power.
The algorithm sets a hard limit of 20,999,999.9769 BTC. That's part of the design. The Bitcoin system generates blocks at a fairly steady rate. (It temporarily increases the speed when new mining hardware comes online, and it temporarily decreases the speed when mining hardware goes offline. Likewise, the processing power to find a block goes up, on average, as new mining hardware comes online.) But the number of Bitcoins per block decreases until, eventually, there will be no more Bitcoins generated. Currently, the reward is 25 BTC per block. Block #6,929,999 will generate 0.00000001 BTC, and then there will be no more BTC generated. To say that you are making more BTC after that would be to engage in fraud, and it would not be recognized as valid by the Bitcoin network.
The new Bitcoins go to the miner who first makes a solution to the mining algorithm, but that's not the primary purpose of the mining operation. The primary purpose of the mining operation is to validate transactions as having happened. Sort of a very expensive COMMIT operation. The transactions are grouped into the blocks, and the miner receives the transaction fees that are attached to the transactions. Thus, mining should remain profitable after the end of new BTC because the miner receives the transaction fees.
This was actually a pretty clever solution. By dribbling out Bitcoins using an algorithm, Satoshi solved the problem of distributing money without a central authority. By rewarding transaction blocks with new BTC, Satoshi created the incentive for people to start validating transactions in the days before there is a profitable volume of BTC transactions.
Of course, that's assuming nobody finds a major flaw in the algorithm (certainly not for want of trying), or the economic conditions drastically change (World War IV as Total War?), or people become disillusioned with Bitcoin for some reason. Bitcoin has already survived several flaws in implementation of Bitcoin services, and even a flaw in the core Bitcoin program, but it's still vulnerable. Everything is vulnerable, but Bitcoin is relatively new and people are more aware of its vulnerability.
Re: Dumb question
No, like any other commodity, the price will fluctuate according to the laws of supply and demand. It's independent of the cost of producing it. Right now, there is an alarmingly high demand for Bitcoins, so the price is pretty high.
The algorithm is designed to produce Bitcoins at a steady rate by adjusting the difficulty of the algorithm to the availability of miners. The miners have an incentive to mine because each block they successfully validate gives them a certain number of Bitcoins. But the cost of mining depends on the cost of acquiring the hardware, the cost of the electricity to run it, and the cost of the Internet connection to connect it to the Bitcoin network. And, as a practical matter, there is also the cost of spending time on Bitcoin mining instead of doing something else.
Many miners mine because the costs for them are lower than the wealth they can receive by validating the blocks. So that drives investment into Bitcoin mining. But the more miners go into Bitcoin mining, then the difficulty goes up to keep the Bitcoin production steady. Essentially, you get more people competing for the same resources. The return on investment for the less efficient miners eventually becomes so low that they drop out, which provides a sort of balance to the system. Some people will continue to mine just for the fun of it, but a lot of the mining is done for a profit.
The amusing part about this story is that the script kiddy is doing CPU mining. CPU mining became unprofitable a couple of years ago. The cost of electricity and the opportunity cost of setting it up has made CPU mining impractical compared to other techniques. In recent months, specially built chips (ASIC) have come online, so now a majority of the Bitcoin mining proceeds are going to people who invest in these devices that can do nothing but mine Bitcoins. Soon, a majority of GPU miners are also going to go offline, just because it will be unprofitable to mine with GPUs.
In summary, you have it backwards. The inflation of Bitcoin doesn't depend on the processing power to mine it. The inflation of Bitcoin depends on the algorithm. The processing power loosely depends on the exchange rate, which is dependent on people's valuation of the currency.
Amazon and Google, please work together.
Google Shopping is almost useless. Retailers pay to be included, and they generally have poor selection and high prices.
Amazon Search is almost useless. It frequently returns items not related to the search terms. It's getting better, but it's still a chore to sift through. Also, sorting by price is a bad joke.
In my ideal world, I would use Google Shopping to search for things from Amazon.
No, the problem is still there.
"The problem that gave rise to VP8 and WebM, namely Mozilla declining to support H.264 in 2010 for fear that it might be target of an patent bomb from MPEG LA, the overseer of MPEG IP, has therefore passed."
No, the problem is still there, and you're an idiot, Richard Chirgwin.
The problem is that every implementation and every commercial use of H.264 require a paid license, which is incompatible with free software licenses such as the GPL and many open source licenses.
Mozilla finally caved on the use of H.264 in playback because the vast majority of systems where Firefox is installed have paid licenses of H.264. It's different for bare-metal systems such as Firefox OS, and it's different for encoders and potentially commercial uses such as WebRTC. If Mozilla required H.264 in these situations, then they would be adding new obligations to users, which is incompatible with the principles of free software.
This basic incompatibility between H.264 and Mozilla's mission is probably why Microsoft and their pet Nokia are trying to kill VP8.
Re: Doesn't Nokia have a point?
"Given I take H.264 video off my HD camera, edit it, encode it with x.264 and play it back all within Linux I'm not sure you understand the situation"
No, I'm not sure that you understand the situation.
To use H.264 legally, you need a license. You need a license for the codec, and you need a license to use it.
Windows can use H.264 because Microsoft pays a license for the codec. Flash can use H.264 because Adobe pays a license. MacOS can use H.264 because Apple pays a license. Android and Raspberry Pi can use H.264 because their respective vendors buy licenses. Mozilla is finally doing H.264 in Firefox because the vast majority of Firefox installations are on systems for which somebody paid a license.
For Linux, in general, nobody paid a license. The x264 developers are flagrantly ignoring the patents, and the developers strictly come from countries with lax patent law enforcement. Mainstream Linux distributions are reluctant to add H.264 because the license requirement is incompatible with the GPL and even the BSD license.
Then, to publish videos with H.264, you also need a license. For now, MPEG LA is letting people use H.264 for personal purposes, but do anything commercial and you're supposed to pay.
That's not to mention all the submarine patents that could derail H.264, just like Nokia is trying to do with VP8.
Morons like streaky are why screen resolutions haven't increased in 10 years.
The point of having so many pixels is so you do not see individual pixels. That's what Apple was advertising with the Retina branding, and what Google is now calling the Chromebook Pixel.
I don't want to see pixels. I've seen plenty of pixels. I want the pixels to be so small that I see smooth fonts and sharp pictures. That's the point of having such sharp screens.
Re: So many issues I hardly know where to start...
"1. Most obvious question – even if you're a big Chrome fan, why not buy a MacBook Air and access your favorite Google apps and services from it without giving up the benefits of local capabilities?"
Most obvious answer – Because you need to maintain the OS on a MacBook. You don't need to maintain the OS on a Chromebook. Just think of the children. Or the parents.
Except for not being compatible with any apps or devices, so you have to retrain them on the Google Cloud way of doing things. You can't have everything. Mac users should already be familiar with this.
Who wants a federal monopoly?
What a weird strawman, Mr. Orlowski. I don't think any reasonable person wants a federal monopoly. I certainly don't want the same organization that gave us the TSA to give me last-mile Internet. What I tend to hear is that Google, et al, want the last mile of Internet access to be regulated and opened to competition, but still privately owned and maintained.
Orlowski says Americans have a reputation for being doers instead of whiners, but existing regulations mean we just aren't allowed to do. As a resident of San Francisco, I'm sure you know of Monkeybrains' attempt to bring micro-trench fiber to the city. Well, they couldn't figure out how to file the paperwork to get their construction approved. Sonic.net is able to move forward only by turning themselves into a phone company and adopting all the regulations that are involved with that, which means no naked DSL.
I suspect that Cyrus Farivar at Ars is a bit anxious about the Comcast thing because it's a 6-month or 12-month promotional deal. At the end, he'll have to face the choice of paying 2-3 times as much for the same service, or having his service cut to 1/2 or 1/4 of its current speed. He knows he won't get anything comparable from AT&T, so he can't threaten to leave to get better prices. Also, frankly, $45 for 24 Mbps is pathetic compared to many other places.
Re: No surprise, I predict that there will be more to come
Well, my plan for achieving migration is saying, "I will make it happen," on my network. If you are a network administrator, now it is YOUR personal duty to enable IPv6 connectivity on your network. IPv4 was deployed by millions of individual decisions to join the Internet. IPv6 will be deployed by the same.
In my section of the USA, the ISPs are trying to eliminate the home router market. When you get new Internet service from Comcast or AT&T, you get a combination modem and wireless router device, too. The upside is that the routers that they've starting shipping in the last few months support IPv6. This means the homes in the USA are gradually shifting to IPv6, without the consumers having to learn new technology. This is a positive development.
Embrace, Extend, Extinguish
"[Microsoft] maintains it wants to make WebRTC more flexible… That makes the technology more readily adaptable to a given developer’s needs, but it also limits interoperability."
Good to see that some parts of Microsoft are still up to their old tricks.
I suspect that one reason SIP never caught on is that it allows a bewildering array of codecs. If you have a SIP client from one vendor and want to communicate with a SIP client from another vendor, you have to carry so many codecs, most of them patented. Skype is just simpler to use and more reliable. So, I see Microsoft is trying to preserve the value of the investment in Skype.
Finkelstein old-fashioned, irrelevant?
I find any Worst CEO of 2012 list that does not include Stephen Elop to be seriously suspect. I mean, doubling down on a losing strategy, while tossing your institutional knowledge overboard, seems like it should be a reason Why Smart Executives Fail.
But castigating Zuckerberg for not wearing a 19th Century period costume is pretty low. Zuckerberg signals that he doesn't care about the traditional finance people. In fact, he doesn't. As long as he has controlling shares, it doesn't really matter what other people think, as long as he doesn't break any laws. If Zuckerberg can reduce his mental burden by wearing a hoodie every day, then he can concentrate his energy on stuff that really matters to his shareholders (especially himself).
On the other hand, what has Sydney Finkelstein done? Trained a bunch of executives? Made friends with the 1%? I require better reasons why I should pay attention to Finkelstein instead of Zuckerberg.
UPS? So lucky!
My company just moved into a new facility. I work in the AV department, and one of the items I put on my wish list was a UPS system. The company hired a consultant to build the AV system, and he denied the request, saying, "The power from the utility is reliable."
Um, part of my work is digital recordings. I have to buy my own UPS units. :(
Automatic memory management
Of course, this sort of problem is absolutely impossible in a system with automatic memory management, because the programmer has no direct access to pointers. For example, Java.
Re: Update on exit
It's true, you really don't know.
Linux (and other modern Unix-like systems) has the concept of the inode that is separate from the filename. The filename is merely a link between the directory structure and the inode, and a file can have more than one link. When the number of links reaches 0, then the file is deleted.
So, when a program is running, it creates an in-memory link to the inode. It's possible to remove the file from the directory structure, deleting it, but it will still be on disk because the in-memory links keep the number of links from reaching 0.
It's not perfect, if you consider badly written programs. Some programs depend on files that load after the program loads, and the in-memory link thing can cause confusion. Many times people have been working on a file, deleted the old version, hit save, and then found that the new version was not there. That's because the program was holding onto the file's inode, and didn't verify that the inode still had a link to the directory structure when the user hit save.
Everybody's doing it
No surprise, Skype and Symantec having problems getting people to install their latest software. I hate them both.
But everyone's doing it. Apple demands that you leave your Mac unusable for a long time while it installs updates. (Apple on Windows also proactively shuts down whatever you're doing so it can update. So horrible.) Java, ATI, and Adobe sometimes try to sneak some unwanted ad-ware on your computer. Mozilla randomly plays 20 questions with you about your plugins.
Part of the problem with Windows is that it is impossible to remove open files. So, you can't do an in-place update, and you actually need a time when the program isn't running to do the updates. In Linux, you can do an update at any time, and then restart the program when convenient.
- Facebook offshores HUGE WAD OF CASH to Caymans - via Ireland
- Microsoft teams up with Feds, Europol in ZeroAccess botnet zombie hunt
- Three offers free US roaming, confirms stealth 4G rollout
- Justin Bieber BEGGED for a $200k RIM JOB – and got REJECTED
- Review Bigger on the inside: WD’s Tardis-like Black² Dual Drive laptop disk