Anyone who doesn't want the crappy web-video performance they are likely to get with HTML5?
97 posts • joined 11 May 2009
"how long before some shysters file suit in order to get the likes of Cortana, siri etc declared a sentient being and have full human fights?"
This does seem rather dangerous. It's only a matter of time before these voice assistants get cybernetic bodies, and begin fighting humans. Even a room full of humans might not stand a chance when the robots have knowledge of each of their individual strengths and weaknesses. All powered by The Cloud.
Re: virtual monitors
"Not yet, but they are improving. There are Oculus prototypes that are full 1080p, plus for business purposes you don't need stereoscopy; a single screen, even a Cardboard solution with a sufficiently-high-res smartphone will suffice."
The VR headsets like the Oculus Rift, HTC Vive, and other upcoming models that have been getting attention lately probably wouldn't be great as monitor replacements for at least the near future, simply because they're designed more for spreading their resolution out over a wide field of view. You don't need a 100+ degree viewing angle for a virtual monitor, so under that usage scenario, much of their resolution would be wasted. For a privacy-minded head-mounted display that isn't concerned with putting people in immersive 3D environments, a much narrower field of view with pixels more tightly packed together would probably be ideal.
And even if you're not sending different images to each eye, you'll still need a separate display for each eye (or half of a larger display dedicated to each eye) since optics aren't going to let you view the entirety of a screen right in front of your face with both eyes at once. And again, the design of these headsets that use a single smartphone screen divided in two are more suited to providing a wide field of view than they are a sharp central resolution. And of course, you probably won't want to be using a bulky solution with a screen much larger than you need for any considerable length of time.
For "business purposes" you would be better off with a headset that makes use of two much-smaller screens that could be optimally positioned in front of each eye. And if you plan to use the thing in a public place, you'll probably prefer an augmented reality solution to something designed for virtual reality. What good is the security gained from using the headset if you're getting pickpocketted in the process?
I agree that the tech is improving though, and within a few years or so, there may be AR headsets that are not much bulkier than a pair of glasses, that can provide dual-screen output suitable as a proper monitor replacement.
Re: Private Viewing...
There actually are collapsible anti-glare / privacy shields like this. I think they're typically marketed more for reducing glare though, either for people working with screens in bright outdoor conditions, or for graphic artists wanting to eliminate reflections from room lighting without affecting the image quality of their display.
I suppose that if they were used in a public place though, they might attract more attention, and perhaps even encourage people to look over your shoulder to find out what you're trying to hide, whereas a screen filter would probably be more discrete.
Re: Interesting complaint
"I personally tend towards the view that the issue at hand is cultural differences between Germany and the USA."
No, it's more a case of social media enabling anyone to make nonsensical claims to get their moment in the spotlight, and news companies jumping on any sensationalist story that gets picked up, however absurd it actually is. I'm pretty sure no one is actually really offended by it, even if they might claim to be to get their name in the news.
Re: RAID 1
They could recover files off computers, or ask questions to employees. And despite popular myth, companies do still tend to use lots of paper documents, even today, so there could easily be incriminating papers around somewhere.
Re: To be honest...
I think you're missing the point that they were not only being deceptive to those testing for emissions, but they were also lying to their customers directly. They repeatedly made a point of advertising their cars as offering "clean diesel", when clearly they were not. For many people, the supposed "environmental friendliness" was a major selling point that convinced them to go with their cars, when in reality, it was false-advertising.
And no, it was certainly possible for them to make the cars both perform the same AND offer emissions within the necessary guidelines. Not doing so was likely more a matter of them cutting corners in an attempt to reap larger profits and undercut the competition.
Now the question is how this will affect those who bought the cars. It's certainly going to negatively impact their resale value, so their customers have already lost money due to it. There's going to eventually be a recall, but how will that affect the cars? A simple software fix will not likely be enough, because the cars aren't built to be physically capable of offering their advertised performance at the required (and advertised) emissions levels. So, both a software fix and a more-costly hardware replacement will likely be necessary to bring each car up to code. And then you have to rely on the dealership actually installing the hardware correctly.
I imagine they'll be careful to avoid issues like this in their upcoming cars for at least the near future, since they're under increased scrutiny now, but they may very well cut corners in other areas. At the very least, a company going out of its way to deceive both its customers and regulators does not seem like a very good reason for you to go out of your way to purchase from them in the future. Perhaps you have stock in the company though. >_>
Re: So no AdBlock, NoScript, or RequestPolicy then
"Extensions are in fact going to be dropped too."
"No Dan, your link just says they're changing how you build them, not dropping them"
Actually, in many ways they ARE dropping extensions as we know them, and replacing them with what amount to "Chrome-compatible" extensions. Firefox has had an extension system that gives a lot of control to extension developers, and in turn anyone using those extensions. They're getting rid of much of this advanced functionality, so extensions will be quite a bit more limited in what they can do. Many existing extensions simply won't be possible to port to the new system, or they'll need to drop major features in order to work with it. While it may not be the topic of this article, it is likely to be something that's going to upset a lot of people when it happens.
"After reading this about them dropping NPAPI after having read about them them dropping XUL extensions, I now think Mozilla don't have any idea what they've got and are just dumping features that aren't in WebKit and making it look more like Chrome."
I suspect their ultimate plan may be to go the Opera route, eventually discontinuing their browser and replacing it with a low-maintenance Chromium re-skin with a few extensions built-in, that they can profit off of without spending any significant development budget on. These are the same kinds of things Opera was doing before that happened, and why the "Opera" desktop browser of today is just a generic featureless Chrome-derivative instead of the feature-packed Internet suite it once was. They may very well be making their browser look and behave like Chrome in an attempt to ease the transition.
Re: Waste of time and money, you can't break the rules of biology.
"ONLY if you give it enough corresponding bandwidth to compensate. Otherwise, you force it to cram and create even larger artifacts that offset the resolution improvement."
Yep, if broadcasters were interested in improving image quality, they would increase resolution and bitrates to make proper use of what existing HD screens can already offer. Practically no cable or satellite companies even air anything at full 1080p yet, and their 1080i broadcasts are typically compressed down to bitrates comparable to that of a standard-definition DVD.
Online streaming services are even worse when it comes to image quality. Youtube's "1080p" runs at about half the bitrate of an SD DVD, and as such, when a scene is in motion, everything tends to devolve into a blurry mess. Youtube's 4K streams increase the bitrate, but it's still only a little higher than that of a DVD, or similar to 1080i cable. In scenes that are mostly stationary, it's possible to get a sharper image, but with any significant amount of onscreen movement, things will again get blurry and artifacts will become visible.
Of course, even with high-bitrate encodes, most people won't be able to discern the difference between 1080p and 4K at typical television viewing distances. Ultimately though, image quality is much more limited by the broadcasters and streaming services than it is by a television's physical resolution, and they're going to continue using the bare minimum image quality that they can get away with, since any increase in bandwidth will cost them more money, for something that the majority of their subscribers won't even notice.
How much growth did they expect when their devices still cost thousands of dollars for something that amounts to little more than a hobby item for most people? Their products are targeted toward a niche group interested in designing things themselves, and there are now lots of other competing devices targeting that same group, with many that perform better or cost less. The vast majority of people simply won't see the worth in spending a thousand dollars or more for a device to print mostly useless trinkets that still need to be cleaned up and detailed following a lengthy and unreliable print process.
Unless they can make a decent, reliable 3D printer targeting consumers at a price point of a few hundred dollars or so, they'll have a hard time expanding their business much.
My first impression was that I wasn't fond of the new logo, but then your article reminded me that I didn't actually like the old logo all that much. The typeface was fine, but the "weird blobby thing" was always so weird and blobby. (In actuality, it's an abstraction of their even older logo, which was an eye that looked like it had been drawn in MSPaint.) I think we had only become desensitized to it from seeing it so much. The new logo isn't bad, and the old one probably should have been updated a decade ago.
And be sure to watch their new brand video if you like having seizures...
Re: Another bit of idiot naming...
Also, WWE wrestling isn't even an actual competitive sporting event. The matches are choreographed and scripted, and while a lot of the moves require athletic skill to pull off, the events themselves could be considered largely fictional.
Re: "early development hardware"
"3D alone won't guarantee success, look at the Nintendo 3DS."
But the 3DS has been very successful. It's sold over 50 million devices so far, during a time when there has been massive competition in the portable gaming market from cheap tablet and smartphone games. Granted, a lot of this comes down to people wanting Nintendo's games, but from a technical standpoint, the 3D display is a big selling point differentiating it from what tablets can do.
In any case, 3D visuals alone are not the major selling point of Hololens. Rather, it's the augmented reality provided by a wearable computer that can make objects appear and be interacted with in one's environment. The 3D visuals are just a part of that.
Re: Virtual Practicality?
"Peripherals would add so much to the experience. Imagine a reasonably realistic F1 steering wheel for an F1 game."
They have those, only they're not not for use with Kinect, and are not $5 peripherals. A flimsy plastic steering wheel floating in the air is not going to provide a much more realistic racing experience than your bare hands floating in the air. They're certainly not going to "give the player the feeling that they are really driving an F1 car". For that, you're going to need something sturdier connected to a base, securely attached to a table. And if you're already doing that, it won't cost much more to add some basic sensors and a wire connecting it to the game system, removing the need for a Kinect entirely. There you a have your typical racing wheel peripheral, readily available for PCs and consoles, starting at prices significantly less than a Kinect sensor. Cheap floating wheels 'could' be made for the Kinect if Microsoft allowed for it, but it would still be highly unlikely that they would provide as good of a racing experience as the console's standard controller.
I do agree that cheap peripherals for other types of games might work though. A hilt for sword games, or a stock for shooters would undoubtedly feel a lot better than just swinging your arm around and pointing. I suspect that the accuracy probably still wouldn't be quite up to the level of other motion controllers with built in sensors though, like the Wii MotionPlus, so Microsoft might not want such direct comparisons to be made.
Re: No. Just No!!!
The article is focused almost entirely on augmented reality, which involves projecting objects and information over your surroundings, without actually blocking out your view of what's around you. The headset discussed here isn't even intended as a consumer-level device, so it's probably not something your child will be using. Of course, there will be more-immersive VR headsets coming soon as well, which will block out one's view of the outside world, but you of course have the option of not buying one for your child if you think it could result in eye-poking.
On the other hand, I should point out that had you been wearing an AR headset yourself, it might have protected your eye from getting poked. : 3
Re: This wouldn't be (much of) a problem...
While I agree that a write-protect jumper or switch could help prevent remote attackers from updating a BIOS, it doesn't sound like it would help much at all in the scenario described in the article. A jumper is definitely something a maid or border official could handle within a minute or so. While the article describes it as being performed by someone "unskilled", this might not be an actual maid, but someone posing as one, who's had some practice performing this task. They'll know exactly where the jumper is for the target device, and how to get to it in an efficient manner. Even if they were a "complete noob", whoever put them up to it would have surely shown them how to do it. I doubt many maids will be randomly compromising BIOSes on their own.
And of course, the manufacturer isn't going to hide the jumper in some inaccessible location if they intend for people to actually apply patches. On a laptop or mobile device, it might be accessible from the battery compartment, or some other relatively convenient location.
Also, there should be no need to boot the device to verify that it worked. If there's only a few minutes available, it can simply be assumed that the patch worked. Otherwise, they can try again the next time an opportunity presents itself. If they happened to brick the device, its owner will probably just assume it broke in transit.
As for soldering in a new chip, that would obviously greatly increase the necessary time and skill requirements, as well as the failure rate. There's a pretty big difference between moving a jumper and soldering dozens of tiny pins in close proximity to one another. Again, the whole point of this is that it's something that can be done by someone with little training in a very short amount of time. And sure, there are many other ways a system could be compromised by someone with direct access, but not so many that would allow such relatively undetectable low-level hardware access.
Re: Cut to the chase
I kind of doubt printing your own microchips would really improve security. How do you verify that the microchip-printing machine wasn't compromised? Thoroughly examining each chip with a microscope? Good luck with trying to analyze the security of nanometer-scale circuits. Maybe for the most basic, pocket-calculator type chips it could work, but it will be next to impossible for any advanced chips as they continue to become more compact and complex.
They should have gone with hovertoasters...
People will pay to fund anything thrown on kickstarter, so long as it makes sufficiently sensationalist claims. It's an interesting project, but the hoverboard itself seems unlikely to ever become an actual product.
Compared to a simple skateboard, it's deficient in practically every way. Its biggest issue is that it can only function on certain metal surfaces, whereas the main draw of the fictional hoverboard that it's trying to play off of was that it could operate smoothly over any type of surface. The chances of there ever being specially designed copper-plated skate parks to use them in seem incredibly slim. It also appears to be much heavier and far less maneuverable than a skateboard. They also seem to have conveniently silenced the horrible ear-piercing screeching noise it makes during operation in their kickstarter videos. What's the point of a hoverboard if it's clearly worse in every way then a piece of wood with wheels on it?
As for using it to stabilize buildings, I can't see how hovering a building would work any better than existing Earthquake-proofing techniques used in modern structures, and it would probably just increase the potential for failure. And that's assuming you can even reliably hover a skyscraper over its foundation. There's probably some use for their electromagnets somewhere, but they don't seem to have any clear idea of what that might be.
Then there's this quote from their "Will it ever be affordable" question...
"Look at computers - only 15 years ago memory cost around $100 per Gigabyte; now its around $.01!"
What kind of memory are they talking about? RAM was around $1000 per GB 15 years ago, and is around $10 per GB now, while hard drives were around $20 per GB, and are now around $0.04 per GB. Neither of those add up to anywhere near the reduction in price they suggested for that time frame. And of course, their product isn't even the kind to benefit from cost-reduction due to miniaturization, making it a terrible analogy.
Re: Windows 7 with a flat theme
"But on the whole it just looks like W7 with a flat theme. (This flat scheme and colours thing is really off-putting and really not encouraging)."
This probably isn't the final theme. I imagine that they'll want it to look at least a little more different from Windows 8 to help distance it from that release. Perhaps there will be a new look ready to show off by the consumer preview in the spring.
It might not have a user-serviceable battery, but the large capacity should give it more room to degrade before that becomes an issue. Years down the line, when other smartphones only last half a day before needing a charge, this one should still be able to make it through the entire day with no problem.
Re: "keyboard doubles as a capacitive multitouch trackpad"
They are physical, moving keys. You should be able to tell they're not flat just by seeing how the light reflects off them in the photos here. Of course, this is a phone keyboard, so the key's aren't raised all that much, so they shouldn't get in the way of swiping. You can do a quick search on your favorite popular video sharing site if you want a video.
It's the Most Perfect Feature Yet!
Apple's patented Dynamic-Bend technology is what makes the new iPhone more perfect than ever! Apple noticed that the size and shape of lesser portable-tablets would cause discomfort among wearers of tight pants, which happen to make up a majority of their users, so they held off on releasing one of comparable size until they had its ergonomics perfected. Now, the new iPhone conforms to you, perfectly matching the curves of your body. This requires no complex setup by the owner, but intelligently matches one's shape over the course of a few days. Simply put, "it just bends".
Re: can it run Crysis?
Judging by the video they provided, Printer-Doom looks more like several colors of barely recognizable noise past the splash screen, so I wouldn't exactly say that the printer can run it 'playably'. I suppose what performance is there is so that it can serve it's web interface relatively smoothly though.
Re: "running most game CPU usage sits at < 20%"
"If the virus writer is at all intelligent, he will know that gamers tend to notice when their game doesn't run smoothly, and will have written his virus to use the GPU when no game is running - in other words, the rest of the time."
The majority of people downloading these games likely have a dedicated graphics card in their system, and most gaming cards have a fan that audibly ramps up under load. So, the user is bound to notice this and in most cases hunt down the cause relatively quickly. It would probably go unnoticed by some, particularly those with lower-end hardware, but it's not likely to get past those who would notice a reduction in FPS while gaming.
In any case, the post was talking about the idea of companies legitimately making their games available for "free" by including mining routines in the software, as an alternative to ads or micro-transactions. It might not be a bad idea in theory, but ultimately you would still be paying for the game in the form of increased electrical use, and more importantly, the returns wouldn't be anywhere near enough to pay for the development of the game, let alone turn a profit. The developer likely wouldn't average more than a fraction of a cent per hour that the game was running on typical PC hardware, or significantly less still on smartphones or tablets, since the use of specialized mining hardware has pretty much made Bitcoin mining on a standard PC obsolete.
However, the idea could potentially be profitable for those attaching malware to pirated software, since they don't actually need to pay for the development or distribution of that software, nor do they have to pay for the electricity. They probably won't make large amounts of money from it, but installed on thousands of PCs, they could pull in some extra cash on the side.
A difficult economic climate...
"Sorry guys, but due to current market conditions, we can no longer afford to keep you on our staff. We value you as employees, and have tried everything in our power to retain your positions, but the money just isn't there. As a parting gift, feel free to pick up a free copy of Minecraft as you exit the front door."
How can you people go on about the use of the term 'billion' and not notice the more glaring error in the article...
"3.5 by 4 kilometres wide – or to use the Reg standards converter, it's 29 by 25 Brontosauruses."
This conversion would place the length of a brontosaurus at around 140 meters, when they are actually only estimated to have averaged around 23 meters or so in length. Even their largest known cousins didn't reach half that long. If the Register intends for brontosauruses to become a worldwide standard for measurement, they need to more clearly define their size.
Re: Why is Win 8 and Win 8.1 seperated?
"According to netmarketshare their desktop OS market share data (cited in the article) covers all versions of Windows, with no exceptions being given, which seem to indicate that it includes Windows 8 phone..."
The key word is "Desktop". They have a separate list for mobile/tablet marketshare, in which the OS is specified as "Windows Phone OS 8.0" and "Windows Phone OS 8.1", so I can't imagine they would count them twice for both lists. The "Desktop" list might potentially include Windows tablets, but Windows isn't exactly a major player in that space, so it would only throw off the results by a very small amount. How the results are obtained would cause much more inaccuracy, which I touched on a couple posts up.
Re: 8 / 8.1
Windows major releases aren't really getting any more frequent yet as far as I can tell. Vista was released in 2006, 7 was released in 2009, 8 was released in 2012, and it stands to reason that 9 will be released in 2015. Aside from the 5+ years of XP, which wasn't their initial plan, and the occasional half-cycle releases like ME, they've been on a roughly 3 year release cycle ever since the very first versions of Windows. The numbering of the "8.1" service pack was really just an attempt to get people to take a second look at Windows 8 after its poor initial reception.
"The problem is though you pay for iOS / OS X through hardware purchases and Android is free, so unless Microsoft start giving away their OS I can't see many people being willing to pay to upgrade"
People pay for Windows through hardware purchases just as they do for OSX. The cost of OSX isn't 'free' after all, but incorporated into the cost of the device, just as it is for end-users purchasing Windows systems. Only a small minority of end-users actually upgrade existing Windows computers to the latest version, but that's how it's always been. "Fragmentation" of the install-base isn't really a problem outside the corporate environment, since new releases of Windows generally have very good backward compatibility with software, as far as operating systems go. And Android might be "free", but that's because Google is an advertising company profiting off their ability to track you and feed you ads, which the OS helps them to accomplish. It's not particularly well suited to a desktop or notebook productivity environment at this point either.
It is very possible that Windows may eventually move to a subscription model though, and might even offer such an option with their next major release. I would expect them to ease people into it though, like by offering the option to upgrade a computer to the latest version for $10 a year, with maybe some higher-priced tiers providing access to the latest versions of other software like Office as well. I can't see them forcing people into a subscription model right away though, as there will undoubtedly be many businesses and individuals who won't be interested in an operating system that's constantly changing. A subscription model could be a reasonable business plan for the future, as hardware takes longer to become obsolete for many users. That, or they could specify that all Windows computers require a non-serviceable battery pack with a five year life span. : P
Re: Why is Win 8 and Win 8.1 seperated?
8.0 and 8.1 arguably should be counted together for the purposes of the graphs here. It's a free update, and aside from the update method, it's not really very different from some of the relatively large changes made in service packs for prior Windows OSes.
HOWEVER, it should also be noted that it's extremely unlikely that both added together would reach anywhere remotely near the actual install base of active Windows XP systems. Remember that these numbers are derived from Internet stats based off trackers on certain public websites, so they're not going to include systems only accessible from internal networks, or those that see little to no web use. Additionally, the stats by Statcounter are based off total page views rather than unique visitors, so systems that see the most web browsing use, which tend to be consumer systems, will end up greatly overrepresented. Statcounter is also only used on around 2.5% of web sites, so it can't see the vast majority of the web. Meanwhile, Net Applications is installed on far fewer sites still, and it doesn't release much information on how it comes to its numbers, though they seem to be based more on estimates of unique systems rather than page views.
Neither web-tracking service is likely to be a particularly accurate way to measure the relative install-base of various operating systems though. They undoubtedly overrepresent certain geographic regions and demographics of users, while underrepresenting others. The best you can say is that within the sample obtained, these are the results they came to, but without knowing exactly what that sample consists of or what those methods are, the results are of limited use.
If you know the sampling method and demographic on the other hand, the results can be a lot more meaningful. For example, Steam's Hardware and Software Survey likely provides a reasonably good representation of systems in use by the core PC gaming market. In that case, approximately 60% are on Windows 7, 28% on Windows 8/8.1, 5% on Windows XP and 2.5% on Vista. Additionally there's around 3% on OSX and 1% on Linux. This is useful information for someone looking to develop a game or piece of gaming hardware. Of course, it's not at all representative of the wider operating system install base, since it's almost entirely based on data from relatively new consumer systems used for gaming. Web trackers like Statcounter and Net Applications target a more diverse audience, but you still need to remember that they undoubtedly misrepresent worldwide systems as a whole by a pretty wide margin, and it's extremely difficult to determine just what demographics they do cover.
Drone delivery is so last year. Much better options are now possible. We're about to open a Kickstarter for a completely REVOLUTIONARY new delivery method, called Dropit. Inspired by FUTURE TECHNOLOGIES we wanted to complete a COMPLETELY DIFFERENT, ORIGINAL delivery system.
Currently we don’t have a fully working prototype, but here's a computer rendering of what we envision the device to look like...
Re: You've got it all wrong....
This is pretty much what I was thinking (right down to the title).
Google isn't interested in making deliveries. They're interested in getting up-to-date, high-resolution aerial photography without requiring a fleet of spy satellites. Toss a flock of drones with download-pointing cameras over each major city, and you have maps that are constantly up to date. If they fly at a high enough altitude with zoom lenses to reduce distortion, they won't need to worry much about object avoidance either. They can just circle around the city all day like aerial Roombas.
Re: Dust removal
"Not scratching the surface" is kind of a silly argument. If the option where to either A) Have the entire surface covered in dust and lose the rover, or B) Have the solar cells get scratched up and lose 10% of their efficiency, I think it's obvious what the better option would be. I'm sure the dust buildup has a far greater effect on performance than scratches ever would.
Of course, the real reason was likely that their goal was to make a rover that would last at least 90 days, and within that time frame, dust buildup probably wasn't their biggest concern. During the initial planning stages, they may have figured that other components would be more likely to fail first, like from dust damaging motors and such. In that sense, the other components might be considered to have been overengineered, seeing as they held up far longer than necessary. And they probably did consider the possibility that gusts of wind might clear off dust, even if it wasn't something that was guaranteed.
In any case, actual "wipers" probably wouldn't have been the best solution anyway. Some sort of tiny air compressor or compressed air canister would probably work better. Or even just a clear plastic sheet covering each panel like a screen protector, that could be peeled back once with an attached wire in the event that dust became an issue.
Re: Lack of ambition...
That might work, aside from them being on opposite sides of the planet.
Re: What pleb uses Opera 12 still?
The "new Opera" is largely pointless. If you like it, than you might as well be using any other Chromium derivative, as it's ultimately just a re-skinned Chromium with a few extensions built-in and a few other features missing. The real reason Opera moved to Chromium was simply to save face when they discontinued their browser suite. Rather than admit that desktop Opera has been discontinued, they released a low-maintenance hack based off an open-source program to take its place while they focused on mobile. I wouldn't count on them ever restoring most old features of the browser. And really, the main thing Opera had going for it was its extensive feature set and customizable UI, both of which were dumped in the Chromium version. People stuck with Opera despite the occasional site incompatibilities because of it's feature set. The "new Opera" might have roughly the same site compatibility as other browsers, but that's the only thing it has. It lacks almost everything that made the browser worth using over the competition.
"A quick look at their Desktop Blog comments, and it's a VERY small minority of users that are EXREMELY vocal."
It's probably a small minority because most others gave up hope and abandoned the thing long ago. I switched to FF as my primary browser last year, after using Opera for many years, and haven't really been visiting their development blog since. Just looking at various web stats, you can see that desktop Opera is on a continual downward trend in terms of its userbase, not counting the mobile browser. Most long-term users won't be interested in the new browser, as it's "Opera" in name only, and the previous versions can only hold up for so long. The only way I'd likely consider coming back to Opera as my primary browser is if they open-sourced their old desktop suite as "Opera Classic" or something.
I only have a couple Gigs of RAM on this older system, and unfortunately, Firefox does seem to be very wasteful of memory. This is more of a concern when you work with relatively large numbers of tabs, where the browser can end up using all your available memory rather easily. Tab grouping helps, but I get the impression that the browser is wasting some resources for tabs that aren't even loaded. On the other hand, Chromium-derivatives seem to be even worse at efficiently handling large numbers of tabs, so it's not like there's a decent alternative. Some older versions of Opera are far better at managing lots of tabs, but they're outdated and have performance issues in other areas at this point. It seems like after the big "browser war" rush of a few years back, things have kind of stagnated in terms of performance and features in recent years, despite there being lots of room for improvement.
Yeah, why should Google get all the money for that?!
Don't forget Opera. They took their awesome feature-packed Internet suite and replaced it with an almost featureless reskinned Chromium last year, effectively removing any reason to use it over any other Chromium-derivative.
Re: For some reason...
"...this reminds me of that other site (check it daily just to be sure!): http://www.hasthelhcdestroyedtheearth.com/"
That site is kind of disappointing compared to the LHC webcam site. : 3
They have a point...
Actually, in this case, you kind of have to agree with them, or at least agree that seriously applying for the Mars One "mission" is probably a bad idea. It seems extremely unlikely that Mars One will manage to land anyone on Mars in the next decade, if ever, and if they were to somehow do so, there's little chance that the "colonists" would remain alive for long. Even if they somehow manage to make the landings happen, there simply isn't enough time to perform the necessary tests to make sure things work as planned and people will be able to survive the trip as designed, let alone an extended stay afterward. Then there's the predicted budget, which might seem like a lot, but is really quite low considering what they claim to be trying to do. $6 billion to set up a manned colony on Mars? The Apollo program cost over $100 billion (adjusted for inflation) to send a handful of manned missions to the moon, which is a much easier feat to accomplish, and that was without the need to set up a colony or keep people alive for more than a couple weeks round trip.
The way this is likely to go down is that we'll see the reality TV show pop up within the next couple years, complete with nonsensical eliminations and voting and drama, and the show will last a few years, during which time the important parts of the mission necessary to make it happen will be continually pushed back and finally cancelled for lack of funding. I suppose if someone's an attention-whore who can pretend to be willing to risk their life to go to Mars, this could be a reasonable way to get attention, but for anyone else, it's probably mostly a waste of time.
Re: Audiophile's? Audionuts is more like it
That is why music should only be listened to naked in the center of an anechoic chamber buried a hundred meters underground.
I was thinking that as well, but a quick search turned up that the capsule was actually intended to be opened in the year 2000, likely as a sort of millennium event, but they apparently lost track of exactly where it was buried, and didn't want to dig up the whole yard. So, it actually ended up being buried much longer than intended. Here's an article I found from a few years back that provides more details...
Apparently there was a muffin and beer in the capsule too. For some reason, they didn't make the news headlines though.
Re: These are not the pixels you are looking for.
"Try quantifying bitrate and encoding system in a manner as easy as 'More Pixels' - it's hard to explain to someone nontechnical (almost all 4k TV buyers will be nontechnical) why their current HD picture looks so bloody awful."
For transmission quality, they could simply use the bitrate in megabits per second, such as 5 Meg, 10 Meg, etc, much like how the quality of downloadable music is advertised, in terms similar to what's used for broadband. Perhaps if someone wants to pay extra for their streaming video service to use a 30 Meg transmission, they can have that option.
Of course, this isn't really all that useful to TV manufacturers, as it's related to transmission quality rather than display hardware. For them, telling people they get "FOUR TIMES THE PIXELS!" is what they're trying to market 4k on, whether or not people will ever see those pixels. Netflix might be planning on offering a 4K stream at 15 Mbps, but compressed Blu-Rays at 1080p already offer twice that amount of bandwidth. The increased resolution might provide a slightly sharper image, but that will only really be noticeable if you sit very close to a large screen. At typical television viewing distances you would need a massive wall-sized display to notice a significant difference, and that's not something everyone needs. The move to HD was a significant upgrade that provided a noticeable boost in quality on even average-sized screens, but 4k's potential benefits mostly just apply to the high end, making widespread adoption unlikely in the near-future.
And their game was only successful because it was a generic clone of another successful game. A year or so from now, most of their players will have move on to whatever generic game becomes popular next, and the chances of it coming from them are rather slim.
We recently got a new quiet energy-efficient dishwasher, and it works quite well. Sure, it takes around 2 to 3 hours depending on the settings and how long the sensors determine it should run, but the length of time generally shouldn't be a problem, since you'll typically be running a dishwasher between meals or overnight. It also offers a quick, less-efficient 1 hour mode as well though, in case you need something right away. Whether you pre-rinse your dishes before putting them in likely makes a difference in cleaning performance as well. Also, at least in the US, recent environmental regulations have made many detergents not perform as well as they used to, so it's generally not recommended to use the powdered detergents anymore, but rather the tablets and gel-packs. The detergent formulas likely vary from country to country though.
Re: Fishnapper was right
I'd hardly say the fishnapper was right, since it was more likely the act of transporting them around in glasses and other conditions associated with the move that did them in. If they really thought there was a problem with the fish's accommodations, they'd have been better off discussing that with whoever was in charge.
Re: Lifetime plans
My thoughts while reading the article were that this sounded like a scam, or at least a very poorly thought out business plan if they actually intended it to be sustainable. I'm sure the company's founders made a lot of money off of it before things imploded, leading me to think their goal was simply to bring in quick money with an unsustainable business model while tricking their customers into thinking they'd have a service that would last them for years. At the very least, they could have given their users more than two weeks notice that their "lifetime" plans would be ending. Even most free services give their users months of advance notice when shutting down. The company had to know for quite a while that their income was unsustainable.
Even so, anyone thinking of purchasing these plans should have been suspicious. One should be wary of a startup company offering "lifetime" access to their product. Really, the only time a "lifetime" model would really work for a business is if it's for something that lacks significant reoccurring costs, or where they'll be able to easily upsell the user to another service in the near future.
@Daz555 "The new consoles from MS and Sony have finally made the move to a 1080p native resolution at a decent framereate and this is a huge jump from the low res, upscaled games of the PS3 and 360 generation."
No they haven't. The vast majority of games on the Xbox One are getting upscaled from 720p, and from around 900p on the PS4. Over the course of the console generation, as developers try to push more demanding graphics out of the hardware, these resolutions are likely to drop further, as will the framerates. The resolutions will likely stay higher than they did last generation, but still won't make use of the full resolution of standard 1080p screens for more than a handful of big games.
Of course, I agree with your main point, that even-higher resolution consoles won't be seen any time soon. As much as television manufacturers would like to sell people 4k screens to replace their existing 1080p ones, they simply aren't necessary in most viewing scenarios. Not everyone needs a massive screen covering their wall, and for more common television sizes and viewing distances, the difference in sharpness will be virtually invisible. It's hardly like the move from SD screens to HD ones. Because of this, it's unlikely 4k source material like movies, television and games will become common any time soon.
Yep, this exactly summarizes the problems the Wii U has faced. The Wii sold great because Nintendo was selling to a new market of more "casual gamers" in addition to its usual fans. The problem is, these people don't keep tabs on new gaming hardware. They mostly learned of the Wii through word of mouth, for which the console's name "Wii" was memorable and unique. From its first announcement, I could tell the new console's name was a mistake, since these more casual gamers will have difficulty differentiating it from the Wii. Even the console's logo looks pretty much identical to it's predecessor's, aside from the barely identifiable U shape to the upper right of it. To them, the new console is just some sort of overpriced touchscreen add-on for the machine they occasionally play bowling and party games on. The benefits of such a controller aren't exactly as easy to determine as those of the original motion controllers, so they are unsure of how exactly such a device will offer anything over what they already have. And for any of these casual gamers who did pick up the Wii U, they undoubtedly found it lacking in the game department. For a long time after launch, Wii Sports, the series that likely sold them the original console, was nowhere to be found. It took a year after the console's launch for Nintendo to release anything Wii sports related, and that only came in the form of two sports available for Wii Sports Club, where people are expected to buy or rent updated versions of the individual sports that were available in the original game, as they become available. So rather than having a pack-in Wii Sports title included with the console, these casual users are expected to figure out how to purchase them through Nintendo's online store.
Meanwhile, for those more familiar with gaming, the new console seems like a repeat of what the Wii had to offer them last generation. The hardware capabilities are much improved, but still only marginally better than what the prior generation of consoles from Sony and Microsoft had years ago. The controller is interesting, but there aren't really many games that make good use of it yet. And meanwhile, Nintendo remains a bit behind the other console makers in terms of non-gaming features. Many also recall the dearth of decent third-party games available for the Wii during its latter years, and suspect the same will happen with the Wii U, perhaps even sooner due to its slow initial sales. The only thing they expect to get out of it are Nintendo's core first-party games, and Nintendo hasn't shown them all that much on that front yet. They still haven't shown a new Zelda, Metroid, F-Zero, or Starfox. It's certainly possible Nintendo might turn this around over the course of the next year though.
Of course, the Wii U still could benefit from another price drop as well. The current pricing is closer to the Wii's launch price, but the console lacks momentum at this point. I suspect there will be another price drop at least by the next holiday season. A basic Wii U for $199 USD or $249 for the premium version with a pack-in title would be a much more attractive price point for both casual gamers and those looking to get one as a second console. Even if Nintendo is losing money on each console sold, they'll make that back on the inevitable sale of a couple Nintendo exclusives, and it would help build their install base to keep the third-party developers from abandoning them.
Learn to read tables, Reg! : P
"It also pruned the predicted sales figures of its portable 3DS from 1.8 million to 1.35 million units."
I suspect you weren't paying much attention to the text at the top of the table that reads, "sales units in ten thousands". That should be a sales figure revision from "18 million to 13.5 million units", ten times the quantities you reported. While their sales prediction for the last year may have been a bit overly optimistic, the 3DS is actually selling quite well, especially compared to Sony's Vita, which is selling slower than even the Wii U. The 3DS has already sold around 43 million units worldwide over the last three years, and is definitely still going strong.
Re: "Google has not been able to dredge up anything really useful for a long time now"
It's not just the results either. The interfaces of their sites are going through the floor. For everything they improve, something else gets made worse. And recently, they've been forcing unwanted Google+ integration into all their sites. Google has long since abandoned the simplicity and usability that made them popular to begin with in favor of coercing people into using their services to further their goals of monopolizing the connected world.