73 posts • joined 11 May 2009
A difficult economic climate...
"Sorry guys, but due to current market conditions, we can no longer afford to keep you on our staff. We value you as employees, and have tried everything in our power to retain your positions, but the money just isn't there. As a parting gift, feel free to pick up a free copy of Minecraft as you exit the front door."
How can you people go on about the use of the term 'billion' and not notice the more glaring error in the article...
"3.5 by 4 kilometres wide – or to use the Reg standards converter, it's 29 by 25 Brontosauruses."
This conversion would place the length of a brontosaurus at around 140 meters, when they are actually only estimated to have averaged around 23 meters or so in length. Even their largest known cousins didn't reach half that long. If the Register intends for brontosauruses to become a worldwide standard for measurement, they need to more clearly define their size.
Re: Why is Win 8 and Win 8.1 seperated?
"According to netmarketshare their desktop OS market share data (cited in the article) covers all versions of Windows, with no exceptions being given, which seem to indicate that it includes Windows 8 phone..."
The key word is "Desktop". They have a separate list for mobile/tablet marketshare, in which the OS is specified as "Windows Phone OS 8.0" and "Windows Phone OS 8.1", so I can't imagine they would count them twice for both lists. The "Desktop" list might potentially include Windows tablets, but Windows isn't exactly a major player in that space, so it would only throw off the results by a very small amount. How the results are obtained would cause much more inaccuracy, which I touched on a couple posts up.
Re: 8 / 8.1
Windows major releases aren't really getting any more frequent yet as far as I can tell. Vista was released in 2006, 7 was released in 2009, 8 was released in 2012, and it stands to reason that 9 will be released in 2015. Aside from the 5+ years of XP, which wasn't their initial plan, and the occasional half-cycle releases like ME, they've been on a roughly 3 year release cycle ever since the very first versions of Windows. The numbering of the "8.1" service pack was really just an attempt to get people to take a second look at Windows 8 after its poor initial reception.
"The problem is though you pay for iOS / OS X through hardware purchases and Android is free, so unless Microsoft start giving away their OS I can't see many people being willing to pay to upgrade"
People pay for Windows through hardware purchases just as they do for OSX. The cost of OSX isn't 'free' after all, but incorporated into the cost of the device, just as it is for end-users purchasing Windows systems. Only a small minority of end-users actually upgrade existing Windows computers to the latest version, but that's how it's always been. "Fragmentation" of the install-base isn't really a problem outside the corporate environment, since new releases of Windows generally have very good backward compatibility with software, as far as operating systems go. And Android might be "free", but that's because Google is an advertising company profiting off their ability to track you and feed you ads, which the OS helps them to accomplish. It's not particularly well suited to a desktop or notebook productivity environment at this point either.
It is very possible that Windows may eventually move to a subscription model though, and might even offer such an option with their next major release. I would expect them to ease people into it though, like by offering the option to upgrade a computer to the latest version for $10 a year, with maybe some higher-priced tiers providing access to the latest versions of other software like Office as well. I can't see them forcing people into a subscription model right away though, as there will undoubtedly be many businesses and individuals who won't be interested in an operating system that's constantly changing. A subscription model could be a reasonable business plan for the future, as hardware takes longer to become obsolete for many users. That, or they could specify that all Windows computers require a non-serviceable battery pack with a five year life span. : P
Re: Why is Win 8 and Win 8.1 seperated?
8.0 and 8.1 arguably should be counted together for the purposes of the graphs here. It's a free update, and aside from the update method, it's not really very different from some of the relatively large changes made in service packs for prior Windows OSes.
HOWEVER, it should also be noted that it's extremely unlikely that both added together would reach anywhere remotely near the actual install base of active Windows XP systems. Remember that these numbers are derived from Internet stats based off trackers on certain public websites, so they're not going to include systems only accessible from internal networks, or those that see little to no web use. Additionally, the stats by Statcounter are based off total page views rather than unique visitors, so systems that see the most web browsing use, which tend to be consumer systems, will end up greatly overrepresented. Statcounter is also only used on around 2.5% of web sites, so it can't see the vast majority of the web. Meanwhile, Net Applications is installed on far fewer sites still, and it doesn't release much information on how it comes to its numbers, though they seem to be based more on estimates of unique systems rather than page views.
Neither web-tracking service is likely to be a particularly accurate way to measure the relative install-base of various operating systems though. They undoubtedly overrepresent certain geographic regions and demographics of users, while underrepresenting others. The best you can say is that within the sample obtained, these are the results they came to, but without knowing exactly what that sample consists of or what those methods are, the results are of limited use.
If you know the sampling method and demographic on the other hand, the results can be a lot more meaningful. For example, Steam's Hardware and Software Survey likely provides a reasonably good representation of systems in use by the core PC gaming market. In that case, approximately 60% are on Windows 7, 28% on Windows 8/8.1, 5% on Windows XP and 2.5% on Vista. Additionally there's around 3% on OSX and 1% on Linux. This is useful information for someone looking to develop a game or piece of gaming hardware. Of course, it's not at all representative of the wider operating system install base, since it's almost entirely based on data from relatively new consumer systems used for gaming. Web trackers like Statcounter and Net Applications target a more diverse audience, but you still need to remember that they undoubtedly misrepresent worldwide systems as a whole by a pretty wide margin, and it's extremely difficult to determine just what demographics they do cover.
Drone delivery is so last year. Much better options are now possible. We're about to open a Kickstarter for a completely REVOLUTIONARY new delivery method, called Dropit. Inspired by FUTURE TECHNOLOGIES we wanted to complete a COMPLETELY DIFFERENT, ORIGINAL delivery system.
Currently we don’t have a fully working prototype, but here's a computer rendering of what we envision the device to look like...
Re: You've got it all wrong....
This is pretty much what I was thinking (right down to the title).
Google isn't interested in making deliveries. They're interested in getting up-to-date, high-resolution aerial photography without requiring a fleet of spy satellites. Toss a flock of drones with download-pointing cameras over each major city, and you have maps that are constantly up to date. If they fly at a high enough altitude with zoom lenses to reduce distortion, they won't need to worry much about object avoidance either. They can just circle around the city all day like aerial Roombas.
Re: Dust removal
"Not scratching the surface" is kind of a silly argument. If the option where to either A) Have the entire surface covered in dust and lose the rover, or B) Have the solar cells get scratched up and lose 10% of their efficiency, I think it's obvious what the better option would be. I'm sure the dust buildup has a far greater effect on performance than scratches ever would.
Of course, the real reason was likely that their goal was to make a rover that would last at least 90 days, and within that time frame, dust buildup probably wasn't their biggest concern. During the initial planning stages, they may have figured that other components would be more likely to fail first, like from dust damaging motors and such. In that sense, the other components might be considered to have been overengineered, seeing as they held up far longer than necessary. And they probably did consider the possibility that gusts of wind might clear off dust, even if it wasn't something that was guaranteed.
In any case, actual "wipers" probably wouldn't have been the best solution anyway. Some sort of tiny air compressor or compressed air canister would probably work better. Or even just a clear plastic sheet covering each panel like a screen protector, that could be peeled back once with an attached wire in the event that dust became an issue.
Re: Lack of ambition...
That might work, aside from them being on opposite sides of the planet.
Re: What pleb uses Opera 12 still?
The "new Opera" is largely pointless. If you like it, than you might as well be using any other Chromium derivative, as it's ultimately just a re-skinned Chromium with a few extensions built-in and a few other features missing. The real reason Opera moved to Chromium was simply to save face when they discontinued their browser suite. Rather than admit that desktop Opera has been discontinued, they released a low-maintenance hack based off an open-source program to take its place while they focused on mobile. I wouldn't count on them ever restoring most old features of the browser. And really, the main thing Opera had going for it was its extensive feature set and customizable UI, both of which were dumped in the Chromium version. People stuck with Opera despite the occasional site incompatibilities because of it's feature set. The "new Opera" might have roughly the same site compatibility as other browsers, but that's the only thing it has. It lacks almost everything that made the browser worth using over the competition.
"A quick look at their Desktop Blog comments, and it's a VERY small minority of users that are EXREMELY vocal."
It's probably a small minority because most others gave up hope and abandoned the thing long ago. I switched to FF as my primary browser last year, after using Opera for many years, and haven't really been visiting their development blog since. Just looking at various web stats, you can see that desktop Opera is on a continual downward trend in terms of its userbase, not counting the mobile browser. Most long-term users won't be interested in the new browser, as it's "Opera" in name only, and the previous versions can only hold up for so long. The only way I'd likely consider coming back to Opera as my primary browser is if they open-sourced their old desktop suite as "Opera Classic" or something.
I only have a couple Gigs of RAM on this older system, and unfortunately, Firefox does seem to be very wasteful of memory. This is more of a concern when you work with relatively large numbers of tabs, where the browser can end up using all your available memory rather easily. Tab grouping helps, but I get the impression that the browser is wasting some resources for tabs that aren't even loaded. On the other hand, Chromium-derivatives seem to be even worse at efficiently handling large numbers of tabs, so it's not like there's a decent alternative. Some older versions of Opera are far better at managing lots of tabs, but they're outdated and have performance issues in other areas at this point. It seems like after the big "browser war" rush of a few years back, things have kind of stagnated in terms of performance and features in recent years, despite there being lots of room for improvement.
Yeah, why should Google get all the money for that?!
Don't forget Opera. They took their awesome feature-packed Internet suite and replaced it with an almost featureless reskinned Chromium last year, effectively removing any reason to use it over any other Chromium-derivative.
Re: For some reason...
"...this reminds me of that other site (check it daily just to be sure!): http://www.hasthelhcdestroyedtheearth.com/"
That site is kind of disappointing compared to the LHC webcam site. : 3
They have a point...
Actually, in this case, you kind of have to agree with them, or at least agree that seriously applying for the Mars One "mission" is probably a bad idea. It seems extremely unlikely that Mars One will manage to land anyone on Mars in the next decade, if ever, and if they were to somehow do so, there's little chance that the "colonists" would remain alive for long. Even if they somehow manage to make the landings happen, there simply isn't enough time to perform the necessary tests to make sure things work as planned and people will be able to survive the trip as designed, let alone an extended stay afterward. Then there's the predicted budget, which might seem like a lot, but is really quite low considering what they claim to be trying to do. $6 billion to set up a manned colony on Mars? The Apollo program cost over $100 billion (adjusted for inflation) to send a handful of manned missions to the moon, which is a much easier feat to accomplish, and that was without the need to set up a colony or keep people alive for more than a couple weeks round trip.
The way this is likely to go down is that we'll see the reality TV show pop up within the next couple years, complete with nonsensical eliminations and voting and drama, and the show will last a few years, during which time the important parts of the mission necessary to make it happen will be continually pushed back and finally cancelled for lack of funding. I suppose if someone's an attention-whore who can pretend to be willing to risk their life to go to Mars, this could be a reasonable way to get attention, but for anyone else, it's probably mostly a waste of time.
Re: Audiophile's? Audionuts is more like it
That is why music should only be listened to naked in the center of an anechoic chamber buried a hundred meters underground.
I was thinking that as well, but a quick search turned up that the capsule was actually intended to be opened in the year 2000, likely as a sort of millennium event, but they apparently lost track of exactly where it was buried, and didn't want to dig up the whole yard. So, it actually ended up being buried much longer than intended. Here's an article I found from a few years back that provides more details...
Apparently there was a muffin and beer in the capsule too. For some reason, they didn't make the news headlines though.
Re: These are not the pixels you are looking for.
"Try quantifying bitrate and encoding system in a manner as easy as 'More Pixels' - it's hard to explain to someone nontechnical (almost all 4k TV buyers will be nontechnical) why their current HD picture looks so bloody awful."
For transmission quality, they could simply use the bitrate in megabits per second, such as 5 Meg, 10 Meg, etc, much like how the quality of downloadable music is advertised, in terms similar to what's used for broadband. Perhaps if someone wants to pay extra for their streaming video service to use a 30 Meg transmission, they can have that option.
Of course, this isn't really all that useful to TV manufacturers, as it's related to transmission quality rather than display hardware. For them, telling people they get "FOUR TIMES THE PIXELS!" is what they're trying to market 4k on, whether or not people will ever see those pixels. Netflix might be planning on offering a 4K stream at 15 Mbps, but compressed Blu-Rays at 1080p already offer twice that amount of bandwidth. The increased resolution might provide a slightly sharper image, but that will only really be noticeable if you sit very close to a large screen. At typical television viewing distances you would need a massive wall-sized display to notice a significant difference, and that's not something everyone needs. The move to HD was a significant upgrade that provided a noticeable boost in quality on even average-sized screens, but 4k's potential benefits mostly just apply to the high end, making widespread adoption unlikely in the near-future.
And their game was only successful because it was a generic clone of another successful game. A year or so from now, most of their players will have move on to whatever generic game becomes popular next, and the chances of it coming from them are rather slim.
We recently got a new quiet energy-efficient dishwasher, and it works quite well. Sure, it takes around 2 to 3 hours depending on the settings and how long the sensors determine it should run, but the length of time generally shouldn't be a problem, since you'll typically be running a dishwasher between meals or overnight. It also offers a quick, less-efficient 1 hour mode as well though, in case you need something right away. Whether you pre-rinse your dishes before putting them in likely makes a difference in cleaning performance as well. Also, at least in the US, recent environmental regulations have made many detergents not perform as well as they used to, so it's generally not recommended to use the powdered detergents anymore, but rather the tablets and gel-packs. The detergent formulas likely vary from country to country though.
Re: Fishnapper was right
I'd hardly say the fishnapper was right, since it was more likely the act of transporting them around in glasses and other conditions associated with the move that did them in. If they really thought there was a problem with the fish's accommodations, they'd have been better off discussing that with whoever was in charge.
Re: Lifetime plans
My thoughts while reading the article were that this sounded like a scam, or at least a very poorly thought out business plan if they actually intended it to be sustainable. I'm sure the company's founders made a lot of money off of it before things imploded, leading me to think their goal was simply to bring in quick money with an unsustainable business model while tricking their customers into thinking they'd have a service that would last them for years. At the very least, they could have given their users more than two weeks notice that their "lifetime" plans would be ending. Even most free services give their users months of advance notice when shutting down. The company had to know for quite a while that their income was unsustainable.
Even so, anyone thinking of purchasing these plans should have been suspicious. One should be wary of a startup company offering "lifetime" access to their product. Really, the only time a "lifetime" model would really work for a business is if it's for something that lacks significant reoccurring costs, or where they'll be able to easily upsell the user to another service in the near future.
@Daz555 "The new consoles from MS and Sony have finally made the move to a 1080p native resolution at a decent framereate and this is a huge jump from the low res, upscaled games of the PS3 and 360 generation."
No they haven't. The vast majority of games on the Xbox One are getting upscaled from 720p, and from around 900p on the PS4. Over the course of the console generation, as developers try to push more demanding graphics out of the hardware, these resolutions are likely to drop further, as will the framerates. The resolutions will likely stay higher than they did last generation, but still won't make use of the full resolution of standard 1080p screens for more than a handful of big games.
Of course, I agree with your main point, that even-higher resolution consoles won't be seen any time soon. As much as television manufacturers would like to sell people 4k screens to replace their existing 1080p ones, they simply aren't necessary in most viewing scenarios. Not everyone needs a massive screen covering their wall, and for more common television sizes and viewing distances, the difference in sharpness will be virtually invisible. It's hardly like the move from SD screens to HD ones. Because of this, it's unlikely 4k source material like movies, television and games will become common any time soon.
Yep, this exactly summarizes the problems the Wii U has faced. The Wii sold great because Nintendo was selling to a new market of more "casual gamers" in addition to its usual fans. The problem is, these people don't keep tabs on new gaming hardware. They mostly learned of the Wii through word of mouth, for which the console's name "Wii" was memorable and unique. From its first announcement, I could tell the new console's name was a mistake, since these more casual gamers will have difficulty differentiating it from the Wii. Even the console's logo looks pretty much identical to it's predecessor's, aside from the barely identifiable U shape to the upper right of it. To them, the new console is just some sort of overpriced touchscreen add-on for the machine they occasionally play bowling and party games on. The benefits of such a controller aren't exactly as easy to determine as those of the original motion controllers, so they are unsure of how exactly such a device will offer anything over what they already have. And for any of these casual gamers who did pick up the Wii U, they undoubtedly found it lacking in the game department. For a long time after launch, Wii Sports, the series that likely sold them the original console, was nowhere to be found. It took a year after the console's launch for Nintendo to release anything Wii sports related, and that only came in the form of two sports available for Wii Sports Club, where people are expected to buy or rent updated versions of the individual sports that were available in the original game, as they become available. So rather than having a pack-in Wii Sports title included with the console, these casual users are expected to figure out how to purchase them through Nintendo's online store.
Meanwhile, for those more familiar with gaming, the new console seems like a repeat of what the Wii had to offer them last generation. The hardware capabilities are much improved, but still only marginally better than what the prior generation of consoles from Sony and Microsoft had years ago. The controller is interesting, but there aren't really many games that make good use of it yet. And meanwhile, Nintendo remains a bit behind the other console makers in terms of non-gaming features. Many also recall the dearth of decent third-party games available for the Wii during its latter years, and suspect the same will happen with the Wii U, perhaps even sooner due to its slow initial sales. The only thing they expect to get out of it are Nintendo's core first-party games, and Nintendo hasn't shown them all that much on that front yet. They still haven't shown a new Zelda, Metroid, F-Zero, or Starfox. It's certainly possible Nintendo might turn this around over the course of the next year though.
Of course, the Wii U still could benefit from another price drop as well. The current pricing is closer to the Wii's launch price, but the console lacks momentum at this point. I suspect there will be another price drop at least by the next holiday season. A basic Wii U for $199 USD or $249 for the premium version with a pack-in title would be a much more attractive price point for both casual gamers and those looking to get one as a second console. Even if Nintendo is losing money on each console sold, they'll make that back on the inevitable sale of a couple Nintendo exclusives, and it would help build their install base to keep the third-party developers from abandoning them.
Learn to read tables, Reg! : P
"It also pruned the predicted sales figures of its portable 3DS from 1.8 million to 1.35 million units."
I suspect you weren't paying much attention to the text at the top of the table that reads, "sales units in ten thousands". That should be a sales figure revision from "18 million to 13.5 million units", ten times the quantities you reported. While their sales prediction for the last year may have been a bit overly optimistic, the 3DS is actually selling quite well, especially compared to Sony's Vita, which is selling slower than even the Wii U. The 3DS has already sold around 43 million units worldwide over the last three years, and is definitely still going strong.
Re: "Google has not been able to dredge up anything really useful for a long time now"
It's not just the results either. The interfaces of their sites are going through the floor. For everything they improve, something else gets made worse. And recently, they've been forcing unwanted Google+ integration into all their sites. Google has long since abandoned the simplicity and usability that made them popular to begin with in favor of coercing people into using their services to further their goals of monopolizing the connected world.
Re: New year's resolutions
Is this a big potato or a small potato? What kind of potato is it? Does this include the skin, that you may or may not be eating? Google's quick result gives "163 calories" for "1 Potato medium (2-1/4" to 3-1/4" dia)". According to the detailed nutrition information off to the side,such a potato weighs "213g". Interestingly though, clicking through the first search result also provides the nutrition facts for a presumably equal-sized "1 potato medium (2-1/4" to 3-1/4" dia)", but says it's only 129 calories and weighs 173g. They at least provide the information that these values including the skin though. So, which is closer to a typical-sized medium potato? Should I trust the site Google most recommends, that specializes in providing nutrition information for various foods, or Google themselves, who leave out key details and tack on a photo of a mysteriously deformed heart-shaped potato that looks like no potato I've ever seen before? Either way, you'll probably want to weigh your potato, then divide and multiply to extrapolate the estimated calories in your particular potato to derive any semi-accurate results, since Google doesn't appear to let you add the specific weight to your search query to calculate that automatically.
Re: Bit spooky
It's been like this for years, on all platforms. How do you think achievements are tracked? If a game has an achievement for killing X number of enemies or using an item a certain number of times, those stats are constantly getting uploaded while you are playing the game. It doesn't matter if its Sony, Microsoft or Valve, if you're playing games on a system that's online, they're recording the exact time periods when you are playing each game, along with all sorts of stats about how you play those games. If your user profile is publicly viewable, much of this data is available to anyone wanting to monitor and catalog it for themselves.
Re: Check out...
The "technical foul" penalties are not something added by Microsoft, but by the game's developer, for added realism and to make for a "more civilized online environment" according to the game's publisher. They are present in the PS4 version of the game as well, if you have the Playstation Camera, and were in last year's version of the game too, if you happened to have a Kinect for the 360. You also have the option of disabling them, by turning off voice commands in the game's options. I think it's a pretty amusing feature, actually. Putting it in more games could do a lot toward trolling all the twelve year olds who think they look big by swearing at others in online games. : P
Also, it sounds like that "skype ban" thing was just a myth. What people can get banned for is uploading gameplay videos that get reported for excessive swearing, which is against their terms and conditions you agree to. That's not to say such calls aren't being recorded, but the same could be true for call through any service.
I do agree about not liking the idea of having a camera and microphone always watching though. However, that's not just limited to Microsoft. Do you really trust the front-facing cameras and microphones found in smartphones and tablets?
Re: Tell us who has won .....
It's also possible that Microsoft and/or Sony have hundreds of thousands of consoles stockpiled in warehouses in an effort to artificially boost perceived demand. If your console is sold out at stores this holiday season, that will create the impression that it is in demand and flying off the shelves, even if the actual reception is a bit more lukewarm.
Re: Come on Gabe Newell @ VALVe
One thing a lot of people don't seem to realize about SteamBox is that it's based on Linux, and unless a game's developer releases a Linux port of their game on Steam, you won't be able to play it natively on the system. Native Linux support is still not all that common among big game releases. Valve offers Linux versions of many of their games, but even they haven't released ports of games like Portal 2 and CS:GO for Linux yet, though I suspect they'll have those by the release of SteamBox. But among games already released by publishers who don't have a vested interest in SteamOS, it's quite unlikely they'll invest the millions of dollars necessary to port their previously released games to Linux. So, most of your existing Steam library won't likely run on SteamOS. Apparently, there will be the option to run games on a windows system and have the video stream to your SteamBox over your home network, but this is obviously not an ideal solution.
Re: No thanks....
But if it doesn't go through their server, how are they supposed to record all your sound data, perform voice recognition on it and index the results for quick retrieval by interested parties? Without that, what good would this product be?
In case you didn't notice, much of the Register's headlines and articles use bits of language intended to be mildly amusing. And who are you to say that their typical readers are " technically illiterate"? It's a long-running technology news site, so I suspect that most of their regular visitor's level of scientific knowledge is above average. Think of it more as a somewhat nerdy person calling their friend a nerd in good humor, than someone doing so in a demeaning way.
Re: And a lovely player it was
I skipped version 3 of Winamp, as it performed poorly and had some functionality issues. Thankfully, Nullsoft abandoned it as well, and the following year released Winamp 5 based off version 2's codebase, but with most of version 3's additions replicated. There's also since been a "Winamp Lite" released, that cuts out many of the side features added over the years.
As for the other bundled garbage that can get installed with it, that just requires going through a custom install to avoid those things getting added. That can be a bit more tedious, but it's not like you really need to reinstall Winamp frequently. Many people get along fine using versions of the software that are years old. Existing versions of Winamp will continue to function fine for many years to come, so I expect it to continue to be used for quite some time into the future, even if it is no longer actively developed.
So, Apple made a smartphone that looked nice, so no one else is allowed to make a smartphone that looks nice? The iPhone was little more than a PDA with a phone built in, which had been done before. They just packaged it up to look nice and be a bit more consumer-friendly. If we get into their claims that aspects of their user interface were "stolen" by their competitors, it should be noted that Apple's own company was largely built on copying OS features "borrowed" from others. If it were some small company having their product design copied by a big company, I could see, but an industry-leading corporation should expect their competitors to iterate off of their product designs. It makes for a healthy, competitive market in which the consumer wins.
Re: 36 cameras
I was thinking the same thing. Using wide fisheye lenses, this could technically be accomplished with as few as six cameras. You would need high quality lenses to avoid blurriness at the edges though, which would increase the cost, and such lenses would need to protrude from the surface of the device, making them likely to get broken. 12, or maybe 16 cameras with just moderately-wide lenses would be more practical. However, I suspect the creators of the device were looking to keep costs down, and as such, they appear to be using lower-end off-the-shelf cell phone cameras, which tend to have a rather narrow field of view. In bulk, such cameras might only cost a few dollars or so each, so using 36 of them may actually cost less than using fewer cameras with wide-angle lenses added.
This is a neat idea, though the image quality doesn't seem particularly great. The combined resolution is good, but the images appear prone to exhibiting blown highlights and color casts. I also suspect that most tosses will result in unusably blurry images, and that the "tough clear plastic case" won't survive a missed catch over a hard surface. They really should have made the areas between lenses out of a soft foam or rubberized honeycomb material, seeing as the device is intended to be thrown. It does provide an interesting view with minimal effort though, so I'm sure many will be willing to overlook the fact that it's a $500+ camera with mediocre image quality. If one were willing to do some manual editing in an image stitching program, better results could be achievable with multiple shots from a regular camera on a tripod, though of course getting a top-down view would be harder, and you wouldn't be able to capture an instantaneous moments in all directions.
The distortion in that image is because they stretched a 360 degree view into a flat rectangular image. There's no way to get around that short of viewing the image in spherical form. It's similar to how a flat map of the world greatly distorts the shape and size of landmasses the closer you get to the edges, while a globe does not. The only way to fully get around that in a continuous image is to map it to a sphere. For panoramas, a sphere can be simulated on-screen showing only a portion of the image at a time, such as with QTVR, or the similar tablet app shown in their video. So long as the field of view is set correctly, edge-distortion shouldn't be noticeable when viewed that way. You can't expect to have a flat photograph showing everything around you without significant distortion though.
They added that for the sole purpose of attempting to humanize their business venture. In reality though, they are cold, hard mimetic poly-alloy robots, sent back in time from the future to kill the leader of the human resistance.
Re: Helium is a finite resource
Helium is a finite resource just as natural gas is a finite resource. And aside from a small portion of natural gas deposits with elevated concentrations of Helium, it typically only appears in relatively trace amounts. Once it escapes from containment, it rises to the upper atmosphere and is lost into space. Creating Helium in a laboratory environment is prohibitively expensive for large-scale use. Sure, Helium might not become rare enough in the next couple decades to be an immediate issue, but eventually its price is bound to rise to a level that will impede its use for many scientific and industrial purposes.
Re: ...another completely inexperienced poster.
Sure, he may have been exaggerating, but you can't just look at the maximum continuous throughput as a measure of the relative performance between the two technologies either. Mechanical drives are still as abysmal as ever at access times, where they haven't really improved much over the years. If you're just using the drive to read huge files, this might not matter, but small files and random accesses will slow the drive's performance to a crawl. In terms of access times, SSDs can in fact be hundreds of times faster than traditional hard drives. The real-word performance results are not quite so extreme, but you're still looking at performance many times that of tradition hard drives for most purposes.
Of course, he is completely ignoring the fact that SSDs still cost many times as much per gigabyte, making them unsuitable for backup media or mass storage, which is something you could have pointed out. In situations where random read performance isn't an issue, the over 10x cost of SSDs probably isn't worth the price in most cases.
Re: do they get paid?
"would they be entitled to a slice of the pie?"
Yes, they are entitled to one slice of pie every other Tuesday, as provisioned by their meal-scheduling program.
Re: Now our Mars missions can look forward to on-demand 3D porn!!
Depending on the alignment of the planets, the ping time to Mars and back ranges between roughly 10 and 40 minutes at the speed of light, averaging around 30, so that wouldn't make for a particularly comfortable web browsing experience. Sure, you might be able to download a Blu-ray in 5 minutes, but you'll be waiting far longer than that for the video to start. "On-demand" will typically mean waiting half an hour from when you click play to when the stream starts on your end. Plus, you'll undoubtedly be sharing the connection with a bunch of filthy Mars-men, so expect your share of the bandwidth to be a fraction of that. Of course, there would likely be a colony-side datacenter that would continually download and cache videos, web sites and so on, but if you want anything that's not cached, expect to wait quite a while for it. And that's assuming whoever's in charge even lets you have a say in what gets downloaded.
Even if it had been a $25 extra, it could have been a killer feature. According to a teardown analysis, the component cost of the Touch Cover was just around $16, making its $120 price tag almost pure profit. Or at least, it would have been, had the Surface actually sold. The cost to manufacture the Surface RT itself was estimated to be well under $300, so Microsoft could have priced it far more aggressively than they had. You can't expect to charge the same premium for a new product that the established leaders in the field with rabid fanbases are charging, especially when your platform has a significant lack of software compared to the competition. Had Microsoft priced the surface RT around $400 from the start, with the Touch Cover either bundled or an inexpensive extra, they could have established themselves as a major contender in the field. The platform would have sold itself. Instead, they spent more on advertising the product than they made on sales.
Re: Unfortunately, there's nobody to trust right now
Of course, even if the standards are "thoroughly tested", that doesn't mean that governments won't be able to invest billions of dollars into finding vulnerabilities in the code that they keep secret to themselves, perhaps found through the assistance of quantum supercomputers or other technology that the non-governmental organizations won't have access to, or might not even know exists. And who is to say that these organizations aren't secretly working with one government or another? And even if the encryption standards are seemingly "secure", what if the hardware manufacturers are doing things within their chipsets to purposely circumvent or weaken the encryption in a particular way?
Re: Steam OS for games, Google Docs / something else for the office work
I have doubts that Steam OS will be a suitable one-stop replacement for gaming on Windows any time soon. The big issue here is with backward compatibility, which has been a big draw of PC gaming over the years. Most existing games are simply unlikely to ever get ported to Linux, and hence, won't be running natively on Steam OS any time soon. The vast majority of new games are still only getting released on Windows, with a portion getting Mac support, but very few see a Linux release. When Steam OS launches, this is likely to improve for new releases, but publishers are not likely to invest in porting their existing games to the OS.
Those with large Steam Libraries will likely find that 90% of their games won't run natively in Steam OS, and those games that do will be mostly limited to a subset of indie titles. Steam OS is slated to offer streaming capability that will allow you to run Windows Games on another system and play them over a network, but this still requires you to be running a copy of Windows on the one, and have both systems running the game at once. Perhaps they'll eventually add an OnLive-like cloud service for game streaming, but again, such services have their limitations. And of course, until Steam OS (and desktop Linux in general) gets a decent install-base compared to Windows among gamers, things like graphics card driver support are bound to be substandard, even if the OS is designed to run games more efficiently. Steam OS might be a decent solution for playing some of your PC games in the living room, and for doubling as a home theater PC, but I think it will be some time before most gamers will be willing to give up access to their Windows game libraries to switch to it exclusively.
Windows 8 will undoubtedly cause Microsoft to lose some users, but it's hardly going to be anywhere close to a majority, so Windows can't even remotely be considered 'dead'. It's more of a shot to the foot than a head shot. I do suspect Windows will lose more of its usage share in the years to come, but it's difficult to say how new management could affect that. The biggest advantage of Windows over its competitors is its backward-compatibility and the familiarity of its interface, which is an area where Microsoft clearly messed up with Windows 8. Even if drastic changes to the interface were to make for an improved experience, if you're forcing users to relearn how to use their systems, you've brought yourself down on par with the competition on that front. At the very least, Microsoft should have provided native 'classic interface' options to keep power users happy. They might make up a minority, but are vocal, resulting in a lot of bad press that is bound to turn off a lot of the more casual users. Of course, Microsoft might still turn this around with the next major release of Windows, as they did with Windows 7.
Re: HL3? What's the point?
"In fairness, you've got to compare it to other shooters at the time, and on that basis I think it holds up well." -Steve Crook
This is exactly what people need to keep in mind. Half-Life 2 came out nearly NINE years ago at this point, and had spent around five years in development prior to that, so of course parts of it are going to look dated by today's standards. When it came out though, it was a lot more open and interactive than most other shooters. Sure, the level layouts were rather linear, but Valve provided lots of interesting things to play with within those environments to give players unique ways to approach situations.
One of Half-Life 2's biggest selling points was the relatively large number of physics objects within the world, and your ability to interact with them using the gravity gun, explosives, and so on. Previously, furniture and other clutter within games was mostly static, while in Half-life 2 it made for a dynamic source of cover, obstacles and improvised weapons. The game also brought interesting drivable vehicle segments into its gameplay, which was relatively rare among shooters at the time. And then there were friendly NPCs tied into the actual gameplay, which was something else that was pretty much unheard of in shooters. Sure, they lacked any sort of advanced AI or anything, but they were much better than anything else seen in comparable games at the time.
Another thing the game did well was that it made the most of its linearity to provide a highly "cinematic" experience, more so than any other games I can think of from the time, expanding on what the original Half-Life had done before it. Since then, many other games have done the same, and it's become something of the norm, but this was rather innovative at the time.
"But after playing the Fallout series, I definitely don't want to bother with Halflife again!" -Brian Miller
Aside from the obvious fact that Fallout 3 came out four years after Half-Life 2, they're arguably not even in the same genre. As has been said, Fallout 3 is more an RPG (or rather CRPG, if you want to get technical), where the focus is much more on placing people in a sandbox environment and letting them do what they want, than on providing a cohesive story with careful pacing. Sandbox-style games can be great, but that doesn't mean there's no room for games that tell stories in a more controlled fashion. Both provide different kinds of experiences.
I suspect Half-Life 3 will contain more open environments, but will continue to be much more linear than something like Fallout. Fans of the Half-Life series aren't looking for such an open, non-story-centric experience where you could spend dozens of hours crafting thimbles and taking quests from locals, instead of saving the world or whatever it is Gordon Freeman is doing. That kind of gameplay simply doesn't fit what most people want from a Half-Life experience. I do think Half-Life 3 will again try to be best in its class at various aspects of gameplay though, but it's difficult to tell what those might be at this point.
It would be very "un-Valve-like" to make Half Life 3 an exclusive to Steam OS. In recent years, they've been pushing to make the Steam experience work smoothly across multiple platforms, so it seems highly unlikely that they would backtrack on that in an attempt to push the platform. And when it comes down to it, the SteamBox isn't even like a traditional console in that it's mainly just a free operating system and specifications for hardware manufacturers and end-users to put together their own systems with. Valve's profit comes from people using the Steam service to make purchases on these platforms, no matter which platform they decide to use. Of course, users of a system with a Steam-centric operating system will be more likely to purchase games and other content through Steam than through competing digital distribution services, so it is in Valve's best interest to get people using the platform. However, one of the big selling points of the platform is cross-platform compatibility with people's existing systems. Valve isn't expecting PC gamers to give up gaming on Windows anytime soon, as a large part of the Steam catalog is only available on Windows.
At most Valve might give early access to the game on Steam OS, allowing users of the operating system get their hands on it a day or so ahead of other users. I could also see Valve holding off a bit more on releasing the game for other consoles. Making it a launch title for Steam OS also seems likely, though there will no doubt be beta-test releases of the OS well before the release of Half-Life 3.
Re: By the time kids today are old enough to be pilots
"SA comes from high-end equipment as long as your target is out of sight. When it comes close to you, all your high-tech equipment becomes almost useless - and you would need a fairly complex 360° 3D system to remote fight in such a situation. Same when engagement rules ask you to identify your target visually."
That doesn't seem like something that would be a problem at all with existing technology. A "360° 3D system" is not something that would be prohibitively complex. It would simply involve taking multiple camera inputs and piecing them together into a single surround image. With multiple sensor domes on top of and below the aircraft, you could obtain a view in literally any direction from a virtual cockpit, unrestricted by the aircraft's body. This same unrestricted surround view could be transmitted in multiple vision modes as well, such as near-infrared and thermal imaging to see clearly in situations that a pilot's "eagle eye" would be completely blind to.
On the controller's end, the remote pilot could view the surround image using a high-resolution head-mounted stereoscopic display, with motion tracking to enable them to look in any direction just by turning their head. Resolution should not be a problem with such a system, and if the camera feeds provide a higher resolution than the display, they would have the ability to seamlessly zoom their view, effectively giving them telescopic vision. The pilot would likewise have the ability to view a zoomed out, ultra-wide field view as well. There could additionally be one or more co-pilots viewing the same feed and able to look in any direction independently at will, assigning targets and so on. Each could have their view overlayed with a custom HUD providing them with relevant info.
As for transmission lag, computer-assisted aiming and flight assists could largely make up for it, allowing the aircraft to automatically start reacting to a situation instantaneously. Of course, if the plane is doing something that it shouldn't, the remote pilot will still be able to correct it a fraction of a second later. Autonomous control of vehicles has come a long way in recent years, and it stands to reason that it should continue improving in the years to come. Even if an enemy managed to completely jam a remote-controlled aircraft's transmissions to its pilot, we're getting to the point where that aircraft could still fend for itself, or at least make a quick escape.
Of course, this would mainly apply to aircraft purpose-built with remote and autonomous piloting in mind, although systems could be experimented with on existing aircraft as well. I have little doubt that such systems are already being worked on and experimented with for eventual wide-scale use, and that we'll see those systems in place in the coming decades.
I believe I found the article you're referencing, but the guy didn't include a total estimated cost, just that he estimated it might cost around $1 more to produce than some comparable chargers for competing products, which sell for around $6 to $10. At the time, Apple was selling their charger for around $30. They still likely cost under $5 to produce, and even selling them at $10 Apple is making a decent profit. If they really were concerned about people using third-party chargers, they would make $10 the normal price for them.
The guy also did a followup article comparing the performance of a variety of different chargers...
While the Apple charger fared well, so did all of the official chargers from competing products. They each performed better at some things and worse at others. The only ones that performed poorly all-around were the "counterfit" Apple chargers, made to look like the official ones, but with the bare minimum hardware inside needed to convert voltages.
Also worth noting in the article is how Apple (and some other manufacturers) use non-standard voltages on the data lines of the USB connector to prevent many otherwise-compatible chargers from working with their devices. Of course, this doesn't do much to block the poorly made counterfit chargers from working, since they simply copy that as well. It does make for an array incompatible signals sent from chargers to devices though, making it less likely that you will be able to use one charger for multiple devices.
Re: A good opportunity to sell stock at cost and look big'n'caring
Actually, the connector isn't proprietary on the chargers. They may use a non-standard plug on the phone's end, but the other end of the cable is just standard USB. The charger, likewise, just has a standard USB plug, and even Apple's overpriced charger doesn't include a cable. It's just a simple AC to USB power converter.
- Pics Facebook's Oculus unveils 360-degree VR head tracking Crescent Bay prototype
- Teardown Pop open this iPhone 6 and see where the magic oozes from ... oh hello again, Qualcomm
- Analysis Apple's warrant canary riddle: Cock-up, conspiracy, or anti-Google point-scoring
- Bargain basement iPhone shoppers BEWARE! eBay exposes users to phishing vuln