991 posts • joined 23 Sep 2009
"Fine, but you can only access it via shitunes. FAIL!"
Perhaps that's why it "doesn't got a lot of stuff on it"
@Ainteenbooty: "In what possible manner did you determine that an iPad SHOULD be used in portrait?"
Just look at the unlock screen for one. Shows in landscape. Also, the camera (which should be on the top, right?) is only on the top if in landscape. Otherwise, your hand could potentially be blocking it (as it would be on the right or left, vertically centered). Then there's the whole connector locations bit that point it toward being meant for landscape. If you want to get software-specific, there are one or two screens that don't rotate out of landscape mode.
If you notice something from the Netherlands case, the court ordered Apple and Samsung to reach a licensing agreement. What does this mean? It means APPLE WAS NOT PAYING ROYALTIES as it should have been. They've been using the 3G patents without having a license for them. Apple is definitely in the wrong there. Whether Samsung is asking a "fair price" is open for debate, but keep in mind the license amount has a range of "fair prices" set that they must abide by, so even picking the highest "fair price," it is still an agreed upon "fair price."
Remember? The iPad2 was said to have a 2048x1536 display too. Knowing Apple, if you take the best screen resolution available for Android near the time of release, then knock a bit off or assume same resolution, you're likely going to be right:
iPad2 = 1024x768
Samsung GalaxyTab 10.1 = 1280x800.
Motorola Xoom = 1280x800.
So, the iPad2 falls into the "knock a bit off" category. (yes the Tab "released" a few weeks after the iPad2, but the Xoom existed well before).
For the obvious:
500GB-sized drive (assuming platter density gains of 6X as per the article) would scale and become a 3TB drive, based soley on same platter size with density gains.
Granted, for those who didn't catch this, the explaination will probably be lost on you as well.
"Samsung have not gone out of their way to place obvious logos on any of their recent devices"
You mean, besides the fairly large, obvious "SAMSUNG" logo they slap on the back of their device? Or if you're talking phones, the "SAMSUNG" logo they put on the front at the bottom... My Samsung Droid Charge looks nothing like an iPhone, esp with the big logo on the face....
Not only can you NOT play that, but you'll pay us $500 per device for the priviledge of us telling you "no."
Rather than calculating electricity costs in car polution because of the coal burned to make it, perhaps having nuclear reactors providing most of the power for a country, we can measure cuts in pounds of waste (which is likely to be significantly less than the pounds of CO2 polution just cited in the article).
VDI was designed to remove the cost of your end-points (including management and hardware) and put them in your datacenter. Considering each end-point in a "rendering" (read Maya or AutoCAD type renderings likely) saturates Gigabit ethernet to the desktop, why would you NOT assume that a measely bonded gigabit or quad gigabit would not get saturated when hosting tens of servers?
Careful though, the device is reasonably rectanglish and has rounded corners and a black frame surrounding the screen. Better not tip Apple off!
Just use your iPhone as the glorified iPod Touch that it is....you know, those things Android, et al users refer to MP3s? (iTunes converts them, so they can't be referred to as MP3s at that point).
Let's not forget that this was a 1.6ghz amd vs a 2.3ghz intel. AMD is already behind on a per-cycle performance standpoint. This "comparison" was flawed from the start.
"Where he is taking the company"
Because the world would be a better place if all the Wintards moved to MacOS? Sorry, I'll stick with the lesser of two evils. (no linux people, Ubuntu or otherwise would not become the "next big OS." You know as well as I they'd run to Apple because "their iPhone is hip").
You did read that the memory reclaimed was due to a framework or methodology of detecting and squashing memleaks, right? It's not that they unload pages that are in the background or the like, simply fixing buggy code that leaves allocated mem floating unreferenced.
Another reason the lay computer user can't/won't use Linux. If they download their lovely Chrome installer, how are they going to know they have to go into the properties and mark the file as executable? Perhaps have a helping info page "You're running Linux! Let me show you how you can run this program. Oh, you're running XFCE? Here's the instructions. Oh, Gnome, here's the OTHER instructions. Unity? Bah, open a terminal window (if you know how, or hit ctrl+alt+F2), and do a chmod u+x on the file (if you know where it is, likely somewhere in your ~ folder), then open it."
More secure? Sure. It just saves people from themselves because it doesn't hold their hand while walking them into an oncoming train.... But only 25% of /actual computer users/ would even be able to use it to a decent degree.
I do give Apple credit at making a *nix box that is at least usable by the masses. I just don't like their "culture."
Manned Space Exploration
Manned space exploration is all well and good, but with focusing primarily on robotic exploration, to be eventually followed by manned exploration, we could do so much more. Robots can withstand Mars, likely Venus (but not for too long), Titan, and a dozen other places, and do so at astoundingly cheaper mission costs, even if they sent TWO of each robot, just in case one failed/blows up/etc. We could do tens of missions simultaneously for the same cost as having an astronaut golf on the moon....
Once we get a decent set of building bots on Mars putting together a habitat or digging/finding a hole, THEN we can send some 'nauts up to put boot-prints on top of rover tracks. This will likely be the best method of space exploration. We won't even mention how horridly overpriced the monopoly of space launches has been. Elon has shown us that.
Same "template" base?
Even if they use the same template, the resulting file will be different, so file-level matching is out based on hashing of the encrypted data. It says if you share even a single slide between ppts, they dedup it, which means perhaps block-level dedup with understanding of filetypes to a degree. likely they "stripe" intelligent chunks (slides, pages, or simply data-blocks) across hdds and match those chunk signatures. They may not know what's in the file server-side, but I'm sure the client-side is quite aware to pull this off.
New Security Policy
Put all prototypes on a Kensington lock lanyard. But don't give the employee the code.
Done and Done.
Question of a question
"do typical variations in the sun's output have a greater or lesser effect on global climate than the accumulation of CO2 from all that fossil burning we're doing?"
One hates to answer a question with a question, but just consider what the CO2 in the atmosphere causes to be "trapped?"
There's a couple "run a full Win7/Linux" tablets out there. Two of the better ones are the MSI Windpad 110W and the Fujitsu Q550. Check them out before complaining about "glorified phone OSes" for your only options.
Trade dress or no....
"Samsung didn't say how much this spec will cost you, but you can save a bit by settling for a 14in screen or a Core i5 CPU."
Trade dress or no, it will likely "save a bit" compared to buying the Real Thing(tm).
Ribbons and statistics
Since the "most used" commands are likely Ctrl+C and Ctrl+V and the Delete key, I'm sure that will skew the weights for how often New Folder is selected from the context menu or Send To -> USB drive. I know that I WON'T use a fancy button on a ribbon for cut and paste, just because it's the "most used" commands or hotkeys. Newbies might love the visible button, and I know it will be a great feature to add to (listen Microsoft!) the touch GUI (yes, imagine that: a GUI designed for touch, as opposed to the "regular" GUI for keyboard+mouse). But we know MS doesn't listen to anything but TechNet subscribers....
"...and Apple's Tim Cook, were approached by AMD and rejected the position..."
Well, we now know why Tim Cook rejected AMD's offer....
Space remaining is simple enough to see. In Win7 Explorer, click "computer" and it shows the drives in the system with a bar showing used vs free space. Click on any of the drives and it shows "Space free:" in the status bar. Not as simple as always showing X free diskspace in the status bar all the time, but usually you should have an idea of how much disk space you have left....
What I hope they improve is the file renaming, so if you have, for instance a bunch of photos named "DCIM_0001", "DCIM_0002", etc, you could select all of them and rename them to end up being "Birthday2010_0001", "Birthday2010_0002" (retaining the original numberings/letterings in the part of the filename you didn't change), rather than renaming them all to your new name + _00X. It's even worse when you already have them all named, but wanted to add a short prefix to all 100 of them....
At least it has microUSB, miniHDMI, and an SD card slot. At the same price point as the Samsung Galaxy Tab 10.1, if the screen is as nice, this might lure me away....
Sueing for using "F150" for a car name, I can give Ford that. But sueing because a car is black with chrome accents and has "aerodynamic" (read: "rounded") corners and you drive it with a circular device (steering wheel) and a gear-shift.... I propose that all car manufacturers sue for injunctions against each other for obviously stealing each other's "look'n'feel" and failing to be "innovative." Perhaps when we're all riding in horse-and-buggies again, bureaucrats will wake up and realize that the world could be a better place without such petty wastes of (consumer) money.
(yes, it is us that eventually have to pay for all these lawyers)
"will know that the Apple device will work, and be easy to use"
I think the biggest thing Apple has going for them is the iPod. A lot of people has/had one. They're used to being forced to use iTunes to manage their device. They like the idea of their iPad having the same interface, being able to use the same apps, etc. Everyone thinks "iPad" when they think of a tablet now, simply because it's "the thing to have." I've talked to a lot of people that want to use a tablet for everything an iPad CANT do (very well at least...), but they still want the iPad because they think it will do it "best." They won't even hear alternatives. It's quite sad really...like trying to give someone purified tap water as opposed to the bottled water at the store... "It's just not my <insert brand here>"
You mean plug-in developers have to, in advance, test their plug-in semi-regularly (every 6 weeks) on a beta build of Firefox and release at least minor updates to keep up with the current compatiblity versioning? Heaven forbid! 10 year-olds do as much and more with their World of Warcraft addons.
even better, IE needs to have this auro-disabled option. to make it even better than that, make them have to click though a non-autopop menu to actually enable the toolbar. people end up with toolbars get them because they just have to hit yes to a popup: and thwy always hit yes.
Why not just isolate the video clip background? It should be the /exact same/ each time, unless it too is randomized (such as the bubbles example).
Either way, the high contrast and wildly different colors vs the background should make any program that is intelligent enough to pull individual frames out of the animation easily able to figure out the characters.
...when HDDs are on par with SSD performance perhaps? There's a reason Core i7s cost more than Core i3s. Same applies to the performance difference for SSDs vs HDDs. Horses for courses and all that.
"If Google/Samsun had bothered designing some slightly different icons, they wouldn't be in this mess"
Yes, because everyone knows a button should be a triangle, not a square with slightly rounded corners like the one I'm going to hit for "submit"... oh wait...
Likely, the "feature phones" or the even dumber ones would lose battery life having to encrypt/decrypt 128+bit comms. Perhaps this is a way to mask the battery drain from having signal towers too far apart?
@Tony: write wear
Since your TB written increased as your time range decreased, thus making your info fairly illegible, lets go off the assumption you write 14.5TB over a 3 month period of time...
14.5TB x 1.6 (for write amplification) x 1000 (convert to GB) / 120 (120GB vertex 2 drive size) = 193.33 full drive writes every 3 months. 3000 (expected cell endurance) / 193.33 (full writes from above) * 3 (months: time period) = 46.55 months (3.88 years) expected life span on a 120GB vertex2 SSD, not accounting for overprovisioning. Put a couple of 240GBs in a RAID1 and you can double that and have redundancy. If you're waiting more than 5 years to replace your primary DB server disk drives, you dont care enough to bother with SSDs.
Lessons about "wear" and "ware" and "their" and "there" are for another post.
My modem uses USB.
"A FYI time travel is now officially impossible so don't pull that one"
Only time travel that requires going faster than the speed of light to achieve it. There are still other ways.
Indilinx is OCZ's "cheaper" controller; the one they put in their low-end drives. Thus, they're either moving up to replace SandForce in their high-end, or their bringing up the low-end to spec with their current high-end drives in preparation for even more next-gen SandForce controllers.
Either way, should make acquiring a 120GB boot/progs drive a bit lighter on the wallet.
Who'd want to learn finances from a guy who supports giving out shovel loads of cash for horridly overpriced devices? Sounds like the same teacher the US Gov't had when they drafted the bailouts....
(Don't believe me on "overpriced"? Just look at their operating costs and profit margins)
"claims to have deciphering [sic] the final"
"only one of which has ever [sic] been decrypted."
See? Plenty to do with IT. Don't rely soley on your spell checker for your article. Try proof-reading it once in a while.
Don't use DRAM (like Intel) in the SSD. SandForce controllers use flash cells as their "workspace" so anything that's on the drive-side is essentially non-volatile. Now, whether the controller/hostOS knows how to handle recovering from a sudden power loss is anyone's guess....
Intel puts out 220mm wafers for their Core-i series chips. Does this mean each CPU is ~8.6in? Nope. The wafer gets cut up into the individual dies.
Probably the same IQ as Paris?
"I also note the "Two-year Microsoft® Office Personal license" - so what happens after 2 years? Do you have to buy the whole thing again?"
Answer is: likely, yes. Although I doubt you'll want to continue using Win7 on a 1.2Ghz Atom after 2 years. If you do feel like continuing, just install LibreOffice instead. It will likely support Microsoft's next office format before MS makes a readability plugin.... (remember the conversion to docx?)
Even these new security measures won't protect Joe User from himself when he clicks the fake antivirus message which prompts him to run a downloaded file which then in turn installs the virus with user-level permissions, which can then use <insert vuln here> to escalate priveledges (or simply be happy with user-level-priv keylogging) to install the "virus" (read: fakeAV or its ilk)
Budget constraints and performance metrics are part of the problem, yes. However, an even larger elephant in the room is that of compartmentalization. Often, you'll have a group of IT guys who specialize in one thing. You ask the "Network Infrastructure" guy to fix a syslog error on the Win 2003 server and he'll likely not know how. Ask the server guy to fix a syntax error on a jsp page and he won't know where to start. Diverse skillsets in even one person will help diagnose a "slowness" issue more than having 3 whole teams on the matter. Why? Because, chances are, they won't be able to see outside their scope. They may not even know you're running that tomcat webapp server in a VM that was only given a single vCPU without any MHz reservations, but happens to coexist on the same physical machine as the reporting server...
Any way we can get a RegCast transcript?
Mass Kinetic Energy
@bazza "I was thinking of the whole machine": Just how much mass do you think this heatsink has? From the picture, it's no where near the mass required to harvest kinetic energy to power a complete rig. Perhaps if it was on the order of Scythe's Mugen R2, then you might be able to help an SSD clear its cache (Intel SSDs and the like, since Sandforce doesn't use DRAM caches). But to help the whole machine, you're looking at several kilos at the least.
Do you know nothing?
"What happens if the datacentre aircon fails and the SDDs get too hot ?"
I take it from this statement that you know nothing of SSDs. However, to feed the conversation for those honestly seeking answers, SSDs can work well within the range a server room would hit if the "AirCon" goes out. I'd be more worried about your CPUs tipping over and sending garbage to the SSDs than that the SSDs would continue to function normally.
You must not be aware of how things are done in Open Source....
First, it is not "they" that need to support Linux-based OSes.
Second, Linux may very well get support (or at least better support) for IOMMUs before Windows.
If it's open, it's usually available. That's why nVidia drivers exist for Linux, but Radeon drivers are still "experimental": ATI didn't support an open driver interface.
Quick thought for you:
I'd rather have a few %CPU taken from an i5 (or X4 depending on your faith), then offloading to a likely a 400mhz chip (if it's lucky). Most hardware RAID cards bottleneck themselves by not being able to compute parity fast enough and not being able to handle the volume of data transmits. Even high-end RAID cards can't handle the throughput that software RAID can. If you don't believe me, go get two high-end RAID cards, stick 8 SSDs in RAID-0 on them, then software RAID-0 the two RAID volumes together. Scales quite linearly, which you wouldn't see if the software RAID was the bottleneck.
"You'll be lucky, BTW, to get a Macbook Air for £650 after a year or have a Dell depreciate by only 40% of it initial value."
The reason? Line-up refreshes. In one year, you may get a speed bump (proc/ram) or a HDD size upgrade for the MBA/MBP line-up. In six months, the Dell, et al, laptop will have gone through 2 revisions, and in a year it will be shipping with the next-gen CPU (Ivy Bridge in this case). Who would want to shell out £350 for a second-rate, old-tech PC laptop when you could drop another £500 for the New Shiney, and still be barely more expensive than the original purchasing cost of that MBA? Granted, if you're speaking portability, the MBA does get around fairly easily, but there are PC counterparts to that. The point is more along the lines of why you can still get a last-year's MacBook for a high price: you can't get anything (much) better now. They just don't move that fast. Hence why the MBA was using a slow, underclocked ultra-low voltage Core2Duo (for three years) up until their SNB refresh a couple months ago.
A few corrections
As stated in an earlier reply by some else, but with a lot more bias, you're wrong.
The new AMD kit is a mere $5-10 premium over the "on sale" i3-2100. If you compare vs the i3-2105 (with the HD 3000 gfx, rather than the very worse HD 2000 of the original i3-2100), the price matches exactly.
Platform is also a consideration, since the A75 boards, when features are compared, are generally cheaper than their 1155 counterparts. Not by much mind you.
Also, as a correction to the biased reply, it is true the i3 is a dual core, with hyperthreading, however, hyperthreading gains a lot more than a mere 10% (the OP stated "2.2" factoring in the dual cores) performance increase. There's upwards to a 40-60% increase in threaded situations over running with hyperthreading disabled. What was never touched was that the i3 cores, MHz-for-MHz perform better than the AMD cores.
In the end, the i3-2105 (yes, better GPU than the 2100) has better performance in single-threaded or general light loads. The AMD A8 chip does better when you're taxing the system with heavily-threaded loads (does better by far btw). However, you're not likely to do that unless your encoding video or doing many things at once, at which point, you bought the wrong CPU either way. The GPU core in Sandy Bridge doesn't even hold water against the A8 GPU core. The A8 has 2x the performance of the HD 3000, hands down. There's just no comparison. The only advantage the HD 3000 has is QuickSync. But then again, the A8 has DX11.
Who's the victor? Anyone who buys the $500 Walmart machines. Why? Because they'll have an AMD CPU.
- Product round-up Six of the best gaming keyboard and mouse combos
- China building SUPERSONIC SUBMARINE that travels in a BUBBLE
- Boffins attempt to prove the UNIVERSE IS JUST A HOLOGRAM
- Review Raspberry Pi B+: PHWOAR, get a load of those pins
- Linux turns 23 and Linus Torvalds celebrates as only he can