A year or two ago, yes. Now? It's just too unstable.
Can't remember the last time I managed to crash it, myself. Flash I can crash, but that's easy to recover from.
2406 posts • joined 10 Jun 2009
A year or two ago, yes. Now? It's just too unstable.
Can't remember the last time I managed to crash it, myself. Flash I can crash, but that's easy to recover from.
Is it bad that I quite like this thing? I saw it the other day and the animation/design is pretty impressive. You have to admit, it's a step up from the 90s Microsoft ads. Who else remembers "W-w-w-Windows, Windows, 386!"
Time was, an N64, was an N64
Until you plugged the memory expansion module into it.
Hardware revisions have been going on for as long as there have been computers. No biggie.
Where are the games?
The creep toward major publishers putting games on Linux is starting. Metro: Last Light has just been put on Steam for Linux, Tripwire have been putting everything on Linux, as have Double Fine, Valve themselves, and Egosoft. All we need is for someone like Activision to start doing it, and a console might be that trigger.
Steam for Linux now has (I think) over 300 entries in the store.
for media playback i assume, not having discs is the whole point of steam, no
Actually, Steam does support installing from disc, and games only ever get bigger, so optical disc transfer might still be viable, at least for this generation.
This wasn't already illegal? They're pushing for gay marriage, but it's still legal to sack people for being gay?
Do they re-use the pile of junk they have created?
Isn't the point of the exercise to find out whether they can?
Ever hear of the disposable society?
You know most people see that as a bad thing, right?
So you should throw an android out after less than two years
Yes, that's what they implied by saying they wouldn't put Kit-Kat on the phone. In fact, if you can't get Kit-Kat on your phone, it instead downloads an update that makes it explode, so best to throw it away first. There's no possible way it could keep working otherwise.
I can't replace the engine (myself) in my car either...
But you can replace the battery, surely? Or swap out a wheel when you get a flat? Or change the oil?
No-one's asking to be able to completely disassemble the motherboard and replace the A7 (the correct comparison to a car engine), just to be able to have some serviceable parts, especially the low cost ones that fail most often.
How often do you take your phone or tablet apart?
If it's broken? Pretty regularly. I'll happily rip a gadget to pieces if I can fix it myself.
It's one of the most polished and modern-looking alternative OSes out there, too.
*Glances at screenshot.*
How exactly did they dupe you if they never said it could do any of those things you're now complaining it can't do? I mean, in terms of media it sounds fairly similar to my PS3, which I use for media all the time, so...yeah, that's about the level I expected. It's a feckin' games console, and you're complaining that it focuses on games. I'm glad it focuses on games!
Hang on a minute...so of the "limitations" you're complaining about with the PS4, one is that it can potentially do 4K, but they haven't enabled it? OK, so it's vastly superior to the Xbone that's still upscaling 720p for some games, but that's a limitation?
And it can do 3D, but no-one's making 3D games at launch because they're mostly pointless, and that's a limitation of the console?
And it can't play CDs, like no-one in their right mind has for years (and I still buy CDs), and that's a limitation?
It has features that you have no particular interest in using, and that's a limitation?
Hey, El Reg, this just in - the PS4 can't play Betamax either! What a restricted, locked-down POS! Seriously, this article was straining for things to nitpick in the console. Face it, it's still a better prospect than its peers.
The lack of backward compatibility does annoy me though. I mean really, how different is a Dualshock 4 to a Dualshock 3? If the Move can work with it, the PS3 controllers should be able to.
Something about not spending any more money on non-essentials I think.
Of course, the problem we've had for years is convincing the Americans that their grossly OTT armed forces are "non-essential".
PC gaming is back on the rise, too. After the original Xbox One announcement I heard plenty of voices within the gaming community saying "right, that's the final straw, back to PC". Steam has apparently just eclipsed Xbox Live in terms of active subscribers, and the PC platform has really been leading the way in gaming in recent years, more so than usual. Better online services, cross-platform gaming, cheaper prices, more games, more peripherals and huge support for a thriving indie scene full of interesting ideas.
Given that my preference has always been for the PC platform, it's great to see the revival happening, especially after a period of time where it looked like PC versions of games were either going to be half-arsed across the board, or abandoned altogether.
Riiight, because Mozilla have a track record of giving a toss what advertisers think. They partly created the proposed Do Not Track header, have had click-to-run for outdated Flash versions for a long time, opposed DRM in web standards, etc, etc. I think you're thinking of Chrome.
Greg J Preece sounds like the sort of pig-ignorant person that this article is featuring. Obviously never been brought up to have some kind of good manners and quite obviously unable to interact and give other people undivided attention.
Obviously. Yep, from three posts that you clearly didn't read properly, you've got my entire personality nailed.
Why are you at a meeting/presentation so boring that playing with your shiny-shiny is necessary?
Christ's sake, can at least 10% of you at least attempt to read a post properly before leaping onto high horses.
If the meeting/presentation is worth your time, put the toys away unless you are on call
Which I am. And you'd know that if you'd read my posts.
In any case, If you are an operator responsible for machines and your system is *that* flaky that you need to be checking a mobile phone every minute or two, then might I suggest you have some serious problems?
Now what part of what I've said at any point implies that I'm constantly checking my phone? Alerts aren't something I generate - they come in when necessary. We have a whole bunch of servers hooked up to the same alerting system, some far more critical than others (including development servers that temporarily scream murder every time someone deploys a new test image). Unfortunately, the mobile client I use only lets me have one ringtone for each alert type, not each machine, and on vibrate it's all the same, so when that thing rings you can bet your wages I'm checking it.
Alerts are just an example I was using of how not all mobile phone use is talking to friends on the Twatters or idly playing games. That might be what the OP uses his phone for, but smartphones were originally invented for other uses.
I don't know about you, but it read to me like it is a company mobile and he's getting company-based alerts. I think he'd be fired for ignoring them, rather than answering them, but that's just me and my strange ideas of reading what people actually wrote before commenting.
You are correct, have an upvote. Checking the alerts on my phone to see if a server just keeled over is not "playing with my toy", especially given that it's a toy the company bought me for that exact purpose. Not sure the rest of you get how priorities work when you're a developer/admin.
As for the implied threat of "long memory", given that I posted with my real name, I hope that future employers do read this and understand that I have the required dedication to my role, which is partly to keep everything working.
I give presentations too, you know.
More than 59 percent of men said it was okay to check text messages at a power lunch
And 0.2% of the non-suited population said it was OK to ever refer to a meal as a "power lunch".
Another comment thread full of "back in the good ol' days" and pompous twits going on about how superior they are.
To the point where, if I'm giving a presentation, and someone is using a phone or tablet, I will stop the presentation, explain to everyone present that when we are all ready to continue, I will do so at their earliest convenience; and I will wait until anyone messing about with their gadgets has returned their attention to the meeting.
Man, I bet you're a riot at parties.
I also refuse to hold conversations with people who keep diverting their attention to their phone. I generally give them three warnings, after that I will simply leave, ignore them, or go about my own business, even if they try to pick it up again when they're done.
If you were ever so patronising to do that with me, I think I would pull out three different gadgets to get you to leave more quickly. I'm more than capable of listening to someone and simultaneously investigating whether the incoming alert on my phone is just another friend blathering on Facebook or something more important than you that demands my attention. And yes, it quite possible in my line of work that an alert on my phone is more important than whatever you're banging on about, in which case I'd save you the trouble of bemoaning connectivity and leave the conversation first.
If you need someone to lock their gaze on your steely visage at all times whenever you're speaking to avoid being offended, perhaps the problem isn't them.
Actually, my Nexus 7 is rather spiffing. Admittedly, it might not be running Android any more, but that's the nice thing about non-Apple kit.
I hear Ford Focus card are popular too, but would you really expect anyone to care if you mentioned you owned one?
I think you're using the wrong metrics when deciding what type of bloody phone to buy...
Looks like we've been having nicer weather than the UK recently. :-p
Stop broadcasting an SSID altogether and configure your devices manually?
I doubt it. The thing Apple got right was that you can't stick a keyboard/mouse interface on a phone, something MS has been trying (and failing) to do for ages. If they have forgotten that lesson, their design prowess is shallower than I thought.
It was 2011 when Lion started the conversion of OSX into iOS, with its pointless Launchpad crap and the ditching of Rosetta (which is fair enough in a way, but was still part of the strategy). Jobs was still around then, just about, and I don't doubt for a second that it was his idea.
I also think we're already several years past the point where people who like to upgrade their computers would consider a Mac?
I dunno. There's a difference between laptops not being that upgradeable, and not even being able to jam more RAM into it. With the new Macbooks you can't easily replace anything. Custom hard drive, RAM soldered to the motherboard (seriously Apple, piss off), etc. I'm typing on a Macbook Pro right now, and the first thing I did when I got it was shove 16GB of RAM into it. Apple were going to charge me 300 quid for an upgrade that cost me 100 quid, so I did it myself. Bollocks to the new Pros - this one's already had two hardware failures that I fixed for $100, and replacing those parts in the new ones would have cost me thousands.
They'll pry 10.6.8 out of my cold, dead hands.... But now the latest versions of Netbeans require Java 7 and you have to hack the installer to get it installed at all. Works just dandy of course.
Yeah, that caught me off guard, but then I remembered I was on Kubuntu. One quick add-apt-repository later and I was running Netbeans 7.4 :-) Haven't bothered upgrading the OSX partition yet for exactly this reason, along with all the other reasons listed in the comment thread above.
Steve would not have let this happen
Yes he would. Wasn't slowly turning OSX into iOS one of his master plans in the first place?
And I can see the quality of Apple's products decline — getting sloppy. Jobs would never have let this crap get through.
Jobs was around for the stupid i4 antenna nonsense, and the beginning of the iOS-ification of OSX, and I think he was still with us when they started gluing stuff together again.
Which is why one of the last things I do before committing any code is have the IDE look up any @todo tags in my current module. ;-)
Think my favourite comment was from my old boss, who left in one class:
"I think I've got this working, but then it was written by <ex programmer>, and <ex programmer> was a lazy incompetent twat."
I think they get paid by the line, including comments
I do a lot of Java. I doc the shit out of my method and class headers, because it makes code inspection and merging later far easier, and the IDE will pick up on them. Within my methods though, I keep the comments to what's required. I go with the philosophy that comments should not explain what you're doing, but rather why you're doing it.
One wonders, therefore, why the NSA bothered hacking it in the first place.
Because they could? It's pretty apparent at this point that the NSA's only oversight is a man who always says yes.
Based on iPad3 & 4 benchmark results, I see no reason to disbelieve them. iPad2 smashes iPad1, iPad3/Retina smashes iPAd2, iPad4 smashes iPad3. The stats are on the web for all to see.
I'm not contesting that newer chips are quicker, that just makes sense, but if the numbers are real, then why use bullshit graphs to show them? Why not have a real graph, if you're not exaggerating things?
and it also includes the battery-saving M7 motion processor that's in the top-of-the-line iPhone, offloading sensor-monitoring and processing duties from the A7
That would be the motion processor that's turned out a bit dodgy?
Oh, and graphs without numbers == meaningless. Those are practically the same curve, yet one is "8x faster" and one is "72x faster". Get lost.
Jet2 are another one I'd never fly with again, given the option. After a particularly uncomfortable flight to Barcelona, where three ordinary sized guys couldn't all sit back at the same time because our shoulders wouldn't fit next to each other (not guts - shoulders), in an "extra legroom" seat that actually had less legroom than the others, I was actually given shit by a flight attendant for being unhappy about it. Fuck those guys.
Until Windows 8 has reasonable takeup the stock MS browser on it will remain largely irrelevant to Google.
OK, Windows 8 takeup is still way behind 7, and XP continues to get in the bloody way, but I'd argue that 8 is still relevant. It's market share is already above that of OSX, for example.
To be fair, IE 9 and 10 have generally been way easier to support in the first place, and equally killing support for them is unlikely to mean that the pages will no longer render properly in IE9; they just won't be tested there.
Rubbish. All Valve proved that if you spend man months of developer time optimising a single game on Linux, but don't spend the same effort optimising the Windows version then it can run faster.....
I'm sorry, but are you seriously implying that Valve hadn't previously put any effort into optimising Left 4 Dead 2? A popular game based on their well-known and established Source engine? Never mind that it also has an OSX release, and a 360 release, and the Linux version was still faster. Oh, no, course not - it's a deliberate cover-up of the way you see things.
As I said before, the original Eyefinity demos were Linux (i.e. not done by Valve). But I suppose that was just a graphics manufacturer doing something evil like optimising code before showing it off.
And if top-notch gaming isn't possible in Linux because of the apparent fundamental flaws you've been on about in your previous two posts, I'd love you to explain the ridiculous performance jump I just got after upgrading to 13.10. My game of Killing Floor (also not a Valve game, nor a Valve engine) went from a crap frame rate on the lowest graphics settings to running silky smooth on the highest. That's 6 months difference - one minor kernel version, and hugely improved AMD drivers. It's almost like the drivers really were crap before, like AMD had only bothered to properly optimise them for the platforms games were generally available on...
First time in a while I've seen the reality distortion field from a Windows jockey. Geez...
Just install Services for UNIX:
Doubt it'll bring Windows completely up to scratch, but that still looks useful for when I'm in Windows. Thanks. :-)
The Windows advantage is primarily in the Windows kernel performance, rather than the drivers....
Given that Valve have already proven that Linux builds of games can run faster than their Windows counterparts, I'm not really convinced by that. To be honest, there's absolutely no architectural reason why games can't run fast on any of the three main OS', and Linux builds have been used to demo very high-end graphics previously. Weren't the original multi-screen Eyefinity demos actually running on Linux?
I actually got so sick of Android on my Nexus 4, waiting constantly for Ubuntu's promised image, that I've resurrected my Nokia N900 and started using that again. Which leaves my Nexus wide open for a bit of Linux hackery, I reckon!
TL;DR - Think I'll be joining you on the brickwagon.
Why not install Cygwin, then? http://cygwin.com/
It's not just the command line that makes Linux better for dev work, you know. There's a lot of tools that Cygwin can't run, GUI utilities that Windows doesn't have, and just straight-up better windowing systems. I'd rather work with KDE any day of the week, just because I can configure the crap out of it.
Windows 8 is still faster for gaming:
I'd expect it to stay faster for a little while yet, while the Linux driver optimisation comes up to scratch. That said, if the Linux drivers are at least capable of running compatible games without noticeable slowdown, there's no need for Windows any more.
Ever since the beta of Steam for Linux dropped, gaming has been my primary reason for upgrading distro, kernel and graphics driver. My machines tend to run AMD, and the AMD Linux drivers have previously lagged behind nVidia's. If an upgrade gets my Linux box to the point where I'm no longer rebooting for a round of Killing Floor, that's alright by me.
They had huge penetration into schools and made a difference teaching kids how to program. I programmed I/O stuff on one in high school as late as 2001, and I own another unit myself. Wasn't that the point of the effort? To teach people?
Instead they will see people with British accents, entertaining them , whilst teaching them the values of friendship, helping others and politeness. They will be educated without realising it. All of which is absent from Nickelodeon, Disney Junior or, even worse, Tiny Pop.
I'm not entirely certain why British accents are required, but your post does imply a certain...bias towards the homelands?
And I believe Tiny Pop now has the new My Little Pony, which is in my opinion one of the finest, best written cartoons produced for children in quite some time. It is absolutely superb, and I'm pretty certain that a series subtitled "Friendship is Magic" covers all the bases of "friendship, helping others and politeness". But I'm sure you'll dismiss it because it's made in Vancouver.
If you don't like the BBC, that's your loss, you obviously enjoy trash TV of the lowest standard.
Well that's not snobbish or patronising at all.
Go live in America or Australia for while, watch the local stuff, and then realise what you're criticising actually produces some of the finest TV programmes available, anywhere.
As a British ex-pat currently hiding out in Canada, you're surprisingly ignorant. Sure, the US has some quite astonishing crap, as does the UK, and the BBC is more than capable of producing both quality and crap - see "Two Pints" or the state of the modern Panorama for examples of the latter. That said, shows like The Wire, Deadwood, The West Wing, and so on all come from our American friends, and are quite excellent.