Re: Intel based
Who cares? If the performance/price combination matches up to expectations, what does it matter?
6100 posts • joined 28 May 2010
Who cares? If the performance/price combination matches up to expectations, what does it matter?
I've seen this both on TV and in cinemas - the camera pans quickly and you can't really see clearly what's happening, like it's not really that smooth.
I don't understand how that happens on a modern TV or projector - is it actually part of the recording, the display technology, or my eyes/brain? It's certainly worse in some cases - F1 on NowTV streaming for instance - but considering I see it in big budget movies in the cinema I am starting to wonder if it's me?!
I have, and last time I was there you could spot the 4K TV from the other side of the store, the difference was so big.
It made every other TV look crap which possibly was a bad thing as people who couldn't have afforded 4K might've decided to buy nothing!
It would be interesting to get figures on how many hours are spent watching films in cinemas per week, compared with TV. I don't know if the data is available but you could for instance find a 1-cinema town and ask how many tickets they sell per week, multiply by 2. Scale up based on town population Vs national population and compare to official TV viewing figures.
My total guess is cinema viewing is tiny in comparison, like 2% of TV watching 'man hours'. Anyone able to do some guesstimates or provide data?
Based on what counter-argument? More and more often you'll see people in a communal living area watching Youtube videos on their laptops/tablets/phones, even while the main TV is on.
If he can't see a difference between HD and 4k in the carefully contrived showroom then I suggest maybe his eyes ARE the problem.
Either than or the person who set up the demo units should be fired.
As you say - SOME cases. But not to the extent most people bother buying dedicated HiFi equipment any more, they just use their phone or PC.TVs are still in the "everyone has one" phase, like HiFi was 20 years ago.
I'm not so sure TVs are on the way out but it's certainly a possibility.
Isn't the ideal that we reach a point that the mundane jobs are done by robots, and people don't HAVE to work - we have enough resources that rather than having to be a "benefits culture" claimant, work becomes a choice rather than a necessity?
"My body just got re-united with my soul and mind, the parts of me that matters and that never can be held hostage"
What? Surely that's the exact opposite of "you can imprison my body but not my spirit".
Amazon Prime UK works fine on my wife's Android tablet. It's a Nook which she replaced with Cyanomod's Android.
My HomePlug unit acts as a WiFi network as well as having ethernet sockets - so you can simply have one WiFi network per room (I have one in the garage).
Especially when games consoles also have all these apps, and you can (I think) also plug in your iPad to display content from all these iOS apps on the TV... presumably Android allows this too and obviously so does a laptop.
HomePlug might be another great option.
Given the stuff you see is from people you told FB you were friends with, "how utterly stupid a large proportion of the population is" reflects more on your social circle than the population as a whole.
While I agree most of the population are stupid or boring (to me) I only make friends in real life with those who don't fit that description... why not follow the same rules online?
FB is FAR more useful and less annoying when you use it to network with people you'd actually talk to. If you open it up to people whose opinions you aren't interested in, of course it will seem rubbish.
I thought so too.
It is a sensible thing though because often there are people who you ARE friends with, who decide to post 5 pictures of their baby/cat every day or link FB posts to every app and game they play.
He has more evidence than you, since he was an eye-witness. Clearly he was a blind eye-witness, or maybe you just have your own viewpoint you're blindly following?
Clearly if you disagree with what was written, it's wrong/rubbish.
It's also not bias if it's reported as opinion, which it was.
They didn't seem that bothered about keeping their masks on either
Nominate as most arrogant comment of the day... if only they were as informed, smart and awesome as me they'd clearly take my viewpoint because that is objectively the only sensible viewpoint any right minded individual COULD take.
Get over yourself.
"Tie yourself and others into proprietary formats, and be at the whim of a single entity who's only purpose is to extract as much money from their customers as possible?"
Proprietary, open formats which are a recognised standard that other major players support?
Oh, no, a company who exists to make money. If only we could all work for free!
Just highlights how/why Android is harder to develop for than iOS.
In the old days you HAD to know how to do minimal coding and monkey about with your config scripts just to USE a PC effectively.
Then why are you commenting on the story you prawn?
"The new Ferrari is useless, I won't be buying one for sure. Oh, also I can't drive"
Why would any features be blocked.crippled in a FREE version? If they put all the good stuff in, nobody would pay.
Seems they've given enough for people to write letters and personal spreadsheets, anyone using it day-to-day for more serious stuff would need to pay.
How is that LibreOffice iOS app working out for you?
It's just about writing the code but the fact you have to rewrite big chunks for each OS, especially the UI side which is pretty darn important. Clearly the effort went in to iOS first and that would require them using OBJ-C (well not having to but on a project that size it would make sense) which means it's likely an entirely separate development team than those who would work on a WP version.
People who aren't paranoid I imagine, and the subset of geeks who aren't obsessed with the NSA.
Whether it only phones come when it hears it's name is quite relevant I think. The article suggests not but could just be inaccurate.
As for why: don't know. Being able to control music playback in your home, lighting, etc, could be neat. But that doesn't really require cloudy stuff, surely PC software can do it.
Seems to me one problem is simply the mic won't hear you unless you're in the right room. For this to be as transparent and useful as possible it needs to "just work" no matter where you are... so you can wander between rooms.
Why would sending a few analysed voice commands be an issue. It may be listening the whole time but is it like having Skype turned on all day, or only when it hears its name will it actually open a channel?
Either way, I rather think it is the point. It's a techy thing for techy folk.
Better not have that MMR either. I mean, you can't be too careful.
I wonder how many people get eye injuries due to their glasses every year (actually serious question, considering how many people injure themselves getting out of bed every day).
That's a crock. You gamble with your health every time you eat barbecued food or go outside on a sunny day... a rogue particle could knock a free radical loose. You gamble every time you eat food you haven't prepared or use crockery someone else has 'washed'.
You gamble every time you play sport - a knock could detach your retina or any number of other things.
Having dental treatment has risks too. You gamble that saving your tooth or reducing your toothache is worth the tiny chance of something seriously wrong happening.
As you get older and get a cataract, you gamble that improving your eyesight is worth the small risk of making it even worse.
Heck, even contact lenses have risks.
They say doctors make the worst patients.
The statistics are available to doctors and patients alike... maybe the former are just more conservative.
I'm fairly sure the guy who did my consult had had the treatment.
I also now need eye drops but then I spend all day staring at a screen which is about the worst thing for that.
Still worth it just to be abel to go outside in the rain or walk into a hot room in the winter and still see.
Gee Les, what an original and witty remark.
It doesn't actually make any sense but hey it's a nice pithy meme.
Oh for goodness' sake, stop holding the web up as some idealistic paradise. It only exists in its current form because massive companies and governments have ploughed untold billions into the infrastructure, in the belief they can make money from it in some sense.
If you post on someone else's site, they can take it down at their discretion. Internet forums and social networks are the "Wild West", if a grumpy moderator doesn't like you then tough.
Of course if you are making part of your living through a page on FB it gets murky. Has nobody sued FB for this yet, and at least made it to court?
Lots of expensive things don't last as long as cheaper ones. I'd wager a 500k supercar will last far less time than an old Ford Focus.
It's for the super rich, if they have enough money this counts as disposable income then good for them, let them buy it.
I'm surprised Apple don't team up WITH a brand like Rolex or whoever for the fancy designer versions. Brand is important in that world - gadgets and expensive watches are very different markets.
But now 16Gb IS available... and that's the maximum you can officially install. So unless you can actually cram in more than is supported (you could with some older MacBooks) things are a bit different.
The main issue is simply it costs you more now. I'm not sure how much more though... Crucial sell 16Gb kits for £125 although of course that way you did end up with 4Gb left over you might be able to sell.
Interesting. I wonder if that's a new thing or if they did this previously? It would make sense given that low power use and less need for cooling are important on the Mini.
Your Mac Mini has a better GPU than the Pro?
I don't think it runs a laptop CPU. At least the 2012 one doesn't.
It also doesn't use laptop RAM or laptop hard disk (from what little I know about laptop parts) given that you can fit a standard desktop SATA SSD.
And laptop motherboards don't generally have 4 USB3 sockets.
They don't market it as a server now but why would that need upgradable RAM, when you can spec it up with the most RAM it can support? That's a cost issue, not a spec issue.
More importantly, which the article didn't obviously mention(?) is that you cannot buy a quad-core Mini any more. That really hurts for use as a serious desktop or server. My guess is Apple realised the Mini is predominantly used as a media server or basic desktop and doesn't need those things.
Personally, I immediately bought the cheapest quad-core 2012 Mini from my local store while I still could - I was waiting for a refresh to buy a Mini for fear of buying an older spec, but it turned out the opposite. If I'd known in advance, I might've bought up a load of them and sold them on for a tidy profit!
"But at least it used to be a doddle to expand the previous incarnation’s memory: rotate the base, unclip the Wi-Fi card’s antenna, unscrew a metal plate, and there’s your RAM slot."
That's not even nearly true. You simply rotate the base, and the RAM is RIGHT THERE (in the 2012 model anyway).
I also very much doubt Yosemite is really using 7Gb of RAM out of the box.
Why is El Reg so keen on that photo? Does a man posing while not wearing a shirt automatically count as gay and/or porn to you lot? A topless man is a long way away from a nude selfie.
If they really want them, they can probably get them anyway. And no I don't mind, if Obama is that bored he is free to look.
...it's disgusting that they promote destroying the planet with reckless printing. Surely little Tommy should've set up a viral #BringBackOurIguana campaign on social media and championed the cause on national TV.
It's not because the population are idiots but because they love to sue. Other countries have as many idiots but they wouldn't try and sue the company when they drive on the wrong side of the road after watching an advert.
That sounds like something only an Australian would say. I'd never considered a croc to be appealing.
Considering how all major players are clamouring how their browser is the new fastest JS machine...
Why don't the various major browser manufacturers just agree to add a proper language in their browsers, i.e. JS 2.0 or something? It would take a long time but we got there with HTML5 in the end.