377 posts • joined Wednesday 31st October 2007 00:58 GMT
I don't much care if the originator of the acronym pronounced it wrong. It sounds dumb the way he says it.
1 out of 3 aint bad.
I doubt that phones and headsets rely on this API, unless they in turn depend on apps which rely on the API.
The key link in the chain being "apps". Specifically /desktop/ apps.
The reaction and participation on the petition - or rather the distinct lack there-of - is a pretty good indicator of the impact and importance of this API. The internet says "meh".
Re: Still not enough
Not really. Users want some of the things that the Start Menu gave access to but certainly not everything.
But I agree that just a button that takes you to the start screen is not what most people want either.
Re: Tablets shrinking, not phones growing?
Why wouldn't it be backwards compatible ?
WinPhone API is currently a sub-set of WinRT + some additions.
If WinRT is the focus and is expanded so that WinPhone is at some point completely subsumed within the RT API to become WinPhone 9 (for example), then WinPhone 8 is - by definition - a subset of WinPhone 9 and there is no reason to think that a WinPhone 9 system could not therefore run both WinRT (8) and WinPhone (8) apps.
In exactly the same way that - for example - an Android 4.2 device can run Gingerbread apps.
Re: One Ring to Rule Them (and in the darkness bind them)
A building that size creates it's own shelf.
A lot to cover...
Thinking a little more literally ....
Only one Apple product needs a "cover"... the iPad.
Needing to cover "a lot" could be a clue at a bigger iPad - a rumour recently scotched, but perhaps Apple have managed to keep a larger iPad successfully under the radar. Highly unlikely, but possible.
Or it could just be a significant number of iPad announcements.
Could be anything but why bother running around in circles trying to second-guess. The 22nd isn't so far away that just waiting won't kill anyone. :)
"No major user interface tweaks" ?
I don't know how you measure "major", I can only imagine that the author uses a quantitative yardstick rather than a qualitative one.
I know that I for one am really looking forward to file system level file tagging and tabbed Finder.
Only 2 changes, granted, but in terms of the impact on usability, I anticipate that these will be HUGE.
Re: Love all the "won't buy one" comments when no one knows what it'll be
All my orgasms are already wireless.
Re: Super-fast hard drive? Now that's an oxymoron!
A steam-driven truck could actually be a huge success. ;)
Except that these are not JUST chargers
Where was the EU when phones did only have to plug in to chargers and there was far greater/worse diversity in play ?
The micro-USB connector on my Galaxy S2 is the one thing that causes me to cast envious eyes at my fiances iPhone charger.
Even the old 30 pin connector... sure it was bigger, but somehow it slots home more naturally and easily than the micro-USB which even after 2+ years I still can't slot home without having to check the orientation of the male connector in my hand and carefully lining up.
The new orientation-agnostic lightning connector is pure genius.
I was once told that it was known within IBM as a "Kingston Reset". Anyone able to confirm or explain ?
Oxygene needs to be mentioned
Between the free (as in beer) FreePascal and the expensive as in (who needs two kidney's anyway?) Embarcadero offering, sits "Oxygene" from RemObjects.
Like Delphi, Oxygene is an ObjectPascal based language but unlike Delphi instead of creating a portable runtime (FireMonkey) which all Delphi apps have to target if they wish to deploy to mobile devices or Mac, Oxygene compilers inhabit the natural world of each platform.
Oxygene for Java works directly with Java packages and emits Java byte code.
Oxygene for Cocoa works directly with Cocoa and emits LLVM code.
Oxygene for .NET ... well, you get the idea.
Using Oxygene means that you are working with the platform tools as the platform creators intended, but are not confined to using the language that those same creators used (Java / Objective-C) but avoiding all of the limitations and pitfalls that a lowest common denominator / one size fits all approach suffers from.
Worth bearing in mind is that Delphi for Android / iOS does not offer universal support for either Android or iOS.
For Android you can only target 2.3.3+ or later Android OS because they rely on the "NativeActivity" class introduced with that version of Android, to bootstrap the FireMonkey runtime. Even if you have a modern JellyBean device you could be out of luck as the runtime demands of FireMonkey mean that NEON support is mandatory. Whilst perhaps less of a problem in the future, it currently rules out Tegra2 chipset devices for example - Samsung Galaxy Tab 10.1, ASUS TF101 etc etc.
For iOS the embarrassing performance of FireMonkey on older iOS hardware has resulted in Embarcadero now mandating iOS 6 as the minimum for FireMonkey apps, ruling out 1st gen iPads for example. iOS 5.1 may only account for 6% of active iOS devices but that's still 24 MILLION+.
But more importantly than potential users unable to use your kit, if - like me - the spare gear you have knocking around to use as a development test rig happens to be either or both of these sorts of devices (in my case a TF101 and an iPad#1), then you will need to add the cost of some hardware to the already eye watering cost of Delphi.
Oxygene on the other hand - like the native tool chains - support anything that the platform vendors support.
It's a more sophisticated approach and cheaper into the bargain ($699 initial, $499 annual subscription thereafter. Unlike Embarcadero, you aren't required to pay your first year of subscription renewal up front).
Delphi Pro XE5: $1,000 + $500 for mobile support + first year support and maintenance = $2,000+
5s on top/at the back | 5c below/in front
Did I win ? Where do I collect my prize ? :)
Your "regret" was in fact a "disappointment"
That is unless you were involved in the decision as to where to position the power button.
Point of order m'lud....
That's not "bigger" than your desktop displays, only "more detailed".
As you were...
Re: Kindle Refresh?
3G coverage was always global. In theory.
The Amazon coverage map is not to be relied upon however. That map shows, for example, that New Zealand is covered. We aint. The map shows where 3G is available, not where 3G carriers have agreements in place with Amazon, and that is the important part. If you are not in an area serviced by a carrier with an agreement in place with Amazon then you are stuck using a USB rat-tail and a PC to manage the content on your Kindle DX.
I love my Kindle Touch - it's great for most reading I do. But it is less than ideal for reading technical references. The larger Kindle DX would be a perfect complement to my Touch for me, if only they added a touch screen and wi-fi either instead of or as well as 3G.
Still waiting ... :(
Re: And use real stopmotion special FX, while you're at it
Primitve ? Maybe
Emotionless and full of character ? Absolutely not.
The greatest folly of the CGI industry was to get the movie industry to believe that looking "real" was the key to success, as if the 100 year history of cinema counted for and taught us nothing about an audience's ability and willingness to invest emotionally in visuals that on their own were clearly unreal but which when presented as part of a well formed narrative were completely convincing.
They are only now starting to learn the lesson that it doesn't matter how real or impressive your visual are if you don't have the foundation of story and character for those visuals to build on.
Re: Future Digital
Yes, at about the same time that digital pixel resolutions finally caught up with the detail resolution of film grain... approximately 16MP on a 35mm sized sensor. But it isn't a direct comparison.
When you blow up a pixel-formed image too far you get ... pixelation, which looks horrible.
When you blow up a grain-formed image you get the grain revealed which can be incredibly pleasing.
Re: Originals were great?
And that was 100% true.
Lucas can't act for sh*t. Harrison Ford however can claim to do the Kessel Run in 12 parsecs and we believe him, despite his units of measure being all out of whack.
I have an 27" iMac at work and one at home, bought within weeks of each other.
The one at work isn't affected. :)
The one at home is. :(
I wonder if they'll fit a 2nd HDD for me while I'm there ? That would rock.
If you do enlist the help of a mate and carry your 27" iMac to the store...
Please, PLEASE take the time to stop at least once on the way, hold the beast to your head and shout...
I'M ON MY IPHONE! YEAH! THE NEW ONE! WITH THE BIGGER SCREEN!
Why not print the data in "2D bar code" form ?
Then load into scanners when you need to read it ?
This would also require that more tree's be planted to provide the pulp required to produce the storage medium (paper), consuming C02 in the process.
By contrast, flash memory consumes irreplaceable raw materials and only ever contributes CO2 during manufacture and subsequent use.
Well the advances made in the 10 years to 2010 didn't result in anything more usable or reliable than the ones in 2000.
You think there has been more significant progress in the 3 years since ? And Apple the only/first ones to realise and put it in their phones ? Given that technology in iPhones tends to lag wider applications somewhat ?
I'll have two of whatever you're drinking.
Absolutely. Pointless gimmicks have no place on an iPhone.
Re: 'this is simply unacceptable'
There is a very clear difference between "this low resolution copy is so crappy I can't tell the difference between this 6 and this 8" and "I can clearly see this is an 8 yet on the original it is clearly a 6"
Such warnings are proven to be highly effective.
I mean, just look at the way that film piracy all but disappeared when they put those PIRACY IS THEFT warnings on all DVD's and blu-rays etc. That sure gave those pirates a wake-up call.
You know it makes sense.
The "early days" myth...
It constantly amazes me that people have such short memories (or for those for whom memory genuinely doesn't extend so far, an inability to research).
It is not "early days" for this sort of technology. Far from it. It has been knocking around in one form or another for 30+ years. That's plenty of time for people to come up with applications - in terms of what the technology can be usefully applied to, and the software to achieve it - while the technology itself caught up.
We have already reached the point where the technology "evolution" is reduced to ever smaller and more convenient iterations of the same thing. Actual progress really isn't occuring any more. But every time a smaller, cheaper or just slightly different implementation of the tech is announced, the terminally amnesiac or plain uninformed declare that "the technology is finally here", it turns out that it still sucks and a new generation of people are born who just think "it's early days" and that the technology just needs to mature.
The tech in question is recognising and tracking in 3D space. Every "new" technology that claims to do this better than ever before in reality is just doing it differently than before. Accuracy reached levels where applications could do something useful with the inputs a LONG time ago, and yet it is issues with precision and accuracy that constantly plague those applications.
This leads the terminally optimistic to believe that the technology still needs to be and can be improved, when in fact the problem lies not in the accuracy with which the input device - the human limb - can be tracked, but in the accuracy with which the operator can control their own device. i.e. their arm.
Mice don't move in "straight lines"
A mouse is controlled by a hand (typically) so all the same observations about movement of floaty hands apply - when in motion - but importantly stays stock still unless deliberately acted upon (where-as a floaty hand requires arguably greater effort - certainly concentration - to remain still, inverting the principle and subverting the ergonomics of interaction).
Drawing a straight line with a mouse is actually very difficult - much easier with a pen/stylus device in fact.
What *is* true of a mouse however (or pen/stylus+tablet combo) is that the device is limited to 2 dimensions of movement.
Which makes perfect sense when you think about it. Your screen is 2D so having a 2D controller is obvious to the point that it makes you wonder what all these 3D spatial controller doohicky designers are even thinking of (when it comes to desktop navigation replacements).
Which is why whenever someone claims to have come up with a spatial recognition/tracking device that will replace the mouse, I don't even have to try it to know it simply aint gonna work.
1) What a dumb assertion.
You can insert anything in that statement in place of "X11 License" and it will be just as meaningful (i.e. not at all).
Here, let me show you:
There is only one MIT License. There is no "cheddar cheese". Look it up.
>>> provides link to a 3rd party reproduction of a non-copyrighted license that fails to mention cheddar cheese as "proof" of non-existence of cheddar cheese.
In common use, "X11 License" is used interchangeably with MIT License. You can argue this is inaccurate since the X11 license is technically only AN MIT License, not THE MIT License. But it does never-the-less exist even if in practice it does not have anything that distinguishes it from THE MIT License - which is what tends to be the case in such examples of "interchangeability". Take your own advice - "Look it up", you will find that MIT/X11 is predominantly synonymous in the licensing field.
(I haven't yet found an example of a supposed distinction between an X11 License and the Expat license (also AN MIT License but more widely acknowledge as identical to THE MIT license) which includes any additional clauses, despite the assertion in this respect in some quarters, referring to a prohibition on unauthorised use of the author's name in any sales or promotional activity).
Re: 2 year update cycle
I find the 2.3 > 4.1/2 differences to be very significant. Possibly small, but still significant. Swipe-to-clear notifications, infinite scrolling home screen pages, 5 home screen icons vs 4, more direct access to toggles via the notifications icon bar. The list goes on.
All small changes in themselves, but bit improvements in usability.
Just a damned shame we couldn't get all that good stuff without also having our battery life, performance and stability go South on us (see other post on the affect that 4.1.2 had on my Galaxy S2).
Re: I just got upgraded last Friday
Exactly the same story here!
I am this close (holds fingers in the air almost touching) to taking it back and getting it put back to Gingerbread. As well as the dire battery life, I find it sluggish, irritatingly unresponsiveness with enormous delays between touch and response, plus it is far less stable than before with apps crashing on a frustratingly regular basis that never had any issues before.
Yesterday my phone crashed-OFF entirely while I was browsing a news web site. Twice.
It's either back to Gingerbread or root and custom ROM the damned thing. 4.1.2 is simply unlivable with (on a Galaxy S2).
Re: It was "stereoscopic", not "3D".
Um, unless you have a number of cameras arranged spherically around a given point of interest capturing the subject simultaneously, the image captured - whether using a lightfield camera or not - will always be from the point of view of the camera.
Lightfield cameras have no greater application in 3D than stereoscopic cameras. The only advantage they would give is the ability for the viewer to adjust the focus at the time of viewing, rather than it being set by the director/cinematographer in a way intended to direct the attention of or evoke the emotion in the audience. In other words, completely subverting the artform.
Indeed. I ran a pub in the nearby village of Blunham when they were filming BATMAN BEGINS at the hangars. One of our customers had a commercial laundry business. As well as washing the (then) Jordan F1 pit crew overalls, she was doing the laundry for the GCPD (Gotham City Police Department). Who knew what laundry could be such a COOL profession! :)
Also worth mentioning is that Gotham's Dark Knight has other local connections. Just a few miles away is St Neots (where I hail from originally). There is a gas turbine power station on the southern edge of that town, built on the site of an old coal-fired station that was demolished in the 80's.
The derelict buildings of the old, demolished facility was used as a location shoot for the AXIS Chemicals factory scenes in the Tim Burton/Michael Keaton BATMAN movie.
Re: Watch your back
All these people commenting without understanding the issue and the principles involved. Sheesh.
The "copyright" is not in the IDEA, but in the visual representation in an artwork. Sure, floating mountains "existed" before Roger Dean painted them, but he isn't suing Hack Cameron for using the IDEA of floaty lumps of rock in general, only for the specific use of his original representation of that idea.
If Hack Cameron had come up with his own entirely original vision of floating mountains then there would be no case. Arguably, even if floaty mountains had been the ONLY element of similarity in AVATAR that was common with Roger Dean's work, there also would a much harder case to make (two people can have the same idea/vision independently). But when there are multiple points of similarity between two works, there comes a point where legitimate inspiration stops and copyright infringement begins.
Tickets for the clue train can be purchased from the desk in the lobby...
Re: Rip Offs or just similar ideas
When you say "Douglas Adams works", do you mean paintings ?
The issue here isn't the IDEA of floating castles or the IDEA of a huge f***-off tree, it's the direct copying of particular visual representation on those ideas.
As for the amount of stealing involved in AVATAR, I doubt you would get very far defending a charge of bank robbery by pointing out that you had in fact robbed LOTS of different banks. ;)
"Damage" is an overloaded term in the legal context.
In this case, the argument will be that Cameron should have negotiated with him for the use of his intellectual property. Had he done so, an agreement might have been reached for a few thousand bucks, if that was what the owner of the IP felt it was worth or was likely to be worth to the derivative work.
Since no such agreement was made - and no attempt made to broker any such agreement - the owner of the IP now has a right to say that Cameron profited from the infringement of copyright and that this profit was at the cost of any fees that would have been due to him under a licensing agreement. With the benefit of hindsight it can now also be established what the value of that copyrighted work was, given the commercial success of the derivative work.
As others have said, if AVATAR had flopped there wouldn't be a case. But not because there wouldn't be an opportunity to make money, but rather precisely because no money had been made. It's a subtle difference but an important one.
As for the $50m damages... that represents less than 1.8% of the international box office gross of the AVATAR movie, which is a pittance given the significance of the works influence over the derived work (arguably a significant proportion of the production design and at least one key plot element) considering that this does not take into account merchandising earnings from the movie or the earnings of the anticipated sequels.
Re: No chance
Those statements on the internet will be cited as evidence that, without prompting, coaching, encouragement or incentive, a number of people have independently formed the opinion that the world of AVATAR is to a significant extent derived from his art works.
And that is no laughing matter, as any copyright lawyer will tell you.
PC games have a different problem to address...
You need high frame rates on PC/video games because in those cases there is no "motion blur" to provide the hints to the visual system about rates of movement for objects in the frame.
That's why HFR is important for video games.
The only reason it is important for movies is that it makes post-processing the CGI cheaper - you don't need time-expensive algorithms and processing to "create the motion blur" necessary to make fast moving CG graphics integrate seamlessly with the optically captured images.
This is also why, when you then skimp on those motion artefacts in your post processing, the result is something that looks like a video game. If your movie consists primarily of CGI elements, it is rather predictable that it will look like other CG media.
But don't make the mistake of thinking that HFR is intended to improve the experience for the audience. It is merely a way to cut costs for the movie maker.
Re: film vs tv
If you need to "think about it" in order to decide that it works or it doesn't then it quite simply hasn't worked.
Either you watch 48 fps and you immediately appreciate the difference and feel it is an improvement, or it feels false and artificial and far from being "immersive" pulls you out of the illusion and is a constant reminder that you are watching the results of a pixel wrangling server farm in a data centre in Wellington/Hollywood.
Or you don't even notice the difference.
All I know is that far more people that I know who saw the 48fps Hobbit fell into the "No difference" or "Hated it" group than in the "Wow it was great" category.
For all that your "thinking about it" might have led you to conclude, or whatever theories of human vision anaylsis might suggest we should prefer, simple reality presents a stark contradiction.
To which the common response is "We only prefer 24fps because it's what we're used to".
No. That is an argument for explaining people's ability to spot the difference and even describe it ("it looked like a video/computer game, not a movie). It does not explain the fact that people simply do not like it.
A great article with just one point that I'd take issue with...
HDTV was - and is - a significant and dramatic improvement over SDTV... on a big enough screen.
The author neglects to address the dramatic increase in size of screen (in the domestic setting) that coincided with the roll-out of HDTV around the world. Not so much in the US, but they already had big-ass back-projection CRT 60"+ sets when we brits were still squinting at 26" tubes. But LCD and plasma screens made big screen luxury practical even in the rabbit hutch houses of Blighty.
And once you get to/above 40" the static resolution of SDTV does become intrusive. Maybe more so in slow, relatively static images, but then not all TV is high-action sports so it isn't unreasonable to devote at least a little effort on making sure that those sedate, static images will look good, when the eye ISN'T tracking a fast moving subject across the field.
We have also seen significant improvements in codecs of both broadcast and distributed media. Bluray delivers HD images, but it also delivers images with greater dynamic range - true blacks and far less intrusive colour and contrast stepping and blocking (to the point of being non-existent to most people).
So I don't think it's fair to say that the industry hasn't acknowledge that static resolution isn't the only goal, neither is is quite accurate to say that static resolution isn't at all important - as I say, as a medium, film and TV have to allow for the fact that some images will be (at least relatively) static, so you can't just ignore that aspect completely.
Re: FFS - It's a development kit, not a prototype or manufacturing sample
Wrong. The PS3 dev kit you are referring to was a cut-price kit ($2k) introduced after the console itself was already on the market, in an attempt to kick-start the home-brew/budget games community.
The original PS3 dev kit ($10k) - the one that is comparable to any current PS4 dev kit - was a hulking great box that looked more like a 1st gen betamax player than a PS3. And - GASP - it ran a different OS that the console itself. Who wouldathunk it possible !!?!
FFS - It's a development kit, not a prototype or manufacturing sample
Does nobody these days have even the most basic grasp of how these things work ?
HINT: It was some time before there was any development environment for creating Windows applications that actually ran ON Windows. I can sit here today with a Microsoft Windows based Development Kit that produces code that runs on OS X or even Android or iOS.
To really bake your noodle, I can use a Windows guest OS in a VM running on an OS X host, using a Windows based development kit to produce Android / iOS / Windows / OS X software.
For the dense of skull, just because the Development Kit is running a FreeBSD variant/derivative, that is no proof that the code produced from the back-end compilers in that SDK are targeting the same - or even similar - OS.
It isn't to say that it isn't either. But you might just as well speculate that the PS4 will be based on .. oh, I dunno... just make something up.
Um.... given that the original work was by a Korean, I wonder if in our rush to ridicule, perhaps something has been lost in translation.
Perhaps the intention is to send a message which appears to be digital garbage, essentially using the font as a Caeserian Cypher. PRISM, the NSA, GCSB, GCHQ etc see "Xy kxtrt" but when you view that message using a particular font you see "Hi there".
And given that lowercase and uppercase characters have difference code points, you can even make letter distribution analysis more difficult by employing a different Caesar Shift for upper and lowercase (not to mention rotations that mix non-letter and letter characters etc).
So even if the original effort was not along these lines and really was as naive as the report of the report of the blog post suggests, the essence of the idea has some validity.
i.e. send messages which make no sense when examined as a series of character code points, and which only make sense when rendered visually using a very specific font. Adding CAPTCHA style obfuscation then becomes your last line of defence against snooping, should the authorities render your message using the required font and attempt to OCR it.
The foolishness of that final defence is only that if the snoopers have reached a level of awareness that they know with which font to render before attempting to OCR, then with only a small effort with a few correctly rendered messages, humans could quickly decode the Caesar shift in the font and apply that algorithmically to all other similar digital messages - no need to OCR.
Re: One good U-Turn deserves another...
They may not be better in the long run but they certainly aren't as bad yet.
Getting shot in the head is never nice, but given the choice between being shot in the head this Christmas or in a few Christmas' time, I think I know which I'd prefer. But if you have a hankering for an extra hole in the head sooner rather than later, that's your affair. ;)
Re: OK I will spell it out. @ AC 13:24
This is called trying to "have your cake and eat it too".
You simply cannot buy a "cheap dual core gaming PC" that will "spank both consoles". A modern spec'd gaming PC might be able to do that, but not a comparably priced PC, certainly not a "cheap" one, where "cheap" really has to be talking about comparable price. Otherwise the comparison is meaningless.
Why? Obvious. Of course something that costs 2-3 or 4-5x the price of something else is going to be better. Duh. It's completely pointless to say that a "cheap Ferrari" will "spank a Ford Mondeo". For example.
And you also left out the part where you buy your much more expensive PC in the hope of getting all these enhanced effects in the game, but with no guarantee that any one game will take advantage of your particular selection of equipment, because of the variation and constantly shifting/improving state of the hardware out there. It simply isn't cost effective for publishers to spend time tweaking their products to make the most of every possible bit of kit it might run on, and certainly not on the older gear. Dammit, people should just upgrade already.
The consoles don't have this problem at all. And ironically, as consoles age, so the games that come out actually get better, even though the hardware hasn't changed. Simply because with a fixed target to aim at, the developers get better and better at extracting the most from that hardware. This is a cycle that has been repeated since the very first PlayStation - look at the first gen games on that console and then look at things like Gran Turismo which came much, much later. The hardware stays the same (and gets cheaper and cheaper) but the games get better and better.
Compare and contrast with the PC gaming mentality which is that developers don't need to worry too much about making the most of the hardware because the punters can just upgrade the hardware or downgrade the game settings if their rig isn't up to the job.
I know this flies in the face of the "truthy" thinking about how such things work, but it is the reality.
One good U-Turn deserves another...
Microsoft ran scared of losing sales in a catastrophically humiliating fashion when their XBone went up against PS4 at launch and were facing a DOA product. It really is no surprise that they would do a U-Turn in such circumstances.
But they showed their hand.
We know where they want to take gaming. So once they have sucked enough punters into the new XBone ecosystem, they will simply U-Turn again, with some spurious claims about "regretfully being forced to close loopholes that enabled piracy" etc etc. As far as I can tell, the U-Turn on game sharing and re-selling isn't even really a U-Turn. This "new policy" is all about gamers being free (as in speech) to share and re-sell. Nowhere do they appear to say that they guarantee - for any period, let alone in perpetuity - that it will be free (as in beer) to do so.
You really would have to be some prize idiot to buy this as a sincere change of attitude.
- World's OLDEST human DNA found in leg bone – but that's not the only boning going on...
- Lightning strikes USB bosses: Next-gen jacks will be REVERSIBLE
- Pics Brit inventors' GRAVITY POWERED LIGHT ships out after just 1 year
- Microsoft teams up with Feds, Europol in ZeroAccess botnet zombie hunt
- Storagebod Oh no, RBS has gone titsup again... but is it JUST BAD LUCK?