924 posts • joined Monday 23rd April 2007 10:09 GMT
Re: Not convinced..
For all the apologizing that everybody seems to be doing, saying that it is just a developer preview, that the design is still in flux, that it's merely a "mid-stride snapshot," yadda, yadda; the fact is that Apple released this design to the world: they put it squarely on the front page of their web-site, added a new section showcasing the new design, and made TV commercials for it. The new design was released, even if the software was not.
This was not a half-baked (well, aesthetically it is), let's-show-what-we-have-so-far effort--they thought they were done with it and were very proud. It may have been rushed, but they were satisfied with it enough to announce the new design publicly, outside the context of the WWDC, and to wax philosophically to the world about it.
If they change those icons now, or tone down the garish design before the actual release of the software, it'll be precisely due to the vociferous criticism and ridicule they have just received--it won't be because they planned it that way.
That, or the entire effort was an unmanaged and confused mess of people doing their own thing without communication or coordination, and completely lacking someone with a bit of taste and sufficient power to stop them from showing a half-done product.
Either way, it's a new kind of Apple, and it's that pretty.
Re: Not convinced..
I agree. The new look smells too much of designer masturbation for the sake of being different. It reminds me of those logo-type boutique mission statements that El Reg is so fond of ridiculing.
It went something like this...
Cook - "We gotta do something! This Samsung ripping us off has got to stop! Google, Nokia, HP, they can't leave us alone! We gotta do something. Get me Jony!"
Cook - "Jony, is there a way to make our look and feel 'copy-proof'? You know, add some fancy DRM or something that would prevent others from copying it...? Is there?"
Jony - "Hmmm, I don't think it works like that, sorry."
Cook - "Argh! There's gotta be a way!"
Intern #1 - "Sir, I think there may be a way..."
Cook - "Give it to me!"
Intern #1 - "You may not like it..."
Cook - "I'm desperate, hit me!"
Intern #1 - "What it... What it we made the screen... ugly? What if we designed it so crap-tastic, that nobody would touch it?"
Intern #2 - "With funky psychedelic colours!"
Intern #1 - "And tiny illegible fonts!"
Intern #2 - "And weird indecipherable icons!"
Intern #1 - "better yet--cute, childish my little pony icons!"
Intern #2 - "Yeah!"
Cook - "You know, that is so crazy, it just might work!"
Cook - "Jony?"
Jony - "I'm on it!"
Cook - "Oh and Jony, don't forget to write it up as the pinnacle of design... You know, for the snobs..."
Jony - "Got my thesaurus and Architectural Digest right here."
Extra! Extra! Founder of secret cabal accused of plotting to take over the world denies actual plans to take over the world!
News at 11.
Re: Bad timing last time?
The thing that you--and the article author--is missing is that H.264 succeess was not due to ubiquitousness on the Web, but to its pervasive use throughout the entire content industry, including consumer media-delivery electronics.
This is the reason why VP9 will also fail against H.265: as long as Google keeps on directing its efforts at the "Web delivery" side, they will miss the introduction and proliferation of H.265 throughout the content creation pipeline. This includes direct hardware support for the codec on every aspect of production, such as cameras, film-editing devices, etc. Why would anybody even consider re-encoding their product for the Web when the major and popular browsers already support the native encoding of your content?
That has happened many times before. The JPEG format became popular on the Web because it was a standard for digital photography that existed prior to its inclusion on web pages. It was only natural to include support for it in web browsers and enable a myriad images at once, rather than force everyone to re-process all their photos just to put them up on a web page.
As long as content is produced off-Web first, non-Web-specific codecs will permeate the industy and the Web will have to adapt to support them. Enterprises like Apple and Microsoft understand this. This is why QuickTime and AVI were always intended to be content industry standards, not plain PC codecs. Google's VP9 is not being proposed as a new standard for the film industry as a whole, but as a Web media solution. As such, it'll remain a niche player, if it remains at all.
British Phonographic Institute
Wow! You guys have a British Phonographic Institute over there? How do you enroll, is there a test? Do they offer night classes?
I truly believe this is the future of machine-human interfaces. It's not there yet, and it will need to be commoditized and miniaturized much more, but that will come in time.
The part you are missing is that, pay attention now, IT IS NOT A VIDEO PORT. Did you get it? It is a generic, multi-purpose, serial link providing raw information.
This significantly simplifies the internal design of the device, lowering it's cost and failure potential. One single port, as opposed to myriad dedicated sockets.
The thing to keep in mind is that not everyone will require wired video output (indeed, most within the Apple ecosystem will just stream wirelessly). They will be spared the expense and complexity of having a dedicated port for it.
Those that need such a thing can buy an adaptor that implements the necessary transcoding of the signal. Being controlled by software means that it can easily be upgraded and improved.
It is a rather clever and elegant solution to future-proofing the device.
Re: Yet another reason to shun the gootards.
Please enlighten me... what is that "Better than Heroin" thing that you talk about?
Re: Android now king of tablets too
And they'll still rake in more money than the rest combined.
Happy B'day, Perl!!!
I've been using Perl since the late '90s, ever since I was "baptized by fire" by a boss giving me a text-processing/CGI task, handing me the Camel Book, and demanding that it be completed in 6 days.
I discovered, to my amazement, that the book read so pleasantly from start to finish; a testament to Mr. Wall's wit and literary prowess. To my further amazement, I discovered that Perl, too, was a pleasant tool to work and play with--again, testament to Mr. Wall's background and experiences.
I love Perl. I particularly adore its two most important features, in the words of Tom Christiansen: "getitdoneness" and "whipituptitude."
Happy Birthday, Perl! Here's to another 25!!!
To this day, whenever I need a translation, I always type in "babelfish.altavista.com" in my browser's address bar.
Ah! Those were the days.
Re: When will Google realise
They need to continue pounding on the ChromeOS nail. You see, Android is a fluke. It was intended to tide them over on the mobile space (giving them an entry against the BlackBerry behemoth), while they worked on their real project: the take-over of the PC market and expansion of their online ad/search business with an always-on, browser-only, cheap netbook-like PC.
That was the plan. The world would use cheap netbook PCs that ran *everything* on the browser, and required a constant connection to (guess what?) Google for all it's services, not the least web search. This is where Google makes its money from, remember? Up until 2006 or 2007 every tech company was banging on about how The Web will be the platform of the future.
All roads would lead to the Web and they all would pass through Google.
Then something weird and unexpected happened: the iPhone and then the iPad showed how mobile was to be, and it wasn't the mighty Web. It was small devices, running native myriad applications, including games. Android--the free, open source side project, of all things--became immensely popular. Who knew?
So Google has tried since then to follow through, but they just can't shake it: the world does not want an always-on, web-browser-only cheap netbook. The world wants smartphones and tables that include the web as a mere addition to the many other features they have, most of them including native applications.
Android keeps getting more popular and ChromeOS keeps falling deeper into obscurity. But they can't let that happen, you see, because (and here's the secret, so pay attention), THERE IS NO MONEY ON ANDROID.
Re: Forstall was a polarizing figure, so this may be good for Apple
You have a point. However, keep in mind that, as polarizing as he may be, Steve Jobs managed to keep him along and make him and everyone else work together on many successful products for 14 years.
If anything, this shows an issue with Mr. Cook's leadership and ability to keep the lights on and the trains running on time after Job's passing.
Re: 'Large marine animal', yes...
Because if it were the wrong eye, it would be of a different marine animal. DOH!
Is that "adult" as in "adult entertainment," or as in "for grown ups."
Re: Old News?
No. The news this time is that the FAA has approved its use, in-cockpit, at all stages of flight.
Re: "developers... must build "wrapper" UIs that skin Apple's own Safari browser"
But the user doesn't care: he can "browse" the web the same way.
I like Safari too.
You win 3 Internets today, sir!
Backed by HP
I noticed that they had to mention "backed by HP" twice in two contiguous sentences, as if to ensure it's taken seriously.
"No, really: backed by HP. I swear. They're backing us up. Stop snickering at the back, you!"
Legions of consumers complain about the myriad buttons and complex nature of modern remote controls, yet they still use them, rather than walk over to the darn set to switch the channel.
And now, you expect them then to stand up and walk over to the other side of the room every time they want to show a picture or play a song?
Less than useless.
Re: I simply don't get...
That's because Samsung's lawyers are absolute idiots that couldn't defend themselves out of a paper bag, and obviously didn't even know how to show prior art to ridicule Apple. Had they even shown a single old smartphone from 2004, or even a screenshot of a Star Trek episode, the judge would have immediately thrown out the case in favor of Samsung.
Perhaps the case is a bit more complex than what you are supposing, and the prior art that you suggest does not really defend the *actual* claims of the suit. Perhaps the evidence presented to the jury by both sides has more depth and nuance than what you have picked up from blogs or Twitter feeds. Perhaps the case has more merit than what nerd-rage fueled posters in a tech rag can conceive.
Re: OSX is shoddy, hardware is commodity, only iOS?
>> "I don't really understand..."
>> "franken-GUI with all sorts of crap tacked on..."
>> "the outmoded, rubbish video drivers..."
>> "I would call it pretty..."
Wow. talk about emotionally charged phrases. Such angst! Such passion!
You should really let them be and think of your own. Stop wasting yourself hating a face-less company that has done nothing to you.
You'll be happier, trust me.
Re: "it’s hard to argue that it’s still the best."
First, the hardware components are not necessarily the same. Second, the operating system is not the same as the cheap crap you buy without brand.
And finally, when you say "everything thinks," you mean "you think," right? Or did you take a poll across the planet?
No, you are wrong. That is the definition when the phrase is used in a philosophical or rhetorical context. As such, it is a term of art.
However, when used in normal communication, as the article does, it is not a term of art; it means exactly what the English words would mean in that sentence: raise a point that has not been dealt with; invite an obvious question.
When being pedantic, it helps to read actual reliable reference sources and not, say, Slashdot--where your complaint is usually thrown about rather vociferously. The definition above, for instance, comes from the Oxford Dictionary, which also includes the definition to which you alluded.
Nice application of the spirit of the Rubiyaat to an observation on technology.
That's actually one of my favorite verses, from Fitzgerald's translation:
"The moving finger writes, and having writ,
moves on; and all your piety nor wit
shall move it back to cancel half a line;
nor all your tears wash a word of it."
Strangely sad and profound.
>> They chose seven inches for two reasons: it's more mobile and - perhaps the really important criterion - it's more readily distinguishable from the iPad.
Really? And here I was--with the rest of the rational people of the planet--assuming that the reasons the competitors chose a 7" tablet instead of ~10" was:
1. Production costs - This cannot be understated: all manufacturers discovered that they could not compete with the iPad at the same price point, and the only reasonable way to lower the price was to produce one with cheaper components. Some did this (Asus, I'm looking at you!) and others just picked smaller screens.
2. Availability of materials - it's been widely publicized that Apple bought pretty much the manufacturing capacity of certain components, which prevents other device manufacturers from acquiring them in large enough quantities for a consumer product.
Re: Those were the days...
Psst! Those games are still around, and there's a thriving community of programmers keeping the platforms and the retro-style alive.
Head-on over to <http://www.atariage.com/> and take a looksie.
Me? I'm in the "Intellivision Programming" forum garnering some enthusiasm for my upcoming title. ;)
A few corrections
First, in 1977 Atari released the "VCS," or Video Computer System. It wasn't until some time in the 1980s that it was renamed the "Atari 2600."
Second, the article gives the impression that Jobs and Woz worked on the Pong arcade machine. Jobs may have assisted (he was in and out of Atari payroll at the time), but Woz was recruited by Jobs to work on the Breakout arcade game later on.
And third, one of the reasons Atari had an advantage over their competitors was due to a cheap license from Magnavox. By then, the full potential of the video game business was still not fully understood by anybody, and so Magnavox sold a license to Atari for what resulted in a very shortsighted price.
Later on, when the video game business went gangbusters, there was not a soul in the industry that could match Atari's cost of production, in part due to high license tributes from Magnavox and the arcade franchises.
And finally, lets not forget the huge influence that MOS Technology (and later, Commodore Business Machines) had on Atari. Bushnell was smart, and Atari was good, they could not have built their empire on on their own.
Wow, what a bunch of whinging trolls here.
I, for one, respect Mr. Orlowski and Page, and appreciate their take on this subject. It's very easy to side with every other blog joint out there, but it takes some special balls to rise above the "I read it on the Internet, so it must be true" mentality.
I like El Reg, I've been reading it for over 10 years. Some articles may be over the top, and still others may seem as just click-bait for trolls. But such it has been for ever, and it's all very much worth it for the unique point of view and quick wit, and overall most articles are indeed good.
Re: Fahrenheit 451
Funny, that. The copy of Fahrenheit 451 I had included a foreword written by Mr. Bradbury himself explaining how in the intervening years since he wrote the book, it had been banned from schools and communities due to offensive language. He relished in the irony.
I personally discovered the book when playing the eponymous game for the Commodore 64. A friend of mine mentioned it was a real book, and I immediately went to the school library to check it out. I read it overnight in a hurry (for I had a test the next day, for a completely unrelated subject). I proceeded to re-read it at a more leisurely pace the next week.
I've subsequently read the book maybe 4 times more since then.
It was a pleasure to burn.
Das ist alles.
Did you know that "trial" is the noun form of the verb "to try"?
Re: Accounting can be very creative
>> i dont see how money can be brought into this at all.
That's possibly because you're not a lawyer, or even read the actual goings on of the trial, including the opposing arguments and the judge's comments.
The reason money "can be brought into this at all," is that a "fair use" defense requires certain conditions be met--one of which is that it benefits society. This is typically qualified with the intent of the derived work: educational uses bolster strengthen this argument; while purely commercial ones weaken it.
By the way, I'm also not a lawyer, but I read some of the documents from the court.
Reading and knowledge, they're a dangerous thing.
>> "It means you need to separate the concept of the Java programming language from the Java runtime environment. The language is what the code is written in. The runtime environment is where the compiled code gets executed. The compilation to byte code is what separates Dalvik and the JRE."
Actually, he did, but you didn't. When he said that "Java is fragmented," he is talking about the mind-share of developers using that language.
You seem to be obsessed with trying to keep Android and Java separated by grasping to straws, delineating their technical differences. Who cares?! The reason this is at trial and a Judge is seriously considering the issue (as opposed to dismissing it right from the outset), is because it is not as clear-cut as you claim.
Sun's (and now Oracle's) intention with *both* the language and framework was to have developers expend effort on training for a single platform that will then run everywhere, including mobile platforms. Android throws a wrench into the works by splintering the development efforts of Java developers into essentially two platforms, and arresting Sun's (now Oracle's) potential to release an official mobile Java platform.
Part of Oracle's argument is that *this* was Google's intention from the beginning: to avoid having to compete with yet another programming language and exploit an existing large group of developers already trained in the language--without paying for a license to clone it or its API.
OK I give up!
I'll never be able to understand the Japanese. WTF?!
Re: Ughh... Still shudder when I recall those days
Branches are jumps; they add an offset to the program counter. That they react on the status flags is just a technicality, they're still GOTOs.
The typical micro BASIC dialect promoted the following idiom for control flow:
IF (X = 1) THEN GOTO 300
That's assembled as (depending on your CPU dialect):
MVII #1, R0
CMP X, R0
That was e point of the poster you responded to.
Re: Sounds believable!
You are incorrect. They stored all data slurped, unencrypted *and* encrypted. That's why the modifications to the kismet software.
Plus, the headers of all transmissions are sent unencrypted, even if the payload is. So they slurped the source and destinations, and additional information, of all transmissions sniffed, even of those that were clearly intended to be private.
This is why they are in trouble.
Re: Am I alone?
>> We seem to be moving from a data-centric view of the world to a view which encompasses the application which created the data, and I don't think I like it.
I must say that this has always been the goal of personal computing. Why is it important to access files directly when you can only use it with specific applications? Isn't it the job of the applications to handle their own data types, rather than deferring all this administrative work to the user?
I want my photo application to have access to my photos and show them to me, and allow me to edit them. I further want this access abstracted. I profit not in the least by having them thrown in with a bunch of text files and other documents of myriad type; it just adds to the confusion and the maintenance burden.
Likewise for word processor documents: why would I ever need to see them thrown anonymously in a folder without context? If I ever need to access them is to read, edit, or share them, and a suitable application would allow me to do so.
Compartmentalizing the file system by document type (or application-specific function) using folders goes someways towards this, but why not extend this to its logical conclusion and abstract the entire file system.
Re: Not a great solution
But the price is set by the Publisher. If they don't sell books, they don't make money. It is in their interest to price their books at a reasonable price.
Amazon, on the other hand, sells abso-fscking-lutely everything under the Sun, and can afford to lower it's prices below cost (as it was doing before) to prevent competition.
Re: What kind of article is this?
Actually, Amazon is doing to e-book Publishers what Wal-Mart does to their suppliers: they are a de facto monopoly in the market and get to dictate the terms to them. Amazon did not allow any publishers to raise their wholesale price, and would decide what price to set. Most of the time, the price was artificially low and at a loss, to bar entry into the market by competitors.
This prevented the publishers from being able to sell their goods to someone else, and in essence made them beholden to Amazon's whims. This was the problem. This is what the author is decrying in this article: where was the DOJ then?
The prices are a bit more expensive now, when the market is still nascent. It is expected to fall as competition enters the market. This is the point.
Moreover, the prices are more expensive because before they were kept artificially lowered by Amazon. Yes, it may have been great for consumers in the short-termed, but the complaint from the publishers has always been that it was not sustainable for the industry.
Does it really matter?
If Apple are making a lot of money, much more so than Google is from Android, what do they care about market share?
In a nutshell, Amazon went crazy with VC money growing their automated distribution and inventory management systems at the turn of the Century (as many other start-ups did at the time), and when the dot-com bubble burst there was pressure to make money out of their existing infrastructure.
They came up with some clever marketing to convince large and mid-size corporations to hand over computer processing in their hosted environments, harking back to the days of centralized mainframe and "utility computing" back in the 1970s.
Amazon weren't the only ones, many other companies were caught by the downturned economy with large data centers with little to process.
It will fail.
The main reason is elucidated by the article:
>> "There are two advantages here."
Great! Not one, but two. I'm excited!
>> "First, content owners need only hold a couple of copies of each title, one in SD, the second in HD."
Uh.. Sounds fine for them. Alright, what do I get?
>> "Secondly, if a studio decides to offer, say, 4K by 2K copies, it can do so easily."
Hum... That's for *them*, too. Surely, there's something good for me, since it's designed to improve my experience, and make it more convenient than pirating, right?
>> "But, yes, all this involves DRM, to prevent folk giving content away to all and sundry."
DOH! I should have known. That and the focus on a "rental" model are really the reasons why this endeavor will surely fail.
I'll stick to ripping my own DVDs into my computer, and playing them on my Apple TV.
But how much of a "runaway success" is it really, if the product makes no money? It's true, Google created a service which ended up being the best of breed and loved by all, but which had to be modified from it's original design in order to make money and support a business.
The problem is that, as long as Google "owned" the Web, as they seemed to do for some time, they could avoid impacting their flagship service with much intrusive ads, because the sheer scale of their usage made enough money with minimum ads. However, this is predicated on Google being the de facto portal to the Web and all online services and destinations.
The truth is that they didn't contemplate this changing, or at least not so soon.
It's not that Google needs to be Facebook in order to survive in the current marketplace; it's that they need to be *something else* than what they are right now. People are accessing online services through myriad other resources that are not Google, and "Web Search" is much less relevant at the moment.
Some may call these "silos," but in essence, they are specialized utilities. Just like the electric company provides you electricity and the water works company provides you with water and sewage service, different online resources provide different services. That they require discrete information from you to do this, well, that's par for the course--the water and electric companies also need to know where you live and how you like to pay, and by extension of you being a customer, will always know how you consume their services.
Google chose to be Facebook, because they thought that turning search algorithms for online web pages into a social-graph analysis machine would be simple (perhaps it is), but mostly because Facebook was raking in the money, and Google wanted some of that.
- Review Samsung Galaxy Note 8: Proof the pen is mightier?
- Spin doctors brazenly fiddle with tiny bits in front of the neighbours
- Nuke plants to rely on PDP-11 code UNTIL 2050!
- Game Theory Out with a bang: The Last of Us lets PS3 exit with head held high
- New material enables 1,000-meter super-skyscrapers