From the Omni Consumer Products division at Playtex, introducing the new Maidenform T2 Series, with RoboCup (tm) technology.
933 posts • joined 23 Apr 2007
From the Omni Consumer Products division at Playtex, introducing the new Maidenform T2 Series, with RoboCup (tm) technology.
In Puerto Rico, we say "la Internet." This comes by metonymic gender assignment, from "la inter-red," "net" being an anglisism of the proper Spanish word "red" (net).
Surprising that La Real Academia chose differently. Now for the important stuff, ¿dónde está la cerveza?
The definition of re-broadcasting or public performance does not stipulate the size of the audience. Do you mean to say that if I am the only person sitting and watching a movie in a theater, the owner no longer has to pay licensing fees? That is ridiculous.
The point of the ruling is that Aero are renting access to a mechanism that allows content re-broadcasting. Regardless of whether it is to a single person or many, the fact that *they* are facilitating the broadcast (and making money out of it), infringes on content owners property rights.
To the second point that everybody seems to make, it is prohibited by Copyright law to make copies or re-distribute protected content. Individuals are granted an exception for personal use under the fair use clause.
However, this exception is acknowledged specifically because personal use has a self-limiting effect on any such infringement. The moment that this use is mechanised by clever technology to overcome such limiting factors--and especially when its sole purpose is to make money off it--it is no longer fair use. It is as simple as that.
You want to lay a long cable from your house to an external antenna outside the city, fine: you deal with the costs and risks, and it is your personal problem.
You end up with a nifty setup and want to start renting it out to others? That's no longer the same, and you will be found infringing.
As such, Aero was exploiting a loophole with clever technology, and was caught.
You forgot to say, "Pip! Pip! Cheery-oh."
>> My point is there isn't any features in Glass that is not in a phone so logically banning Glass from resale should mean phones should also be banned from resale for the same reasons.
"Logically," a phone is different because,
1. It has other functions (e.g., phone);
2. Under normal and common usage, it is not ever-present on the face of the user, ready and willing to snap photos or record video at an instant;
3. It is harder to use for surreptitious recording, since a stock phone needs to be placed in line of sight, which is not necessarily the optimal position for common personal usage;
4. Is not as creepy.
Not being able to perceive nuance in real life situations is the mark of a basement-dwelling nerd. So, take your self-involve, anti-social, Aspergers view of the world elsewhere. In the real world, things are a bit more complex than merely saying "x looks like y, ergo x = y. QED. I HAS TEH LOJIKS."
>> I'm really struggling to see their rationale for this.
It's simple, and the article spells it out explicitly: Microsoft have recognized that they have lost their monopoly position, and so in order to ensure their future survival, they are changing business strategies to concentrate on large enterprises--the ones that have the money. This is the same strategy Oracle took, ignoring the SME and hobbyist, and focusing on the big bucks.
Supporting small to medium businesses does not pay the bills--certainly not any enterprise that counts its pennies so much that it feels the need to "cheat" by using trial licenses for actual lab and deployment testing. That won't do, and Microsoft needs to let to of any "dead weight" that does not add to the bottom line.
I'm not saying it is the correct strategy. However, it is not wrong either. It is just a realization by MS that they just can't do whatever they want anymore--they need to focus on making money and guaranteeing their future success.
That is a fallacy. It is equally dumb as suggesting that all male priests are child-molesting sociopaths because the vocation requirements attracts that sort of sensibility (or lack thereof).
I assert that sociopathic, non-empathetic nerds, unskilled in inter-personal relationships are NOT the most productive nor the best qualified for IT positions, including computer programming. For the same reasons that every other vocation or industry of substance in modern civilization tries to immersed its apprentices in the humanities, IT personnel would do well and best to enrich their experiences with art, history, social sciences, and all other human endeavors.
Problem solving, logic, and thought are enriched and hightened by human experiences, and are not the exclusive domain of robots. In fact, they most categorically exclude robots.
Because it doesn't work as you say.
Case in point: "AntennaGate" and Jobs response. The company maintained secrecy and admitted to nothing, and when Jobs finally came out to say something, he was assertive and direct, and told everybody how it was all blown out of proportion, and that it was a common problem to all mobile phones.
Do you recall what happened next? The problem went away. It was no longer in the news and only the hard-core anti-Appleist continued to talk about it. The public relations imbroglio was diffused, and the conversation changed to the next Internet meme.
Most of the time, Apple never responds to those issues publicly, and eventually they go away just the same.
Contrast this against the "iOS Map Incident" and Cook's response. Cook came out with his tail between his legs, crying mea culpa at the media and the masses, admitting the error of their ways while begging profusely for forgiveness, and promising to make it all better.
And what was the result of that? To this day, whenever Apple is in the headlines of any newspaper or online publication--especially if it's due to a problem--they seldom fail to mention the "recent problems with the release of their mapping application," and proceed to make a comparison. The conversation changed to focus on Apple being in a precarious position of weakness, and their serious problems at product releases.
Right or wrong, "AntennaGate" went out of the public sphere of discussion, while "MapGate," being a self-admitted public defeat, remains a point of attack forever.
You should go back to Customer Relations 101, and review your notes. Then offer some pointers to Cook as well.
Except that nobody cares any more. IT departments are not "allowing" users to bring their own devices--the practice was imposed on them, due to executive and business pressures, in spite of their policies.
I really doubt that enterprises are now going to seriously entertain going back to some limited functionality, encompassed in some alien platform interface, for the sake of some purported security benefit. Especially on iOS devices.
For all the apologizing that everybody seems to be doing, saying that it is just a developer preview, that the design is still in flux, that it's merely a "mid-stride snapshot," yadda, yadda; the fact is that Apple released this design to the world: they put it squarely on the front page of their web-site, added a new section showcasing the new design, and made TV commercials for it. The new design was released, even if the software was not.
This was not a half-baked (well, aesthetically it is), let's-show-what-we-have-so-far effort--they thought they were done with it and were very proud. It may have been rushed, but they were satisfied with it enough to announce the new design publicly, outside the context of the WWDC, and to wax philosophically to the world about it.
If they change those icons now, or tone down the garish design before the actual release of the software, it'll be precisely due to the vociferous criticism and ridicule they have just received--it won't be because they planned it that way.
That, or the entire effort was an unmanaged and confused mess of people doing their own thing without communication or coordination, and completely lacking someone with a bit of taste and sufficient power to stop them from showing a half-done product.
Either way, it's a new kind of Apple, and it's that pretty.
I agree. The new look smells too much of designer masturbation for the sake of being different. It reminds me of those logo-type boutique mission statements that El Reg is so fond of ridiculing.
Cook - "We gotta do something! This Samsung ripping us off has got to stop! Google, Nokia, HP, they can't leave us alone! We gotta do something. Get me Jony!"
Cook - "Jony, is there a way to make our look and feel 'copy-proof'? You know, add some fancy DRM or something that would prevent others from copying it...? Is there?"
Jony - "Hmmm, I don't think it works like that, sorry."
Cook - "Argh! There's gotta be a way!"
Intern #1 - "Sir, I think there may be a way..."
Cook - "Give it to me!"
Intern #1 - "You may not like it..."
Cook - "I'm desperate, hit me!"
Intern #1 - "What it... What it we made the screen... ugly? What if we designed it so crap-tastic, that nobody would touch it?"
Intern #2 - "With funky psychedelic colours!"
Intern #1 - "And tiny illegible fonts!"
Intern #2 - "And weird indecipherable icons!"
Intern #1 - "better yet--cute, childish my little pony icons!"
Intern #2 - "Yeah!"
Cook - "You know, that is so crazy, it just might work!"
Cook - "Jony?"
Jony - "I'm on it!"
Cook - "Oh and Jony, don't forget to write it up as the pinnacle of design... You know, for the snobs..."
Jony - "Got my thesaurus and Architectural Digest right here."
Extra! Extra! Founder of secret cabal accused of plotting to take over the world denies actual plans to take over the world!
News at 11.
The thing that you--and the article author--is missing is that H.264 succeess was not due to ubiquitousness on the Web, but to its pervasive use throughout the entire content industry, including consumer media-delivery electronics.
This is the reason why VP9 will also fail against H.265: as long as Google keeps on directing its efforts at the "Web delivery" side, they will miss the introduction and proliferation of H.265 throughout the content creation pipeline. This includes direct hardware support for the codec on every aspect of production, such as cameras, film-editing devices, etc. Why would anybody even consider re-encoding their product for the Web when the major and popular browsers already support the native encoding of your content?
That has happened many times before. The JPEG format became popular on the Web because it was a standard for digital photography that existed prior to its inclusion on web pages. It was only natural to include support for it in web browsers and enable a myriad images at once, rather than force everyone to re-process all their photos just to put them up on a web page.
As long as content is produced off-Web first, non-Web-specific codecs will permeate the industy and the Web will have to adapt to support them. Enterprises like Apple and Microsoft understand this. This is why QuickTime and AVI were always intended to be content industry standards, not plain PC codecs. Google's VP9 is not being proposed as a new standard for the film industry as a whole, but as a Web media solution. As such, it'll remain a niche player, if it remains at all.
Wow! You guys have a British Phonographic Institute over there? How do you enroll, is there a test? Do they offer night classes?
I truly believe this is the future of machine-human interfaces. It's not there yet, and it will need to be commoditized and miniaturized much more, but that will come in time.
The part you are missing is that, pay attention now, IT IS NOT A VIDEO PORT. Did you get it? It is a generic, multi-purpose, serial link providing raw information.
This significantly simplifies the internal design of the device, lowering it's cost and failure potential. One single port, as opposed to myriad dedicated sockets.
The thing to keep in mind is that not everyone will require wired video output (indeed, most within the Apple ecosystem will just stream wirelessly). They will be spared the expense and complexity of having a dedicated port for it.
Those that need such a thing can buy an adaptor that implements the necessary transcoding of the signal. Being controlled by software means that it can easily be upgraded and improved.
It is a rather clever and elegant solution to future-proofing the device.
Please enlighten me... what is that "Better than Heroin" thing that you talk about?
And they'll still rake in more money than the rest combined.
I've been using Perl since the late '90s, ever since I was "baptized by fire" by a boss giving me a text-processing/CGI task, handing me the Camel Book, and demanding that it be completed in 6 days.
I discovered, to my amazement, that the book read so pleasantly from start to finish; a testament to Mr. Wall's wit and literary prowess. To my further amazement, I discovered that Perl, too, was a pleasant tool to work and play with--again, testament to Mr. Wall's background and experiences.
I love Perl. I particularly adore its two most important features, in the words of Tom Christiansen: "getitdoneness" and "whipituptitude."
Happy Birthday, Perl! Here's to another 25!!!
To this day, whenever I need a translation, I always type in "babelfish.altavista.com" in my browser's address bar.
Ah! Those were the days.
They need to continue pounding on the ChromeOS nail. You see, Android is a fluke. It was intended to tide them over on the mobile space (giving them an entry against the BlackBerry behemoth), while they worked on their real project: the take-over of the PC market and expansion of their online ad/search business with an always-on, browser-only, cheap netbook-like PC.
That was the plan. The world would use cheap netbook PCs that ran *everything* on the browser, and required a constant connection to (guess what?) Google for all it's services, not the least web search. This is where Google makes its money from, remember? Up until 2006 or 2007 every tech company was banging on about how The Web will be the platform of the future.
All roads would lead to the Web and they all would pass through Google.
Then something weird and unexpected happened: the iPhone and then the iPad showed how mobile was to be, and it wasn't the mighty Web. It was small devices, running native myriad applications, including games. Android--the free, open source side project, of all things--became immensely popular. Who knew?
So Google has tried since then to follow through, but they just can't shake it: the world does not want an always-on, web-browser-only cheap netbook. The world wants smartphones and tables that include the web as a mere addition to the many other features they have, most of them including native applications.
Android keeps getting more popular and ChromeOS keeps falling deeper into obscurity. But they can't let that happen, you see, because (and here's the secret, so pay attention), THERE IS NO MONEY ON ANDROID.
You have a point. However, keep in mind that, as polarizing as he may be, Steve Jobs managed to keep him along and make him and everyone else work together on many successful products for 14 years.
If anything, this shows an issue with Mr. Cook's leadership and ability to keep the lights on and the trains running on time after Job's passing.
Because if it were the wrong eye, it would be of a different marine animal. DOH!
Is that "adult" as in "adult entertainment," or as in "for grown ups."
No. The news this time is that the FAA has approved its use, in-cockpit, at all stages of flight.
"pushing the envelope of Action Sci-Fi towards an art form in itself," LOL! Spoken like a true nerd.
But the user doesn't care: he can "browse" the web the same way.
I like Safari too.
You win 3 Internets today, sir!
I noticed that they had to mention "backed by HP" twice in two contiguous sentences, as if to ensure it's taken seriously.
"No, really: backed by HP. I swear. They're backing us up. Stop snickering at the back, you!"
Legions of consumers complain about the myriad buttons and complex nature of modern remote controls, yet they still use them, rather than walk over to the darn set to switch the channel.
And now, you expect them then to stand up and walk over to the other side of the room every time they want to show a picture or play a song?
Less than useless.
That's because Samsung's lawyers are absolute idiots that couldn't defend themselves out of a paper bag, and obviously didn't even know how to show prior art to ridicule Apple. Had they even shown a single old smartphone from 2004, or even a screenshot of a Star Trek episode, the judge would have immediately thrown out the case in favor of Samsung.
Perhaps the case is a bit more complex than what you are supposing, and the prior art that you suggest does not really defend the *actual* claims of the suit. Perhaps the evidence presented to the jury by both sides has more depth and nuance than what you have picked up from blogs or Twitter feeds. Perhaps the case has more merit than what nerd-rage fueled posters in a tech rag can conceive.
>> "I don't really understand..."
>> "franken-GUI with all sorts of crap tacked on..."
>> "the outmoded, rubbish video drivers..."
>> "I would call it pretty..."
Wow. talk about emotionally charged phrases. Such angst! Such passion!
You should really let them be and think of your own. Stop wasting yourself hating a face-less company that has done nothing to you.
You'll be happier, trust me.
First, the hardware components are not necessarily the same. Second, the operating system is not the same as the cheap crap you buy without brand.
And finally, when you say "everything thinks," you mean "you think," right? Or did you take a poll across the planet?
No, you are wrong. That is the definition when the phrase is used in a philosophical or rhetorical context. As such, it is a term of art.
However, when used in normal communication, as the article does, it is not a term of art; it means exactly what the English words would mean in that sentence: raise a point that has not been dealt with; invite an obvious question.
When being pedantic, it helps to read actual reliable reference sources and not, say, Slashdot--where your complaint is usually thrown about rather vociferously. The definition above, for instance, comes from the Oxford Dictionary, which also includes the definition to which you alluded.
Nice application of the spirit of the Rubiyaat to an observation on technology.
That's actually one of my favorite verses, from Fitzgerald's translation:
"The moving finger writes, and having writ,
moves on; and all your piety nor wit
shall move it back to cancel half a line;
nor all your tears wash a word of it."
Strangely sad and profound.
>> They chose seven inches for two reasons: it's more mobile and - perhaps the really important criterion - it's more readily distinguishable from the iPad.
Really? And here I was--with the rest of the rational people of the planet--assuming that the reasons the competitors chose a 7" tablet instead of ~10" was:
1. Production costs - This cannot be understated: all manufacturers discovered that they could not compete with the iPad at the same price point, and the only reasonable way to lower the price was to produce one with cheaper components. Some did this (Asus, I'm looking at you!) and others just picked smaller screens.
2. Availability of materials - it's been widely publicized that Apple bought pretty much the manufacturing capacity of certain components, which prevents other device manufacturers from acquiring them in large enough quantities for a consumer product.
Psst! Those games are still around, and there's a thriving community of programmers keeping the platforms and the retro-style alive.
Head-on over to <http://www.atariage.com/> and take a looksie.
Me? I'm in the "Intellivision Programming" forum garnering some enthusiasm for my upcoming title. ;)
First, in 1977 Atari released the "VCS," or Video Computer System. It wasn't until some time in the 1980s that it was renamed the "Atari 2600."
Second, the article gives the impression that Jobs and Woz worked on the Pong arcade machine. Jobs may have assisted (he was in and out of Atari payroll at the time), but Woz was recruited by Jobs to work on the Breakout arcade game later on.
And third, one of the reasons Atari had an advantage over their competitors was due to a cheap license from Magnavox. By then, the full potential of the video game business was still not fully understood by anybody, and so Magnavox sold a license to Atari for what resulted in a very shortsighted price.
Later on, when the video game business went gangbusters, there was not a soul in the industry that could match Atari's cost of production, in part due to high license tributes from Magnavox and the arcade franchises.
And finally, lets not forget the huge influence that MOS Technology (and later, Commodore Business Machines) had on Atari. Bushnell was smart, and Atari was good, they could not have built their empire on on their own.
Wow, what a bunch of whinging trolls here.
I, for one, respect Mr. Orlowski and Page, and appreciate their take on this subject. It's very easy to side with every other blog joint out there, but it takes some special balls to rise above the "I read it on the Internet, so it must be true" mentality.
I like El Reg, I've been reading it for over 10 years. Some articles may be over the top, and still others may seem as just click-bait for trolls. But such it has been for ever, and it's all very much worth it for the unique point of view and quick wit, and overall most articles are indeed good.
Yes, but how much is that in Football Fields, or Libraries of Congress?
Funny, that. The copy of Fahrenheit 451 I had included a foreword written by Mr. Bradbury himself explaining how in the intervening years since he wrote the book, it had been banned from schools and communities due to offensive language. He relished in the irony.
I personally discovered the book when playing the eponymous game for the Commodore 64. A friend of mine mentioned it was a real book, and I immediately went to the school library to check it out. I read it overnight in a hurry (for I had a test the next day, for a completely unrelated subject). I proceeded to re-read it at a more leisurely pace the next week.
I've subsequently read the book maybe 4 times more since then.
Das ist alles.
Did you know that "trial" is the noun form of the verb "to try"?
>> i dont see how money can be brought into this at all.
That's possibly because you're not a lawyer, or even read the actual goings on of the trial, including the opposing arguments and the judge's comments.
The reason money "can be brought into this at all," is that a "fair use" defense requires certain conditions be met--one of which is that it benefits society. This is typically qualified with the intent of the derived work: educational uses bolster strengthen this argument; while purely commercial ones weaken it.
By the way, I'm also not a lawyer, but I read some of the documents from the court.
Reading and knowledge, they're a dangerous thing.
>> "It means you need to separate the concept of the Java programming language from the Java runtime environment. The language is what the code is written in. The runtime environment is where the compiled code gets executed. The compilation to byte code is what separates Dalvik and the JRE."
Actually, he did, but you didn't. When he said that "Java is fragmented," he is talking about the mind-share of developers using that language.
You seem to be obsessed with trying to keep Android and Java separated by grasping to straws, delineating their technical differences. Who cares?! The reason this is at trial and a Judge is seriously considering the issue (as opposed to dismissing it right from the outset), is because it is not as clear-cut as you claim.
Sun's (and now Oracle's) intention with *both* the language and framework was to have developers expend effort on training for a single platform that will then run everywhere, including mobile platforms. Android throws a wrench into the works by splintering the development efforts of Java developers into essentially two platforms, and arresting Sun's (now Oracle's) potential to release an official mobile Java platform.
Part of Oracle's argument is that *this* was Google's intention from the beginning: to avoid having to compete with yet another programming language and exploit an existing large group of developers already trained in the language--without paying for a license to clone it or its API.
I'll never be able to understand the Japanese. WTF?!
Branches are jumps; they add an offset to the program counter. That they react on the status flags is just a technicality, they're still GOTOs.
The typical micro BASIC dialect promoted the following idiom for control flow:
IF (X = 1) THEN GOTO 300
That's assembled as (depending on your CPU dialect):
MVII #1, R0
CMP X, R0
That was e point of the poster you responded to.