363 posts • joined Tuesday 17th August 2010 15:49 GMT
Hmm. It does look too similar to be a coincidence.
But... I don't think Apple's designers and managers are idiots. As with the "iPhone" trademark, I suspect they felt it would cost less to let SBB contact them if there was a trademark involved and set their price (although how you "trademark" a machine's user interface I've no idea; trademarks are usually static images).
I also can't help noticing a pattern emerging here. I wonder if SBB said "No," knowing full well that they'd get a ton of free publicity by doing so and demanding Apple pay them on their own terms. (And by "their own terms", I suspect they first had some meetings with Apple's people to decide what those terms would be.)
Whatever you may think of Apple, "incompetent" isn't a term you can apply to their management team. There's more to this than meets the eye.
Re: iOS 6 Maps - Cloud Based Satellite Imagery
Not even Google's "satellite imagery" is particularly detailed. Satellite imaging isn't that good. In fact, the problem you're referring to is entirely due to the poor resolution of the available satellite images.
To get those close-up images of individual cars parked in driveways, you need to pay someone to fly over the country and take photographs. It's called "aerial photography" and Apple are clearly going to need time to fill in the gaps. Even Google still has a way to go to achieve 100% coverage of the planet's land masses.
Here in Italy, great swathes of the countryside are devoid of detail. And that's in Google Maps, not just Apple's new Maps app. So it's not just Apple.
Mapping the planet is a constant work in progress—a permanent beta. Nobody has "perfect" coverage. Ever.
For those whining that Google's Maps app was "better"...
... try living in the Italian countryside sometime. Until iOS 6, my home town was hidden by clouds. (Yes, exactly as is being complained of in the Apple Maps app by others.)
Maps are ALWAYS a work in progress. They're never done. And with a dataset as big as an entire planet, expecting perfection out of the box is idiotic. Sorry, but it just is.
Yes, the aerial photography needs work, but Google's early efforts were no better and certainly aren't glitch-free even today. Oh sure, you'll get excellent details of major cities like London and Rome, but the further out you get from either, the lower the detail and accuracy.
Google also clearly couldn't be bothered to update their iOS Maps app for years, so Apple naturally got fed up and decided to go for the nuclear option. I can't say I blame them.
Google will doubtless release a magically upgraded and improved iOS Maps app sooner rather than later. I'm betting you'll be encouraged to enter your Google account details to "get the most" out of it. (I.e. give Google even more data about your every single move. For free. Nice con, that: convincing your raw materials suppliers to give it all to you for nothing, so you can benefit from fat profit margins. And making them feel like they're doing you a favour? Priceless!)
Re: Translate me a Spaceship
Bullshit. Stop assuming everyone is a feckless imbecile unless they have a beard and a pocket protector. All you need to do to fix the 'problem' is to ensure that a jury of my peers comprises people with a similar level of intelligence and skills.
Re: No Thanks...
"User friendly" means "easy to use".
Being able to "tweak almost all settings and features to suit the user" is known in the trade as customisability, and iOS offers plenty of such features too.
As for the Samsung Galaxy SIII, I've played with one too. Nasty screen, irritating GUI (sorry, Google, but you still don't get it, do you? It's consistency the people want, not battery-draining widgets and overblown transitions).
And, of course, it's a Samsung, so it's a blatant rip-off of Apple's design language as well. Why buy a bad photocopy of a Van Gogh, when I can have the original for about the same price?
As for the childish "walled garden" and "fanboi" bollocks: I'd rather live in a gated community than the Big Brother house.
One of my cousins is a famous musician / celebrity here in Italy. I've worked on lyrics with him, and am a (very) amateur musician myself.
Preparation and planning are crucial: it's the difference between making a profit and losing the shirt off your back. Insurance, travel, venue hire, catering, sets, etc. are very much dependent on the kind of touring you're going to do. U2 will be renting rooms at the best hotels, with plenty of room service. An indie band is going to be staying in B&Bs and eating at a greasy spoon—assuming you even have that much money to spend.
Many of the items you list aren't paid for up-front, so the revenues from a concert can often be used to pay them off. Indies aren't going to go for costumes, (unless they're Lordi, where the costumes are very integral to the band's image).
Instruments and technology is something you'd bring with you, not hire on the day. The only exception is the front-of-house mixing desk, which is often provided by the venue, but may need to be hired-in if the concert is being played outdoors. (In Italy, many bands play small villages and towns, setting up prefab stages in their piazzas. The local communities often pay towards the costs in the hopes of making the money back from visitors at concession stalls. The UK tends to prefer pubs and similar venues for young bands who are still learning the ropes.)
As for studio time: six figures tops. Any songwriter worth their salt will prepare as much of each track as they can in advance using their own home studios (i.e. for free). This speeds up the studio recordings, which are expensive if you need to hire in an orchestra as well. Very few studios can handle that—we're talking AIR or Abbey Road—so the cheaper option is to hire a much smaller, cheaper, studio for a little longer and record the orchestra sections separately instead. (I.e. the string section, the woodwinds, percussion, brass, etc. don't get to play together.)
That "six figures" estimate assumes every single track on the album will require an orchestra. This is not necessarily the case: most orchestral backing tracks heard today are actually sampled orchestral sound banks, such as this one, from Garritan. These "sample instruments" are now so good that you can usually get away with using them exclusively. (The first few demos at that link are impressive given that they're not being performed by a real orchestra.)
As for promo videos: this one, made in 2003 and shown at the ICA in London in 2004, cost all of about £2K—most of which was for developing the 16mm. film.
The only 'cheats' here are the music and actors, all of whom worked for free. Had library music been used, the cost would have bumped up by £100 or so. If the actors had been paid Equity rates (currently £130-ish per half-day). Let's assume AP cannot find a single close friend or relative who can do her a favour, so she has to pay everyone. Even then, we're talking about not much more than £10K or so for a straightforward "film the band playing live from a few angles" music video.
Merchandising—T-shirts, fancy posters and other folderol—isn't that expensive either. Only an idiot would order massive supplies of everything up-front; the sensible way to do it is to order just enough for two or three concerts (she'd know how many people can be accommodated at each one well in advance), then use the income from the concerts to pay for the next batch of supplies. Thus your up-front capital expenditure isn't that great. Again, we're talking about a pretty small portion of that 1.2 million from Kickstarter. Unless she's playing Wembley, she only needs to pay for a few thousand of each standard piece of merch. (The Kickstarter bait items are also unlikely to be more than a few grand or so in total, and they're a one-off expense.)
Finally, much of this can be set against taxes—which may well account for her decision to give the album itself away for free. AP is married to a multi-millionaire and clearly doesn't need money.
No matter how you try and wiggle your way around this, AP's request for a 'fan-orchestra' is hard to justify.
So let me get this straight...
... I'm supposed to lead on the platform with the worse ROI, using Java? (You know: the language currently most famous for its "write once, malware everywhere" philosophy?)
I think I'll pass, thanks.
I've driven from London to Rome a few times now, without even touching a map en route—except once, when there was some construction going on around Strasbourg.
There are these wonderful inventions called "road signs" that tell you which way to go. It's not that difficult. Check the route before you go. Make a mental (or written) note of which key cities you're likely to be heading for on each leg, then pin that list to your sun visor. In Europe, you can often just make a note of the EU-wide "E" routes, many of which run across multiple countries.
(If memory serves—it's been a couple of years since I last made such a journey as the petrol costs have become prohibitive—French motorway signage always includes the "E" route number as well as the national number. In Switzerland, they often don't bother displaying their own numbers on the motorway signage and just use the "E" number instead.)
A paper map as backup should be all you need.
Can't be done.
There are no factories on anywhere near the same scale as Chinese plants. Take a good look at the sheer scale of some of Foxconn and Pegatron's plants and you'll see the problem: they're basically city-sized factories. Just one Foxconn plant has more people working in it than currently live in Guildford.
The only example of manufacturing on a similar scale is Ford, who have a mega-factory that takes in raw materials (steel, rubber, etc.) and spits out complete cars: you'd need to create a brand new supply and logistics chain on a similar scale to that.
In the US, where human labour is entire orders of magnitude more expensive, robots are taking over, but you pay the price in flexibility and capital costs: those robots might not require a salary, but they do require initial purchase (think six figures) plus ongoing servicing and maintenance. It's also harder to repurpose a sandblasting robot cell to perform, say, soldering instead. That's why China's manufacturing conglomerates are so successful.
Foxconn can simply repurpose last year's production line to produce wannabe "me-too" products from other clients. A US factory's production lines, filled with robots instead of humans, will be much harder to repurpose: A robot designed for, say, sandblasting an aluminium casing isn't going to be much use at soldering.
Apple would effectively need to fund the construction, fit-out and tooling of a brand new mega-factory and supply chain, from scratch. If you think Apple would then be dumb enough to hire *people* to do work that a robot can do, you've another think coming. So no, there wouldn't be much benefit to the US unemployment figures either way.
Apple, like Dell, Asus, HP and their ilk, chose China because it offers flexibility and massive economies of scale. Nobody else can crank out products in the quantities demanded by the West, in the timescales demanded by the West, for the prices demanded by the West.
But Apple DID buy the iPad trademark from ProView!
Or so they were led to believe by ProView's Taiwanese subsidiary at the time.
ProView (China) is claiming that ProView (Taiwan) did NOT have the rights to sell the "iPad" trademark, despite ProView (Taiwan)'s claims to the contrary during the original sale. That would suggest ProView (Taiwan) sold IP rights when it wasn't the legitimate rights holder. Either way, ProView (China or Taiwan) is at fault here as, presumably, someone in the corporation's Chinese mothership was aware of their Taiwanese arm's activities.
THAT is what this case is all about: ProView have been acting in bad faith. If Apple's allegations are correct, then ProView are guilty of "trademark trolling".
If Apple does lose this case, it's going to make doing business in China more expensive as more arse-covering layers will be required when dealing with IP issues. (I.e. if US Corporation A tries to buy Chinese Corporation C's IP, a whole layer of legal middle-men in both nations will have to be hired to perform the necessary checks, all of whom will also require expensive liability insurance as Corporation A is damned well going to sue somebody if it turns out they've been sold a pup.)
DJs have a target audience...
... that couldn't give a shit whether the audio is 44.1 kHz or 192 kHz, as long as it has solid beats and a good melody.
The sound systems used even by the most famous of rock bands and their ilk have barely changed since the 1950s: it's all just monaural crud cranked out at the highest possible volume using technology that wouldn't look out of place during William Hartnell-era episodes of "Doctor Who".
The quality of any recording is only ever as good as the quality of the sounds being recorded. Recording a live concert isn't going to give you amazing sound worthy of a 24-bit, 192 kHZ DAC simply because the technologies used to perform a live concert are, frankly, shit.
Even orchestral music has a large portion of extraneous noise—you've got anywhere up to 80 people breathing, scraping stuff against other stuff, hitting stuff, blowing, or just shuffling their feet and occasionally coughing. If you can't hear any of that in your recording, it's not an accurate recording, but a highly processed and unnatural one. (Recording an orchestra is still considered something of a dark art even today; there's no single standard method for doing it.)
The only sound sources that can possibly be recorded faithfully, without any extraneous sounds, artefacts and other noises, are digital synthesisers, that can bypass the analogue conversion phases entirely.
Ergo, if you're not listening to electronic music, you're actually spending all that money just to hear the bits the record producers never wanted you to hear.
I spent about a month's worth of my free time 'ripping' my parents' vinyl collections into AAC format.
I wasted more time trying to get rid of all the hisses, clicks and scratches in Audacity than I spent actually playing the tracks into the machine in the first place. Some of the records dated to the 1950s and had been played so often that they'd become audibly worn, with all sorts of nasty audible artefacts.
I can only assume that the truly hardcore vinyl fans are like comic book fans: they buy the records and never, ever, take them out of their original packaging to listen to the things.
As an audiophile and occasional musician myself, I have no time for vinyl. It was, and is, hugely overrated as a format.
"Don't forget you need a Macbook Air and iPhone 4 to use that with!"
Er, no. You don't.
iOS 5 effectively removed the tethering requirement. Aside from charging my (v1) iPad, I have no need for the USB cable that came with it today.
Perhaps you should send a memo to Jonathan Ive at Apple then. The iPad—"The World's Favourite Tablet"—has had the more sensible 4:3 aspect ratio since the first model.
Also, you might want to read up on exactly how many "widescreen" aspect ratios the movie industry has used over the years. Most movies today are shot in 1.85:1 or 2.39:1 aspect ratios, neither of which will fit a 16:9 display exactly either, so you're still going to get some letter-boxing.
Photographers have a lot more aspect ratios to pick from too, including 3:2, 5:4, 6:1 and 4:3, so a 16:9 screen doesn't offer anything particularly useful to that sector either. (In fact, the 4:3 aspect is a pretty good compromise for them.)
So, er, no. 16:9 offers precious few benefits and is clearly not a priority for Apple, who are by far the market leader in the tablet industry.
Yes.... and no.
One of the less-known secrets of Macs is that they tend to hold their value very well.
Because every Windows PC manufacturer insists on cranking out a dozen new models and variants every few weeks, their machines are often obsolete before they've even hit the retailer's shelves, let alone your home office, so PC owners simply aren't used to thinking in terms of trading up; the assumption is that their laptop isn't worth much, so it ends up being given to some (usually older) relative for free instead. Many long-time Mac users have never paid the full whack for a Mac after their first. They've sold their old models and simply paid the difference.
This is, incidentally, one of the reasons why more people tend to opt for Apple's extended AppleCare warranty scheme than you might expect as it makes buyers more likely to buy the machine used if it's still quite new: it removes the element of risk that you'll end up with a dud. (Students buying on an educational discount even get that extended warranty thrown in as part of the discounted price, so it's often a no-brainer for them.)
That said, Macs may cost more up-front, but will usually last for many years, so buying a refurb or secondhand model is not unusual. Yes, Apple will stop officially supporting older models after a while, but the machines will continue to run the likes of *BSD and Linux just fine. And OS X has supported shared computing features ("XGrid", since OS X 10.4), so older machines can still be useful in offloading rendering work in (say) Final Cut Pro or Logic Pro.
Think of your computer as an investment. You're going to be spending an awful lot of time with it, so it makes sense to buy something that isn't going to have you wanting to hurl it out of the nearest window within a week.
If it really is a minor revision...
... then the iPad 3 will probably lack the "retina" display. The processing power : pixel density ratio will be terrible otherwise—much as the iPad 1's lack of memory causes it to perform rather poorly with today's more memory-hungry, iPad 2-optimised apps. (Speaking of which, the iPad 3 had better have a minimum of 1GB RAM too. And I do mean "minimum". Four times the pixels = 4 x the data storage for the graphics elements of your GUIs.)
Of course, the new processor might be a blancmange, with quad-soufflé cores and powerful graphics sponge cakes with greatly improved jam rendering. And a cherry on top.
As others have pointed out...
... Apple's iPhone 4S uses a Qualcomm chip for their 3G GSM features, with a Skyworks chip used in the GSM version of the iPhone 4. Apple designed neither of these chips. They're off-the-shelf (sort-of) components.
Both Qualcomm and Skyworks are already paying the licensing fees for the use of 3G / GSM-related patents in their products, so their customers don't have to. Therefore, Apple are well within their rights to be absolutely livid about Samsung's behaviour: unless Samsung have some weird patent covering the use of antennas (that allegedly don't work properly), I can't imagine what the hell it thinks it's doing.
Apple have plenty of money to spend on this kind of legal circus. If nothing else, it could help clarify a bunch of legal grey areas, as well as killing a few of the more ridiculous patents approved by the USPTO on the way. Other companies may also benefit from this wholesale clearing of the legal air.
Samsung seem to be behaving very oddly. You don't license patents for things like GSM to end users. Nor do you license the patents to design companies that assemble devices that use off-the-shelf components for the patent-related functionality. Samsung bloody well know this, so Codd knows what they think they're doing. I hope, for their sake, that they have a good reason.
Browett only joined Dixons in 2007. The economy went pear-shaped shortly afterwards and Browett's strategy involved "raising Dixons' game" to the extent that TWO rivals effectively gave up.
Browett's strategy is clearly explained in the article: he stopped the sales commissions to reduce the pushiness. He trained some 14000 staff to reduce their ignorance. He got out of foreign territories—a good move given the state of the economy in many of them, especially Italy—and began a programme of store refits.
Note that the latter process takes ages to complete; you can't just shut down every store in the land for a couple of weeks to do it. The manpower simply isn't there, so it's inevitably going to be a rolling programme of surveying each site, drawing up plans for the new layout, ordering the furnishings, signage, etc., then hiring shop-ftitters to fit it all over a period of a couple of weeks. (Less if you're lucky.)
To put this in context: the Italian equivalent to Texaco, previously known as "AGIP", rebranded itself as ENI a few years ago. Despite this, and despite the new brand still using the old "Roman wolf" logo, there are still some old "AGIP"-branded petrol stations in Lazio's countryside, where I live. One only got the refit treatment a couple of months ago. The other is still there, on the Via Cassia.
A variable-width layout would also let you cram more / longer headlines and / or subheadings onto the front page. (Key articles could even have a short 'teaser' intro. Perhaps some more pictures could be used as well—like the ones you often use for stories you highlight in the sidebar.)
Not that I'm expecting the suggested changes be done right now, but fully half my MacBook Pro's 17" display is empty space when I open the browser in full-screen mode. That's a waste.
Remember, too, that "variable-width" layouts will adjust automatically to smaller screens, so users with tablets needn't worry either. Hell, it might even work well enough on mobiles that you don't even need to maintain a separate site for those.
On a separate note...
... is there a particular reason why The Register's layout still assumes very narrow, tall, screens?
Surely it'd be better to use a variable width layout and lose the ridiculously wide margins that appear on a widescreen display? You already have a separate "mobile" site for those afflicted by phones with terrible web browsers, so why try to cater for them on the main site as well?
This would also make the forum easier to work with as long posts won't end up hogging an entire screen's worth of space.
I just ran into this issue myself...
(I love the sound of my own typing, and I don't tend to edit, or even proofread, my posts if I'm replying as a break between translation jobs.)
JS counter please! I consider not having one a "Class A" showstopper bug. It's literally impossible to tell when I've gone over and I've wasted more time copying and pasting into Word to check.
Alternatively, as someone suggested elsewhere, allow arbitrary length posts, but have a "Read More" link after a predefined limit. In fact, this would probably be more elegant.
I've always thought it suspicious...
... that the heating graphs invariably start climbing rapidly around 1950. This may be due to the different sources of data being used—and the Earth's population being quite a bit smaller back then—but... what about all the "Clean Air" laws that began to be enacted around the world from around that same period?
A thought occurs: Create a global "Energy Generation Fund" and subsidise the replacement, worldwide, of polluting energy production infrastructure. Require every nation to wire up every city, town and village to a national grid too, if they haven't already done so. This would create jobs and leave a lasting legacy of clean, low-pollution energy. As there would also be a lot of nuclear plants, it would also be easier to standardise their construction and maintenance costs, thus reducing overall TCO even further.
That same electricity can also be used to pump water to villages that are suffering from water shortages. The electricity by all those nuclear plants would also be cheap enough to justify running desalination plants where needed, reducing the fresh water problem likely to affect much of the world's population as it continues to grow, improving irrigation and agriculture too, increasing the food available.
I'm generally lumped in with the "skeptics" camp, but this is missing the point...
The problem I have isn't the science, which exists and can be verified / falsified as necessary. The problem is the presentation of the science in the media, and the attempts to convince us that the worst-case scenarios are the only scenarios ever likely to happen, regardless of the topic. EVERYTHING is going to kill us! We're apparently in the worst recession EVAR! Muslims are going to wipe us all out! North Korea is going to wipe us all out! And so on and, tiresomely, on. Naturally, there's no shortage of snake-oil salesmen abusing the science to claim that their magic bullet will somehow save us all.
The signal:noise ratio is shocking. There is, quite literally, nowhere for an interested, lay population to inform itself impartially of the facts, in a relatively unbiased way.
Unlike most of the mainstream media, The Register isn’t interested in peddling just one or the other side of a story. They have a "Harry Hill" editorial policy: “Who’s right? There's only one way to find out: FIGHT!”
I'm not interested in being a skeptic for the sake of being skeptical. I'm skeptical because I’m a layperson in this particular field: I don’t have the time to trawl through the science journals and read up on the current state of the art myself. I’m reliant on other people—i.e. the media—to report back on it all for me. That’s what they’re for!
Richard's article points at new information added to the pool of Climate science, which is a good thing. The leap from "the oceans are doing X, therefore it's mankind's fault" did jar with me, but it's clearer when you read the original document that there is some actual science linking these two assertions. A hyperlink directly to the relevant text—or just a pop-up "Here's Why" boxout—would have been better than the "As if by magic..." impression the article gives at present. But it’s still good.
You, sir, are a troll.
For those who couldn't be bothered to read the actual letter, here is the full list of signatories:
Claude Allegre, former director of the Institute for the Study of the Earth, University of Paris; J. Scott Armstrong, cofounder of the Journal of Forecasting and the International Journal of Forecasting; Jan Breslow, head of the Laboratory of Biochemical Genetics and Metabolism, Rockefeller University; Roger Cohen, fellow, American Physical Society; Edward David, member, National Academy of Engineering and National Academy of Sciences; William Happer, professor of physics, Princeton; Michael Kelly, professor of technology, University of Cambridge, U.K.; William Kininmonth, former head of climate research at the Australian Bureau of Meteorology; Richard Lindzen, professor of atmospheric sciences, MIT; James McGrath, professor of chemistry, Virginia Technical University; Rodney Nichols, former president and CEO of the New York Academy of Sciences; Burt Rutan, aerospace engineer, designer of Voyager and SpaceShipOne; Harrison H. Schmitt, Apollo 17 astronaut and former U.S. senator; Nir Shaviv, professor of astrophysics, Hebrew University, Jerusalem; Henk Tennekes, former director, Royal Dutch Meteorological Service; Antonio Zichichi, president of the World Federation of Scientists, Geneva.
Do you seriously believe I should ignore the opinions of all of these people in favour of some random internet troll who thinks "NomNomNom" lends his posts that extra touch of depth and gravitas so lacking in these forums?
And there I was thinking that the President of the World Federation of Scientists might actually have a clue how science is supposed to work! Clearly, I was mistaken.
I mean, naturally, I should always go with the side that has more money. There's certainly an awful lot of money sloshing about in the pro-"Chicken Little" camp—not least because many, many companies stand to benefit from the lavish grants and subsidies governments are now encouraged to fork out, despite many nations' economies slowly floating down Shit Creek without any sign of a canoe, let alone a paddle.
I have been convinced by NomNomNom's clear, lucid and unbiased response! More windmills, I say! Never mind the naysayers who point out that we've tried relying on wind power in the past and didn't exactly stick with the technology when "alternative fuels" appeared on the scene! More solar photovoltaics are urgently needed to cover those unsightly roof tiles that blight cities like London! More! More!
Context: it's a real thing. You may want to look it up sometime.
Macs have a damned sight more than 6% of the consumer PC market, which is the only market Apple have targeted until very recently. Unfortunately, nobody seems to be surveying that market alone as getting solid data for it is much harder than just letting a computer read HTTP headers from lying web browsers, servers, and all those PCs sitting in offices all over the world doing bugger all for 16 hours each day.
Almost every single consumer also has access to a work computer of some sort. The latter is the computer they're going to be sitting in front of for about 8 hours of every working day. Breaking out the figures that apply only to the consumer market is therefore tricky, although it'd be interesting to see what Apple's market share becomes if you only check data from weekends.
"...he was getting ten or twenty FOI requests each week..."
So that'd be just 2-4 emails per day, then? Big whoop.
Also: he only had to dig out the information once! He could then trivially send the same ZIP file to each request, along with a standard form reply—a process that would have taken mere moments!
The whole point of science is that hypotheses and theories can be falsified! You are REQUIRED to show your working, including the raw data and the results of your processing. It's not optional. And, no, you don't get to pick and choose who does that. Not liking the cut of someone's jib is insufficient reason to deny them access to your data.
There really is no excuse for the UEA's behaviour in this affair. None. It's the very definition of "unscientific".
For what it's worth, I do agree that we should be reducing pollution in general, but I've always felt that way. And so do most good business owners: pollution = inefficiency = wasted money.
However, CO2 is not produced exclusively by humans. It is also emitted by any number of other processes. There must, therefore, be a "baseline" level we should be aiming for. What is it? And will we need to cull the burgeoning global population of humans to achieve it? (That's a question almost everyone seems to avoid, but it's a crucial one: there are 7 billion of us now. How many humans is "too many"?)
Ditto for methane and, of course, the #1 Most Wanted "Greenhouse Gas": "dihydrogen oxide vapour". (Yes: the gaseous form of that stuff covering 75% or so of our entire planet.)
Right now, nobody seems to know what the base levels of each gas' emissions should be, so how will we know when we've done enough to cancel out our own species' input into the various complex systems we're talking about?
It's still throttled.
Unless you've discovered a way to substantially increase the water pressure at the point of entry (and assuming your plumbing can handle it), you're always limited to the volume of water the supplier can feed into your house over time (i.e. your "bandwidth"). How long does it take to fill a bath with the taps fully opened? That's your "water bandwidth". If you want to fill your bath up more quickly, you _will_ be expected to pay more for the privilege.
Industrial businesses that have to run water-reliant processes pay big bucks to have lots and lots of water available when they need it. It's substantially more than you get at your front door.
So, yes, you are being limited. Turn all your taps on and measure the quantity of water that comes out per day. That's your daily ration. It _is_ limited. Inherently so.
I don't think ARM have too much to worry about yet.
The Tilera chips seem to require a lot more power than ARM cores. Fine for servers, which will hurt AMD and Intel's business, but ARM's core (sorry!) business strength is in mobile and embedded markets, not servers. Mobile and embedded customers consider power / watt pretty bloody important.
Even when you add up all the CPUs in all the data centres in the world, it's still peanuts compared to the *billions* of mobile and embedded chips out there. That's why Tilera's pricing starts at $many and rises very quickly to $lots.
ARM do need to get their 64-bit architecture out, but I don't think they're too worried about the server market; compared to mobile and embedded, it's unlikely to become a primary source of revenue for them.
Remember, too, that ARM's philosophy is that their cores can be coupled easily to other _specialist_ processors, such as GPUs and DSPs. Tilera's approach is to assume everyone wants the exact same core lots and lots of times. Both have their advantages and disadvantages, so ARM aren't necessarily going to lose out here. If most of your processing is better done on a DSP or GPU, you don't need lots of 64-bit cores as well: a bunch of 32-bit cores should be plenty to parcel out the jobs to the specialists.
There's a big difference between building the widgets...
... and _designing_ them.
Apple are a design company.
Write it out 1000 times and maybe, some day, it'll eventually sink in that it's not just about the bits _inside_ the box.
There used to be an ostrich farm not far from where I live here in Italy. I was wondering why it disappeared a year or so back. That link would explain it.
It's quite common to find horse meat here in Italy. I've never tried it myself—it's not sold at my local supermarket and I prefer chicken and pork over the 'red' meats anyway—but I'm told it tastes a lot like beef, only leaner.
I do draw the line at seafood,. Not just because it barely touches my insides as it accelerates rapidly towards my arse, but because it's basically either giant underwater insects; squishy things with eyes, beaks and more tentacles than are strictly necessary; or snot in a shell. I'd rather eat baby seals.
(Now there's an idea for a new Jamie Oliver series: "Jamie goes Clubbing!" Whack the seal lightly over the head. Peel, and fry gently in olive oil, with a few cloves of garlic, then serve on a bed of something green with a badly translated name*. Pretend you did all this in just 10 minutes. Pukka!)
* Any Italian noun with an "i" on the end is the plural form. Thus "biscotti" and "panini" are plurals. If you're having just one, it's a "biscotto", or "panino". Note the "o" at the end. That's the correct, singular, form for just one of either item.
While I'm at it, "biscotto" literally just means "biscuit" (literally "cooked twice"), while "panino" just means "sandwich" (literally "little bread"). The only reason for using the Italian words for a biscuit or sandwich is if you're being deliberately pretentious. Especially "biscotto": "biscuit" is already a French loan word! It's just as Latin as the Italian one.
It's bad enough having to correct grocers' apostrophes without pointing out that the correct English word for "two slices of bread with some stuff in the middle" is "sandwich", not the Italian plural form of "little bread".
Grrarrrgh! Grammar Hulk ANGRY! Grammar Hulk CORRECT SPELLING WITH EXTREME PREJUDICE!
But what about the humble number?
Does "1" trump "A", for example? A computer sees an "A" as a "65", so I'd go with 'yes'. But I suspect it's more complicated than this.
If we're not careful, digits will start roaming the streets taking pot-shots at letters. With such character assassination, we could well end up destroying civilisation as we know it!
I demand an EU Directive on this!
Personal data should be defined as belonging to one or more layers...
LAYER 1. Data required to be stored for reasons of basic business law.
E.g. you can't demand that the information Amazon need about you for their financial records be "deleted"; that'd be illegal as it violates the integrity of their customer sales database. How can they know what their tax liabilities are if a bunch of their invoices and receipts have error messages where the purchaser's details should be?
This is basic business and finance.
LAYER 2. Data that is legally required to be stored for Data Retention laws, but which can be safely hidden from public view. This is data law enforcement offices may need to access. Checks and balances are needed to ensure this privilege is not abused.
This layer is for data that is used to answer questions like: "Was Suspect A _really_ messaging Person B when the murder took place?"
In a society increasingly reliant on IT, we do need _some_ level of data retention, or the police's job becomes effectively impossible.
LAYER 3. Data which should NOT be stored UNLESS specifically sanctioned by a legal mechanism, such as a warrant issued by a judge.
This includes—for example—text messages sent via IM protocols.
There's no justification for having such conversations recorded in perpetuity by a central server: text messages take up very little storage space and, should a user at either end of the conversation desire a permanent record, there's nothing to stop the client software doing the recording itself.
If law enforcement officers really do need to see what two potential suspects are discussing, they should require a warrant to have such conversations 'tapped' and recorded by the central server, just as is already the case with telephones. However, they do not have the right to demand every word you've ever written since you signed up.
A point to note is that, in order to prove Suspect A's alibi—that he was chatting via Skype with Person B at the time the crime took place, for example—it is only necessary to know that Suspect A _was logged into Skype and sending IMs_. It is _not_ necessary to know details about the actual conversation.
Hence the "layers": a telephone company will usually log when a call was placed, to which number, and for how long, but they don't record the conversation itself unless specifically asked to do so by a suitably worded warrant. And even then, they only record conversations for the period _after_ that warrant was issued, until its expiry.
Any new data protection system needs to take all the above layers of data into account. Lumping all personal data under the same label will never be workable.
"Who mentioned duplicating anything?"
So you always delete your own copy of a digital work after copying it onto your friend's USB flash drive, do you?.
Of course you would, you paragon of virtue, you!
Forgive me if I find that rather difficult to believe. Especially after what happened in Tottenham and Croydon last year.
If memory serves, as your society becomes larger and more complex, so do its user interfaces. Etiquette is essentially social grease. The more cogs, gears and other folderol you have in close proximity, the more grease you need to keep everything moving smoothly. If you only have a simple wheel spinning all on its lonesome, you can spend a lot less on its maintenance.
In a rural village, with a low population, you don't need to worry about formal introductions as everyone knows everyone else already. It's all just one big tribe.
Once you reach a certain population level, that "one tribe" system fails: it becomes impossible to remember everyone's names, and you end up with separate groups forming within the über-tribe, each of which then has to interface with the other groups. The more groups you have, the more complex the interfaces become, until you end up with people who spend their lives analysing and codifying those very interfaces in the form of etiquette.
The rise of social media technologies in recent years has had the effect of reducing the number of interfaces required to socialise with even a very large circle of friends and acquaintances, as the computer is doing the remembering of names and faces for us. The result is a gradual erosion of what many of us older readers think of as basic politeness and social graces.
Polite society—as we know it, at least—is ending. There will be a new society, but it's unlikely anyone has a clue what it'll look like. We're in the transitional phase now.
you can create a porn app for the iPhone easily. just offer a homepage shortcut icon for your porn website.
sorry for the lack of capitalisation, but my right hand is otherwise engaged.
Apple may charge $99 per year for the App Store...
... but they do a damned sight more than just shovel your app onto it. If your app is good enough—and it is, right? You're not just developing yet another "torch" app, are you?—then Apple may pick it for their App Store's front page billboard areas. That's free advertising, right there, where your potential customers are looking.
What do Google do for your $25?
Curation has advantages for developers too: it means potential customers will assume your app _isn't_ a piece of malware, which remains a psychological advantage iOS has over Android. (No, Apple's curation isn't perfect, but Google's is _non-existent_.)
Finally, Apple's iOS SDK _is_ free. You can develop apps for as long as you like while you're learning the ropes. It's only when you get to the point where you want to sell it that you have to pay, but Xcode is right there in the App Store. And it costs precisely nada.
"to develop for iPhone, you need to have a mac."
And to develop for other platforms, you need to have a PC of some sort too. What's your point, exactly?
All businesses have to invest money in tools and other paraphernalia. Compared to the cost of actually _doing_ business, the price of a Windows, Linux or Apple computer is negligible: how much money do you expect to make from developing apps per month? If it's more than £2000, that's a top-of-the-line Mac right there. Every month of sales after that is gravy.
Investing in the tools of your trade is something you need to think long and hard about. From personal experience, I would advise strongly against going for the cheapest, nastiest piece of crap computer you can find on eBay and invest in something that's going to last you a while and which comes with excellent after-sales support.
Say what you like about Apple's prices, but I've found their support and customer services to be second to none. And that counts for a lot when you're relying on your computer hardware to pay the bills.
Bugger me, it works!
Does anyone see weird 'filler' symbols instead? It'd be interesting to see how far Unicode has spread.
(I.e. if you see multiple 'empty square' symbols, please reply with your platform, browser, etc.)
FYI: I posted on a MacBook Pro running Safari 5.1.2 on OS X 10.7.2.
I suspect the "filled Apple" symbol (in the row above "Giovanni disse") may not appear correctly on other platforms.
Quick test of accented letters...
(... 'cos I'm half Italian.) And some other random symbols I can get from my keyboard...
È é è ‰ ™ ® ü Ü û î Û ö Ö ¥ ñ Ñ ÷≥≤«æ…‘“π≠–ºª•¶§∞¢#€¡
«Non e possibile! Funziona!»
– Davvero? I caratteri Unicode si vedono tutti?
All the Español I know:
"Well," said the DFS salesman, shortly before he was arrested after producing his concealed pun, "sofa, so good."
X vs. Y!Think X is better than Y? Explain your reasons here! What if you think X is worse than Y? Then you, too, have come to the right place! LET X = BCPL, birds (angry or otherwise), bacon, beef, Brits or bits! Prefer tabs? Let X = Tabs! LET Y = C—sharp, double-plus, or objective! Vi and Emacs! WordStar or EDLIN! Tabs or Spaces*! Up or down! Motorola MC680x0 Assembly Language and Intel's X86 Assembly Language! Sean Connery or George Lazenby! This is the place to be for pointless nerdwankery of the highest order: Let the "Which is better?" Debates begin!
No "Edit" feature?
At present, if I spot a mistake in a post I've just submitted, I have no way to go in and correct it in-situ. I have to copy the post into the clipboard, use the "Withdraw" button to delete my original post, then paste the clipboard into a new post, edit it and submit anew.
(Incidentally, the copy-paste procedure above seems to mess up the line endings. You might want to look into that.)
Beef? HERETIC! Burn him! (Okay, grill him!)
Clearly, some people here are under the impression that beef is a good form of meat. Not so!
Ladies and gentlemen of the jury, despite the evidence put forward by my learned colleague, Mr. H. Lecter, for the defence, I put it to you that the humble pig, not Man, is the pinnacle of culinary evolution! I intend to demonstrate beyond a shadow of a doubt that, clearly, bacon > beef, and that the defendant in this case, Mr. T. Pott, is clearly deluded and requires immediate psychological evaluation.
To begin with: the pig can be made into sausages, bacon, pork chops, and more! In Italy, the traditional New Year dish of Cotechino e Lenticchie is ambrosia! (Albeit in small doses, otherwise you'll want to keep a window open...) Even the pig's head can be used to give the illusion that you're in the Middle Ages by the mere addition of a simple apple, so it even has entertainment value!
The defendant would have us believe that the cow: a sad, laughable creature that exists primarily to inject methane into our already saturated atmosphere; provides milk—okay, yes, that's a neat trick, but milk is not meat and meat is our topic of debate here—and beef. And beef alone. But beef can be tough, fatty, gristly and, unless minced and used in a ragù, is just expensive and pointless.
Furthermore, beef is the only meat the cow provides. It's a low-level meat. It is the Z80 assembly language of meats. Fine if you want to eat the 1970s 8-bit processor of foods, but you can't wrap a bit of beef around another beef-based product to make it better.
Contrast with bacon, verily the Meat of Meats! The mighty bacon can be added to anything and always makes it taste better! Wrap it around chicken! Serve with fried Cheddar cheese! With eggs! (Who eats "Beef and eggs" for breakfast? Nobody except the criminally insane, that's who!)
Bacon can even be wrapped around a pork sausage and, despite that sausage also coming from a pig, it still tastes better!
Finally, we all know that the pig has contributed far more to the world of literature, comedy, and innuendo. It has, for example, given us for the verb "to pork". It has also given us a slang term for law enforcement personnel and another term for overacting!
What has the cow given us? Clichés! Nothing more.
Bacon is truly the meat of the gods themselves! By comparison, beef is just congealed Bovril with bits in.
Clearly Mr. Pott is insane and should be committed. Don't let his smooth talking on Climate Change™ fool you: this man merely offers us a passable illusion of intelligence and wit, but is clearly not fit to walk the streets!
The case for the prosecution rests!
... like that.
Note that the editor itself doesn't have to be WYSIWYG: just have the buttons wrap highlighted text with the required markup, or—if nothing is selected—make the switch toggle between its related opening tag and the closing tag.
But, yes, either a traditional toolbar or, use text buttons if you prefer. (Icons are often easier to localise though, for what that's worth.)
- Xmas Round-up Ten top tech toys to interface with a techie’s Christmas stocking
- Google embiggens its fat vid pipe Chromecast with TEN new supported apps
- Microsoft: Don't listen to 4chan ... especially the bit about bricking Xbox Ones
- Shivering boffins nail Earth's coldest spot
- Thought your Android phone was locked? THINK AGAIN