Re: "keyboard doubles as a capacitive multitouch trackpad"
It's the latter, a tactile keyboard with added touch-pad functionality integrated.
228 posts • joined 9 Oct 2008
It's the latter, a tactile keyboard with added touch-pad functionality integrated.
It was built-in to AmigaOS, even in the original hardware back in 1985.
You might be thinking of the Amiga which offered Screens back in 1985.
Of course, Screens were more like how current Mac OS X does full-screen applications rather than a giant desktop you can view a certain area of at one time. Arguably this is more sane - you only have as many screens open as you need at any time, and only one desktop (Amiga: workbench) screen for things you want to do within the OS, tools, config, etc.
Why? Differentiation of products, and power management.
I hate the self checkout machines, simply because I will have an issue with "unexpected item in bagging area" (yeah, the item I just swiped), and if I buy anything like a beer, I have to get someone to approve me.
Sadly, because Tesco are the worst supermarket ever, and cost-cutting their staffing levels to the operators of the self-checkout tills, you are forced to queue for ten minutes before you can use them whilst looking wistfully at the ten empty cashier tills.
Half of them have the red flashing light on indicating a fault too. Don't break, my arse.
Not quicker either, a cashier till has someone to pack the bags - you. A self checkout takes longer because it's scan then bag, scan then bag. No parallelism.
And a typical RRAM die will probably use a tiny fraction of a gram of Pt/Au.
Anyway, what matters here is scale of manufacture, and hence the cost of the product, as well as the capacity offered, and the speed it can operate at and bandwidth it can offer.
Well, if these subterranean voids are common (these are only holes down into them), then growing crops is a matter of lighting the voids, adding viable soil and growing the crops, and keeping them watered.
Of these, I think that adding lighting is doable, once we have the power plant.
Soil? We would have to manufacture it on the moon, with some way of seeding it with the required microbes and fauna. That doesn't sound easy to me. I expect people are working on it. Hydroponics will probably be the initial system.
Water - well, that requires a means to extract water from the lunar soil/rock. It's meant to be there, but the equipment wouldn't be easy to get onto the moon in the first place.
Building a small manned base under the overhangs of one of these things seems easy in comparison.
I had problems in London, so they were clearly lying about it being the north east only.
The return on investment for having adequate security measures in place should be being allowed to stay in business (i.e., companies that do not invest in security are shut down by the government, I might even allow the companies' one shot at fixing their problems before being shut down).
Yes, that's what happens when sea ice melts. It's already in the sea.
Sadly, this article is about glacial ice, which is not already in the sea.
So your example would be a full bowl of water, with ice cubes being added to it. Does it overflow? What happens when you add ice cubes at twice the rate?
Again, sea ice is not glacial ice (it melts every year and doesn't contribute to sea level rise). More sea ice forming is evidence of a local cooling effect in the oceans it is forming it. That cooling effect is coming from glacial ice melting into the ocean nearby. These 2km thick glaciers and ice basins are not resting on the sea, so when they melt, they contribute to sea level rise.
I expect it's because your joke was indistinguishable from the "drunken chucklefuckery" that many people write about their non-scientific opinion of global warming! You should have put a smiley in, or something!
Hmm, so what could be cooling the sea more than normal to create more 2m thick sea ice.
Hmm, maybe it's those 2km thick glaciers that are melting faster and putting a lot of very cold water into the ocean.
Note how I emphasised that sea ice is 2m thick, and these glaciers are 2km thick.
More sea ice is evidence of warming, in this case. Bad luck (to everyone).
Also there are multiple glaciers in this system in Antarctica, and the study studied just one glacier's contribution to sea level rise. All the other glaciers will also be contributing (they're retreating too). And once one of these glaciers hits the inland ice basin, all bets are off.
Because it seemed sensible at the time to create a hierarchical namespace for country code tlds, so split up the namespace by organisation type, and delegate responsibility for some second level domains like .sch.uk. The current mess is what happens when a system is not enforced (you're not a charity/non-profit/etc, so you can't have the .org.uk) and is allowed to evolve for twenty years.
In hindsight we found out that people just registered their names in all namespaces to protect them. Nobody uses .ltd.uk or .plc.uk either. Nominet was stupid and didn't do geographical second level domains either (.ldn.uk, .man.uk, etc), and now we have .london because ICANN opened that can of worms up too.
So ... designed in good faith by technical people who didn't take reality into account, not taken to the logical conclusion of said design by the entities managing the namespace, and then abused by marketing/sales/etc regardless.
Poor sod. Still, if you use Hibernate with annotations, and Spring with Annotations (or Guice with programmable modules), things get a lot better within an IDE with regard to not getting errors because you had the sheer temerity to rename/move a class, etc.
There are things that are good about Hibernate, compared to raw JDBC. Not a lot if you've got the JDBC architecture correct, but sadly too many people get it wrong with regards to transactions, depth traversal in the DB, try/catch/finally in the DAO, etc.
As you can see - once you've got it right, then there's no reason to really use Hibernate, although boilerplate code in DAOs is a PITA to write.
As for dependency injection, quite simply - use it.
Yup, I'd say to be able to say "FO work" and live in luxury you need around a million quid (up front) for every decade you intend to live the luxury lifestyle. Super luxury will still not be an option, but you might have a nice houseboat and holiday home in a warm place (and you'll need them to occupy yourself for 52 weeks of holiday a year).
Get rid of the luxury aspect, and you can probably get by with £250k for each decade, paid upfront of course so you get the house, car, pension, etc all done and then live on what's remaining. Maybe a spot of contract work would make it far easier too. Plenty of lottery winners come in this category, blow the lot early on, and then have no spare cash to live on after a few years because they overestimated what that money could do for them. You might still need to downsize/move at retirement time to unlock money though. Overall it's probably still worth it, work's work after all.
Right, firstly they need to save for a deposit in those barren years between leaving university and being 35. Also their partner will hopefully have done the same. Maybe they will have previously had a small flat rather than renting, and thus have some capital already.
Secondly, they will need a job in London, they will need to be good at it, and thus get promoted. Yes, they will have management skills and presumably be a team lead. That goes with having >10 years experience as a developer.
I don't think having a £350k house today is unachievable for a senior software developer in London. Even if that means buying a £300k dump and doing it up, or getting lucky with the housing market. And yes, it will mean a commute, nothing near the worthwhile jobs will be affordable.
Now in ten year's time, if houses in London and nearby keep rising at 10% a year ... then the current 25 year olds are totally screwed. That £350,000 will buy a 1 bed studio flat in a less salubrious suburb of Croydon by then!
If they've still got 30 years of career ahead of them, and we aren't adjusting the term "millionaire" for inflation, then it's highly likely that a good portion of them will go on to be millionaires by the end of their career. Of course, that will be a time when 10-20% of the country are millionaires (indeed, most people owning a decent house in or near London could be) and the term is basically meaningless to define someone who is rich.
Most of that is down to assuming that they buy (get a mortgage on) a £350,000 house by the age of 35ish that appreciates at an average of 4% over 30 years, to be finally worth around £1,135,189 by the time they retire. Getting such a mortgage should be possible for any senior level software developer in London. The only way to make use of that money by then will be to sell the house and live in a camper van (or a bungalow somewhere up north).
Of course, the story is really saying that a significant percentage think they will be multi-millionaires with fast cars and expensive holiday homes, and (as a whole) those people are delusional.
Java performs a lot better than Ruby. And it's not far behind C++.
And one major attraction of Java is its scalability (or the ease that common Java enterprise frameworks enable scalable application design). That's why it is used extensively in the real world.
Luckily Java has had multi-threaded development capability since day 1, and this isn't the first time it has been made easier to use - the Java Concurrent frameworks are now very old, for example.
What the lambda expressions actually do is allow the programmer to express, concisely (a big problem with Java, previously you would have had a bulky inner class implementing a functional interface) a more functional model of programming that so happens to also make it easily multi-threaded.
The only well paid group of people here is the global warming/climate change deniers, funded by companies with vested interests in keeping the status quo with emissions, oil/coal use, and so on and so forth. Billions of dollars go here, and yet the science that supports them is under 1% of published papers.
Oh quit it with this line of argument. They're in there so that the signatories to the agreement don't tell the media or publicise the results of the agreement widely, they don't stop reasonable disclosure within immediate family, etc.
The father had a perfectly reasonable expectation that family financial matters would be kept within the family, and that his daughter wouldn't gob it all over the world. Even so, he should have reminded her when he told her, that she couldn't give details. He probably did, but she wanted to brag about her "win by family circumstance". Lesson for parents - don't tell immature offspring anything when there's a lot at stake.
In the end, that $80k was probably just about reasonable recompense for the stress the whole situation (firing, court case, etc) put him through, and she has lost him that with her large gob. What could have been a decent college fund (or parent-funded downpayment on her first house) for her is no longer, so she'll now be racking up the student loans, so hopefully she will learn something from this.
"The "terms"; ie "the agreement between the two parties", was that he recieves 80K for not disclosing the deal to anyone...... That means "no-one"."
I am almost 100% certain that it doesn't mean immediate family, that would be an unreasonable restriction.
Also, is his daughter old enough to be a party to such an agreement? I doubt it.
I think this man's lawyers have let him down. But not as much as his daughter, who has learned that you don't just gob your mouth off about everything that happens in your family to everyone on a public forum.
I presume this is good for things like wireless video streaming, where you could put your phone/tablet down near to your TV and use it as a lossless wireless display that can actually cope with video, games, etc.
It's only about 120 times faster than my internet connection supposedly is, so it's certainly not for that!
I don't think the motion controller was the worst idea in the world, and certainly it made the Wii a good party console.
But the Wii U's expensive-to-include gamepad certainly is a bad idea, and it's making the Wii U too expensive compared to other options, with little room to reduce the price in the future. Some of that money could have been spent on the lacklustre CPU in the Wii U.
The PS4 is simply so much better, but not vastly more expensive. The Wii U is this generation's Dreamcast ... but will it even sell 10 million in the end?
Maybe it's not too late for Nintendo to release a Wii U sans Gamepad but with a classic controller, drop the price, and ensure popular games can run without the Gamepad present.
"Great" for London-only businesses I guess.
TBH decent internet search has done away with the need for really local domain names.
Maybe there will be subdomains, like putney.london and foresthill.london ... that would let you have really local domains.
God, I can't get excited by this really, I can't even bring myself to use an exclamation mark anywhere.
A true great family machine - games for the kids (and at the time they didn't care about 50Hz pixel perfect smooth scrolling, but they did care about lots of colours), but dad could use the excuse of "doing the family finances" too (hence the adverts showing both games and business stuff at the same time) :-)
I did school work in Tasword on my CPC 6128 with DMP2000 printer. More than adequate for essays.
You should check out the CPC Wiki then - http://www.cpcwiki.eu/index.php/Main_Page they'd love to hear from someone who wrote such software.
I presume you worked for Kuma software then :p
Indeed so - IIRC Amstrad got the 3" drives cheap from Hitachi because the format was failing.
They kept it alive enough for a double density version to be included in the Amstrad PCW range, although the PCW 9512 used 3.5" disks in the end.
Data exchange was less of a concern because the disc formats were different on all the home computers anyway. The CPC, despite the 3" discs (yes, "discs" :/) did have an option to use CP/M formatting on the drives which actually made the system more compatible in some ways than other computers. And you could always buy an external 3.5" drive.
The article forgets to mention the old PcW16 - this used a 16MHz Z80.
Is there anything actually factually accurate in this post, apart from possibly the assertion that most families bought the green screen version? And indeed, being able to use the computer when mum or dad were watching the TV was invaluable.
The CPC did suffer from Spectrum ports, that's true. But the games that made use of the hardware were far superior and prettier.
The C64 was smoother for games, especially when scrolling was involved, that's true. But it looked crap, and the graphics looked like mud.
The CPC 464 suffered minor slowdown due to the screen display, the C64 did too and managed half the resolution.
The external modulator was rubbish. But as soon as SCART came out you could have a direct RGB signal to your TV very easily. And vice-versa, if you had the colour monitor, you could add an external TV tuner and gain a cheap second TV. I bet even today someone is watching Freeview on a CPC monitor somewhere!
YouView was a dead duck as soon as it was revealed it would be a £200+ set top box.
Since then there have been a ramp of cheap media players with iPlayer and 4od installed - ITV Player often lagging. The cheapest is the £10 Now TV device that is subsidised by Sky I believe, but I don't think there is any obligation to buy Sky's premium content on the device.
Of course the problem is that all of these catchup channels have their own separate applications and UIs. ITV Player's often being the worst (a web portal with a terrible interface, at least on the PS3). Sadly the other half needs to watch Coronation Street on catch-up after the nipper has been put in bed and thus we have to suffer the terrible UI and PS3's screen saver kicking in, which doesn't happen in the native PS3 iPlayer and 4od apps.
So Freeview Connect needs to deliver a single, consistent, usable UI. I suggest they make use of the current excellent iPlayer UI and extend it to host catch-up TV and archive TV from the other channels.
So ANPR is a bit rubbish - which suggests the current numberplate system is not ideal.
A QR-code of the numberplate would be quite small, and could be done in the form of a window sticker that is suitably reflective, and can be mounted on the rear and front windscreens (avoiding muddy numberplate syndrome).
ANPR readers should be able to use suitably high resolution cameras that can actually detect these QR-codes in the images.
So all that's required is for the government to issue these QR-codes to everyone in the country, and to make not displaying one an offence. Any image that an ANPR detects doesn't have the QR-code can go to human verification of the number plate and a fine for not displaying it (or for driving with obscured windows).
But yeah, you're not going to sell such tech for a grand a box, are you?
No, Apple license the ARM Instruction Set in the form of an architecture license - in the case of the A7 chip, that's the ARMv8 architecture.
The ARM CPU cores in the Apple A6 and A7 are full custom in-house designs, the first was called Swift and the current one is called Cyclone. http://en.wikipedia.org/wiki/Apple_A7
We don't know yet if other architecture licenses (and ARM themselves) also incorporate what this patent describes, and have the appropriate license for the patent if they do. The patent has to be far more specific than "branch prediction with speculative execution" ...
I'm guessing that they bet that corporate IT departments don't choose laptops based upon screen resolution, but all the other factors written about in the article.
But I agree that in a 13.3" laptop, 1366x768 has had its day. 1600x900 would be a good "non-HD screen" target to go for (rather than the 2560x1440 displays).
A real shame for the employees, just before Christmas too. I hope this was done whilst there was enough cash in the bank to pay their final salaries and redundancy.
As for potential purchasers, we have Google as a major company that could do with owning its own hardware company for server uses (and search appliance uses), and of course Facebook (with that employee on the board of Calxeda) would also be interested.
This also shows how hard it will be for AMD to compete in the ARM Server market with their forthcoming ARM server chips, but at least AMD have industry contacts and fabric (Seamicro). Indeed this could be what put investors off sticking another $30m into the company.
Come and Avago if you think you're hard enough!
So what's that £91m in assets that will be fully deprecated in five years? Does it include the hardware within redundant datacentres? Or is it non-reasonable deprecation only?
I am always astonished at the cost of these systems, but when you get large software consultancy firms involved it's hardly a surprise. Poor code at a high price, delivered late and not to spec.
How about the courts also stop foreign media from publishing the names in the first place?
The law is good, to protect the victims even if that means a perpetrator gets anonymity. But the law is no good if it can be worked around by such a simple thing as a foreign paper publishing the names online for all to read in this country. Arguably, the paper should be taken to court for making this information available in the UK (they could apply some form of IP based restriction to the article). At the very least, that media disrespected our laws (even though they don't operate within them themselves).
Heh, 4K is a horizontal measure of pixels, not vertical like all previous resolution labels. Those who want to remain sane will say it refers to a class of resolutions similar to 3840x2160 (2160p), but also including 4096x3072 (4x 1024x768 - the 4K that this article refers to) and some others.
(Those that want to get annoyed and have an argument will state it's 3840x2160 only, and that's not even 4K and RAGE, etc.)
I think it should be called 16:9 2160p or 4:3 3072p, but it appears that many humans can't deal with such large numbers, so we've wrapped around to 4K for some reason (movie related).
I agree that a multiple of 1024x768 is inevitable, and thus the @3x 3072x2304 is the easier, and more logical, resolution to aim for in a 12.9" tablet. It will be less stress on the GPU (being around 2x the pixels instead of 4x the pixels), and possibly would be doable with an plain old A7 with a much faster GPU, or an "A7X" with 50% more GPU (and faster GPU) instead of 100% more, and faster, GPU (and a 200% faster, same speed, GPU is not really an option).
I don't think the retina name is an issue. There's more to eyesight than pixel detection at certain distances and DPI - there's angular resolution as well, for which retina is still not enough.
I'm guessing iOS is a given, due to the touch interface and lack of suitability of Mac OS X in such a device.
That's always the bad thing about sharing a name with someone who becomes a hated criminal!
Luckily none of the affected people are taking it further, but I can easily imagine that another person would involve the police if they weren't aware of why they were suddenly receiving hate messages.
The ARM1 was designed as a BBC Micro add-in processor, and Chris was referring to the BBC Micro in that statement.
At least it's getting most of the TV catchup services (well, no 4od yet, and nobody cares about itv player) and streaming video services (Netflix, Lovefilm) on its release date in the UK.
But the important thing for a games console is that they've got a decent controller and the console's hardware can do the games at 1080p. Sure, it needs more good games right now, but it's the same with every new console launch.
They just need to improve the capacities - 2MB chips aren't useful, even for phones, even if they stack in-package to 16MB. They need to be around 500x more capacious.
"not all items will only be 40 yards away in a straight line."
I would seriously hope that the computer overseeing a picking job would organise the things to be picked in an optimal manner for each meatbot, presumably taking a circuitous route from the empty truck zone to the full truck ready for unloading by other meatbots, before the driver returns to the empty truck zone to start again.
And I think I saw on TV recently that Ocado went for the full automated picking factory - clearly this is where Amazon could be going soon. And then the meatbots will be complaining about fewer and fewer jobs that they are qualified to do.
I'd hate that. Then again I'd hate a lot of manual jobs, and at least this one is under cover.
The main issue I have with it is the constant timing and countdown clock thing, which is a clear and obvious source of stress, especially if you overrun often - I can imagine a red light blinking by your name on some manager's tablet.
The analysis pulls figures from out of thin air, and clearly the price of the AMD APU, at $100, is tantamount to saying "we don't know". In addition how is it "three times as big" as any other 28nm chip? It's far smaller than high end 28nm GPU dies for a start. So they've pulled yield figures out of their rears as well, without noting that the APU has redundancy built in - 20 CUs instead of 18, for example. Yield will be higher than the 66% they are saying: The process is mature. It's well known. And the die has redundancy.
The $88 price for the GDDR5 memory is probably realistic (although Sony's individual deals with the memory manufacturer aren't known), but the price will drop over time, as will the price for the APU.
Yeah, whatever happened to local loop unbundling?