Re: Hololens hype
...me too, and one of them (highway surface inspection and maintenance) does indeed involve someone wearing them while driving, or at least being driven.
936 posts • joined 10 Apr 2007
...me too, and one of them (highway surface inspection and maintenance) does indeed involve someone wearing them while driving, or at least being driven.
@Halfmad, Apple is an unlikely candidate, as the company doesn't sell into those markets too much. Apple does have a small "enterprise" market, but those customers are in the design and media industries (print, tv, cinema, web) - all very much 2-dimensional (with the half-exception of cinema stereoscopy).
In industrial product design, biomedical research, architecture and what used to be called CAD/CAM, Windows is the predominant OS.
"Uh ... no. Show me real-time code that is written in C++."
Uh ... yes:
Hey, here's two more that you can see the code for:
Whether a kernel is C or C++ depends more on when it was started than anything other factor. C++ is a superset of C; anything that needs C for "efficiency" is just as possible with C++ code.
"You can write OO code in any language if you are perverted enough."
Ah, you've used glib GObject, then...
At this point, I'll have to confess to writing quite a bit of OO code in 68000 assembly, although I didn't recognise it as such at the time. (I even had vtables)
@Dan55 above on C being "faster". It's not. C++ code runs exactly as fast as the equivalent C code - C++ actually offers a good compiler more optimisation opportunities. Non-virtual method calls are simply C function calls, in-function variables are allocated on stack just as in C, and exceptions/RTTI can be disabled if your module doesn't require them. (Just specifying throw(); at the end of your method declaration removes the overhead of exceptions in that method even if the rest of your code uses them). C99 borrowed a lot of its nice features from C++ (it's a shame that C++ took so long to take "null"
Just because C++ lets you quickly write inefficient code (like copy-by-value parameter passing for superlarge types), it doesn't mean that C++ is itself less efficient, just that some people don't know as much about programming as they think they do. (The small consolation of such dumb behaviour is a. it's less likely to cause bugs than naive use of pointers is, and b. you can optimise the problem away later.)
I'm happy to accept the argument that C++ leads developers to use less speed-efficient data-structures like the STL containers for tasks where a hand-rolled equivalent would be superior, and that C++ code can thus be slower as a result. But that's trading dev-time for run-time. Unlike a hand-rolled data structure, the STL version will work reliably straight away; and in general, I heed Dr Dijkstra's warnings about premature optimisation...
Yes, BSD's kernel is written in C.
If Apple had acquired Be rather than NeXT in 1997 (i.e, if they had made their decision on purely technical grounds, rather than on who the CEO of NeXT was), then their kernel would now be written in C++.
(Haiku, and it's now Open Source, if anyone is curious)
As it is, Cupertino's current device driver and I/O layer is written in C++, and so are many of the low-level libraries unique to OS X.The remainder are C, or Objective-C for less performance-critical ones.
This illustrates only that competent developers use a variety of tools. OS X was not a clean-sheet design; it was actually something of a rush-job as Apple had fallen far behind Sun and Microsoft in OS capabilities and desperately needed to catch up: you have to remember that even in 2000, Mac OS was only co-operatively multi-tasked; so one bad application could often kill your entire system. OS X as released was an amalgamation of many different sources: each was chosen because it was a proven, viable subsystem, not because it was written in the One Holy Language.
But, going back to kernels: The reason why the BSD kernel is written in C is because AT&T's UNIX kernel was written in C, and that was because C was the language that K&R developed specifically to allow their UNIX OS to be portable across AT&T's various system architectures.
You have completely missed the point of the glasses. They're a tool, not a toy. Nobody will wear them outdoors ... unless they're doing work that requires them.
I was sceptical about this when I saw the presentation, but today I can think of hundreds of applications for this in industry.
What this can do is provide a standard interface and API for 3D visualisation, one that allows anyone (Autodesk, for example, or SolidWorks) to add holographic modelling and manipulation to their product. And there's a mass-produced client device to lower the cost of using the technology to the point where small practices can afford it. If that doesn't sound interesting to you, you're not in an industry where showing people things, in 3D, that don't actually exist yet, is important. Here are some examples:
You're an architect, and you're trying to show a client what you mean about moving a ceiling, or you're a surveyor, and you're trying to perform an initial layout of a building site on the ground. Or you're a car designer, and you're trying to get a feel of your work in three dimensions. Using this is a hell of a lot cheaper than clay modelling, and your engineering department can see the changes you make, and can work remotely with you as if you're all standing around the same life-scale model. When you're done, your brand manager can have a look, and even if she's in Michigan, Turin or Paris she can see how the design is progressing, "walk" around it, and make informed comment.
Outside of creative tasks, how about maintenance of industrial equipment: the glasses can provide you with annotations in real space showing you the assemblies that you need to repair, or an X-Ray view of the equipment you're servicing.
Speaking of X-Rays, this kind of technology is already very useful in medical training: there are exisiting surgical training systems using VR; this product lowers the cost of delivery of these, allowing their use in teaching to be broadened.
All of these applications are possible now, but the barrier has been that there's no common API or client device specification, so my 3D visualisation system might not work with your CAD package, for instance. More important, nobody mass-produces the equipment, so it's horrendously expensive. Microsoft's product and adding the APIs to Windows makes both of these adoption barriers smaller.
(The iPhone did not create the idea of mobile applications, what it did was lower the barriers to developing and selling mobile applications)
Google's failure was that there was no actual need for Glass - it basically gave you a smartphone that you didn't have to take out of your pocket. Glass was a consumption device, not creation. Glass was expected to be worn all the time, rather than when there was a need for it. Very different usage models, very different capabilites, and very different applications.
Cortana isn't just voice recognition. It's actually very smart at determining the context of your queries based on your previous questions.
Also, and most important for a desktop/enterprise system, you can drive it entirely by text, where it becomes a kind of free-form CLI, for want of a better analogy.
I can imagine a lot of people not wanting all of their stuff in US clutches once they understand what this implies.
I'm surprised you missed one of the biggest ongoing data-privacy cases of 2014, but here's a summary:
tl;dr = Microsoft won't store your private data on servers under US jurisdiction.
"The better solution would be to broadcast from a transponder that, nominally at least, only covered the UK and Ireland"
Yes, and it'd be even better if not for the fact that the UK and Ireland are not the same country. This leads to the situation where Irish TV-licence payers are unable to watch their own channels on Astra without paying Sky a subscription.
The alternative, of Free-To-Air broadcast on Astra 2A, would have the effect of increasing the Irish broadcasters' reach by a factor of 16 (from 4 million to around 64 million viewers). Sounds good? Well, no, not for a public broadcaster it isn't - the cost of live events and other bought-in programming would now also go up to account for the larger addressable viewership. It wouldn't matter that 90% of that market have no interest in watching: in TV, you pay for how many people can view something, not how many actually do.
(Yes, there is a free satellite system in Ireland, but it uses Ka-band, so is incompatible with the Astra Ku-band receivers that most people own, and is only intended as infill for areas that cannot receive DVB-T signals.)
But broadcasting is one part of the media market where it's harder to make a clear-cut case for a single, flat market; there are valid societal reasons to not force broadcasters in small countries to play at the same scale as their larger neighbours.
However, when it comes to direct sale of digital media to customers (Amazon, iTunes, Qobuz, etc) it shouldn't be so difficult to fathom: the EU Single Market rules mean that a publisher cannot stop me buying a book from Germany/Hungary/Slovenia/wherever. But if I try to purchase an eBook of the same text, from the same EU seller, I can't. Because now, suddenly, it's not a good anymore, and they don't have a right to sell me it. ?? But... if there's something that's only available on eBook, that they cannot sell me, I can ask them to print it out, and then ship me the hard copy, and now suddenly they're allowed to do it again. That is madness.
"But why is HS2 so much more expensive than building the original railways in the 19th century, which connected up just about every village and hamlet?"
Those railways were very expensive for the people who invested in them... http://en.wikipedia.org/wiki/Railway_Mania
I think if you include land acquisition costs into Musk's plan, the price advantage will shrink somewhat, but Musk's offer has a higher risk of overrun, or not working at all. Doing a proof of concept will help him get acceptance from people who are being asked to pay for it. Right now, you'd need to be insane to pour billions of taxpayers' money into an idea hasn't even been demonstrated as viable.
Yup. There's no place in sausages for offal. (unless you're making something like leberwurst, of course)
"Cheap cuts" is not the same as "bad meat". Stews and pies use cheaper cuts with more fat, but a good pie needs those to come from good quality meat.
The fat in these cuts holds much of the flavour, so lean cuts like chicken breasts or fillet steak are bad pie candidates. And yet, many home cooks end up using these because supermarkets don't stock anything except lean cuts, and then get disappointed that their stews or pie-fillings are tasteless and rubbery.
Find a butcher. There's still a few of them about, and they're cheaper than you think.
(And I won't have a word said against vegetarian pies either: a good pie doesn't need meat, it just needs to taste good)
Agree. Who thought that babies have such good motor skills that they'd be able to grip the sensor in the first place?
By the time a child can actually hold the thing reliably, they're old enough to shout for attention.
The most important function of a monitor is to detect if a baby stops moving, not to measure their temperature or heartrate, so you'll still need to shell out for a sleep monitor..
And as for measuring heart-rate in the first place, I'm reminded of George S. Patton's (paraphrased, and possibly apocryphal) quote: "just because you can measure something doesn't mean that it's worth a damn to know it"
"They still have to pay you for the time to do so, if you're an employee."
True, but in a development position, they're only paying for your time because they'll own the copyright on what you produce for them.
It's possible to make an argument that maintenance programmers don't fall into this rule, but you've then got to ask where the software to be maintained would have come from without copyright? ("Open Source" is the wrong answer, because that also depends on copyright to enforce the sharing of everyone's contributions)
"Copyrighting code has had a far worse impact on business development than any perceived benefits it has provided."
That would be the same copyright that allows you to work as a contract developer (without copyright, the work you produce for an employer belongs to nobody, and they don't have to pay you for it), or is it the copyright that prevents people fleecing FOSS projects and including Open Sourced code into proprietary systems?
Copyrights are not the same as Patents. I don't agree with Software Patents, but you'd have to be a fool to want to abolish copyright.
"British enough to serve in the RA". Milligan was born in India, but his father's place of birth was in Ireland, then (1918) a part of the United Kingdom. The family moved to England in the 1920s, so as a British resident, he was conscripted into the British Army in 1939 and served for six years.
However, Milligan was never entitled to automatic British citizenship, and once India became an independent country, his automatic right to a British passport was cast into doubt, and he eventually became stateless. After a long, long battle around his refusal to swear the Oath of Allegiance (out of stubbornness, and the not unreasonable argument that six years service in the Army should count as prior proof of allegiance to the Crown), he exercised his right to Irish citizenship: Ireland's law gave him an automatic right to citizenship as the child of a British citizen born in Ireland before 1922, but British rules required him to have been born in the UK too.
Spike always considered himself Irish, but never lived in Ireland. However, his grave carries the Irish inscription "dúirt mé leat go raibh mé broite" ... "I told you I was ill" (the council refused to allow the English version)
You don't need physical access to the laptop. All you need is access to the projector connector in the meeting room where its owner is going, and a small laminated card saying "Mac users: Sorry, but the projector doesn't hot-plug with Thunderbolt displays. You need to plug in the cable, then restart your laptop"
The "Bing" name replaced Microsoft's previous search engine portals in 2009... a year after that post about Stack Exchange.
Joel is good, but he's not that good.
A flame war is not harassment. Criticism is not harassment. This isn't a law against harsh words; it's a law against actions.
Forum arguments are not bullying: The forum posts are the record of the conversation, and the person being "abused" has the right of reply right there by posting a rebuttal. It's still free debate, even if it degenerates into abusive debate; also, reading the transcript, it will become clear who's being abusive, and who isn't, should the matter need to be taken further. Also, the ability to edit posts allows those people who type before thinking a chance to retract and apologise for their ill-judged remarks, and most forums are moderated to some extent.
This law isn't about someone making one or two nasty posts on a forum, it's about pre-meditiated, concerted actions to damage another person's career or reputation. "Cyber-Bullying" isn't someone getting annoyed because some other nerd thinks BSD is a better licence; it's someone nursing a grudge to the point where they're prepared to impersonate their opponent online, execute anonymous smear campaigns, spread malicious rumours and fabricate "evidence" for these. That, I hope you agree, is wrong, and should be a crime.
The reason for this consultation is this: while the current law makes running a smear campaign a crime, it does not explicitly mention the case where the instrument of that campaign is the Web.
The change in law is to make sure that if someone is accused of defaming or bullying online and there's sufficient evidence to bring a charge, that the wording of the law itself will not allow the offender to walk free on a technicality. ("But, Your Honour, when this law was drafted, publishing meant meant printing, and my client did nothing more than enter some text on a web server. My client had no idea that this information would be read by anyone")
There's no surveillance involved. If a complaint is made, the police will try to find evidence that will identify the suspect, just as in any other crime, but there's no suggestion at all of recording everything just in case.
People crying wolf? Well, this is a criminal offence we're speaking about, so offences must be reported in the first instance to the police. I don't know who'd go their local Garda station with a printout of a LKML thread where they've been called a clueless amoeba, but I'm pretty sure that if they did, the matter wouldn't go much further than the station desk.
The Dodge Dart is a very different car. It's wider, longer and --crucially-- much heavier than the Alfa, despite sharing its architecture.
Wait until Summer, when the 3-Series sized Alfa is to be launched: that will most definitely be sold in the USA. If you can't wait, well, there's always the 4C ;)
Nice to see a reviewer who gets why people buy Alfas. Soft-touch dashboard plastics are no substitute for having fun while driving.
@Robert - the car to (finally!) follow the 159 will be launched in Summer. There's supposed to be a Sportwagon version either at launch or shortly after.
I have a Giulietta (but a 2.0 diesel rather than this QV model -- diesel is far cheaper than petrol here in Ireland, and I have nobody to pay my fuel bills!). I had two 147s before that. I do believe that the "reliability" story is now as much perception as reality: my previous two 147s were completely reliable, and never stung me with "additional service items" in the twelve years I had them (5 years for the first car, seven for the second; both bought new).
I did, however, need to replace wishbones prematurely on the 147s: they're almost a consumable part thanks to the overuse of too-tall speed bumps in our towns. (Tellingly, I don't remember seeing many speed bumps when driving in Italy a few years ago).
Hardcastle, would you like to share the details of when you drove a Giulietta, what model it was, and what you compared it with?
I test-drove A3, 1-Series, Volvo V40, A3 Focus 3 and Golf 7 against the Giulietta, and bought the Alfa. Just because I'd owned Alfas in the past, I didn't automatically go for the new Alfa: I wanted to see what else was there. I liked the 1-Series, but not the horrible pedal placement, and the body. Volvo was too hard (a sin in a Volvo), the other three just didn't do anything for me (A3 in particular was utterly devoid of any kind of driver feedback).
I'm happy to hear first-hand experiences, but yours sounds like the usual "all Italian/French/Japanese cars are crap" nonsense that stopped me buying magazines long ago.
Any other text editor? Try SublimeText. Cost/benefit probably is in vim's favour, but it shows that there are alternatives.
I write my python and PHP projects in SublimeText; but for anything more than casual editing of my C++ and C# code, I'll end up turning to Qt Creator, Visual Studio or XCode. You'll notice I didn't mention Eclipse: that's intentional.
Picture 1: Twitter Inc is trying to coax you into giving them access to your Outlook.com email account. (They want to find out which of your Twitter contacts are also known to you via email)
When you foolishly click "Yes", Twitter's server starts the process of obtaining an authentication token from Microsoft that it can later use to slurp your email info.
Picture 2: Microsoft is asking you to log in and prove that you are the owner of that outlook.com account. Once you do that, you will be asked to give permission for Twitter to grab your contact list.
There is no business connection between Microsoft and Twitter. If you had used a GMail account as your Twitter account's email account, then Picture 2 would show you the Google signin process instead.
Diminishing returns set in once you get to the modern codecs, but there is not one of the sample images where the WebP image is of comparable quality. BPG is able to pull more detail out without the edge artefacts that mar the WebP images.
For example, on the Alfa Romeo 6C picture, http://xooyoozoo.github.io/yolo-octo-bugfixes/#vintage-car&webp=s&bpg=s , the BPG image allows you to see that the maker's badge reads "ALFA ROMEO MILANO", while the WebP image has obscured the word "ROMEO" entirely. (WebP doesn't have the horizontal green ripple below the top of the car grille, but that's the only noticeable difference).
At the other side of the frequency spectrum, BPG also excels is in its treatment of gradients and out-of-focus image areas, which is the weakest part of JPEG's block-based coder.
In fairness to the JPEG committee, they were primarily working to solve the problems of the photographic printing industry within the computational constraints of the late 1980s - the longevity of JPEG JFIF is a testament to how good a job they did.
For whatever it has to do with this, LexisNexis is not US-owned, but part of the UK/Dutch publishing conglomerate Reed Elsevier, and has been for the last twenty years.
Incidentally, a credit bureau (or credit ratings agency) does not "decide how much credit you get", any more than TripAdvisor, Inc. decides whether a particular hotel is good or bad. They ratings agency only reflects the experiences of those companies who have either lent you money or sold you goods and services on credit in the past.
In the same spirit, on iOS phones, you can use the settings under "Settings > Notifications > Do Not Disturb".
Similarly, for Windows Phones that have Cortana, look under "Settings > quiet hours".
Both of these are time-based (WinPhone's version can also activate based on your calendar appointments) but time-based switching is usually good enough to guarantee some time away from callers. Both also allow selected callers to bypass the block.
" If the original 500 had an engine of only 479 cc what does that say about the 1.4 or 1.7 litre engines of the new car?"
The FIAT 500 and 500C models are available with an 875cc two-cylinder engine (but producing 85 or 105 bhp, because time has moved on since 1957), and it suits the car very well.
To my mind, FIAT's revival of the 500 is the best relaunch of a historic model by a manufacturer (and to be clear I'm talking about the 2007 model, not the X or L, which are different members of a family of models). BMW's MINI was always much bigger than BMC's original Mini: the original car was what we'd now call a City Car (A segment), but the relaunched model was a "supermini" (B Segment). The new one takes this enlargement even further to almost be a "Small hatchback" (C Segment). And I'd rather not think about the Countryman or Paceman.
"The new cars are not in the same class as the originals"
FIAT has at least kept their 500 in the same size category as its original: a city car. The fact that it's bigger than the 1957 car is more to do with 50 years of crash protection regulations and raised customer expectations regarding cabin noise and comfort, than any desire to bloat the model.
(Incidentally, the 2001 BMW MINI and the 2007 FIAT 500 share the same designer: Frank Stephenson, who now heads design at McLaren. Stephenson left BMW long before the MINI facelift that ended up looking like a Chinese car maker's knock-off of the first one)
As someone who proudly owns an original Mini, I'm sure you understand the idea of owning a car because it has character, even if it might be more expensive and less practical than the alternatives.
If you want the "car as a cheap utility" side of the old 500, FIAT will sell you its descendent: the Panda, a superbly useful and reliable car at an affordable price. The 2007 500 isn't that, and can't be that, because what customers want from a "utilitarian car" are so much higher now: hence the Panda's boxy space-maximising shape. The new 500 model was about recapturing the emotional bond that people had with the old 500 - and with the aforementioned twin-cylinder engine, it's got a lot of that same fun to it.
The 500 L and this X are not relaunches of any old model (although you might make a case for the 500L being a spiritual successor to the 600 Multipla, but it looks completely different). These two new cars are more closely related to each other than to the 500, both mechanically and in terms of size. "500" is now a sub-brand covering what you might call the "want" cars (heat over head); the FIAT brand will deal with the "need" cars (head over heart) like Panda, Punto, etc.
My own thought is that, now that cinema presentation is mostly digital, there's no need to stick with the same frame-rate all through the feature. Certain types of action, particularly close combat, would definitely benefit from a higher frame rate, but that doesn't mean that the rest of the feature should be similarly high-rate. (Cinema projectors project each 24fps frame multiple times as it is, because this minimises the perception of flickering)
I just wonder if it's possible to do this change of display-rate in a way that won't provoke jarring changes in a viewer's perception.
The answer is that cinema features are filmed at twenty-four frames per second. That's enough for the illusion of movement, but not for smooth panning. Cinematographers and directors go to great lengths to distract the viewer from this phenomenon when tracking their subjects.
The reason you're seeing it on the F1 Now feeds is different, and could be display-rate mismatches between the source stream (50fps if it's a British broadcaster) and your tablet's display (60fps, as most display LCDs are) - you'll get a kind of "6:5 pulldown" - six displayed frames fed by 5 frames of input material: one is doubled. The streaming server could also be dropping alternate frames to save bandwidth, thus reducing the time resolution to 25 images per second.
The rest of this is long and a little rambly, so you can stop reading here.
This low frame rate is a legacy of the technology that was available in the 1930s. At that time, the available mechanisms could not pull a new frame into the camera any faster without tearing or slipping. (24 frames a second isn't fast, but remember that the new frame is pulled up into position in the tiny fraction of a second that the projector's shutter closes, so the mechanism needs to be much faster).
Raising the frame-rate of cinema presentations is possible (has been since the early 1970s), but experiments with the viewing public showed that audiences do not like the effect of high frame rate cinema: it looks less "real" than the slower rate, and the reason has nothing to do with technology:
Traditionally, lower-budget TV drama was either broadcast live or captured on videotape, because video is far, far cheaper than film production (not least because videotape is reusable and takes can be reviewed instantly). Video did have one advantage over film, however, which is that it has a time resolution of 50 or 60 fields-per-second, which makes motion, and especially panning, much smoother. Higher-budget TV shows were still filmed on 16mm or 35mm cinema stock, at 24fps, because this allowed exterior and interior shooting (cheap drama was studio-bound; video cameras were too at first) and it got around the issues of selling your programming to a station with an incompatible video system. A desirable side effect is that these telecine presentations looked like "a real film" rather than "a TV show".
But, thanks to the historic use of video for cheap TV drama, a high frame rate is now almost indelibly associated in the audience's mind with low-budget videotaped TV shows or live events. Simply displaying a film feature at a high frame rate will suggests the same low-budget, "unrealistic" experience to cinema viewers, or will make it seem like they're watching a live broadcast - by making it look like the actors are "right there", it also makes it look more like they're "just actors", thereby destroying some of the suspension of disbelief.
The recent release of Peter Jackson's"The Hobbit" was offered at both 48 and 24 fps, but audiences responded that the 24 fps showing was "grander" and more "epic" than the 48. The 48fps was reported by contrast as being "like watching TV", and "fake". The 48 fps presentation was either dropped for its sequel, or very few cinema owners took it on.
If you've got a TV with motion interpolation, as most modern LCDs do, watch a BluRay source of a big blockbuster film first at its native 24fps, and then with all of the motion gubbins turned on. The latter might look more "real" but it also looks less "cinematic".
And in the same effect from the opposite side: pretty much all TV productions are now filmed digitally, and broadcast at 50/60 fps, but big-budget drama is either shot natively at 25 or 30 fps or is de-interlaced down to a 25/24 frame-rate in post production to a achieve a "filmic" look.
Like it or not, 24 frames per second is considered as a sign of "quality" by the viewing public.
Video gamers spend more time in front of "a TV" than most TV viewers. In this market, the immersiveness of a bigger display panel will beat any colour or contrast benefits of a tablet.
I wouldn't put much store by Colour reproduction as a deal-breaker either: bear in mind that 8% of males (and 0.5% of females) have colour-deficient vision - about 40% of these aren't even aware that they have such a condition until tested.
Feel free to swap your IT industry salary for the matching percentile on the scale of what songwriters earn.
I definitely wouldn't.
Just to clarify: I wasn't working at Apple in the Jobs/Scully era, and I wasn't in California - I'm too young and too Irish for that. All of those stories are other people's, and you can read them at the folklore.org site if you've an interest in the PC boom times of the 1980s or Apple in particular.
I did experience the slow death-spiral of Apple of the 1990s, the return of Steve Jobs, and the effect that it had on the company, and in my particular case, the in-sourcing of all tech jobs to Cupertino that resulted in what a friend had already described as "being made offer I could most definitely refuse".
I'd also note that the Apple that was dying on its feet was actually a fun place to work; and unlike today's industry-dominating behemoth, when you went home after work, you weren't still at the beck and call of sleep-deprived stress junkie in Cupertino who's forgotten what timezones are.
As far as I can tell, he's not saying he would locate in Ireland, but rather if he had a global business that was looking for a headquarters, then the choice would be between Ireland and Singapore.
In Scully's own words, he felt his biggest mistake was not that he fired Jobs, but rather in directing Apple to transition to Motorola/IBM's PowerPC RISC architecture, rather than spending the same effort to adopt Intel's CPUs instead. In hindsight, and considering that Apple eventually moved to Intel ten years later, it's probably true, but at the time, I remember everyone saying that RISC was the future, and Apple was going to speed ahead of Microsoft by jumping early. As we know, Windows 95, the explosion of commodity PC chipset suppliers, and the poor performance of Apple's emulated 68k software on PPC gave the lie to those predictions, but at the time you could see why Scully made the decision he did.
I think Scully was right to fire Jobs. Yes, the Macintosh became a success, but it was a Macintosh line whose development spending was under strict control of Scully. Had Jobs been still in charge, I do feel that the product's costs would have spiralled out of control in a search for an impossible "perfection" just like Lisa had.
http://www.folklore.org is a great site for the inside story of Apple during the first Jobs reign. You get a feel for a Jobs who was admired and feared by those he worked for, but also a man whose ambition was not yet tempered by failure. By firing Jobs, Scully could take some credit for Apple's later success when Jobs was hired back by the underrated Gil Amelio.
After Scully, Spindler did nothing to address the reality: that the WinTel duopoly driving down prices was killing Apple's business while the company's R&D was draining away into a hole named Copland, whose ship date was slipping by about 5 weeks every month. Amelio shored up the holes, killed the money-pit projects, and chose to bring in the only person who could herd the various gangs of cats that actually did things at Apple, and that person was... Jean-Louis Gass—no, it was Jobs, of course. (Although we all thought Gassée's BeOS, not NeXTStep was going to be the shoo-in as the replacement for Copland).
(Still, Amelio's description of standing on the bridge of a crashing supertanker where the captain's wheel does nothing is a perfect summary of how bad Apple's internal management structures were - although I shouldn't complain too much, as I was hired at this time as part of a major group expansion that was done without the knowledge of anyone outside the building)
In his first stint in charge, Jobs had always spoken of a mythical factory where "sand goes in one end, and computers come out the other", but it says a lot about his later, more mature, approach that Jobs's most successful products, iPod and iPhone, were outsourced as much as was possible. Do what you're good at; leave others to do what they're good at.
"stupidity and laziness on the typical end users part"
There is such a thing as user stupidity, but it's very rare. Mostly, problems are down to developer stupidity in assuming that everyone who will ever use your software actually gives a shit about software, or computers. They pay money to developers precisely because they don't want to give a shit about this stuff.
To you and me, it's obvious that you need to explicitly disconnect the iMessage service, because you and I understand how it's likely to work, and can then ask ourselves "what would happen if my number is still in the lookup tables when I no longer have this phone?".
Apple's software should have automatically de-registered the device from iMessage when the holder of that device does a system wipe, or when the SIM in the device changes (I suspect it does the latter already, so why not the former?). Then, somewhere in Cupertino, a database process would spot that a given subscriber number no longer has any iDevice UIDs attached to it, and thus delete the iMessage record. That would be the simple, obvious and transparent process. Personally, I'd also age the records used to determine if a number is still in the service, so that a UID mapping for a device that hasn't checked in in a long time would be suspended until it reappears: that would [eventually] fix the problem of a phone falling overboard.
Apple's solution is a kludge, but its one that benefits them by making the problem look like it's not theirs ("Hey I can't believe your crappy LG/Lumia/Samsung can't even receive SMS - look at these I sent you yesterday, and you never got any of them. Man, you should have stayed with Apple")
It doesn't matter how it worked technically. Network TV broadcasts are provided under the condition that unauthorised rebroadcasting of the content you receive is prohibited.
Aereo was a system for rebroadcasting of network TV. (And, yes, parallel set of unicast streams is a broadcast)
It was a service based on the old saying that forgiveness is easier to obtain than permission. What that saying neglects to mention is that forgiveness is only one of the options you'll receive after your offending actions, and there's no guarantee you won't get punishment instead.
Using Xamarin would allow them to write the iOS, Android and upcoming Windows Modern apps in C#.
I suspect those Windows versions will arrive at the same time as the next Office release for Windows. Office 16 is due in Spring 2015.
Run lots of applications, and starve the bastard of CPU time...
You've hit on the Achilles' Heel of FOSS projects. There usually isn't a business requirement to meet, so the code is free to go wherever its devs want it to. That's both good and bad, depending on whether you're a developer or a user...
The other problem I have with this is that using github as a source will over-represent the work of very inexperienced programmers. These days, everyone's "first code project" ends up on github. If the developer's own inexperience is the major limiting factor, how can you accurately judge the effect of the language they've chosen on the code quality?
I use both OSX and Windows, and have to say Windows is now far less RAM-hungry than OS X is.
I can understand the reviewer's mistaken comment about OS X being "leaner" than Windows, because once upon a time this was indeed the case (10.5/10.6 versus Vista). It's just that since then, Apple's middle-aged-spread has coincided with Microsoft's serious performance improvements in Windows 7 and 8.
I don't think any reasonably-priced infrastructure could deal with the sudden influx of 50,000+ devices (a modest average of 2.5 per attender; it's probably more like 3.5) all opening twenty or thirty connections to various HTTP-borne services.
And, because this is the Web Summit, I suspect many of the cries of "network is down" are really "network is up, just slow, and I'm too impatient to let the sessions establish"
... is one area where Bing definitely exceeds Google. Google Translate looks like it's been abandoned, and still has a nasty habit of randomly inverting the sense of German sentences that use "es sei denn" (="unless" in English), or simply transliterating Japanese hiragana rather than translating the words written in them.
Google has a very good index, but that doesn't mean that everything else they do is equally good.
"Much if not most of the blame for it lies with the West and that is a subject of another discussion."
Bullshit. Blame lies with Putin and his band of crooks who have stolen Russia from its people. The Russian people don't realise that the root of their problems is not gays or Ukranians, and that Russia was already a great nation, is actually respected as such abroad, and doesn't need a circus strongman running the show to "prove" it. Putin is the abusive husband who convinces his victim that only he really loves them, and that they're worthless without him. The electorate really think that he's defending Russia.. but from whom? The only people oppressing Russians are their own leaders.
"The West" would rather see Russia run as a stable, democratic nation with a functioning economy. For the security and happiness of its people, and also "ours".
Didn't you hear? When you can deploy instantly, you don't need to test anymore. Just unit-test your scripts, and fix bugs as users report them.
A friend works for a company making and selling a class of online software engineering tool (I won't say what class of tool, but it's not testing), and they do not formally test any of their web backend code. My friend, who does mobile client dev, thought they were joking with him when they said this, but no, they just run the backend in constant firefighting mode.
And then, when the backend fix breaks the iOS client, they don't understand why users are annoyed that they've no service for a week while Apple approves the updated client...
You consciously remember parts of Ayn Rand books? Wow... I managed to forget most of the one I read, but every so often, a little comes to the surface, like some kind of turd that won't flush down the toilet.
The real sin is in using the words of someone who by all accounts was a decent enough guy, who said we shouldn't be hating and killing each other quite so much, to back up exactly the sort of hatred that he was so keen for us to quit engaging in.
The part of the Bible you're quoting also contains lots of other rules for life that nobody much cares to follow anymore: It's funny how that one passage that lets you hate people you don't know is The Genuine Incontrovertible Word of God, and the one about wearing mixed fabric, or not eating bacon butties are somehow only guidelines that are open to interpretation.
(I'm not gay, that's the luck of the draw, but I have a number of friends who are, and I've heard stories through them that would make you wonder about human nature.)
These are definitely not my thing, but the UV monitor sounds useful to this Celtic-skinned reader, mainly because UV index is the one thing that your body doesn't tell you ... until the damage is done.
(That said, I'm sure jake has trained himself to develop extented-spectrum vision, and can see from the 5Ghz band all the way to gamma rays, but for us mortals...)
Incidentally, it looks like the accompanying app is being launched on iOS and Android too, not just Windows Phone (http://www.microsoft.com/microsoft-band/en-us - about halfway down).
If you can find a way of measuring blood sugar without a pin (and the unstable, one-time-use enzyme that checks the blood), you should clear October from your calendar... you might need to go to Sweden to collect a medal.
The only thing preventing the development of an implantable insulin pump is that there is no suitable method to measure blood sugar.
Not quite: you can always understand the meaning of written Chinese messages - this is as expressive as any other written language, and a lot more compact than many. The problems arise in verbal communication.
Mandaran Chinese has a very small repertoire of sounds, compared to other languages, so the number of homophones is very high: cases like the English "red" and "read" are much more commonplace. The fact that Mandarin is actually a second language for most of the population just adds to this confusion.
So when speaking face to face, it's not uncommon for Chinese to make a "writing" gesture of part of the word that they're saying, in order to disambiguate a word that sounds the same. On the phone, this option is not available.
Still, it sounds like someone at Apple inventing an application for this watch, rather than them inventing the watch to fill a need.
Did you not catch the bit about how people who make judgements based purely on specs without looking at the product are easily misled? The screen is superb, despite being 720p, and the phone does most things very quickly despite being on a "slow" System-on-chip.
The price quoted is SIM-free. What "flagship" mobile phone sells for £300 without contract or lock-in?
It's what it is: a good value mid-range phone whose performance is better than the spec-sheet numbers would suggest. Basically, the sort of "crap on paper, but really good in use" kind of phone that Nokia have always done well. The one thing they've never really done well was "flagship" phones...
Rivals, yes, but not from other music stores. Today's big "casual music listener" demographic is still there, but they've grown up in an era where you just don't pay for music: YouTube has given them a free, personalised, All-Requests, All-the-Time version of MTV that means you never really need to buy that song they like.. just keep listening for free until you go off it.
Music fans - people who enjoy music as an art form, and will sit and do nothing but listen to music - will still pay, but they are a tiny part of the music market. The casual listenership has almost completely fallen away, leaving only the core of people who actively enjoy music.
On the other hand, there's still money in back-catalogue; particularly in classical and Jazz - but they're not really part of the same "Music Business" that the tech media are so obsessed with, despite often being the same companies.