I was under the impression that the Jet database engine was deployed with the Windows operating system and has been for years.
Yup. It's part of MDAC which has been part of windows since Windows 2000.
320 posts • joined 12 Jun 2009
I was under the impression that the Jet database engine was deployed with the Windows operating system and has been for years.
Yup. It's part of MDAC which has been part of windows since Windows 2000.
I have published source without including or mentioning a license and I just leave the users to figure it out, but that is probably not the best way to handle that.
Definitely not, it may also discourage people from using it. I certainly wouldn't in a business environment.
Basically, it boils down to what you consider to be important. There's various opinions but it boils down to how you wish people to be able to use your code.
The BSD (and similar) licences are fairly permissive, they basically allow you to do what you like with the code so long as you retain the copyright/licensing notices. (ie, give credit where due), the argument for this kind of licensing is that it's rather pointless re-inventing the wheel, so why not make your stuff available. An example would be the BSD TCP stack, why rewrite a complex and error prone piece of software when there's one available for you to use and potentially improve. If you choose to give back your changes, that's all well and good, but not the main thrust or intent of the licence. This is the kind of licence used by people like Apple, a lot of OS X is based upon FreeBSD; quite legally, and in the spirit of the licence too.
The other side of the coin is the GPL style licensing. This is more about the freedom to understand and share the code behind your product. You're free to add and change the code to your hearts desire, however you must (in theory) make your changes available for all, so they can be inspected, reviewed and adopted by others; also it in theory prevents you from distributing software that can't be understood or examined as you have to make the source code available. This is the approach taken by Linux and other projects; although this intent has been subverted somewhat by larger companies. For example, Google don't release any of their internal system code as they don't distribute binaries, you just /use/ the software, or with Android they're moving more and more code into proprietary components so in effect having their cake and eating it too. It's this kind of behaviour Redis are attempting to address.
Most of the OS licences fall between these two ends of the spectrum, which you choose is as much about your views on the use of your software as it is with anything else.
If you want people to be able to use your source code, but you're not bothered if they make any changes public then the BSD/Apache licences are the way to go about it, otherwise the GPL and variants are probably where you should be looking.
And it's the internet, so whichever one you chose someone will probably tell you you're wrong. :)
In an email to The Register, Paul Berg, an open-source licensing expert who advises the Idaho National Laboratory in the US, suggested this is not so much a move to help open-source developers as an effort to put the work of developers who collaborated on the Redis Modules under the control of Redis Labs.
How does this work? If contributors have signed away their copyright, or given redis re-licensing rights then sure, but you can't just take my work and re-licence it without my say so. Have they had buy in from all contributors or are they one of those companies that makes you sign an onerous re-licening agreement when contributing?
So really unless you are doing something requiring a lot of reliability, why bother with geo-redundancy, it's more ballache than it's worth in many cases.
This. Sometimes being 'good enough' is really 'good enough'
Also, best practices have to be married with some level of competence too. I remember a previous company I worked at had their primary B2B site running on a solaris box in the server room with an apache front end hosted by a 3rd party. No real redundancy (other than the text box that could be repurposed if needed) however it pretty much just 'worked' for most of the time as the servers were setup correctly and well maintained. (Sure, we were at risk of serious outage if a fire or flood occurred, but still).
Years passed and we got acquired by another company, they came and gave us the fancy presentation on their shiny new data center. The site was moved to this wondrous place, hosted across 6 servers with load balancing and all the 'best practice' boxes well and truly ticked.
Unfortunately it ticked boxes, but perhaps wasn't set up quite right, that site had more downtime in the next 3 months than it did over the previous 5 years. (Mainly due to randomly crashing servers, sticky sessions and load balancers configured to blithely ignore if the servers were actually up and responding or not...) Whilst the site was now more 'disaster proof' our customers were much less happy.
Sometimes working is good enough.
(Ironically, we did have full multi site geographically separate DR for our core systems, it's just this new fangled web thing had appeared in the meantime...)
The 'chosen for price not anything else'* provider of my home SSL cert sent me an invoice that said 'Paid', which was very confusing when my SSL cert never got renewed. Turns out my credit card had expired and I'd forgotten to update it (well, ignored the email as I thought 'I'll do that when they bug me at renewal date').
Never once got a communication saying payment was declined, but hey ho, turns out Lets Encrypt wasn't much faf at all... their loss.
* shame really, as they were easy as 1....2... 3...
Funny how apple get a free pass, and this undoubtedly go pretty much unreported by the mainstream media, who are all keen to remain on the payola train of free kit, and lavish all expenses paid product launches and PR events,. All paid for of course by the brain-dead cretins who paid £300 over the true value of their phone because it's got a fruity logo, BECAUSE the press told them how great it was and buried anything that suggested otherwise.
Wow. You do seem overly angry by what is essentially consumer electronics. Take a step back and chill for a bit; you don't want an aneurysm before you're 20.*
Do you really believe this btw? Do you really think people base their purchasing decisions on what they read in the press? Year after year?? Is it not possible that maybe some people just have different criteria than you? Or are you the only person who's noticed the glaring holes and if only you could make people see??
It's a vicious circle, the more braindead consumers continue to buy £1000 phones that cost £150 to make, the more money Apple have to spunk on keeping the press on side, and so it carries on...
Okay, so Apple do make a decent profit per phone, but that figure you're quoting is the bill of materials, it'll only get you a pile of expensive looking components, the phone costs much more than that to make. Unless you think marketing, packaging, software development, testing, distribution, support (returns/repairs) and the rest all come for free.
As Ringo says... Peach and love... Peace and love....
* On a brighter note, if you're that angry about consumer electronics your life is probably in general pretty much okay, and probably better than a lot of the world, so there is that.
If price is less of a worry then the Intel NUCs make great Kodi boxes. After having a few slow but okay kodi boxes based on Pi's and the like I took the plunge and got a NUC, as I wanted something with enough oomph to do x265. Whilst it turns out the intel drivers aren't great for x265 the box is powerful enough to do it in software and as a small quiet media box it's absolutely great. Works well over wireless, is unobtrusive in the bedroom, has built in IR out the box so worked with the remote I had... marvellous.
IPv6 supports NAT and Dynamic IP 100%, people telling you differently are spreading fake news.
Isn't that the point? I must confess I've not read up on IPv6 for a while now, but the impression I got last time I did the reading is that I'd have all my internal devices on the private range (terminology??) and then use NAT to translate the first 64bits (or whatever size subnet the ISP gives me) to the external range.
Then I'd have fixed internal IPs and bidirectional NAT would still allow everything to be externally addressable if I so desired as there's a one to one mapping with the last 64 bits.
Or has all this changed or I misunderstood?
Seemed like an elegant way of having a dynamic IP and publicly addressable stuff.
(Obvs there'd be a firewall in there too so you'd have to explicitly allow access, but still...)
The guy could not understand the concept of a linked library only put in one location once on each system deployment and then for each version of the application a symlink to the library in the application home directory....
Okay, I'm probably missing the subtle reason for this, but surely you could just configure the linker to look in that location?
Did Ubuntu ever use it? I thought they were doing their own thing called Mir. Which they have since abandoned?
I've not made heavy use of it, and I will confess to a real dislike of X, but wayland seems to be pretty responsive the few times I've tried it. (I always find vanilla X feel like there's a slight lag between me and the PC).
Also new is support for the Wayland Architecture, a popular remote desktop tool.
Do you mean the Wayland display server stuff? If so, yay! It's getting annoying not being able to use Wayland under VirtualBox, might give Workstation a look.
[...]So he left an anonymous voice message along the lines of "Hello - You're an annoying, inconsiderately loud bastard who won't shut up".
What a cunt.
When said annoying person played that one, the message got home and the problem was (amusingly) solved!
Yay! Glad the problem was solved, I love group shaming people too, makes me feel like one of the herd. I hope they pointed and laughed just to make extra sure.
I have to disagree. Ideally the screen should....
This is an Apple related article.
This is The Register.
Common sense and reason have NO PLACE in these comment pages.
When will people learn.
(Yes, I'm basically saying 'You're commenting on it wrong'.... <hangs head in shame>)
Can someone please explain what backhoe-proofing is
I could draw you a diagram, but it crosses the line of taste and decency; and may even be illegal in some states. (You watched it! You can't unwatch it!)
IMHO, the involvement of these sharks in a business is as good as the Grimm Reaper telling you that you are going to die.
It works the other way too, businesses don't have to take the money.
A problem particular to Linux is that the system-call numbers are different on different architectures (notably between x86 32-bit and 64-bit). I'm sure it's not an unsurmountable issue, but fixing it does require the final-stage compiler to know a little more about the bitcode than a straight compiler would.
This would only affect static binaries and the odd thing that actually make system calls directly. Most stuff links against libc so wouldn't be an issue.
Why have neither Intel, AMD, ARM or Samsung developed a similar approach, or bought this particular technology in from academia? You can argue that Intel, and to an extent ARM are victims of their own success and would dismiss it as "not invented here", but AMD could certainly do with a technology break out.
Do those companies know something that MS don't?
Do Samsung have a history of innovating in the CPU space? I get they're a core ARM licencee but they do more 'mass market' stuff don't they?
ARM are stuck with ARM, and AMD doesn't have the thing that Microsoft have lots of... spare cash. (Remember, whilst we don't care any more, Windows is still the dominant computer OS, and still makes Microsoft a lot of money).
Companies with lots of money can spend it doing R&D, hence this, and why Apple do their own CPUs and GPUs now.
I admit the tech sounds interesting, but this almost sounds like using microcode as an instruction set, the same *kind* of way RISC made CPUs lightweight but not necessarily faster (ARM still catching up to x86 tech).
Isn't this kind of backwards? Remember ARM were an order of magnitude faster than X86 when they debuted, it's just the focus switched to low power/embedded when Intel's focus remained on high performance.
Microcode was the way of getting the benefits of RISC on the horrific X86 instruction set.
It wasn't until the advent of OOO and superscalar stuff around the P5 that the Intel stuff really took off speed wise, by that point Acorn was almost dead and the market for ARMS were phones and PDAs and the like. Ironically the 'everything can have a conditional flag' approach ARM took with their instruction set didn't lend itself well to the OOO and superscalar stuff.
A modern ARM core, especially a 64bit one tunes for speed can be quite quick too. See the CPUs coming out of Apple for example.
This sounds like a laudable goal. Good luck!
"The recent ZTE incident made us see clearly that no matter how advanced our mobile payment is, without mobile devices, without microchips and operating systems, we can't compete competently,"
I expect that will be the main issue, as I understand it OEMs are prohibited from making any device running AOSP if they wish to keep their Android licence. This is why Amazon had to turn to relatively small manufacturers for the fire devices.
Mind you, this was a few years ago now, recent events have probably made OEMs less eager to comply and larger players are starting to realise the control they've ceded so perhaps the time is now.
M$ is cancer, everything it touches dies,
In the 90s I'd've agreed with you, nowadays I'm not so sure.
worst company on planet.
I would suggest you're perspective is slightly warped here. I'm not even sure they're the worse company in IT any more. (*cough*Oracle*cough*)
I have no real view on any of the companies I linked to but was rather trying to illustrate that I don't think anything MS have ever done carries a significant environmental impact, death toll or similar...
Android -> CyanogenMod (embrace) -> CyanogenMod (extinguish) as M$ "sponsored" Cyanogen Inc to destroy CyanogenMod
Blame Google and the 'migrate everything into play services' for that one. If you've not got the Google infrastructure keeping a decent working fork of AOSP these days is pretty much impossible. (Or at least one that has enough traction to make itself financially viable; rather than be a curio for a handful of developers).
Are all the people leaving github going to stop using git too?
As the work on this filesystem came about because of the major speed improvements made by MS when they adopted git as their system of choice for the windows codebase and the issues they encountered. (They're probably quite unique with their codebase size and history length).
As someone who remembers when MS were basically unable to use anything not developed in house (and the evilness that went with it) I find the new more modern MS much more refreshing, you wouldn't get things like this or the Linux subsystem under Balmer or Gates.
They're still a large software company, and they're still money driven but they really don't seem to be the MS of old. (Which in some ways is really weird, they still have the desktop monopoly they've always had, but people don't care any more; still makes them pots of cash though, PCs in companies aren't going away any time soon).
It's okay though, the anger and hatred transfers nicely to Oracle these days. ;)
This blog is worth a read.
You know this means I have to go back to therapy, right? (oh dear God, why! why!)
Just came here to post pretty much that.
Some things aren't worth joking about... or reminding people of.....
I actually worked with SCCS where someone had edited the .rcs file (was it??) so we actually had different record of history that didn't match reality.....
Now WHO'S LOCKED THE FILE???
<goes for a lie down>
If they could recover it and won't, then they should be prosecuted for littering.
If you actually want to tackle the crap in the oceans problem, there's much lower hanging fruit you could aim for. Single use medical waste for an example.
Dropping lumps of toxic waste into the ocean simply isn't acceptable any more. Recovery should be the new legal minimum
As the only rocket company that has repeatedly flown reusable boosters I would imagine SpaceX would be very happy with this.
Basically, the desired condition is what existed before World War I: only the civilized democratic nations have any appreciable military force;
Which were the democratic nations that existed before WWI?
Britain wasn't (No votes for women)
America wasn't (No votes for blacks)
Norway looks like it was (just)
Was France? Germany?
Linux and BSD were once the only places you could go to avoid the OS Snooping. No longer.
I’m pretty sure Debian has had data collection for many years.
Both of them need to keep Alex Jones frothing at nothing at all.
However I bet Matt Baker doesn't give a shit.
Weirdly, I could see this as a segment on The One Show, probably presented by Giles Brandreth, sandwiched between a bubbly interview with Sonya and a short film from Jay Raynor about pies.
I get The Reg is irreverent, and the red top of the IT world, but for some reason the constant use of 'design blunder' to describe a subtle interaction between disparate parts of a CPU that went unnoticed for well over 20 years seems a tad bit disingenuous.
I know we now live in a world where all commentators are perfect and mistakes are to be vilified but still...
Contributors to GPL projects have a right to expect reciprocity in the form of source code.
It's not contributors though is it? Isn't it just people who've bought Teslas? Doesn't the GPL give you the right to the source for the binaries you've received; not a blanket requirement to publish the source for all and sundry. (So you can modify/change the software as you see fit). Of course this doesn't then mean they can't distribute the source as they see fit, but it is a subtle distinction.
Except BSD, perhaps, but they didn't choose that for whatever reason.
Not perhaps. The BSD licence doesn't require source code disclosure.
The reasons are well documented too, it's just a different ideology.
Here's one example:
[...] As stated above, we want anyone to be able to use the NetBSD operating system for whatever they want, just as long as they follow the few restrictions made by our license terms. Additionally, we don't think it's right to require people who add to our work and want to distribute the results (for profit or otherwise) to give away the source to their additions; they made the additions, and they should be free to do with them as they wish.
Personally, I don't see why the BSDs don't get more use in stuff like this; it would prevent companies having the expense of GPL compliance, which once you get above a trivial product size must become significant, and is often overlooked.
I don't really follow the GPL ideology myself, however I do feel quite strongly that licences should be followed, and these days it's really not acceptable to plead ignorance or naivety.
While it's understandable that the enormous value of music ranging from Ella Fitzgerald to Elvis Presley to David Bowie should be protected,
Can someone explain this to me? Why is it understandable?? I'm confused. What 'value' are you protecting? IT's not the cultural value that would be realised if these were in the public domain?
I get that if I create something it should be protected for a time so I can benefit from it and others who didn't contribute are prevented from benefiting without my say so.
But why should someone receive protection long after their death? My descendants won't receive payment for the software I write? If I were a cabinet maker, my descendants wouldn't receive payment for resale of my work after I die? despite it potentially having taken a lot of skill and effort on my part??
I mean I'm over 40, likely over half way through my life, yet the work of someone who died before I was born is still protected by copyright? How is that beneficial to society as a whole, or indeed fair?
Microsoft already have the top-tier ARM architectural licence AFAIK. [...]
Indeed they do if this random internet news site is believable. ;)
And lets not point out the issue with "do an Apple and control the hardware"...in that they use ARM designs as well.
No, Apple control the hardware, they use the Arm instruction set, not Arm CPUs.
Like MS they have an architectural licence, which means they're allowed to design their own CPUs to implement the Arm instruction set, rather than designing SoCs using an Arm CPU core, such as the Cortex, like most Arm licences do.
There's 2* levels of Arm licence, core and architectural. A core licence allows you to take an Arm designed CPU core and add your own IP around it to create a SoC. The (much pricier) architectural licence allows you to design the CPU portion too; it just has to pass the Arm validation tests.
This is why Apple's CPUs are so quick compared to the competition.
* Okay, that's not quite true.
What he neglected to mention was that German workers are paid so well because they held multiple strikes to force Amazon to actually pay a decent wage, and were supported by strong national employment laws that allowed them to do so.
I've highlighted the important bit.
Asking most people to give up money (be they rich or not) is often a dead end. Society as a whole has to agree.
Mind you, how you effect this change when many of the rich are also in power is beyond me.
Every time you use [something vaguely useful or life saving]
[it was probably invented by someone from Scotland]
(Seriously, it's amazing quite how prolific the Scottish are at coming up with cool stuff).
TV, marmalade, the coma scale, the tractor beam, Grand Theft Auto, the list goes on...
The last phone you had had a sucky battery after a while, so you went out and got a more expensive version from the same manufacturer?
Are you really surprised that not everyone makes purchasing decisions based on the same criteria you do?
Truly, one born every minute.
...it always seem to be the affluent ones too.
It anoys me when I have to walk past a pub that the door opens direct onto the street and threes a half dozen or more people vapeing and smoking, standing all across the pavement. Getting past means walking through a cloud of noxious fumes. its then not my choice to be intoxicated by these clouds of christ knows what.
1. Hold your breath
2. Cross the street
3. Learn to effectively rate the risks of things.
the thing is, I know I am not allergic to penicillin so if I have a minor injury that looks a little too red around the edges, I should be able to go the pharmacy and buy some Amoxicillin over the counter.
Those of us on monthly salaries are paid to do a job, our contracts rarely include overtime (but may allow for bonuses or extra time off). There's nothing USAian about that (I'm not from the US). This person seems to have a monthly salary but wants overtime as if she were an hourly worker.
The problem is this idea has become culturally acceptable, and people defend it.
I'm a salaried developer, so no overtime, however project cost estimations include my time with a per hour cost, so it's obviously a metric that exists and is how the company budgets my time. Therefore unless I mess up I work the hours I'm contracted to; as they're the hours I'm paid for. My company doesn't 'care' about me, I'm a resource used to produce a thing. I'm not doing my company a favour by turning up to work, I do it because they pay me.
The reality is, sometimes things take longer than estimated, which incurs extra costs, some of which are peoples time. Otherwise you're just making accountants look good.
Why is there this assumption the extra effort we put into being skilled makes us value our time less? If we carry on perpetuating this idea that we should just work extra for free people will keep taking advantage of it. In 10 years time your company won't care, if you've moved on you may not even remember, but your partner or children will remember that time you weren't around because you were working late, or that school play you missed because... work.
Your employer isn't a person, it's not you're friend, it's a business that remunerates you for your skill and time. We all seem to value the first, lets value the latter a bit more too.
In my experience people with monthly salaries earn more, per hour, than hourly waged staff, and part of that is because we are sometimes called upon to do additional work.
Which you should be paid for. You're paid more because you're higher skilled; not because you're willing to work for free. (Or you shouldn't be).
Frankly, I'm sick of this idea in IT we should just be happy to be taken the piss out of because it's somehow 'professional'.
Anyone who genuinely thought Apple would axe their latest greatest iPhone has been sucked in by the marketing trick.
It's a pretty good marketing trick though. I find Apples 'guerilla marketing' (for want of a much better term) fascinating. Take this story for example. Where has it come from, where did these rumours start? Apple is famously tight lipped about things like this, only revealing what sales info it has to do by law in financial calls. Also often not breaking down figures into specific models; yet our unnamed source is telling us that the iPhone X isn't selling well? How do they know? What were the sales targets? (It's a Veblin good, I expect it's targets weren't high compared to a mid range model, how is it selling in relation to the 8 for example). Suffice it to say, I bet few outside Apple actually know how well or badly any model of iPhone is currently selling.
So where did the rumour come from? It seems like a lot of these Apple stories are self generating. Apple, being the large company it is, generates page views and marketing research sales, so analyst speculate, and rumours start. Then the self citing internet kicks in and it snowballs. It works too, this article, like most Apple related attract lots of comments, so even more views.
Combine that with the weird myopia that seems to occur around Apples phone product line and it makes for some very interesting watching. It almost seems a given when reading Apple articles that they make the iPhone, but they don't they make a whole range of iPhones. There's a comment further up this page suggesting Apple should make a cheaper, no frills, iPhone X. They do, it's called the 7. Which they still sell. Want a cheap (for Apple) iPhone, then buy the SE. Yet it often seems in the media that these phones don't exist. When in reality I expect they make up the bulk of Apples sales. Think about that, Apple have basically managed to have their cake and eat it. They're perceived as a high end luxury brand selling a single expensive product, but they're not, there a higher end consumer electronics company selling a range of products from the eye watering to the fairly reasonable. They've managed to extend further down the market but without the perception that they have. A lot of companies would kill to be able to pull off this trick.
When all's said and done this story is just 'From this date you must use the latest SDKs and ensure you support the latest devices', which isn't unreasonable, or that unusual, yet it's about Apple, so you get millions of page views, a boat load of comments and idiots like me going off on one for paragraphs. That reality distortion field really is living on....
I hope they make them* cough up for that as its largely their desperate cost saving attempts that have screwed this poor person over.
She was a highly paid presenter, reportedly on over 100K a year. If she's not been paying NI on that wage then she's certainly not 'poor'.
Christ, the hospitals are at breaking point, there's fewer and fewer police on the streets, yet it's a tragedy when people are asked to actually pay their fair contribution to society.
I appreciate she's been given bad advice, and it's not her fault. But I, and a good chunk of the people reading this have to pay NI; don't have a lot of sympathy for those that don't.
So now we've got days of increasingly breathless and mainstream media coverage by supposedly sensible adults who've managed to make it all the way to maturity yet seemingly never learnt about wood.*
One possible solution is to place a coaster or other protective surface between the speaker and the wood.
What? You mean one of those things you put under things to stop them damaging surfaces? Who knew??
You mean The value of your investments can go down as well as up doesn't just apply to the little people?
You can't any longer make money in development tools, so why invest in them?
One word: Jetbrains
(Seems that if you're going to charge you just have to make it good... who knew?)
Linux is built on an outdated model. Of course, you're able to do much better yourself, right? Open Source also includes the BSD family.
Linux (the kernel) is many things, and has done many things, but from a computer science point of view, it is based on an outdated model. BSD is a descendent of that outdated model. Whilst obviously Linux has evolved over the years, the basis is still quite old in OS terms.
If you want *nix like but more 'modern', then minix with it's everything in user space approach is worth a look. Even OSX with it's OO driver model, and 'can I have a handle please' NT are worth a mention.
Or you've got some of the more 'modern' but never really made it stuff like Plan 9 that came from follow up research after Unix. Or the stuff like VMS that just... faded away...
Now there's probably a really interesting open source vs closed source debate as to why many of these failed or fizzled out, but it still doesn't make Linux 'modern'.
modern used throughout to mean about 30-40 years old
...Or the Yorkshire version could be with beef carpaccio, horseradish and some kind of cold onion gravy.
Erm... do you deliver?
Use a spring form cake tin to make a plate sized Yorkshire pud.
Fill with roast potato and slices of delicious roast beef. (maybe some carrot + swede mash to get another 2 of your five).
Cover in onion gravy.
Behold it's beauty.
Behold the empty plate.
What ( I think ) would be cool would be for TV manufacturers to do a deal with Sky and add Sky as a dummy source ( with the Sky Q EPG, a minimum processor/ram speed, big disks for recording ). Although there's no reason they couldn't do that already if they licensed Sky's card standard.
You can blame sky for this. What should've happened is you should just be able to buy a CAM from sky and use whatever receiver took your fancy. (The CAM is a module that the card goes in, it handles the decryption) If you have a satellite enabled TV that's what the big hole in it is for. This would have allowed a separate market for Satellite/DVR boxes from the content they're used to consume.
However, sky had good lawyers and managed to get round this requirement. (I heard a (probably apocryphal) tale that the regulations require they sell a CAM module, so they did, just the one; once).
If it weren't for corporate greed then we'd maybe have managed to decouple the provider of the shows from the provider of the equipment, however that never really happened, in fact Sky ended up integrating even more, they now control the entire hardware production chain.
This is why their subscriber base can decline year on year but their profits not. It's only now that the likes of Netflix are providing some decent competition this is starting to change.
Personally, I would've subscribed to Sky years ago if they'd allow this as I use a PC for my TV PVR functionality, and would happily have subscribed if I could've used Myth or TvHeadend rather than Sky+ for my recording needs.
Hopefully there will be a better frame rate. Apparently Now TV is only 25fps at the minute, and its very noticeable, particularly for football.
Isn't all UK TV 25fps?
Pointing to a decade of product launches that follow in competitors' footsteps – Google+ (Facebook), Google Cloud (AWS), Google Home (Amazon Echo), Allo (WhatsApp), Android Instant Apps (Facebook, WeChat), and Google Assistant (Apple/Siri) – he concludes Google has lost the ability to develop its own new ideas.
I remember 10-15 years ago this criticism was leveled against Microsoft. Isn't it just the case that large companies are perceived as innovators, simply due to having made enough money.
MS weren't particularly innovative; licensing DOS, rather than selling it was probably their significant act. (Given that it lead to their dominance). Apple you could argue had the sense to understand the importance of the experience, but even the iPhone was an evolution. Facebook has just become the social platform that was good enough and in the right place at the right time.
Same with Google, they came up with a decent search engine, but after that isn't it just acquisitions and mergers like every other company? Also, they don't need to innovate, they need to feed the marketing machine that pays the bills.
What was that saying again? You reap what you sow?
Sure it's not 'You get what you pay for'??
The slowdown happened years ago: when Java and outsourcing to cheap code shops both became popular.
The slowdown happened because the IT industry has grown far faster than the ability to train competent software engineers.
Not being able to 'see' the Heath Robinson machines that comprise most software applications helps with this too.
If you asked someone to build a bridge, and the result was constructed out of twine and empty kitchen roll holders, even if it was demonstrably able to cope with the weight you'd still be wary of using it. Software doesn't have a physical manifestation that you can inspect so it's much harder to tell.
In 'traditional' engineering you wouldn't hire an enthusiastic DIYer, you'd hire a trained engineer, no matter how well the shelves were put up in their house. In IT we tend to hire the DIYers as there's not enough engineers to go around.
I've more than one occasion seen programming jobs advertised with 'Programming experience would be nice but not essential'. I've also worked with the result of this policy too.
Biting the hand that feeds IT © 1998–2018