I don't know which one is the most depressing. Is it that a machine beats humans, or that there are humans whose job is to be a "professional SC gamer"?
60 posts • joined 14 Aug 2015
"cloud native technology"
WHAT does that even mean? Does it work on a computer in the basement? Yes. In a data center? Yes. So, moving on?
That's like saying "strategic leadership". scalable == adaptive, no?
"professionalising people and technology"
I'm sorry... what?!?!
"Open source is the default choice for modern IT solutions."
The statement is true, if you pay attention to the word MODERN. Unfortunately, modern solutions are not so prevalent. Look at government. Look at just about any large organization that has been around for a while (long enough to be infiltrated by PHBs, dancing around in a macabre mockery of what that place used to be). Those are not using open source as they are far from being receptive to the concept "modern".
>> In other words, IBM's proprietary storage software will still
>> effectively see Red Hat software as competition.
Remember when MySQL was the one database everyone was flocking to? What has Oracle done with it? Right. It doesn't make sense to support development of a lower (or zero) cost product eating into your high margin one.
"[...] those who are fully bought into the Office 365 platform and ecosystem"
You know a place where everyone got M$ Office slapped on their computer? <yes>
You know a place where this Office 365 is used? I heard of this thing for years, and unless it's a buzzword, I've never dealt with someone using this. I have no idea what it looks like and what it does, but from what I hear, it's made for people who (once upon a time) thought that the Palm would revolutionize their productivity.
Not a single mention of Replicant in this article? Replicant is based on Android, minus the non-free components. And that's hardly new, though I never did anything with it other than know it exists (I've the feeling I'm not the only one!).
The main problem facing its use? Proprietary drivers, what else?! Phones and all SoC devices are very unlike the generic PC x86 boxes, with just about every necessary drivers for the latter already built in the Linux kernel.
Oh but wait even with Android, drivers are a massive problem. If there's a new version of Android, it seems the drivers need to be ported to it if the phone is to receive an upgrade (How hard is it to recompile sh*t? Is it all written in assembly for specific micro-architectures!?!?). On my computer, I don't need to get new drivers in order to receive... a new version of KDE or whatever I fancy. Conclusion: Android is a pitiful design.
The people who don't make mistakes are the people who don't get any work done.
That being said, that is not the fundamental issue here: The sheer stupidity or incompetence behind this code. It's not even about testing. It's how can someone even write that?!?! That's even worse than the "rm -r *" of Valve's Linux Steam a few years ago, because this time the target is very deliberate.
When you go on the web page of "Open R", you see a drawing of a monkey. Thank you. Now we know what MVP stands for! Most Valuable Primate!!
> R language is extended atm with incompatible M$-only features
I use R very much, and my first reaction to this article: WTF is that POS "Open R"??
The question is not what, but why? And the answer is simple. M$ brings forward their toxic commits to GNU R, which are rejected for very obvious reasons. Then they create their own thing. Relax, the abomination is contained.
By the way, I would like to thank M$ for aptly naming this creature of theirs.
Open R == open source
GNU R == free software
Even though the source of this "Open R" is out there, its sole purpose is to facilitate the integration of proprietary technologies.
But unlike those other projects, Java is the only one that has not been forked. Why? Simple. Nobody likes Java.* No sane person will produce code in Java unless you have a gun on your head. That gun could be an adequate compensation or just some PHB asking for "enterprise" stuff, whatever that means.
(*): People who know nothing outside of Java possibly love it. Many "computer science" programs are more like "Java science" programs.
Why would anyone rely on GTK3 when those behind it take weird decisions, such as removing menu icons and mnemonics? And why? Because... it is not cool enough for them? Because you don't have that on most phone apps, you should not have that on a "desktop app"? (app: crappy and embarrassing piece of software)
I heard over the years lots of rant against GTK3 and praise for Qt5 (because it seems it allows people to do what the f*** they want to do, without being judgemental).
"complex technical requirements from a small number of externally provided highly specialised applications"
So I guess they are talking about stuff made with VB6 that does not work nicely past XP. Or would it be one of Pascal or COBOL? Let me guess, they "acquired" the license to use that software, but the provider ceased to exist ages ago.
Listen, in France, they developed GendBuntu to get the police to move from XP to Linux. How about they get in touch???
I missed very much KDE 3.5, then finally accepted KDE 4, as it became more mature. Now I find KDE Plasma 5 too experimental (last time I tried: 10 months ago maybe) and I absolutely despise how it drank the FLAT Kool-Aid. It's about time to let KDE4 go (lxde-qt not quite ready), so I might as well return to KDE3 through Trinity.
Which version of KDE are you talking about? Past disappointments with an early version of very unstable KDE 4? Or disappointments with _any_ version of KDE 5 (or whatever they call it)? KDE 5 is one example of a flat-inspired design, with hard to distinguish monochrome icons (extremely stupid), and countless idiotic decisions. For some "traditional" features, you must now use the thing called "Activity", which I never figured out (it's easier to compile your own kernel, really, that says it all). I managed to stay on KDE 4, for the time being.
Confused about KDE? Try GNOME 3! I gave it a try, and binned it relatively quickly. Eventually, the developers of GNOME decided that what makes a desktop UX as we commonly know it must be completely stripped from the gtk library, enforcing people to have a program looking like a phone app. Unsurprisingly, I heard that gtk does not have the following it used to have. There seems to be a convergence toward qt.
Insanity just about everywhere.
One thing the author of the article pointed quite correctly: the packaging system in python. It's a good one I believe, it works, but it is so complicated to penetrate for the simplest package! You got "source distributions" and "built distributions", you got "eggs" and "wheels"... And when you try to figure out what any one of those is and what you should use, what you face is one heap of techno babble. The book on python project development and packaging has yet to be written, probably because if it was written, it would amount to say "It's all one big work in progress".
My attitude toward BTRFS has long been: "This is very promising. We can use it now? It doesn't seem like it's a simple drop-in replacement for ext4... I'll wait for others to try this Kool-Aid." Seems like it does not taste so good after all.
In the mean time, I'm happy with JFS on my system. And will take the label odd-ball.
"For all the criticism of the R language – it's hard to understand, slow and a memory hog"
If you find R hard, then you need to spend some time with SAS, which consists of 4 or 5 languages patched together because individually, each of them is utterly inadequate to get you anywhere. Then whatever you do in R with 1 or 2 lines of code will take at least five times that in SAS, and forget about the concept of "package". R is made for smart people by smart people, and it better stay that way to keep Excel and VBA kids at bay.
The article mentions python as a great language (I agree, it is), but R's internal documentation is so much better than python. Why? Because it always features examples for simple things! Whenever I look into the standard library reference of python and look for a module that could be of some use, I have to look somewhere else to figure out how to do anything with it (i.e., ok, so this is some instance of a class, then what do we do with that?). Without Google, most people using python would have a really hard time (myself included). Without Google, for R, it would be an inconvenience, but I could still find my way around.
Should it be a "memory hog" on a server (never had any issue with it, but never run it on a production server either) is not surprising because it's development did not focus on making it a daemon.
To make a non-brainless and non-disappointing movie out of Dune is like making one out of K.S. Robinson's Red Mars, that is to say highly unlikely. Both books are complex in nature, where the plot is almost secondary to exploring the underlying narrative universe, interspersed with various essays from the authors.
I have come to believe that such works cannot be satisfactorily rendered by movies, period. Further, keep in mind that the vast majority of movies that came out in the last several years were targeted to teenagers and dimwit adults (and abundantly, if not exclusively featuring such like-minded characters). All this point to a massively dumbed down production. There could be some hope in terms of TV series (did not see the miniseries), as you can afford the luxury of character and plot development (that could explain why people with a brain moved away from the big screen market lately).
If you want to make a movie, why not create something **NEW**, tailored to be a movie?
The treatment of the narrative in Lynch's movie is merely "ok" (and very disappointing if you read the book beforehand), but what saved it for me is the visual and audio, which are absolutely perfect. When I read the novel, I am pleased to see what was depicted in the movie. It seems accurate, and for things which are difficult to think of, the movie is an excellent companion.
The soundtrack gave the movie a uniquely mystique flavor, very palatable yet way different from say John Williams style. Remember who made it: Toto, a hip hop band. In fact, the story is that Lynch offered a recording of Symphony no.11 by Shostakovich to the young musician. "You liked it?" "Oh yes", he replied. "Good, I want something done in this style."
Weirding modules were a weird addition, but how can you depict Fremen to be so formidable warriors as to best Sardaukars in melee combat with little effort?
Another "not in the book" the cardiac plugs on Harkonnen subjects. It's not in the book, but it is cunningly fitting.
IoT: When it survives a cost-benefit analysis, we may reconsider. Wait. We can simplify that to a mere benefit analysis.
5G: That's exactly what I was wondering. So do we really have 4G now? Really?? It's like IPv6. It's been out for a while. A really long time. We are still waiting.
VR: I keep hearing that pr0n will make this a big business. You can hide whatever you were looking at by pressing alt-tab. Not so easy to conceal your dark helmet.
IT jobs sexy: I don't know of many jobs that are sexy. It's easy to be fooled by the "sexiness" of the end result of something (check out TV dramas on lawyers or physicians), not realizing that there is a lot of effort and a fair share of tedious work to do in order to get there. This applies to any serious job. So if your job is sexy, it must be a very easy one.
"Today Thunderbird developers spend much of their time responding to changes made in core Mozilla systems and technologies. At the same time, build, Firefox, and platform engineers continue to pay a tax to support Thunderbird."
Excuse me, but there ceased to be a full-time developer working on Thunderbird since some time now. That "tax" must be pretty small. So their finances must be really in the gutter. What did they do with millions upon millions upon millions they received from their deal with Google?
The atrociously high stupidity of matching TB version with FF did not help. Mail clients are like databases: you don't want a new version every five minutes, and if there's a new version, there must be a valid reason to upgrade.
"I believe Thunderbird would thrive best by separating itself from reliance on Mozilla development systems and in some cases, Mozilla technology."
That is one gigantic confession of failure. Of course it's hard for Thunderbird to be "synchronized" with Firefox technologies, because just to figure out what's going on with those is in itself a full-time job. Let me reformulate this in another way:
"Thunderbird would thrive best by separating itself from reliance on" Netscape Communicator 4 technology.
I'm puzzled that Google Drive gets mentioned as some praiseworthy feature. The dropbox client on Linux has always worked great for me for as long as I had dropbox (even before it could be used on Linux without a browser, better, even before Google Drive existed). At least Dropbox
likes to have you think that they don't nose around your data.
Subtitle of the story mentioned mir, which is a very interesting topic, but had nothing about this in it. As I don't use Ubuntu, I am very enthusiastic whenever they throw in new and rather experimental stuff (until it pisses of people sufficiently to revert back to Micros~1).
As much as I praise the move to a "fresh" X framework (Wayland, mir, whatever, bring em on), I am afraid that it will take forever. First, it has to be stable. That's the easy part, and we are not there yet. Then everything has to be ported, from what I understand, as it's not quite a drop-in replacement. What about the stuff that has gone unmaintained for a long time? Will be left behind. And so many will just stick to old X because... the new thing removes more than it adds.
I recently switched from opensuse to Gentoo. And I like this extra freedom in controlling my system. No systemd forced on you, no pulseaudio, btrfs no thanks, no need of hundreds of irrelevant kernel modules, etc. I still keep an install of opensuse (13.1) for the few times I need to access one of those smelly things which I enjoy avoiding on Gentoo (flash, java). Might look at leap once 13.1 reaches the end of its evergreen life, if it goes as planned.
8TB? Maybe in the server market some could enjoy that. But in the consumer segment, I just don't see what intelligent life form could need this. What about people taking N pictures of themselves a day, in high resolution, where N is a huge integer, and keep them all? Again, I said intelligent life form.
Or maybe a superior life form. You know, those who can really see a meaningful difference between high definition and "ultra" high definition. Necessarily, those individuals can also see x-rays and smell dark matter.
There are two ways to escape this hell: choice and sound management.
Take Adobe Acrobat Reader. Install size of latest version (15?): 350 MB. Purpose: document viewer (not even editor). Something has gone horribly wrong here! Yet, there are alternatives. You have a choice! Of course when it comes to "office suits" you are limited between proprietary bloated monsters or a free bloated package. But that's software done wrong for people who ask for software done wrong.
As for those sad legacy systems in many corporate environments, their situation is explained by a complete lack of understanding from managers who know nothing about software development and who believe that to write a program is like making a table: once it's done, it's done. Er, no. It's a perpetual development. Maintenance! Documentation! But then they chose to have it completely frozen, firing the developers as soon as it went passed the second beta, and did not even bother to ask for a copy of the source code because... deciding on things beyond their scope.
Also, if an organization selects an operating system (Windows) that is known to be variably backward compatible over time for that which you use or built, of course IT is frustrating, but that is the result of a willful choice of helplessness!
Exactly. Not a good trend, but not quite in a bad shape! "declining revenues" != "actual losses", and given the amount of net revenues and liquidity they have, this is not quite a sad day.
The market has gone seriously saturated. With a Pentium, you had a good reason to ditch that 386. With a Pentium III, you had a good reason to ditch that Pentium. With a dual/quad-core, you had an excellent reason to ditch that Pentium III. Now, still no end in sight for that machine I bought in 2009.
Just like the 56k modem businesses and countless others had to find something more exciting, a similar thing might apply to that x86 fella.
What is the "information bomb"? Easy. It's all those computers running M$ products that are so pirate-friendly. The solution to this has been known for a long time.
And add to this a large number of smartphones that have not received a security patch in a long time (*cough cough* Android *cough cough*), but I'm not too worried about this, as I don't think there is much sensitive information there, other than... browsing history and questionable pictures?
That's very interesting, but I would like to know how does that magic operates... What is the data that one must provide? I seriously doubt that taking a picture of the field is enough to optimize on a meter square basis. But then I understand if those startups are a little secretive, for the time being.
Anyway, I'm very happy the author boldly pointed out how much 'big data' has been otherwise one huge mammoth buzzword.
I have always been skeptical of cloud stuff and never really drank that kool-aid, but the ability to share massive infrastructure can have obvious returns to scale, but clearly, if we are talking of something "important", well, nothing beats the good old "on metal".
The services that were down are anything but important. I'm even ok if they are down from time to time due to a glitch or (better) some maintenance. What if it's down: Netflix? Read a book. Amazon? So you done reading all the books you ordered? Tinder? Learn how to read!
Kids + even more computer spending + even more computer science (presumably elementary/high school) == questionable.
Programming is good for many, but not for every one. Many should learn _some_ programming, to the extent that it would be extremely useful in their work, even if not an actual software developer/engineer. But when will this specific context arise? 4-10 years after an initial course on the subject? And that which has been learned as a kiddo might not be relevant anyhow to the needs encountered in the real world. Not an optimal use of scarce time at school. The purpose of elementary/high school is **not** to develop technicians! There is another level of education geared toward just that.
Anyone saw this?
Biting the hand that feeds IT © 1998–2019