I remember buying an issue with Mandriva on a CD-ROM in the late 90's, installed it, worked pretty well ...
shutdown -h nowfor a second time: Mag editor fires parting shot at proprietary software
Linux Journal has closed with "no operating funds to continue in any capacity", according to a notice on its site. First published in March 1994, Linux Journal was founded by Phil Hughes and Bob Young, the latter being the co-founder of Red Hat. The first issue, which you can read online, includes an article by Linus Torvalds …
"even some still in print, so it is possible to make it work"
In the UK there are at least 4 I regularly see in large branches of WHSmith: Linux Magazine, Linux Format, Linux <something else> and Ubuntu User.
So its quite possible to make it work, at least in the UK. Perhaps trading conditions are different in the USA but if not it sounds like they're just making excuses for bad management.
Linux Journal had articles like "Address Space Isolation and the Linux Kernel" whereas Linux Format has articles like "Manage Fonts in Ubuntu" - it was more akin to Dr. Dobb's Journal in its target market. Just another casualty of development by means of Stack Overflow, I suppose...
If you want to read about address space isolation in the linux kernel, it is unlikely that the magazine is running the article that month, so you would look it up online, and as long as it isn’t so old that a new version of linux has completely changed it, it will still be useful.
In the past, I suppose you would have gone to the library to look at back issues.
... more akin to Dr. Dobb's Journal in its target market.
Ah, Dr. Dobb's. Sad that that has gone. Maybe there just isn't the interest in computer orthodontia these days ... though programming anything worthwhile in Tiny BASIC these days would be about as pleasurable as having one's teeth pulled!
"In the UK there are at least 4 I regularly see in large branches of WHSmith: Linux Magazine, Linux Format, Linux <something else> and Ubuntu User."
And if all of them were online only magazines, how many of us would even know they existed? There doesn't seem to be the equivalent of W H Smiths or Menzies (or supermarkets for that matter) online where you spend 5 minutes browsing a huge range and variety of magazines. Once a magazine goes online only, it's in it's death throws unless there's a large and ongoing marketing budget. Only the hardcore readers will stay, there'll be few of any new, first time subscribers and there'll be those who really don't like reading a magazine format on a screen because you lose the convenience of the physical format. The "cool kids" think dead tree magazines are archaic, but they lose out on discovering new stuff they may not otherwise come across. They search what they are interested in and don't see the other stuff and almost by default end up in an echo chamber.
The counterpoint is sites like Hacker News, or this Humble Establishment. How much are you paying for them?
Reading a double page spread on tool X where the first page is installing the tool chain vs a blog post with a link to the installation guide followed by the actual stuff you're there to find.
Where the blogosphere (does that word age me?) falls down, is in the WHY area. I search for "I want to do X" will be met with results of "How to do X in Y tool". Without much further examination as to why pick that particular tool. Would you even ask the question if you haven't heard of the other tools?
My point is that while I agree with you regarding the echo chamber of search results, hub sites fulfill a similar role to the magazines. With the bonus that you don't feel you've wasted your money if you only read 10% of the articles.
"Eric Raymond wrote... about the problem of critical internet services being dependent on unpaid volunteers... The big corporations will always step in to fix things, but to Raymond that is part of the problem. "Wouldn't you like to have an internet that's less beholden to the mercy of large corporations and governments?"
Unpaid beardies who can't afford to keep services running or corporations that can. Pick one, you can't complain about both Eric.
How about finding some way to support the unpaid beardies instead of saying it's either/or
Let's just say that the likes of Stallman don't live in the real world. Especially Stallman can afford to do so because someone else pays them a living and so reality is kept at a comfortable distance, but you learn very quickly to never let a suit near the bearded socks-n-sandal wearers if you want to convince them to sponsor something - if anything, they'd flee to Microsoft in a New York second.
I tend to find ways to buy support because that's a valid business argument I can get a budget for, and so helps keeping the lights on for the people who put a lot of time and effort in to write sometimes truly amazing code, but my business experience with FOSS enthusiasts is that they tend to live quite close to the line where enthusiasm becomes fanatism - it's not a good idea to let an inexperienced manager/budget holder anywhere near them. Sadly, it would unsell quickly.
The number of projects I've seen going down the drain because of tech people not knowing when to shut up is quite impressive.
It is the key reason why I make sure business and technical briefings are now kept wholly separate - ideally I get them scheduled at the same time in different locations. That way, we can get things organised and I can actually ensure they get the budget they need.
Management is not into charity other than when there is a direct benefit (their job is to make money), so as soon as someone start preaching the commune, all management hears is that they're paying for others which is not something you want in their heads when they're deciding on budgets.
Can they meet? Sure, just keep exposure to under 5 minutes, and not for critical decisions.
Possibly because when they started, it was just a hobby project, and not critical infrastructure. An since I imagine it's quite gratifying when your hobby project becomes really successful, they keep working at it (even) as it morphs into something a lot of others depends on. But at some point the workload exceeds a personal threshold, and only then does the question get asked.
They paid me a good deal of money for mine (an early so-called "portal"). Then promptly ruined it, and got bought out themselves. The third owner begged me for help. I fixed it in their image (for shares in the company + cash), they promptly sold it for an astonishing number considering what it was, we all made bank ... and naturally the new owners broke it again. They begged me to fix it, but I refused. Been there, done that ... and was comfortably pseudo-retired.
Why are unpaid people bothering to maintain critical infrastructure?
I am of the opinion that FOSS is the best way to gain control over that criticality - typically, it's organic growth. I have no issue with paying for FOSS software maintenance, because that (a) helps, possibly even ensures continuity and (b) it means I can get the authors to address things Id wanyt changed, improved or even invented.
I have zero control over proprietary code - I can't give that to others when the original author/company decides to call it a day, so it's quite important. I call it the Zune factor.
The problem with volunteers is that while they're really into [some project] then it flies along, frequent updates, bug fixes, new features etc. But once they get bored and start disappearing off to the next fun thing it all starts to go pear shaped. The dev world is littered with half supported or dead OSS projects that could have been great if kept going.
And yes ok, the corporate world isn't slow to put a bullet in its products, but - with the exception of google - at least they give you plenty of advanced warning.
"the corporate world isn't slow to put a bullet in its products, but - with the exception of google - at least they give you plenty of advanced warning."
Twenty-four hours is good enough for you, then? That's the typical warning time for anything run by a US corporate.
Anything from Oracle that's not the current release?
It's actually pretty common for a version to go from on-sale and fully supported to not sold and not supported overnight - but you can upgrade to the new version for only a few hundred dollars.
Of course these days they try to push you into the monthly-fee version instead, which is not only unsupported if you stop paying but ceases to exist entirely.
At least in Google's case we know to count to ten and then just assume it'll get dropped because...
I think what annoys me most about Google is not that they abandon projects, but that they don't think to dump the code into a repo so other people (those who like X) can keep it going if they wish to take it over.
"...but that they don't think to dump the code into a repo so other people (those who like X) can keep it going if they wish to take it over."
They hired people to write the code, provided them with boxes, rooms, seats and lights all of which costs money. Google *own* the results. The results may, someday, be useful as part of something else (what? You've never reused bits of code in other projects?) or even be sold to someone else who wants to do something similar.
Capitalism. Never give away *anything* unless the return massively exceeds the loss.
The dev world is littered with half supported or dead OSS projects that could have been great if kept going.
Sad, very sad.
But 100% true.
The Linux eco-system has a huge amount of available manpower which, if properly organised and oriented, could achieve a whole lot more.
But it is helpessly dispersed as every T,D&H out there wants to roll their own whatever.
I mean, choice is OK, but ...
Just how many editors, desktops, filesystems, etc. do you really want/need?
Certainly there is not and can never be a single filesystem that's perfect for all use cases.
Access patterns, bandwidth, latency, capacity per physical replaceable module, layout of the modules etc all vary spectacularly and have very different tradeoffs.
A jack-of-all-trades filesystem like NTFS or EXT is probably fine for most general-purpose desktops, but an (eg) embedded, high-availability or massively-distributed system is almost certainly better off with something else.
I can't help thinking that the penetration of the Raspberry Pi running Linux will start to swing the balance back.
For £30-odd quid anyone can have a computer they're not afraid to break. That they can afford to 'try' things on even if it breaks an I/O port or two.
Those people, and there will be plenty of young 'uns, will grow up, like the Speccy/Amiga/etc. generation, used to the idea of delving into their computers and it will all be on Linux. It will take a few years for the impact to become clear but I don't think Linux is going to become totally invisible.
I also suspect that the long-term impact of Steam making more and more games available on Linux will grow the market of people who want to play games but don't want a sealed, proprietary box and who therefore become increasingly familiar with Linux.
Or maybe I'm being hopelessly optimistic?
I think the problem with the Pi, compared to the Spectrum/C64/Amiga era is that the Pi isn't quite so "plug and play". You can't shove the lead in the back of Granny's portable and hack out a bit of BASIC in the same way as you could with the VIC-20 or the ZX81...
"there were 23 million Pis"
Now subtract all the people that got a Pi and then put it aside when they got some new shiny-shiny.
Now subtract all the people that started with a 1B, then got a Zero, then a 2... That's me. I never upgraded beyond a 2 because I saw no reason to, but I would imagine the upgrade path alone would divide the 23 million by four.
Now subtract those who have multiple Pi dotted around the house quietly doing tasks.
23 million Pi does not necessarily mean 23 million users, and for what it's worth, none of my Pi run Linux.
Then you've got folks running Cluster HAT's with four Pi Zero boards on top, and the likes of Mythic Beasts running lots of them in data centres for customers. Other ISP's do Pi hosting too now, but I haven't seen anything on quite this scale.
23 million Pi does not necessarily mean 23 million users
The Raspberry Pi Foundation tell us half of all Pi computers built have gone into commercial and industrial products so we can halve that number before starting to calculate how many other users there are. Many are in educational use where the kids aren't programming as such, just pushing blocks around to move a cat. We can discount a good number who are using them as NASs, streaming boxes and game emulators. Others are just using them as a cheap PC.
Five million or so Pi owners doing some kind of 'Linux programming' would be my guesstimate.
"23 million Pi does not necessarily mean 23 million users, and for what it's worth, none of my Pi run Linux."
And then subtract all those kids who simply won't ever engage as much as we'd like because they look around and everything has been "done" already. Back in the 8-bit days and the early 16-but days of Amiga/ST, there was no internet, not even BBS for most kids so they were trying stuff out for, in their eyes, the first time. They were inventing new stuff because either it was actually new or they just didn't know that others had already done it.
Now, don't get me wrong, I'm not dissing the RaspPi or the people benefiting from it, I just don't see it as being the groundbreaking device the early stuff was. A lot of kids will play with it, but they already know the clever stuff has already been done or needs teams of well funded people to do nowadays. They only have to look on the web to find examples of so many simple or obvious ideas and that can be very off putting for young minds, especially if they don't have enthusiastic parents or teachers to help out. It's appeal is to the type that will do well in most hobby areas, those that see the achievement in doing stuff themselves. I see it as the difference between people who build and fly model aircraft and those who buy a drone they can take out the box and fly instantly. One group is significantly larger than the other, and guess which it is.
"One group is significantly larger than the other, and guess which it is."
Thanks JB. Of course in the distant days you had to build it yourself (even if from a kit) as there was no alternative. The radio controller for my first plane had just one button - press once for left turn, twice for right turn, the servo was driven by a rubber band, and you had to align the channel yourself. You needed both ingenuity and attention to detail to build and fly under such conditions.
We can only hope our future engineers will be drawn from the builders of the model aircraft, not the drone flyers. Otherwise we're doomed.
In order to learn the essential first principles (the fundamentals of engineering) you need to start learning on systems and with tools that are simple enough to get your head round quickly. The problem with most current offerings (including the Pi) is that they're extremely complicated, so there's a temptation not to investigate how they work at the metal level but just to use them to do jobs. Consequently, fewer and fewer folks learn about how to design new stuff from the metal upwards, so real engineering practice progressively deteriorates (and it's visibly happening already).
"how to design new stuff from the metal upwards"
Exactly. Once upon a time it was more or less obligatory to know some degree of assembler and with it a number of tricks to minimise cycles and/or bytes.
Now? Now there's no point. You could store each element of a bitfield in a big integer array and nobody would notice. And counting cycles is pointless on a gigahertz speed multicore processor that might not even execute your instructions in the same order that you wrote them...
Indeed, the use of assembler over a high level language hurts portability - you'll notice that Linux has just enough assembler to get itself going, and no more.
Plus, on machines like the Pi, doing basic stuff is hard work - instead of bashing a weird serial device for getting data from a keyboard, you either use the onboard serial port and another computer to act as a terminal, or you write a USB stack. You know, like one does on a rainy Thursday afternoon...
The side effect of this is a gradual decline in those who understand the lower level behaviour of a computer.
"or needs teams of well funded people to do nowadays"
Here lies an important point. Back in the 8 bit days, it was entirely possible for a smart child to look at something and think "I could do better". Quite a number of games and utilities were written by bored teenagers.
Now what is the bar? Full office suites as good as anything people used to pay Microsoft for are available for free, the state of games is literally full scale virtual immersion like "The last of us" which gives a character in a world, not a small collection of brightly coloured pixels that needs imagination to make real.
Where is the "I can do better" mentality? Apart, from, perhaps bolting some bits together to make an IoT gizmo better than cheap Chinese rubbish with its non existent "security", the concept doesn't exist. A teenager isn't going to single handedly write a popular game any more as people's expectations are so much higher these days.
IMO the development learning is too steep now for a lot of kids. Back in the 80s it was switch-on-start-typing and you'd get a result straight away. Now its switch on, wait for it to boot. open command line, type "python" (probably) then apart from simple stuff like print they have to remember the syntax of a complex professional language before they can get anything running.
People may laugh now at noddy old BASIC but there was nothing like typing
10 PRINT "DIXONS ARE ****"
20 GOTO 10
into a speccy or C64 in the shop then leg it to get an early flavour of what programming was about (particularly government projects) :) Good luck doing that now with anything in PC world if there's even one remotely close to you.
"Back in the 80s it was switch-on-start-typing"
I think that made kids a bit more bold with coding - if you messed up, you could reset and start again in a couple of seconds, without having to wait for the OS to reboot, or worrying about corrupting the file store.
(That didn't make "R Tape loading error, 0:1" any less frightening though)
Some SBC projects like the Maximite have tried to recreate that experience, and FUZE used to do a version of their kits with a Maximite installed instead of a Pi - the cases were yellow instead of red.
Thr problem today is that there is so much more to do which entertains kids on computers. Nowadays there are so many high quality games and sonmuch contrnt on youtube, etc, to watch, that it's easy fornyoung kids to just drift into that and stay there. Back in the day that many of us started you literally had tonwrite your own games or utilities if you wanted to get something done.
Shame, now that someone has pointed me to it, it looks like it has had some great content over the years. I do feel with Linux, like with Engineering, the roots of the subject are often well taught and understood but learning the "local details" specific to your implementation can only be absorbed by exposure over time.
Sources like this are precisely the kinds of places you can get that exposure, albeit not necessarily the time to absorb it all... (Or filter out what is or is not relevant anymore).
There is a difference between printed technical magazines in the UK and US. I remember reading a comment by a reader from the States to the effect that UK magazine cover prices were very high whereas US magazines were cheaper but had a lot more advertising. That might explain why the UK can support a number of titles, as there is only so much advertising whose natural home would be a FOSS magazine.
It doesnt matter about GNU/Linux being in the background, or in a VM.
GNU/Linux is just one OS that makes use of FLOSS software. There are others.
The important thing is to talk and advocate about the licensing and ideas about FLOSS. Focusing on just the OS is falling into the same mistake the Open Source group did where they removed all talk about protecting rights and freedom and only talked about being open in the respect of improving code quality. Now look where we are, with the Linux Journal suddenly realising that things are getting locked up and we are lacking freedom. They blame how GNU/Linux is used rather than blaming the real issue, that nobody is properly making efforts to talk about that freedom. Well Richard Stallman and the FSF is, but thanks to the efforts of the open source movement he finds he has to constantly remind those he talks to that he has nothing to do with open source at all.
Free Software incorporates the benefits of open source, while focusing on protecting it. It s a political idea. Open Source tried to avoid all that, now we have this situation where companies grab our code and use it as a platform to create proprietary systems.
If we apply this to cars: We need to find ways to push people to not think about the car, but how they are restricted or not with what they can do with the car or where they can go.
Funnily enough this is precisely what Richard Stallman and the FSF have been talking about over the last several years. The GPL version 3 tries to address this, yet there are those who try and stick to the GPL v2 thinking thats all they need, or they dont like GPL v3 because they secretly want to be a bit more proprietary, or they sell out and pander to big corps because they want to mentally come because big corp uses their code. The GPL v2 does what it says in the license text however clever people are leveraging ways to subvert and work around it in ways that got us into this situation, adoption of GPL v3 would (should) have fought against this.
There are also the other Free Software users/developers who prefer total freedom. They develop and use software under MIT licenses, which is a Free Software license, but does nothing to protect you from bad actors. They are so into freedom that they believe there should be no restrictions at all. That sounds great, but that includes the bad actor that wants to lock you up in their code / protocol. With the car analogy this would be like having a car that you are totally free to do anything to, anything with and take anywhere. Then you leave it in a car park where the car park owner takes a liking to it and legally transfers control to him. You come back to find you must pay to access your car and that the car will now only start on a sunday just because the new owner thinks it should. A GPS has been fitted, tied into the ECU so when he sees that you are driving to Coventry, he can kill your car because he for some reason hates Coventry and thinks you should pay more to go there.
The Open Source movement got the ideas and licenses into business by dumping the political arguments of freedom. This worked but it now looks to have worked too well. Now its time to unify the Free Software and Open Source movements so we can start working on getting control of the car back.
There are many points I could respond to, but I'll pick in this one specific point: Then you leave it in a car park where the car park owner takes a liking to it and legally transfers control to him. You come back to find you must pay to access your car and that the car will now only start on a sunday just because the new owner thinks it should.
The thing you are missing here is that the car park owner takes a COPY of my car and makes it his, and his copy of my car only works on Sunday. My ORIGINAL car is still there and still works exactly as I had intended it to. The car park owner could replace my car's open design with whatever closed one he wants. It won't make a single remotest bit of difference to my car. It would be like suggesting that if I go into the garage and spray paint a giant dick into my car, every Citroen C1 in the world would suddenly sport a badly drawn phallus. It just doesn't work like that.
Unless, of course, that was some sort of reference to software patents, which are bullshit, and no licence on Earth is necessarily going to save you from a company with deep pockets if they think you "stole" their idea.
Really sorry to hear about the end of Linux Journal - it was a quality mag in a world of quick answers via Stackoverflow etc.
That is the real problem - we live in a TL:DR world, and the people who are still willing to invest time in reading depth are few and far between. They are also not people who spend money, free beer is what brought them to Linux in the first place!
What LJ should have done is to sell their brand rather than their words - Get their logo on the talking circuit that makes up so much of the vacuous world milking Linux - because that is what the reality of Linux is these days - a few people write some genius, and everyone else starts selling the potential, style over substance.
Not that any of this really is news. Stealing BSD code is what actually built the internet, all that has changed is the lawyers have more complex licensing to get paid to play with.
Biting the hand that feeds IT © 1998–2019