Just open source it, soon have developers helping out, if its as quick and has a low resource requirements.
What were you doing in 2010? The Space Shuttles were still flying, Toy Story 3 ruled the cinemas, and Apple released its very first iPad. Oh, and Linux distro Elive locked down its last stable build. Until now. Designed to run on minimal hardware, Elive is very much a passion project of its leader, Samuel F Baggen. Based on …
Just open source it, soon have developers helping out, if its as quick and has a low resource requirements.
"Just open source it"
That's sarcasm, right? Looks like it, it's so preposterous, but lacking a /s at the end or a funny icon, I prefer to be sure.
No, it definitely IS sarcasm ... wait, wait, is it really sarcasm? No, it can't be possibly anything but sarcasm ...
What does it run like on a modern 8th gen I7, 32Gb RAM, a 4Gb video card, & 250Gb SSD?
At a guess? Either not noticeably faster than a standard OS or even potentially worse. Assuming it's 64-bit (initial release was 2005 so this may be stretching matters) then the main benefit to running it over, say, Win10 is that it won't try to think for itself when you're trying to work on it.
Yeah if you haven't noticed, OS haven't actually got quicker over the last few decades. Even though they are not doing very-much more. Windows 10 was about the first Windows OS to actually *improve* performance on existing machines and even that's debatable.
To be honest, I baulk at the sight of needing 512Mb of RAM to draw a few windows on the screen. We're really still doing something drastically wrong. I know there's a lot more to it than that, I know there are myriad background services and all kinds of connectivity and layers and security and god-knows-what going on to facilitate showing those windows. But I can remember when I had to upgrade from 1Mb to 2Mb just to show a 640x480 window (when I already had 1Mb video RAM) and I couldn't fathom why that was necessary.
Fact is, we're not making the OS any faster but taking everything modern and paring it down. If you want to do that, you have to throw out 99% of what the OS / window manager currently does that's unnecessary.
We've been doing that forever. Intel would trot out a new CPU that was blazingly fast, then Microsoft would trot out a new Windows that ate all that performance and then some.
Every time a new, much more powerful video card came out, you could be sure that there was a game that would bring it to its knees gasping.
Let's face it : we've only just arrived at a time when low-end models are good enough for everyday, home office stuff. That in itself is quite something, and handily explains why people are not buying so many laptops and PCs any more - they don't feel the need, what they've got works well enough.
Even though hardware is nowhere near as expensive as it used to be, a hardware upgrade is nowhere near bringing the same impression of increased power we used to have either. It used to be a thrill to put €850 in a new video card because wow, look at how much better my games run. These days I see new €1000 cards or CPUs and think : why bother ? For the 5% increase I'll get in performance, it's just not worth it.
"But I can remember when I had to upgrade from 1Mb to 2Mb just to show a 640x480 window"
Shoddy PC architecture! The Amiga 500 could do 640x512 on just half a meg of chip RAM (plus a whole sixteen different colours!).
"What does it run like on a modern 8th gen I7, 32Gb RAM, a 4Gb video card, & 250Gb SSD?"
This question makes me think of the scene from Lost in Space (the 1998 movie not the Netflix series), where they find the ship from the (relative) future and the computers are so fast they struggle to use them.
Also the fact that Matt le Blanc, Heather Graham or any of the other stars from that incredible cheesey movie haven't been given a cameo in the Netflix series seems a missed opportunity.
Even though hardware is nowhere near as expensive as it used to be, a hardware upgrade is nowhere near bringing the same impression of increased power we used to have either.
There you go: A 2012 Intel Core i7-3720QM is about as powerful (as benchmarks go) as a 2017 Intel Core i7-7700HQ.
Yes, 640x480x16 comes out at exactly 150KB for the primary frame buffer, assuming 4 bits per pixel.
You would have struggled to get a dual-frame buffer working, but for 2D single buffer, it was ample.
When it comes to things like graphics intensive games, sometimes they used to render a frame in main memory, and blit it out to the graphics card's frame buffer, so for some purposes, it may have been necessary to increase main memory.
Remember that unlike today, where the graphics card can do complete object rendering with texture mapping and light source and even now full blown ray tracing, early VGA ans SVGA cards did not have a huge amount of intelligence, so often the main processor did most of the work.
1.5 MB 24-bit color Retina card here. And that was on top of a 16 MHz 68030, 4 MB 32-bit memory, SCSI controller and a hacked 386SX Bridgeboard. I even hacked it for 1 MB Chip memory and multiple Kickstart ROM's. Way faster machines around here, I favor dual Xeon setups and not a one is a 100x+ faster than that Amiga. Pretty disgusting, really.
Wait, they made a movie AND Netflix series out of an awesome TV show?! #Danger
Don't forget, the stock amiga had those custom chips in them which helped a lot with graphics operations back in the day.
I had the pleasure of getting an old A500 with multiple floppy drives back up and running for the owner; I was surprised the disks were still good, let along the drives.
Getting the IIgs I have back into operation, that's going to be tricky as a) It didn't come with an ADB keyboard or mouse (or 3.5" drives, and b) the gits on fleabay want FAR too much money for parts in even crummy shape. ("It's VINTAGE! We can charge $stupid for it and some schmuck will buy it!") Fortunately, there's a guy making modern bridge hardware for the input devices, and a plug-in box that'll emulate every drive ever made for that platform, and access disk images on an SD card.
Yes. Both are very different from the original and from each other. The Netflix series is barely recognizable. Don't watch it for nostalgia; watch it for its own sake.
The new Dr. Smith is WAY nastier. Prettier, though.
"The most amazing achievement of the computer software industry is its continuing cancellation of the steady and staggering gains made by the computer hardware industry."
- Henry Peteroski
To everyone that replied, Thank You. Enjoy a pint in gratitude.
I honestly wondered how software designed to run on such low-grade hardware (think 386 with 512Mb of RAM) would run on modern high-grade hardware at its disposal. Like training a race car driver on an old electric golf cart that can't do much beyond a walking pace, then sticking them behind the wheel of a NHRA dragster capable of 300MPH quarter mile rocket sprints. Will the driver hack it or will they lock up & wind up spreading themselves like thin chunky salsa all over the place?
In a similar vein, will the software run like a champ on a 4GHz multithread CPU with gigabytes of system RAM, more gigabytes of VRAM, & an SSD with more storeage space than it even previously thought possible? Or will it see all the possabilities before it & lock up in awe thinking the computer version of "My god, it's full of stars!"?
The mental image that flashed through my head was of trying to run a game coded for the XT age of computers on the I7/4GHz/32Gb monster some folks have at their disposal. The phrase "Blink & you've missed it" kept cackling merrily in my head. =-)p
Fun fact - many old games counted CPU cycles or some other trickery and timed their AI speed to that - the original Command and Conquer does this I'm told. That's one situation where DosBox's random maths feature comes into its own - otherwise the AI player just magically gets an enormous base.
I remember trying to play an old Golf game (it was either PGA or Microsoft Golf 2) on a modern machine, it was unplayable as you could not time the swing bar as it took less than a second.
My test used to be SimCity (The original. Not the abomination EA released a few years back).
OK. Install SC. Start it. Hit PAUSE immediately.
How many Months / Years gone by?
OK. Nice upgrade!
Comparing Laptop CPU's, Seriously?
You're not even in the same ballpark. Here..
Two CPU's both running at 4G from Q4 2017 and Q2 2018...That's a comparison@!
Intel Core i3-8350K @ 4.00GHz Intel Core i7-8086K @ 4.00GHz
Price $193.15 BUY NOW! $552.96 BUY NOW!
CPU Mark 9279 17062
Considering the movie offered that same courtesy to Dick Tufeld (Robot, voice), Mark Goddard (Major Don West), June Lockhart (Maureen Robinson), Marta Kristen (Judy Robinson), and Angela Cartwright (Penny Robinson), it does seem a missed opportunity.
Actually the system boots before to press the power button
replaced your HD with an SSD yet? that's the speed increase you've been looking for.
A CLI version should be pretty OK. After all, Raspbian runs on a pI fine, in CLI.
A CLI version would probably be indistinguishable from native Debian.
I remember the days when you could run Windows for Workgroups in 2MB RAM on a 386SX, etc etc.
A modern Pi is fine with a sensible GUI. XFCE is a good choice but there are plenty of other light weight GUIs to suit all but the people who need their windows semi transparent and rotating over the surface of a klein bottle.
I can't really see the use case for a pentium with 256 MB ram. Anything that still needs one will probably also require whatever software was running on it already. Otherwise, a better computer can be had for $5 for the raspberry pi zero. A better computer with a screen can be had for $25-40 if you look for a used laptop being sold on your used-goods-emporium of choice. Power consumption means they aren't even good for places that can't afford modern systems. What's the point?
>I can't really see the use case for a pentium with 256 MB ram.
However, that takes this distribution into the space already served by more dedicated micro OS's and Linux/FreeBSD distributions with better support. A router for example doesn't need a particularly powerful processor to handle the UI. Likewise, whilst your smart TV could benefit from a decent CPU, doing so would massively increase the price. [Aside: personally, I prefer my TV to be pretty dumb and leave the 'smart' stuff to the Xbox/Playstation/etc.]
As for power consumption, we do need to be a little careful here. Whilst an ancient desktop PC with a Pentium may be a power drain, I'm not sure about newer versions of these 'older' chips that Intel may be producing for specific applications.
There was a time when I used an old PC as a router. Then I saw what Mikrotik routers could do while still running off a micro USB power input. I've already got the cost of one from the electricity saving and everyone else is happier at the size saving.
Of course you are right. There are a lot of good use cases for machines with 256 MB or less of memory. However, my original point with respect to memory was basically these two:
1. I wouldn't recommend the use of a machine with 256 MB or less of memory as a computer to be used as a desktop, running GUIs of multiple applications,
2. I wouldn't recommend a pentium computer (referring to pentium meaning the typical age of chips called pentium when they might regularly be shipped with 256 MB of ram) for any purpose. Among the reasons for this are power efficiency, raw processor speed, memory speed, and speed and reliability of the disks typically found inside these things.
The low resources are not the Elive goal but just another of its features, a very good one in the sense that the system is just very fast on any computer, and can make old computers still being usable computers.
There's countries like in venezuela where people cannot afford a normal computer at all, and there stills many old computers existing, in schools, in storage rooms, and running in houses losing more than 1 hour per day in human lifes to check their email in strange ugly systems called "vista"
"we'd frankly baulk at running it on anything much slower than a 533MHz Core 2 with at least 512 MB RAM."
I am assuming this is a typo as I don't believe that Core 2 cpu's were produced at speeds less than 1.06GHz. And I wouldn't call a Core 2 particularly low end anyway, I have a 600MHz Pentium 3 box with 256MB of RAM that usually runs a headless Gentoo distro which I am using with VDR and a DVB-T card for freeview recording and playback box. But I can also boot it into Puppy Linux with a full desktop and runs much better than old versions of Windows do on this machine.
Pretty sure he meant Pentium 2, because at least then the architecture, clock rate and RAM size all come from the same era at least.
OS programmers should all be sat down in a room, and shown a film about that "chess in 1k" on the ZX81.
That's an x86-bootable chess game in 512 bytes.
OS programmers should all be sat down in a room, and shown a film about that "chess in 1k" on the ZX81.
It's not necessarily their fault - but involvement of the higher level 'lets make the system helpful to the point it tries to do most the thinking for the user (and missing the mark of course) types - i.e. marketing and etc.
I'm sure even Windows was nippy before the clipboards, messenging (or whatever that keeps feeding ads), update managers, telemetry and whatever else junk mistaken for essential services rammed in.
"That's an x86-bootable chess game in 512 bytes."
But not fully legal IIRC, as in it doesn't follow all the rules.
The problem with the "good old days" is that unlike what a lot of people imagine, computers didn't manage to do all the heavy lifting in less memory and processing power than today, not really. What they did was offload the heavy lifting onto the user.
Yes, I remember using WordPerfect on DOS 3.x, and it was awesome. But if you wanted to see what your document would look like when you printed it, you printed it. Eventually, yes, they added preview mode, which would give you a graphic rendition of what it would look like, but you couldn't edit in it, you had to flip back and forth. And it required a WordPerfect specific driver for the video card. And a WordPerfect specific driver for the printer model you had, too.
In 1990, I replaced a video card with a model from a different vendor, and I had to reinstall something like 18 different applications, from scratch, and then reconfigure them each to use the new card. Today, you replace a video card, boot to the lowest resolution, download drivers from the internet, reboot, and you're done.
There's definitely room for improvement in terms of programmer efficiency, but the idea that we're using a thousand times the CPU and memory as we were thirty years ago and it's all being wasted is a fantasy. Back then, less than 5% of the population had a computer, and there was a reason for that. PCs were very arcane, and difficult for non-hobbyists to use. Today, we get a lot of ease of use under the covers. That takes a lot of resources and effort.
The idea that programmers thirty years ago were more brilliant and efficient than the ones today is completely wrong. I know; I was one of them thirty years ago. Believe me, we weren't any smarter or better than the current generation.
To my shame, it could easily beat me. "Mazogs" was another triumph of programming in 1k. The sky was the limit once I invested in the 16k RAM pack + blu tac to keep it steady. As the adverts of the time explained, it was enough to control a nuclear power station, though I didn't actually try that.
They used to have WYSIWYG even on the BBC Micro word processors and you could also send a bitmap to print just like windows however given the printing time for sending a bitmap relative to ASCII which would be rendered using the printer's built in font data (i.e. no memory cost to the computer) then bitmaps tended to be reserved for images. Along comes Bill and buys up a load of fonts and the next thing you have is windows only printers, which are a quarter of the price since they dont have to store the font data, can have dumber electronics and a new printing standard.
As to wordperfect, it was a product that preceded the technology that you seemingly take for granted upon the PC namely GUI, cheap RAM, fast CPU and communications. That WP continued to be used after windows was because it was a standard that once learned was quicker to use than even the latest word.
As to your video card and you belief that everything happened without some smart people doing work that you are clearly obvious to then there are lots of reasons why things were more efficent in the old days but the most obvious one is that the computers could not be made fast and cheap enough to do otherwise. I could explain why nothing has actually changed with regard to drivers except assumed access to the internet and PnP but sufice it to say you clearly do not know what you are whittling about
I have no idea what you were programming 30 years ago but clearly you were unaware of what other programmers and the rest of the industry were doing.
Here is a question for you: Given that linux and windows are both based upon unix and VMS then what exactly has changed since they were invented by the old programmers -the correct answer is cost.
Everything is realitively cheap and plentiful now and that means that programmers can now be sloppy where 30 years ago it was not an option for any significant project. Thus writting efficent code is no longer a requirement to be able to call your self a professional programmer
As to wordperfect, it was a product that preceded the technology that you seemingly take for granted upon the PC namely GUI, cheap RAM, fast CPU and communications.
Well, yes, that was largely the point.
The application developers of yesteryear spent great amount of time and effort developing installation branch paths and reams of drivers. Today that's all done in the operating system, where it belongs. But that technology that we "take for granted" comes at a cost. And that cost is what a lot of people criticize as bloat.
People complain that today's 16GB machines do the same things as machines did thirty years ago in 640K, But thirty years ago, it was perfectly acceptable to expect end users to know their PC's memory map, the IRQ settings of their PC, and that they be able to juggle device driver load sequences themselves.
As to your video card and you belief that everything happened without some smart people
I never said that. If you can't make your point with lying, your point isn't really worth much.
As for what I was doing thirty years ago, I was working on embedded real time controllers for fighter and commercial aircraft controllers. I have friends still doing that today. If you think writing DSP software allows developers today to be sloppy, and that efficiency is not a requirement, try to write a landing gear controller interface in 768 bytes.
Of course there were brilliant programmers in 1990. And there are brilliant programmers today. And there were horribly inefficient coders in 1990, just as there are today, too. The idea that the industry has taken a step backwards is a myth. People pining for "the good old days" rarely lived through them.
IOW, many computers weren't expected to be turnkey solutions 30 years ago. They weren't for the masses unlike today. If people pine for the "good old days", they probably pine for the days when people had brains and could remember reams of information and do trig on a slide rule like E. E. "Doc" Smith in the Lensman series.
I have Lubuntu running quite nicely on stuff almost as old as that. The killer is not the OS, though, it's the apps. Chrome/Chromium munch around 500MB for the first tab and 200MB per tab after that. Firefox used to be even worse but has improved enormously recently.
Did you ever get WiFi running ? Tried every light distro out there on an old Pentium M based Acer, could ne er get WiFi running properly
And why are you even trying with the stock Wi-Fi card?
If I'm correct, that's Centruino (Pentium M is a part of Intel Centruino) ... And I don't understand why these don't work ... the kernel has their support built-in.
You can try a cheap USB Wi-Fi card if you don't want to compile a mainline kernel.
Also Google the specific Wi-Fi card PCI vendor and product IDs.
have you tried Antix / MX17 ? wifi has worked on all machines i’ve loaded it on.
it doesn’t have systemd - which might or might not be of concern to you.
I had Linux Mint Mate running with wifi on both Thinkpad T40 & T41 and Acer Travelmates, all of them Pentium M. Wireless cards had to be upgraded to wireless G to access the wireless at work, but they worked with both 2G (max) or 1G ram. No drivers had to be found to get wireless working.
True Antic, lubuntu, a fee others, couldn't be bothered with puppy as it was for an adult , none could get WiFi running, its really old, as in the battery is square and sits kinda in the middle , ( still held a charge tho!) of the bottom case.
Believe I left it with lubuntu and a cat 5 .
I just really hate throwing serviceable equipment away.
Biting the hand that feeds IT © 1998–2018