Just open source it, soon have developers helping out, if its as quick and has a low resource requirements.
What were you doing in 2010? The Space Shuttles were still flying, Toy Story 3 ruled the cinemas, and Apple released its very first iPad. Oh, and Linux distro Elive locked down its last stable build. Until now. Designed to run on minimal hardware, Elive is very much a passion project of its leader, Samuel F Baggen. Based on …
At a guess? Either not noticeably faster than a standard OS or even potentially worse. Assuming it's 64-bit (initial release was 2005 so this may be stretching matters) then the main benefit to running it over, say, Win10 is that it won't try to think for itself when you're trying to work on it.
Yeah if you haven't noticed, OS haven't actually got quicker over the last few decades. Even though they are not doing very-much more. Windows 10 was about the first Windows OS to actually *improve* performance on existing machines and even that's debatable.
To be honest, I baulk at the sight of needing 512Mb of RAM to draw a few windows on the screen. We're really still doing something drastically wrong. I know there's a lot more to it than that, I know there are myriad background services and all kinds of connectivity and layers and security and god-knows-what going on to facilitate showing those windows. But I can remember when I had to upgrade from 1Mb to 2Mb just to show a 640x480 window (when I already had 1Mb video RAM) and I couldn't fathom why that was necessary.
Fact is, we're not making the OS any faster but taking everything modern and paring it down. If you want to do that, you have to throw out 99% of what the OS / window manager currently does that's unnecessary.
We've been doing that forever. Intel would trot out a new CPU that was blazingly fast, then Microsoft would trot out a new Windows that ate all that performance and then some.
Every time a new, much more powerful video card came out, you could be sure that there was a game that would bring it to its knees gasping.
Let's face it : we've only just arrived at a time when low-end models are good enough for everyday, home office stuff. That in itself is quite something, and handily explains why people are not buying so many laptops and PCs any more - they don't feel the need, what they've got works well enough.
Even though hardware is nowhere near as expensive as it used to be, a hardware upgrade is nowhere near bringing the same impression of increased power we used to have either. It used to be a thrill to put €850 in a new video card because wow, look at how much better my games run. These days I see new €1000 cards or CPUs and think : why bother ? For the 5% increase I'll get in performance, it's just not worth it.
Even though hardware is nowhere near as expensive as it used to be, a hardware upgrade is nowhere near bringing the same impression of increased power we used to have either.
There you go: A 2012 Intel Core i7-3720QM is about as powerful (as benchmarks go) as a 2017 Intel Core i7-7700HQ.
Comparing Laptop CPU's, Seriously?
You're not even in the same ballpark. Here..
Two CPU's both running at 4G from Q4 2017 and Q2 2018...That's a comparison@!
Intel Core i3-8350K @ 4.00GHz Intel Core i7-8086K @ 4.00GHz
Price $193.15 BUY NOW! $552.96 BUY NOW!
CPU Mark 9279 17062
1.5 MB 24-bit color Retina card here. And that was on top of a 16 MHz 68030, 4 MB 32-bit memory, SCSI controller and a hacked 386SX Bridgeboard. I even hacked it for 1 MB Chip memory and multiple Kickstart ROM's. Way faster machines around here, I favor dual Xeon setups and not a one is a 100x+ faster than that Amiga. Pretty disgusting, really.
Don't forget, the stock amiga had those custom chips in them which helped a lot with graphics operations back in the day.
I had the pleasure of getting an old A500 with multiple floppy drives back up and running for the owner; I was surprised the disks were still good, let along the drives.
Getting the IIgs I have back into operation, that's going to be tricky as a) It didn't come with an ADB keyboard or mouse (or 3.5" drives, and b) the gits on fleabay want FAR too much money for parts in even crummy shape. ("It's VINTAGE! We can charge $stupid for it and some schmuck will buy it!") Fortunately, there's a guy making modern bridge hardware for the input devices, and a plug-in box that'll emulate every drive ever made for that platform, and access disk images on an SD card.
Yes, 640x480x16 comes out at exactly 150KB for the primary frame buffer, assuming 4 bits per pixel.
You would have struggled to get a dual-frame buffer working, but for 2D single buffer, it was ample.
When it comes to things like graphics intensive games, sometimes they used to render a frame in main memory, and blit it out to the graphics card's frame buffer, so for some purposes, it may have been necessary to increase main memory.
Remember that unlike today, where the graphics card can do complete object rendering with texture mapping and light source and even now full blown ray tracing, early VGA ans SVGA cards did not have a huge amount of intelligence, so often the main processor did most of the work.
"What does it run like on a modern 8th gen I7, 32Gb RAM, a 4Gb video card, & 250Gb SSD?"
This question makes me think of the scene from Lost in Space (the 1998 movie not the Netflix series), where they find the ship from the (relative) future and the computers are so fast they struggle to use them.
Also the fact that Matt le Blanc, Heather Graham or any of the other stars from that incredible cheesey movie haven't been given a cameo in the Netflix series seems a missed opportunity.
To everyone that replied, Thank You. Enjoy a pint in gratitude.
I honestly wondered how software designed to run on such low-grade hardware (think 386 with 512Mb of RAM) would run on modern high-grade hardware at its disposal. Like training a race car driver on an old electric golf cart that can't do much beyond a walking pace, then sticking them behind the wheel of a NHRA dragster capable of 300MPH quarter mile rocket sprints. Will the driver hack it or will they lock up & wind up spreading themselves like thin chunky salsa all over the place?
In a similar vein, will the software run like a champ on a 4GHz multithread CPU with gigabytes of system RAM, more gigabytes of VRAM, & an SSD with more storeage space than it even previously thought possible? Or will it see all the possabilities before it & lock up in awe thinking the computer version of "My god, it's full of stars!"?
The mental image that flashed through my head was of trying to run a game coded for the XT age of computers on the I7/4GHz/32Gb monster some folks have at their disposal. The phrase "Blink & you've missed it" kept cackling merrily in my head. =-)p
Fun fact - many old games counted CPU cycles or some other trickery and timed their AI speed to that - the original Command and Conquer does this I'm told. That's one situation where DosBox's random maths feature comes into its own - otherwise the AI player just magically gets an enormous base.
I can't really see the use case for a pentium with 256 MB ram. Anything that still needs one will probably also require whatever software was running on it already. Otherwise, a better computer can be had for $5 for the raspberry pi zero. A better computer with a screen can be had for $25-40 if you look for a used laptop being sold on your used-goods-emporium of choice. Power consumption means they aren't even good for places that can't afford modern systems. What's the point?
>I can't really see the use case for a pentium with 256 MB ram.
However, that takes this distribution into the space already served by more dedicated micro OS's and Linux/FreeBSD distributions with better support. A router for example doesn't need a particularly powerful processor to handle the UI. Likewise, whilst your smart TV could benefit from a decent CPU, doing so would massively increase the price. [Aside: personally, I prefer my TV to be pretty dumb and leave the 'smart' stuff to the Xbox/Playstation/etc.]
As for power consumption, we do need to be a little careful here. Whilst an ancient desktop PC with a Pentium may be a power drain, I'm not sure about newer versions of these 'older' chips that Intel may be producing for specific applications.
Of course you are right. There are a lot of good use cases for machines with 256 MB or less of memory. However, my original point with respect to memory was basically these two:
1. I wouldn't recommend the use of a machine with 256 MB or less of memory as a computer to be used as a desktop, running GUIs of multiple applications,
2. I wouldn't recommend a pentium computer (referring to pentium meaning the typical age of chips called pentium when they might regularly be shipped with 256 MB of ram) for any purpose. Among the reasons for this are power efficiency, raw processor speed, memory speed, and speed and reliability of the disks typically found inside these things.
The low resources are not the Elive goal but just another of its features, a very good one in the sense that the system is just very fast on any computer, and can make old computers still being usable computers.
There's countries like in venezuela where people cannot afford a normal computer at all, and there stills many old computers existing, in schools, in storage rooms, and running in houses losing more than 1 hour per day in human lifes to check their email in strange ugly systems called "vista"
"we'd frankly baulk at running it on anything much slower than a 533MHz Core 2 with at least 512 MB RAM."
I am assuming this is a typo as I don't believe that Core 2 cpu's were produced at speeds less than 1.06GHz. And I wouldn't call a Core 2 particularly low end anyway, I have a 600MHz Pentium 3 box with 256MB of RAM that usually runs a headless Gentoo distro which I am using with VDR and a DVB-T card for freeview recording and playback box. But I can also boot it into Puppy Linux with a full desktop and runs much better than old versions of Windows do on this machine.
OS programmers should all be sat down in a room, and shown a film about that "chess in 1k" on the ZX81.
It's not necessarily their fault - but involvement of the higher level 'lets make the system helpful to the point it tries to do most the thinking for the user (and missing the mark of course) types - i.e. marketing and etc.
I'm sure even Windows was nippy before the clipboards, messenging (or whatever that keeps feeding ads), update managers, telemetry and whatever else junk mistaken for essential services rammed in.
The problem with the "good old days" is that unlike what a lot of people imagine, computers didn't manage to do all the heavy lifting in less memory and processing power than today, not really. What they did was offload the heavy lifting onto the user.
Yes, I remember using WordPerfect on DOS 3.x, and it was awesome. But if you wanted to see what your document would look like when you printed it, you printed it. Eventually, yes, they added preview mode, which would give you a graphic rendition of what it would look like, but you couldn't edit in it, you had to flip back and forth. And it required a WordPerfect specific driver for the video card. And a WordPerfect specific driver for the printer model you had, too.
In 1990, I replaced a video card with a model from a different vendor, and I had to reinstall something like 18 different applications, from scratch, and then reconfigure them each to use the new card. Today, you replace a video card, boot to the lowest resolution, download drivers from the internet, reboot, and you're done.
There's definitely room for improvement in terms of programmer efficiency, but the idea that we're using a thousand times the CPU and memory as we were thirty years ago and it's all being wasted is a fantasy. Back then, less than 5% of the population had a computer, and there was a reason for that. PCs were very arcane, and difficult for non-hobbyists to use. Today, we get a lot of ease of use under the covers. That takes a lot of resources and effort.
The idea that programmers thirty years ago were more brilliant and efficient than the ones today is completely wrong. I know; I was one of them thirty years ago. Believe me, we weren't any smarter or better than the current generation.
They used to have WYSIWYG even on the BBC Micro word processors and you could also send a bitmap to print just like windows however given the printing time for sending a bitmap relative to ASCII which would be rendered using the printer's built in font data (i.e. no memory cost to the computer) then bitmaps tended to be reserved for images. Along comes Bill and buys up a load of fonts and the next thing you have is windows only printers, which are a quarter of the price since they dont have to store the font data, can have dumber electronics and a new printing standard.
As to wordperfect, it was a product that preceded the technology that you seemingly take for granted upon the PC namely GUI, cheap RAM, fast CPU and communications. That WP continued to be used after windows was because it was a standard that once learned was quicker to use than even the latest word.
As to your video card and you belief that everything happened without some smart people doing work that you are clearly obvious to then there are lots of reasons why things were more efficent in the old days but the most obvious one is that the computers could not be made fast and cheap enough to do otherwise. I could explain why nothing has actually changed with regard to drivers except assumed access to the internet and PnP but sufice it to say you clearly do not know what you are whittling about
I have no idea what you were programming 30 years ago but clearly you were unaware of what other programmers and the rest of the industry were doing.
Here is a question for you: Given that linux and windows are both based upon unix and VMS then what exactly has changed since they were invented by the old programmers -the correct answer is cost.
Everything is realitively cheap and plentiful now and that means that programmers can now be sloppy where 30 years ago it was not an option for any significant project. Thus writting efficent code is no longer a requirement to be able to call your self a professional programmer
As to wordperfect, it was a product that preceded the technology that you seemingly take for granted upon the PC namely GUI, cheap RAM, fast CPU and communications.
Well, yes, that was largely the point.
The application developers of yesteryear spent great amount of time and effort developing installation branch paths and reams of drivers. Today that's all done in the operating system, where it belongs. But that technology that we "take for granted" comes at a cost. And that cost is what a lot of people criticize as bloat.
People complain that today's 16GB machines do the same things as machines did thirty years ago in 640K, But thirty years ago, it was perfectly acceptable to expect end users to know their PC's memory map, the IRQ settings of their PC, and that they be able to juggle device driver load sequences themselves.
As to your video card and you belief that everything happened without some smart people
I never said that. If you can't make your point with lying, your point isn't really worth much.
As for what I was doing thirty years ago, I was working on embedded real time controllers for fighter and commercial aircraft controllers. I have friends still doing that today. If you think writing DSP software allows developers today to be sloppy, and that efficiency is not a requirement, try to write a landing gear controller interface in 768 bytes.
Of course there were brilliant programmers in 1990. And there are brilliant programmers today. And there were horribly inefficient coders in 1990, just as there are today, too. The idea that the industry has taken a step backwards is a myth. People pining for "the good old days" rarely lived through them.
IOW, many computers weren't expected to be turnkey solutions 30 years ago. They weren't for the masses unlike today. If people pine for the "good old days", they probably pine for the days when people had brains and could remember reams of information and do trig on a slide rule like E. E. "Doc" Smith in the Lensman series.
To my shame, it could easily beat me. "Mazogs" was another triumph of programming in 1k. The sky was the limit once I invested in the 16k RAM pack + blu tac to keep it steady. As the adverts of the time explained, it was enough to control a nuclear power station, though I didn't actually try that.
And why are you even trying with the stock Wi-Fi card?
If I'm correct, that's Centruino (Pentium M is a part of Intel Centruino) ... And I don't understand why these don't work ... the kernel has their support built-in.
You can try a cheap USB Wi-Fi card if you don't want to compile a mainline kernel.
Also Google the specific Wi-Fi card PCI vendor and product IDs.
True Antic, lubuntu, a fee others, couldn't be bothered with puppy as it was for an adult , none could get WiFi running, its really old, as in the battery is square and sits kinda in the middle , ( still held a charge tho!) of the bottom case.
Believe I left it with lubuntu and a cat 5 .
I just really hate throwing serviceable equipment away.
I had Linux Mint Mate running with wifi on both Thinkpad T40 & T41 and Acer Travelmates, all of them Pentium M. Wireless cards had to be upgraded to wireless G to access the wireless at work, but they worked with both 2G (max) or 1G ram. No drivers had to be found to get wireless working.
Just this past weekend as chance would have it a live CD with the previous iteration of Elive on it was the first fossil I found in the drawer when I had to try and re-boot an old Mac Pro which had the flu. Booted, ran, GUI, the lot. Love it! Definitely will give this new-fangled version a twirl (knowing I can wind it back 5 years if the need arises and I shall be happy as a sandboy).
I recall it was a buggy, crashtastic mess, but with some "cool" screenshots. Fine if you didn't actually have to use it for anything, I suppose.
Seems others have noticed the bugginess too.
This one best left on the compost heap of history.
That's nice ...but, in all seriousness, why bother?
It's based on an older version of Debian (rather than the current one), and is running an old kernel.
Not wishing to knock anyone's own pet project, and the effort that they have put into it, but really the last thing the Linux world needs is Yet Another Distro.
I can't help but think that the developer's resources would be better off used working alongside the developers of one of the other lightweight (and relatively popular) distros instead, to magnify the power of all their efforts into making one distro better.
There is definitely a place for Linux distros of varying degrees of lightness, but I'm not really sure that a one-man-band (with what sounds like not a lot of spare time) can really make a go of it.
But fair play to him for having a hobby that he enjoys, at least.
As you said, it is a hobby. There are just a handful of major Linux distributions that can be considered for real work, or even for home use for "ordinary" users (meaning those wishing to mainly use applications as opposed to those enjoying developing the platform itself). All the rest are really hobby distributions, or very specialized ones filling some small niche.
Making a Linux distribution that fits into a "small" machine is perhaps akin to building a ship in a bottle. I sometimes myself wonder if a modern kernel could ever run in the oldest machine in my house, a Pentium MMX with 128 MB RAM. Now it has an old Mandrake Linux in one partition (with KDE GUI) and Windows 95 on the other.
"It's based on an older version of Debian".
Not quite old enough unfortunately. That's when poor old Debian got itself fully encumbered with systemd. Wake me up if the next version migrates to Devuan and becomes systemd free.
I'd give it a whirl then for sure.
I still have it, £699 I remember bought for the wife who hated it, she loved the £80 kindle fire I bought her a few years later. It still works perfectly, long battery life, but there are ZERO apps that I can install from the AppStore so its effectually useless.
It is a tablet running a Unix variant after all. You could jailbreak it and install something like QMole (an abandoned collection of GNU and others' *nix utilities ported over, including X), or maybe a RDP/VNC viewer, a remote control, heck, lots of things are possible with a terminal and battery life!
God, you can even sell it, little it would earn you, but at least you'd save on storage in your drawer.
But as we're techies here, possibilities abound.
>but there are ZERO apps that I can install from the AppStore so its effectually useless.
Well for some apps, if you have a newer iOS device you can install the app on that device first and then install it on the iPad2, this process seems to allow you to gain access to previous compatible versions of apps, if they exist.
However, have had similar problems with Android; I made the mistake earlier this year of doing a factory reset on a Samsung phone running Android 2.3.4 and then having the fun of reinstalling updates and apps...
In general, whatever distro you already use, minus some stuff that requires a lot of disk access. If, for example, ubuntu is your wish, just install it, pick a desktop that you like and that doesn't lag much when you run it on the oldest computer you have, and install the utilities and applications you'll use when the machine you're using your USB disk on isn't connecting to the network. I'd suggest making a partition on the disk for general data storage that can be safely mounted and written to by other systems, so you can continuing your USB drive as a drive and for dealing with data stored on encrypted disks you can't mount when booting directly, and you'll have all you need. With very few exceptions, any linux distro will run well enough. Some desktops will use a lot of resources, especially disk, so they might not be available to use while retaining your patience, but there are many, including mate and KDE, that run perfectly well.
Well... I've got a 1GHz Pentium III machine I wouldn't mind finding a use for. Granted that it's probably not worth it, but it's such a cute little compact mini-desktop that I can't stand to toss it.
I have to grant that because I just (last weekend) installed Mint Linux 19 Cinnamon x64 on a 5ish-year-old Lenovo laptop (i3 2.53GHz, 8GB RAM, SSD-240) and I'm running a personal Subversion server and remotely accessing it via VNC as I type this. It's sitting on an otherwise empty shelf in my closet and connected via 2.4GHz WiFi.
So the Cool Gadget Nerd in me wants to run this low-resource-requirement Linux on the PIII, but the Practical Geek keeps reminding me that I have another i3 laptop looking for something to do other than collect dust, and why do I need an ancient PIII desktop computer again?
You can make your OS as lightweight as you want, but if you want to browse modern web pages, Firefox / Chrome / etc are going to chew up at least a couple GB of RAM on their own, and make you sorry you tried it, if you have less. I only wish there was a lightweight browser out there which could render modern web. Now THAT would be a huge productivity boost for everyone.
Just when you were ready to take out your gun and help that old hardware out of its misery, someone comes up with another tiny distro and wants you to spend hours trying to keep it alive.
Since processors are not sentient I'd suggest they deserve recycling, rather than retirement on the back-burner.
tried 2 times to download elive in the correct way (last time just 2days ago), both times waited the 1 hour for download to but sadly both times no download happened, so sad to say because of the frustration of no download to date after 2 attempts i truly have been convinced that elive is a FAKE!!!! if the hard working guy wants to make elive a success then why is he putting ppl off elive????? i luv open source (ubuntu) etc, but y do i FEEL discriminated 2x to download and try this new open source?????
Biting the hand that feeds IT © 1998–2019