58 posts • joined 12 Mar 2008
I used 3.5 inch MO drives through the 80's to mid 90's and they were highly reliable if you used the right suppliers. I had the optical lens fall out of IBM drives from rough handling as one example of a failure rate. Don't remember any media failures.
The old 3.5 drives are impossible to get now and any current drives may have a similar short production life. In the 90's I moved all the data onto CDs and DVDs for archival purposes.
I would go with LTO if I had the $ to spend, but a good RAID NAS with USB backup is a better cost effective solution. If you have two physical locations with fast Internet between them, RSYNC or other methods can be used to copy data between NAS units. I do this every night with my company's critical data.
I do not consider the NAS and USB drives produced by the major disk companies as quality units. They quite often have poor cooling systems and will burn drives up if used 24/7.
Re: We told you it was shit
Sorry to disagree, but most modern GUI design tools allow you to create one action (like Save As...), attach it to a handler and then connect this same action to a menu item, a toolbar icon, a ribbon icon, etc. as much as you like. Change the action features in the program and it automatically propagates to all the linked items. You can also easily show or hide all the linked items. No muss, no fuss and no expanded test matrix.
So I will be lucky to be 70
to see something not quite as fantastic as what I watched as a young boy on TV!
So now the idiot
who almost run them into the ground before Jobs came back thinks they will listen to him? Perhaps when Elop is fired from Nokia, they can hire him as the new Apple CEO to take them into future markets. What a bunch of bull!
Re: Some comments from a small network manager (spare time)
Very sorry for the long delay getting back. I've mostly got everything deployed currently.
Yes, I'm crazy enough to use the Mac Mini for virtualisation. I wanted something simple and easy to maintain and that had a low power footprint. I'm not running any heavy applications or large data stores with these servers, otherwise I might do it different. I plan to shut down the VMs and periodically copy the images to the QNAP. Retrospect does nightly backups through a client on the VMs. We have a Time Machine for the Mini itself and the VM directory is excluded from this backup.
I've tested the Retrospect bare metal restore and it appears to work well, but haven't done a complete test on 2008R2 yet. I'm mostly worried about the data stores being backed up. Restoring the OS image will get me half way there and the nightly Retrospect data backup gets the rest. The nice thing is being able to move it to other hardware if the Mini breaks.
I prefer to keep our web on a VM at the ISP. The less hackers we attract to our internal corporate IP address the better. I'm not opposed to running our own web server, and we do have a high security configured QNAP on a DMZ to provide support for customers and our field service for file transfers. We also use this DMZ to repair returned computers in case they have a virus that would like to spread on our normal network. The Netgear ProSafe firewall makes it very easy to set filters on what devices on our internal net can communicate with the QNAP on the DMZ.
Some comments from a small network manager (spare time)
Thanks for all the great advice Trevor.
I help manage the IT equipment at our small company (about 25 people) and am currently doing some upgrades. We build industrial control systems but our IT infrastructure is not as modern as it should be. Here are some items I am using:
Our Server 2003 running on a Supermicro MB and Pentium 4 is getting upgraded to Server 2008R2 running on a Mac Mini and VMware Fusion. I would have gone with Server 2012 Essentials but MAS 90 (yuck, not my choice) doesn't officially support it.
Our Linux file server has been running for about the same amount of time (old Fedora system) and has suffered several hard drive failures. I'm upgrading this to a QNAP TS-669L running the new WD Red 3TB drives in a RAID 6 configuration.
I use a Win XP system running Retrospect (Windows multi-server version) with several 2TB drives to backup all our computers. This is a disk-to-disk backup. A LTO tape drive is used every so often to move the backups to tape for offsite storage. I wish this was better but no one has time to do it right. We have too much data to pump it to a cloud solution. I just wish all the email clients out there were like Opera and would use a group of small files instead of one great big file of all the emails. This would reduce our backup storage greatly.
I've deployed several other QNAPs and have found their RTRR sync features very nice to copy file shares to/from our branch office. Our main office has a Netgear ProSafe SRX5308 firewall and it provides VLAN separation between our internal office and service networks with near wire speed packet filtering. This high end router is really needed to match our Comcast 50 Mbps service, lower end routers would not give full speed filtering. I can also limit the branch office QNAP to only respond to our fixed IP address so hackers won't be trying to hack it all the time. I don't VPN our branch office because many services don't run reliably on VPN and I don't have time to handhold users much.
Our web page and email are provided offsite by a hosting ISP to keep this traffic away from our company firewall and reduce my maintenance headaches. I would highly recommend the Mac OSX program, Sandvox, to anyone wanting to maintain a simple HTML5 business or personal web site. While we use a CentOS VM at the ISP with Wordpress for our company site, I wish I had seen Sandvox sooner. No one wants to keep the Wordpress site up-to-date so there it sits just like over a year ago.
Computers were not the DNA of the early HP
Hewlett and Packard started out making scientific instruments. Their venture into computers was a buyout of a minicomputer company to supplement their calculator business and help run instruments. It gradually grew into a small business support product near the end of the 70's with small timesharing systems. Engineers and scientists loved the instruments and the computers were quite good for running experiments.
They trashed all that when they purchased Compaq to become the #1 PC maker and sold the instruments business. The innovation was lost and they turned into simply a PC and printer company with plenty of competition to worry about. They lack the DNA they once had and will never be the same.
The box makes it look like I'm entering the Twilight Zone!
Yes, and had I purchased a Nokia WP Lumia, how do you think I would feel about Balmer instantly bricking it by not allowing an upgrade to WP 8? What a POS phone! Who would ever want another after this experience?
Phone 8 will also support native code in C and C++
So does this mean the Qt framework will run? I thought they were not allowing this, but if you can take a desktop C++ program and run it on the phone, why not Qt? Balmer and Elop need to explain this.
Re: Nokia needs Microsoft but does MS need Nokia?
MS already owns a good part of the IP. Elop sold much of it to a patent troll and both Nokia and MS retained rights to use without paying fees. MS didn't even need to pay Nokia anything for the rights.
Metro SP1 for XP and 7?
Come on Balmer. We must have Metro everywhere. Why not use Windows update to push a Metro SP out to all XP and 7 desktops. No need for users to have any say. After all, it's the future and it makes the computer more secure because users won't know how to run anything so they won't be able to break it anymore.
MS collects $5 to $15 for every Android Phone. How'd they do that?
MS likes to give money to others and let them do the dirty work like the SCO/Linux fight. This is just more of the same.
Re: Good discussion -
Yes, till the Brits started the Opium Wars and then it all went to shit for the Chinese...
subduing the locals who were living there
We martians welcome our new Intergalactic Federation King Almighty and Commander of the Universe! (where did you put that earth destructor beam Exigius 12½?)
Re: Leyden jars
"electric aversion therapy and subliminal messaging" = watching TV with commercials
And after the sales associate got done filling out all the paperwork on the return and got no extra commission for selling them a new Android, how do you think they would feel about selling another WP7?
You forgot to mention Crimes of Passion which was one of my personal favorites along with The Devils and Tommy. RIP Ken. Your film style will be sorely missed.
"now they are much more keen to get the boffins - and teachers - in"
"spam in the can"
What about disk tools?
I guess this means the Acronis boot disk will not load when I need to restore a virus crapped out Microsoft system. But oh wait, Windows 8 has the imaging feature built in. Clever those people from Redmond. Back to their usual monopolistic ways I guess.
My ISP disables all EXE downloads by default. Sounds like the BOFH didn't know what they were doing, but then again, I see many similar things at companies I visit.
Distortion is not Loudness!
Compression by itself is not the problem because if properly done it doesn't add distortion. Almost all the new mixed recordings push the sound through hard limiters which apply flat sections to the tops and bottoms of waveforms. They don't always do it to all instrument tracks either, usually just the bass and possibly the drums. This causes distortion and listening fatigue.
In the old days of the LP the sound was always modified somewhat. You couldn't stick low frequency bass too far off the center or the needle would skip. There were also "S" filters to remove the high frequencies from speech which would also cause tracks to bump sideways into each other and cause skips.
When CDs and digital came around this all went away, but other problems hit. If you recorded a quite section several dB down you weren't using all 16 bits. For example, reducing the volume by 1/2 would only use 15 bits. Drop to 1/16 volume and you are at 12 bits which makes quiet classical passages sound not quite as good as they should. Thus, recordings were always pushed as close to 0 dB as possible and also compressed somewhat.
Another problem with CD reproduction has to do with the filtering going on to reduce digital artifacts in the output. The early filters didn't do a very good job and if you pushed the sound up near 0 dB then you could get output distortion with certain waveforms. More modern circuits have solved this problem somewhat. The 16-bit CD is not really as good as they make them out to be.
The real problem is the distortion present in most modern pop music. Push an amplifier beyond its limits and you get clipped waveforms and noise. At least you can fix it by lowering the volume. Modern recordings have this feature added by the sound engineers and there is nothing you can do about it except not purchase the product unless you like playing distortion on your expensive sound system you spent many $$$ on to avoid said distortion.
Oh, and in my opinion the Grateful Dead Wall of Sound was not so much about loudness but about low distortion quality sound. Most of the PA systems at concerts of the time were total crap and the Dead were simply taking the control into their own hands. Many sound systems were rented locally rather than coming with the band. I was quite impressed when I saw them back in the 70s. Quite nice seeing the topless fans dancing away to the quality sound!
A while back it was communism and a cancer
but now that they can make money with open source, everything is just fine.
That was a joke
As Lewis appears to like to downplay the seriousness of the situation at Fukushima, I was simply joking when I suggested he act as a cheerleader for the workers. I do have a physics degree and had a chance to intern at Hanford but declined. There *WERE* likely zirconium fires that caused the hydrogen explosions in the early days, so there.
My main problem is with plant operators and pro-nuke industry people who don't look at the full range of issues, such as forgetting to pour water on the spent fuel rods and placing the emergency generators on a lower elevation where the tsunami could wipe them out. I've even heard they delayed the cooling water pumping at the start because the PM paid a visit and they didn't want radiation steaming around while he was there. Had someone been thinking of these two simple to fix problems, perhaps this mess wouldn't be so bad.
If the reports yesterday about shipping concrete pumps are true (and I believe the one I read) then TEPCO is going to simply bury the problem if they can't get it under control soon.
Go help them Lewis...
"As the situation at Japan's Fukushima Daiichi nuclear powerplant slowly winds down"
Meanwhile, Russia is sending an Antonov 225 cargo airplane to pick up a 190,000-pound concrete pump from the US Savannah River Site MOX plant so it can be flown to Fukushima. They don't plan to return it as it will be too hot when they are done with it.
I also understand TEPCO is hiring nuclear workers/engineers from the US at high rates of pay to work on the problem. Perhaps Lewis Page would like to donate his services to the effort to help bring it under control. He could act as a cheerleader to all the men as they pull on their suits to go put out the nuclear fires. His cheery attitude should boost their mood and make them want to work two shifts in a row.
Paris because she would know how to motivate the workers.
According to the New York Times...
the exposure to their feet was 2000 to 6000 mSv, a much higher dose than initially reported by their upper body monitors. This level is high enough to cause skin to slough off after a few weeks. Meanwhile, they may not have any symptoms. Perhaps you are the one who needs to read up on things.
Someone is talking bullshite here!
Mr. Page: "Their personal dosimetry equipment later showed that they had sustained radiation doses up to 170 millisievert."
IEEE Spectrum article: "The three TEPCO subcontractors were laying electrical cables in the basement of the turbine room behind the No. 3 reactor building when they stepped into water contaminated with radiation, and received doses of between 173 and 180 millisieverts."
New York Times: "It said that the amount of radiation the workers are thought to have been exposed to in the water was 2 to 6 sievert."
How could the NYT get it so wrong? If it was that high it is very serious. Beta burns are not similar to sun burns. Beta radiation goes much deeper into the tissue and can cause more damage, but the exact nature of damage depends on the type of emitter and we don't know the details yet.
I'm willing to bet that the initial 170 mSv was based on their wearable detectors which were higher on the body and not near the water. The direct dose on their skin could be much higher by the factor reported in the NYT.
My opinion is the manager who sent them into this area with boots that were too short should be fired. TEPCO's handling of this situation has been very poor from the start.
Perhaps Mr. Lewis Page will finally have been shown to be a fool with this statement.
UAC fails too
I repaved our saleman's laptop which ran Vista a few weeks ago. He clicked a link from a Facebook "friend" telling him about a photo of himself on said site. Vista UAC didn't slow the malware down a bit. It installed two DLLs which indicated they were related to gaming. When I tried to disable their autorun registry settings, they just enabled it again. His AV didn't detect them and some other AV scanners I have only detected one of them. I didn't want to nuke his hard drive, but once one of these things gets onto one I don't trust the system anymore.
UAC is not the answer.
Does it do multi-core compiles?
In my case (C++ because .NET is too slow for image processing) I love how Qt's Creator IDE makes use of all 4 of my cores while compiling with cl. The MS compiler is clearly compute bound most of the time so I really do see a 4x improvement. I know there are some third party add-ons for VS, but I find that small is beautiful. The only thing I miss is the class browser, which I can live without.
Also interesting thing to know: how many third party tools are broken and will take a year or so to be updated for the new IDE? I avoid using such kits if possible because of this issue.
Not a new thing that Mark 9
"Mundanely the Mark 9 seems to have been intended for torpedoing ships at anchor or in harbour from a distance"
PBS's Nova TV science show recently had a special about the five mini-subs Japan turned loose in Pearl Harbor during WWII. Four were lost but the show had convincing evidence that one got through and let go with a torpedo that helped sink one of the US ships.
The sad thing
is that the CIA had to ask the French for help to prove the guy was a fraud!
Lean and mean C++ IDE
Nokia's Qt Software built a complete IDE called Qt Creator in about a year. It's really fun to watch all 8 cores on an i7 run the command line cl and fly through builds. It isn't perfect and relies on a low level MS debugger, but otherwise is lean, mean and has plenty of useful features that can be easily configured to your liking. You can get it and the source code under LGPL.
The trouble with MS is that they have too many legacy APIs to support and with everything added into VS 2010 to support no wonder it's slow. Not to mention that .NET has memory problems because of the way it handles virtual memory.
Reminds me of my very first virus
This brings back fond memories. Back in the 80's I went to my local Mac store and copied their new OS disks to install on my Mac (that being the way upgrades were done in the early days). After that my machine started to act funny. Had to go and buy my first AV which at the time was Norton for Mac. Never managed to get a virus on my DOS PC before then, but didn't have Internet either.
I recently used the Qt Concurrent framework to speed up a statistical calculation. Lots of floating point but not much memory access. Basically you break up the problem, put each part's data into an object that also has a method to process the data and turn a list of these objects over to the framework. It was rather fun to see my quad core hit 100% CPU and solve the proble in 1/4 the time. It is loosely based on the Google Map Reduce process. Breaking up the data into objects also makes it easy to keep the multi-thread access running without locks.
As I see it, the main problems yet to be solved are (1) the ability to run fast on all cores when doing memory intensive operations such as image processing and (2) the ability to run multiple short operations where the thread scheduling time becomes a problem. The first is basically due to hardware limitations and won't be solved in software. The second may be solved by better thread scheduling or programming language extensions.
It may not be Mono that causes the patent issue
but suppose someone extends Mono into another area, such as an interface to some media product that MS has a patent on. One must be very careful how applications with Mono are used and interfaced too. It isn't so much the use of Mono or Linux themselves that causes patent problems, it's how you put things together to make a product (with things like VFAT) that causes the problem.
@iAPX432 and all that
Yes, the good ole iAPX432. Ran slower than shit downhill. Rumor is that the CIA let the Russians have the design to set them back 10 years on trying to copy the microprocessor. Great idea!
When are ideas an invention?
"We file patent applications on a variety of ideas that our employees come up with," reads a canned statement from the company. "Some of those ideas later mature into real products, services or infrastructure, some don't."
The patent office has clearly gone crazy! They should go back to requiring an actual physical invention that works. The possibility of patenting an "idea" without a full disclosure of a practical actual implementation is crazy. I recently looked at a patent from the 80's that didn't work and was also vague on the implementation. When someone comes along and actually provides an actual working system it only allows the trolls to take it away.
Not that this Google invention wouldn't work. It just sounds like they like to file patents for the fun of it.
@INI vs Registry
"If your applications are so unstable..."
Actually they are quite stable. Millions of $ of product are processed by the programs every day. I just don't see the point of using a bloated registry that stores the data in a format that isn't easy to edit without the system running. Perhaps it's better than putting the data in a section of the WIN.INI file, and if I needed a tree structure I would possibly use it.
Have you ever run the System Internals Procmon program and taken a look at the amount of registry access going on in an idle Windows system? Many programs waste tons of CPU cycles reading the same registry keys over and over again.
"Idiots like you are why Vista has had so many problems with old apps. I bet you think admin access is a given right too."
Yes, my apps require admin rights due to several factors such as using real-time priority. They run critical machines and must do so 24/7. Yes, I could hack the local security settings, but it's just easier to run admin. Access to the machine is controlled so users can't muck it up.
Actually, the reason Vista has so much trouble with old apps is because a bunch of idiots wrote it! You can't add security to an insecure API without breaking old apps. You also can't clamp down too much or people will move on to something else that allows them to get the work done without the complications. Just say NO to UAC!
There are reasons we programmers don't use the registry...
When you need to make it easy for field techs to backup software settings, an INI file saved in the programs's directory works quite well. Try to explain to a customer half way around the world or a field tech how to backup your specific branch of the registry. The registry is also very bloated in many computers which results in slow access. The INI files are also easy to edit if that becomes necessary.
I'm no Linux fanboy, but I'm using the Qt framework so I can switch if necessary. I only received the final Vista drivers needed to run our flagship product just a few weeks ago and am not very trusting in how our hardware will continue to perform on Win7/8. I'm tending to pick only vendors that support both Win/Linux just to be on the safe side.
Not that Linux doesn't have problems too. Almost every release breaks drivers due to low level kernel API changes, but the user level API has been fairly stable. And I'm not forced to move to a new version as often or troubled with activation or DRM.
My only question about building a VM Win XP into Win 7 is when will the EU decide this is bad for other VM vendors with similar offerings? Will the XP VM be added for free or be an extra cost item? And considering VMware has much better USB support, how about letting them shim their VM into Win 7 also? Would MS be willing to allow this?
Fire at 41,000 feet
My first flight on on NASA's Kuiper Airborne Observatory was interrupted by a smoking HP power supply. The telescope data system was all HP equipment (2100 mini-computers or earlier) and something shorted out during the flight. As the smoke increased it started to get everyone a little worried. Fires on airplanes are not fun. Everyone went back to their seats and someone grabbed a fire ax and started looking for some equipment to attack. The system had a ground fault protector that finally tripped and shut down the whole system. We had to land with the telescope enclosure open (a huge open hole in the side of the airplane) which had never been done before. Not exactly the best flight to start out with.
This is a link to some photos I took on the airplane:
Or just use Knoppix and partimage
Backup to network but some command line needed...
RCA and TV
Yes, RCA tried to rip off the TV inventor, Philo Farnsworth. They brought in their own inventor and tried to work aroundFarnsworth's patents. After a long battle, Farnsworth won, but never made much. Wonder if Balmer was knew about this?
Are Allen's shares common stock?
"As part of that deal shareholders have been frozen out and will receive nothing"
Just wondering what Mr. Allen's exposure is?
Got a new variation Jan 1
Got it from a hotel wifi. I run XP Pro and was patched up to SP2 but not SP3 because it caused problems with VMware which I need for development. Symantec Endpoint Protection didn't even see it. Neither did McAfee on my wife's laptop. I noticed I couldn't go to McAfee.com but thought it was down at the time. No warnings at all from the AV.
Spread to my work machines on Jan 5 and started to attack other work PCs. We use McAfee Total Protection at work and I figured it out by checking the logs and saw lots of buffer overflows being blocked. Had to use the Linux server to surf to the McAfee console. By luck it didn't spread to two customers I visited while infected.
Running scans with either Symantec or McAfee failed to remove it on all PCs. Onlyt F-Secure's removal program worked 100%. The problem is that you must remove all infected PCs from the network, disinfect them, connect and patch them or you will get reinfected. Miss one infected computer and you get to start over.
Almost makes me consider running Linux with VMware to run Windows with disks set to read-only. What I don't get is how Symantec and McAfee downplay it.
Drivers and Bloat
Can't copy network files fast because the network stack is throttled so not to break up audio playback. Sounds like driver and kernel bloat to me. XP can do audio quite well and is not throttled, thank you.
What MS needs to do for Win 7 is release an XP level for business without all the disk indexing and other services that cause slow operation. They had to disable all this junk for Server 2008 because MS must not let the new server run slow like Vista.
And I'm still waiting for Matrox MIL 9 with Vista drivers. Many large companies are just now releasing official Vista drivers. The driver SDK model changed from XP to add to the delays. Whole new TCP stack.
Good news is Linux will soon run standard GigE machine vision cameras (see http://www.baslerweb.com/beitraege/news_en_85526.html).
Are they sure?
"The Gulfstream Vs are particularly interesting to us because they usually fly between 40,000 and 45,000 feet, which is higher than most planes fly and we'll be able to get very interesting data that we've never collected before at those altitudes."
When I was there around '74 they were flying a scientifically instrumented U2 which I believe they still have (but may not fly now due to the noise). I'm not sure I buy this line....
And then you have an idiot screw it up
Well the bill just passed so we will see where it goes. The following actions are why the models will never work:
Insurers dive on Reid's 'bankrupt' quote
Paris because she really knows about screwing things.
Can't use a keyboard because of war injuries?
Then he won't be able to punch the codes into the nuclear football to send off the bombs!
Another reason to elect an expert like myself. Lets go a hunting Osama Bin Ladin in Pakistan!
The club card
Interesting that the club card uses the old police DUI photo from when Bill was young and impulsive.
To borrow a little from recent political events: Vista is still a pig with lipstick in my opinion.
The Qt framework lets you do things like this
MyObject obj = myHashMap.get(key);
Result: same as "atomic" and just as easy. It also releases the lock when an exception occurs.
I'm not sure what all the fuss is about.
- Product round-up Too 4K-ing expensive? Five full HD laptops for work and play
- Review We have a winner! Fresh Linux Mint 17.1 – hands down the best
- Vid Antarctic ice THICKER than first feared – penguin-bot boffins
- You stupid BRICK! PCs running Avast AV can't handle Windows fixes
- Antique Code Show World of Warcraft then and now: From Orcs and Humans to Warlords of Draenor