Maybe there should be a minimum spec.....
for the quality of the content !
2394 posts • joined 24 Mar 2010
for the quality of the content !
"I find it very frustrating to have to keep changing tabs every few minutes."
I find it very frustrating to have to keep changing tabbies every few minutes - shirley
"The biggest problem with spreadsheets which I've come across is that they are really easy to modify accidentally and very difficult to verify."
So those of us that use them extensively have to develop working practices/ mitigations to ensure that we do stay safe.
If any data transformation becomes a standard requirement I code it in c
"Spreadsheets are useful for simple stuff but the problem becomes such simple stuff grows without proper design."
Spreadsheets are useful for very complex modeling and calculation - the caveat is that they are not short cuts, require as much thinking as anything else complex and need as much care and checking as is necessary. For example using large sheets to manipulate /calculate large datasets I used to be paranoid and always added large amounts of known good data with known outputs into the set to check. All of this is especially important if macros are used - I try to avoid their use personally.
"I'm beginning to notice a trend: global evaluative statement about oOo/LO but then when engaged with I can never get any detail..."
As someone else mentioned Office is one of MS's cash-cows - they and their 'friends' shall we say will probably do a lot to protect it.
A further note on performance. I used Excel extensively for data manipulation/modeling a few years ago but when I first started using OO I was disappointed with Calc's performance. It was much slower than Excel for the size of dataset I was manipulating. That's all changed - it's now very fast.
"possibly doing egg-sucking tutorial here but is that calling sine with same argument or random/varying argument? LO and Excel can cache results &c."
That's having 400000 sines each dependent on a previous cell's value and then changing the first cell manually but at random. and waiting for the last cell to change.
So as far as I'm concerned if forces 400000 sequential sine calculations and spreadsheet output refresh. I'm sure better benchmarks could be used but for the sort of numbers I use ~~100K rows with calculations ~~ this complex it's plenty fast enough. Anything more complex and I have a number of programs written in c to handle the data.
Mostly retired so I don't need this so much but I do find it all so amusing when people complain about LO's speed.
"For me, the performance difference is real because the startup time of Libre office is much longer than MS office."
Try it either with the internet connection turned off or load an .xls file - I certainly have this weird problem when traveling (not on my home wifi) where starting LO whilst connected produces a VERY long lag before the sheet is ready. Easy to work round and only applies on first opening of an .ods file.
From memory it seems to be that it goes looking for my networked printer - which at home it finds - as to why ? ?
"Also 350k spreadsheet loading over wifi from fileserver ~2-3 secs."
That time includes starting LO BTW
"1 million row Monte Carlo simulation in LO Calc 4.x, re-calculation time around 25 sec, same in MS Excel 2010, around 10 sec (core-duo/3Gb ram) so yes slower on bulk arithmetic I'll grant you."
Can't compare with Excel but 400000 sine calculations here too quick to measure. Also 350k spreadsheet loading over wifi from fileserver ~2-3 secs.
That's on an 8GB i7 OpenSUSE 13.1 LibreCalc 4.1...
"...that one started life as a Fortran programmer..."
May the FORTH be with you
The ~200kW figure is just the 270HP figure mentioned in the video. However that seems very high. Each of those small fans would need to be 6kW which seems a very high figure for a fan a few inches in diameter.
"it takes roughly 6.25 x 10^18 of these per second to make 1 amp."
The 6.25 x 10^18 is 1 coulomb - to quote, admittedly , Wikipedia ( and I've not had chance to check this )
two negative point charges of −1 C, placed one meter apart, would experience a repulsive force of 9×109 N, a force roughly equal to the weight of 920000 metric tons of mass on the surface of the Earth.
This goes some way towards explaining the problem of banging up too many electrons in close proximity - hence the requirement for balancing positive charges which add to the mass and the volume hence someone's comment about 'the laws of Physics'
"You could now easily get, ohh, say fifteen minutes out of the current device, although not on a public road of course."
270HP out of a cigar lighter socket - WOW !! (~~200kW)
'lightweight cable' vs 200kW - WOW !
"a very simple device quickly put together with off the shelf parts as a publicity stunt by a small company "
This small Romanian/USA company has kerosene/LOX rockets capable of sub-orbital flight ( so their pub. states) ! (https://en.wikipedia.org/wiki/ARCA_Space_Corporation)
to have the first flight by a PR person - don't want to waste a skilled test pilot !
'B' Ark next !
"The Tesla model S charges in under 10 hours from a standard outlet, so you are looking at a substantial power bill for 6 minutes of fun."
I'm not sure what you are trying to say - without intending to be patronising : not all batteries are equal, the battery in the Tesla has many, many times the energy storage capacity of the 'hoverboard' hence needs much more power from the charger.
"A failure to do that, the arrogant attitude of "Microsoft Knows Best", will be the end of MS products on all these systems."
No point just talking about it - you need to do it !
"I have a lot of material in DV AVI format, and a growing amount of material in HDV.I need to capture both formats."
Not a lot of help I'm afraid. Kdenlive should capture a number of formats by firewire including DV AVI & HDV - did you want to use USB ? No real experience as I changed from tape to flash quite a while ago. There were certainly several other programs that would capture via firewire Some people seem to have had success with USB using the program dvgrab which I've also used for firewire which I think is it's default.
Note no blank line between the file`names - I can't seem to get rid of of them here !
Can I suggest generating smaller files - say ~15 mins and then concatenating by generating a file e.g test.txt , in the same directory consisting of :
save it , cd to the same directory and run ffmpeg
ffmpeg -f concat -i test.txt -c copy output.mp4 - that will add the .mp4s and sort out the timebase
One kdenlive gotcha I've just remembered is that I find that kdenlive, ffmpeg, melt. mlt need to be from the same repo. ( Packman)
"It would be extremely bad if web servers needed rebooting more than once a year or so since they've been pretty much doing the same kind of things for decades."
Sorry I should have explained. My 'fileserver' is also a print server, compute server ( for scientific software ), SSHD entry point for my network and is also used for video transcoding (1080p/50-> 720p/25 ) as well as a load of misc server tasks including a daemon to my PIC micros and a media server.
I repeat Kdenlive often used to crash but has been rock steady for ~3 year (for me on my systems). I have some form in running high-intensity software as ~12 years ago I was running protein modeling software ( on RH linux ) on a dual Xeon at ~100% cpu for days
"Surely software on all systems crashes and has random bugs"
Maybe so but, for example, Linux runs my fileserver 27/7 for months only rebooting for kernel updates or power-cuts.
"If the author was using 0.9.10 they would likely hit troubles. "
Just to note I'm using 0.9.10 without any issues . I also rendered a 4K video yesterday just to try it - no problems.
Certainly linux has been 'blessed' with many authors trying their hand at software. Some programs have interesting either in utility, speed, ease-of-use, scope etc. Most don't manage everything . Also most don't have the time/resource/knowledge to polish, or even plan how their creation will develop. Many indeed are treating it all as a learning exercise.
However in almost all categories of software there are example that stand out. In the graphics areas I find :-
Kdenlive for video editing
Darktable for RAW photo development/editing
Inkscape for vector graphics
to be exceptionally good, stable and well-documented. There are others, I'm sure, but given the time required to become completely familiar with programs of some complexity/subtlety and indeed the concepts they are implementing I tend to stay with trusted tools for routine use.
@ Martin an gof
I'll clarify on the rock-solid - that's what I mean. (Using 1080p/50 source and outputting the same as H264 mp4 usually with a file size of ~1MB/sec)
4-core i7/8GB/spinning rust OpenSUSE 13.1/KDE renders in ~twice real-time. To convert to 720p/25 for my lesser devices I use ffmpeg
"Kdenlive does have its quirks, including the fact that it seems to be very crash-prone on Linux Mint, so much so that I ended up doing my testing in Debian 8, where it worked fine."
Certainly I've used Kdenlive for ~6 years including 1080p/50 for the last 3 years. In the early days its stability varied but its been rock-solid for ~3 years (under OpenSUSE/KDE)
"The VMS OS created to run on Dec Alpha CPUs. It was even better than most UNIXes."
Having written code for VAX/VMS and for Alphas I can tell you your chronology is wrong.
Alpha ~ 1990
"affing about in the console to remove a directory ~/. is not something an average or below par user is going to do to fix a problem (I'm looking at you mint)"
Merry Xmas - why would you need a console to do that ? It's just another File Manager op. Dolphin (FM), Show hidden files from the menu, right-click on directory, delete from menu
That's for Mint/KDE
"The continuing practice of publicly funded academics publishing their papers behind paywalls ..."
Not always their fault. Some journals, in fact, require the authors to pay a charge to be published.
"Basically, you give someone a shell on the remote system and from that on he can attempt anything."
Basically, you give an authorized user access to one account on the remote system and from then on he can use anything that he is authorized to use.
Here in Indiana, USA, pi is no longer 3.14159265359...."
Apart from the obvious madness, how many people would use pi in mental or long-hand arithmetic and 'need' it to be simplified ?
I think legislating petrol as carbon-neutral might be next !!
"I've done quite a few translation/merges of data using Excel and Word for partial automation,"
We routinely moved 1-2 million records between databases - due to mergers of companies.
"The Git remembers drooling over a Silicon Graphics Indigo machine running Irix and some interesting mathematical/graphical software."
We used to use SGI with 3D graphics and a lot of backup horsepower (compute servers & Linux farms) for protein modeling etc - around ~~2003 we changed to Linux/Dual Xeons and saved a large amount of money on hardware - although the 3D graphics card and LC spectacles added a lot to the cost. Some experiments were done with porting some of our in-house software to Windows (W2000) but it almost always crashed. (We ran on SGI or Linux at ~100% CPU for 2-4 days or more so we gave it some stick. Once we had a 2048 core Linux farm it became a lot easier to do more speculative runs very rapidly )
"And its no good bleating about how manufacturers "should" provide non-windows alternatives"
I didn't but there again I didn't mention anything about medical environment either.
On the other hand a quick Google suggest that most areas of specialized medicine that involved detailed analysis of MRI, CAT etc does use software that is often available for Windows, Mac & Linux. It seems to be in the areas of databases and messaging that Windows is used solely.
If you want specialist areas I know of dozens of scientific programs only available for Unix/Linux some of them are eye-wateringly expensive. I'd also point out that MPLab is highly specialized.
"Who is going to pay that extra cost?"
Well Mozilla ( Firefox & Thunderbird), Google (Chrome, Google Earth) Gimp, LibreOffice , VLC, Skype, VirtualBox, Apache, MPLab (PIC dev/programmer) and a number of others.
"Sounds like the unofficial news sources are now just as bad as the official ones. The horror of it all!"
I think the 'official' news sources seem to be getting worse - maybe reflecting less rigorous research or lack of knowledge/skepticism. Certainly some of time that they report areas of which I have in-depth knowledge the ignorance displayed can be breathtaking. Of course then cut-and-paste journalism multiplies the errors rapidly.
" any computer you don't assemble yourself is going to come with a valid Windows license"
That's certainly not true in the UK. I'm writing this on a very similar laptop to yours (minus the NVidia graphics) which was bought new without any OS and has had OpenSUSE 13.1 on it from new
"@Chemist, there are plenty of mechanisms for getting to distant places, only a matter of time."
That is so obvious it wasn't worth mentioning. I assumed people wanted to travel to the stars within their own lifetime.
I might add that personally I'd be delighted if there was some evidence for potential mechanisms for distant travel but realistically it may be that the physics of this universe don't allow for such things.
Wishing for something may drive technology but not science.
"It's no wonder we're stuck where we are right now with people like you and your 'It's impossible because we know it to be so' attitude."
To be fair the skepticism is due to a lack of any theoretical framework to give any hint as to how this all might be achieved. Idle speculation can only achieve so much.
"Just because sodium fizzes under water here doesn't mean it won't sprout flowers in space."
That's NOT science that's mere idle speculation
"More like a migraine strategy"
I vote for 'micturition-about' strategy
"and need to look up something on my home computer. I can ssh via my phone"
Well I don't know what phne you are using but I generally access files from home using a file manager via fish protocol. My home file system is just another folder in my file manager.
"I get them all the time, just ignore them"
Staggering isn't it. I have, as I say, just had the one on the non-standard port. I get attempts against the usual suspects all the time.but no other ports are open. I do other things when I'm being paranoid -like limiting the time the port is open to a small time window every day. But I do use it for real all the time when travelling.
"Downvoted 10 of millions..."
Say 1.5 billion desktops at @ 1.5% Linux - that's >20 million
Suggested reading :
I know you are joking about "the six of us" as there are 10s of millions of desktop Linux users. The infection vector is content management systems that have been fixed already. So unless you are running such unpatched and exposing the same to the internet it seems very unlikely that you will have to face such problem.
I've been running SSHD for >10years and in all that time only one attempt has attacked the (non-standard) port without success, I add. But that is the only internet-exposed port I have. Indeed I have all lower ports blocked at several points my ISP, router and firewalls. If you expose ports to the internet you need to be responsible enough to maintain configs/software/logs as well as the usual care with installation sources, permissions , e-mails etc.I usually browse in a VM as well.
"To my personal knowledge Thalidomide was certainly trialled against multiple myeloma around 2001"
I've never worked directly on cancer but I know a number of cancer areas have been interested in Thalidomide over the years
Sorry dogged, that's not how I'd define 'surreptitiously'
"a compute with _NO_ network connectivity is rather useless in this day and age."
Certainly I have several PIC 18Fs connected by RS232/USB converters to my file server and laptops and they are very useful - and accessible over networks by running a daemon to control/interrogate them.
"and want to serve adverts in your application or game, you need some way to uniquely identify the user or player"
Luckily not in my world !