Re: Musicians too
"My daughter's boyfriend is a runner, and violinist in the local orchestra."
At last! A use for the Oxford comma.
16459 posts • joined 16 Jun 2014
"25+ miles a day by dint of fitting a treadmill to his desk"
I take it this was an electric treadmill that he just left running whilst he got on with other stuff.
Personally I'd just have fixed it to a wheel of the car. If it didn't score enough on revolutions it'd have added a good few more for the bumps over the pot-holes.
With CEOs like that it's easy to see why the business ended up where it did.
"Cleaner ISAs with cheaper more reliable hardware will tell in the end."
I agree. What I had n mind is whether Intel/AMD can survive attempts to mitigate without losing performance and whether, in the end, they have to get replaced with something else.
Price is the determining factor in the long run. Back in the hey-day of VAX, HP, Sparc etc. we were likely to develop on the same architecture as the application ran on. We might even have developed on the deployment machine; a separate development box would have been a luxury not many could afford for in-house development. It was cheaper hardware that enabled the move to Intel and availability of server OSs that lead to the migration from those platforms. Xenix and SCO enabled Unix applications to move onto Intel; in fact, if SCO hadn't have attempted to follow a high price model I doubt Linux would have come to dominate the Intel Unix-like market. There were plenty of small businesses in the '90s running on SCO and dumb terminals. There was a sub-industry producing multiple serial boards to plug into PC motherboards just to support it. There were also bigger businesses running on multiple Intel processor boxes with Unix OSs such as Dynix.
The move to individual workstations for developers is a separate matter. The PC made it economical for developers to have their own workstations. For someone like Linus that made it possible to develop outside a corporate environment. For traditional architectures as shared server sufficed for many of us working on server/dumb terminal-based applications. I suspect Netware might have prompted the adoption of individual workstations in corporate development. My limited encounter of that left me with an impression that development on a server, especially one that was also running production. would have been too fraught. A separate workstation would have been needed for graphical applications, even for Unix, as the X-terminal really never caught on. This, of course, is where Windows came in and changed things.
As to the future ISTM that the determining factors over the next few years will be the ability to mitigate the likes of Meltdown in Intel/AMD architecture* in the next generation of products and the adoption of ARM in workstations in a configuration consistent with those of servers.
* An interesting aspect is whether Intel and AMD diverge in the process.
"Theranos company and everything connected to it was always a scam."
The odd thing here is running a big scam and not having a bolt-hole for when it's eventually discovered. Did she really believe she had something that worked? Was it simply rich-kid's delusion? You can easily visualise someone with that sort of background not being able to judge what's feasible because up to now money has bought everything. If they fall lucky they have a Facebook and if they don't reality catches up in jail.
"Those people were there because they are famous names and inspire trust."
If you see famous people lending their names to something they collectively know damn-all about (excepting, possibly, the one surgeon in there) you should at least consider they've been hood-winked and avoid the same. As I wrote in another comment, work in a lab with a high enough profile and you see these bigwigs being shown round and not understanding a thing they're seeing or being told; they'd be easy targets for any bullshitter.
The big names who should really inspire trust in a venture in this field would be an assortment of medical and chemical Nobel Laureates. There weren't any.
If you were thinking of making an investment there and didn't understand the nature of the business the list of directors should have been a warning, not a reason to trust.
By and large they look like the sort of party who you'd see being escorted around a lab trying to look interested but with glazed looks in their eyes and ears acting as a through duct to anything explained to them.
Been there, worn the lab coat, had the visitations.
Since the AI system would base its decision on information like "lions are large carnivorous predators with large teeth", thus the fluffy bunny is probably preferable above the lion.
An AI basing its decision on no more than this would just as likely prefer lion or refuse to give an answer at all. In order to give a sensible answer it needs to "know" the implications of large carnivorous predators for human beings and, indeed, that the sharer is a human being.
Numbers and the like have an unfortunate effect on people. People tend to believe them. It leads to quoting results to infeasible levels of precision. It leads to measuring and acting on stuff which is easy to measure and ignoring the stuff which it more difficult to measure even if it's more meaningful (simple example is the setting of arbitrary speed limits and installing equipment to reinforce them whist ignoring tailgating).
"Usually those rules were associated with reasons, so an expert system could not only make a decision but also support that with the underlying reasons."
The shortcoming about that is that is can only make the decisions it was given reasoned rules to make.
A real expert, OTOH, will have a degree of understanding that helps them, when confronted with a novel situation, to undertake new reasoning and at least suggest what the best decision might be.
Let me offer you a small example: the Panel on KDE. That's the bar (usually) at the bottom of the screen with the start button at one end, the system tray at the other and between them minimised windows any useful bits you want to put there.
In KDE3 this could be set to auto-hide and then to reappear when the cursor went to a prearranged place. This could be anywhere along the edge or just in the corner.
If you selected the edge the ballistics of a mouse cursor would ensure you hit it whenever you went for a control at the bottom of a widow sitting at the bottom of the screen. You then had to wait a moment for the panel to unhide and rehide.
The better option was to select the bottom left corner as the target. This is where the menu button is which is often what you want from the panel anyway. Even if it isn't it's quick enough to go for the corner and then slide along to whatever you want. Very effective and very unobtrusive.
In KDE4 the corner option was done away with. Just that. Annoying panel behaviour unavoidable.
In KDE5....no they haven't seen the light and restored the corner option. They've just added, whether as a bug or by deliberate design, another breakage. The panel now bobs up if a window opens at the bottom of a screen. Not only that but it stays there until the mouse cursor visits and leaves it.
What's worse, if the panel is hidden and a confirmation dialog is invoked, say to confirm binning something, it's invoked minimised on the panel which, just ot be contrary, doesn't unhide. To confirm you have to bring the mouse cursor down to the bottom of the screen, unhide the panel, click on the minimised dialog to restore it and click on it. At least that resolves the unhide by window problem - the only workable solution is to sacrifice screen real estate to a permanently visible panel.
And did I mention that in 5 the background can no longer by a gradient unless you make a gradient as an image and set it as the wallpaper? Or the eye-rattling image as the default wallpaper? Or the bland scribbles that have become the default icon set?
The only rationale I can think if is that they didn't want Microsoft to be the only ones with crap UI design.
To be nit-picky, a 100% CUA-compliant browser ought not call its first menu item "File" but something like "Page" instead.
Oddly enough out of the 14 items on the menu only one is "Open Web Location". The rest wouldn't look out of place on the File menu of any other application. Even Page setup is to configure the printer settings.
"I think the problem with a hamburger menu is that it only works if people have learned that a bunch of horizontal lines in a corner somewhere means a button that can be clicked to show settings or options."
It gets out of hand PDQ. Bing on Waterfox. Waterfox has a hamburger menu top right. Bing has its own hamburger menu just below it.
It's hamburger menus all the way down! I look on them as the graphical equivalent of the acquired speech impediments such as scattering "like" into sentences where it doesn't belong or the introductory "So" which seems to mean "I'm a millennial and I'm starting to speak".
"The main factor is time. If you have time, you can do anything"
It's a matter of how you use that time. Do you use it, and budget for using it, upfront to investigate the requirement or do you use it unbudgetted later trying to make the product fit reality and probably throwing in a few bugs because you're under pressure trying to do things in time you haven't really got?
"1. It's not actually a link, it needs to be copied and pasted.
2. You don't know what it is beforehand so you don't know if it's relevant or not."
Copied and pasted is preferable depending on your level of paranoia. Unless you're in the habit of hovering over a link to check the URL it resolves to you don't know what it is anyway so you don't know whether it's safe or not.
"Does it need a manual? If you need a manual then the design can take a step back. "
Even better, write the manual and call it the design. OK, things get iterated and the draft manual needs to get update. but another gem from TMMM is that the manual is the first thing thing that gets started and the last thing that gets finished.
"Exactly because a hamburger menu does state what it does."
A hamburger menu allows you to choose hamburgers. That's what it does.
If you mean a menu that's hidden behind three lines being called a hamburger menu because that sounds kool the problem isn't with the menu, the problem is with it being hidden.
Right now I'm looking at a browser that has the classic CUA menus. If I were to click on File I know what to expect because, give or take application-related variations File menus do similar things.
If I use a browser with a "hamburger" menu I've no idea what it will do and, more to the point, if I know what I want to do there's no obvious way of finding how to do it unless I try everything in turn until I find it. I want to bookmark somethis? My CUA interface has a menu "Bookmark" - I think I've found it. My "hamburger" browser - no it's not on that menu. Oh, look, it's hidden under an icon of a star. What numpty thought of that?
Let me make it plainer. The advertising industry - and that includes FB don't need to influence (if you prefer) any members of the public, rich, middle or poor. All they need to influence are the people who buy advertising and all they need to influence them to do is buy more advertising, preferably at higher prices.
"Do you believe rich people buy a house, car or a boat because they've seen it on Facebook?"
Maybe not houses and cars but the advertisers presumably believe that they'll respond to adverts and as rich people buy more stuff in general and more expensive brands that's why they'll pay to advertise at them. Whether the ads justify the money spent is another matter but remember Facebook is part of the advertising industry and the advertising industry doesn't sell houses, cars and boats. It doesn't sell phones or trainers. It doesn't sell soap powder. It sells advertising. Only advertising. And all it has to do is to persuade advertisers to buy advertising. What people buy because they see it on Facebook is irrelevant.
When I use a piece of software, I expect the right to ENJOY the use of it without interference (so no "time-restricted trials" or whatever have you). I expect the right to STUDY its operation etc.
Even amongst FOSS S/W users (and I include myself) you are exceptional and even more so amongst software users in general. Most of us (almost all, actually) don't include studying software as one of its uses.
Most users have no understanding of any computer language nor should they need to. Very few will have fluency of all the languages in the stacks in a typical Linux distro and I doubt anyone at all has sufficient understanding of the minutiae of the all the sub-systems to avoid inflicting damage if they stray outside their competence* - hence the frequent advice not to try writing your own cryptography.
* Remember the great Debian random number fiasco.
"But don't forget the Red Hat paradox... Despite the massive contribution of unpaid developers, large FOSS projects often need some source of income, and paid support is often the solution."
No paradox. What you're not realising is that by and large is somebody paying the developers and that somebody is often Red Hat. In other cases it's often people with some other financial interest such as Intel who expect to sell the processors the S/W will run on.
The actual principle behind the funding is that a group of businesses who have something to gain from a project find it more effective to get together, do their bit at their own expense and share it rather than each of them try do the whole thing themselves with huge and pointless duplication of effort.
"They seem to have a monopoly, which is why they get away paying no tax as they know people will keep buying stuff from them."
As written that's a non sequitur. You need to turn it around. The reason why they (a) have a monopoly and (b) pay little or no tax is because they keep investing in building the business. That means that the operating profit gets swallowed up so there's no profit on the bottom line and the result of the investment makes it impossible for others to compete. At some point, however, they'll have to stop growing. Any growth curve that looks exponential is really the start of a sigmoidal curve.
The trick is going to be to decide when to stop eating the operating profit and start paying dividends. If they invest in facilities they don't need once they start to saturate the market they'll be throwing away cash. That happens. Right at the start of my working life I found I was in the middle of a small but well-known in its field business going bankrupt because it had done just that and the investment wasn't even on the stuff that made its profits.
They're not unique in investing all the available income for years before they start paying divies; I remember the same things being said about Microsoft.
"LOL believers in the current system thinks that continuous economic growth is sustainable."
LOL believers who equate money - which is a tool to trade what you do for what you want - with continuous economic growth.
Of course continuous economic growth isn't sustainable. But then neither is a society running on barter at anything above subsistence level.
Because if it doesn't, a Chrome "Now guaranteed with more ads!™" might have a hard time keeping its #1 position.
Especially against a forked Chromium without ads rival. In fact, if you were going to the trouble of forking Chromium why not build the ad-blocker in as part of the browser?
Biting the hand that feeds IT © 1998–2019