1290 posts • joined 10 Apr 2007
do you know what "solar wind" actually is?
IIRC some of the recent theories on the Earth and Moon, there was one body initially and something comparatively large smacked into it, possibly shattering and then leaving or possibly merging with the resultant mess. In the debris that was left the Earth reformed out of the larger set of debris and the moon formed from the accretion(?) disk.
It's a neat solution to the problem of why Earth has such an enormous satellite and as I understand it, the chemical make up of both bodies does lend some support to it.
Re: Netware 3.11 - the good old days...
Yes there was. Although it's such a long time ago that I can't remember what the pre-requisites for this were, but it was an amazingly useful feature and saved a lot of blushes.
There were many good things about Netware.
File and directory security wasn't entirely fubar'd... the Windows security model, even now, is still messed up entirely and is not as capable or effective as what was available on Netware. Access rights were centrally stored and administered which was a huge advantage when managing accounts as it was possible to see what rights a user had without having to check every single device and share somewhere on the network to see what arbitrary rights had been assigned there. Not that this model scales overly well but it was a lot easier to manage and more transparent.
Want to prevent a user from moving a directory? Easy with Netware, "impossible" with Windows... how many and how often are file shares dragged from one location to another and "lost"?
From my point, it all started to go wrong with Netware 5 and the continued fragmentation of the user interface... some tasks could only be done on the server on it's awful and extremely inefficient GUI, some on "legacy" client tools and others through using the text based interface. It's implementation of TCP/IP was massively improved but that didn't make it more of a joy to confgure.
I suppose the active changes that Microsoft made to continually break the Netware client and removing the login / authentication plugins forcing Novell to work around things all the time couldn't have helped either.
Descent: Was this the first one that removed the horizontal floor and vertical wall "restriction" that seemed to be a feature of the earlier games?
Re: @Nick Ryan
Thanks. I'm obviously rather behind the times on satellite transmission comms like ribbon aerials, but I have no need other than an interest to be up to date on these things. And a ribbon aerial (now I've looked them up) would fit nicely with the cube sat scheme.
I was wondering about the communication problem as well.
Generally speaking, it would have to be a dish style communications method, as it's much more efficient to only transmit in the direction you want it to be received in. However the smaller the dish (generally the tighter the "beam" as a result and the lower tolerances) the more accurate the direction needs to be to hit the target. I believe this is similar to the problem where smaller consumer satellite dishes have to be more accurately aligned.
Re: Could not agree more
Thank you for remembering that it's really about what customers NEED.
Far too often I've been given what a customer wants, and spent (a relatively) a long time with them separating their requirements from their chosen solutions to get to the bottom of what they actually need. Usually they've been suckered by promises of automation solution nirvana (MS, IBM, Oracle, etc... they're all very guilty of this) where they lose sight of what they need. I've only ever once come across a client that where I've neatly redefined it like this didn't they appreciate the distinction... maybe I've been lucky or it's the way I've pitched it.
On a few occasions I've had to challenge the customer to run their newly designed processes, that they want to automate, on paper first. While this may sound odd I've found that if a relatively small company can't run their new (basic) processes on paper, there's usually no hope of them ever running them on a computer system either. Naturally testing a scheme, even on paper, really shakes down requirements as well.
Re: Linux Graphics... is rubbish
To add to the optimisation talk... when you establish an OpenGL context in Linux it can be linked directly to the display device and not go through X. It's not a very portable solution though.
Re: Stack ranking.
That's one of the biggest things that's still killing MS from inside.
...and in the news recently they want to introduce a similar scheme for our Civil Servants.
Re: BS Corporate marketing language
I wish I could find it, but there was some staggering research into processing politicians' speeches and deriving them down to just what, if anything, meaningful was said. The same would apply here and would probably produce a similar result to your distillation of marketing BS words.
MS' XBOX One plans would have worked and worked quite well and been reasonably fair... but only if the games were released at a reasonable prince taking into account the restrictions. In my mind that would have been 20% or so of the current sales prices of console games.
However we all know that this would never happen, particularly with the studios carefully telling us how many 10's of millions of $ it takes to make a current hit clone / sequel / cut-scene-delivery-mechanism. There is also the problem that the console hardware is usually loss-leading and the recovery is made in the sales price of the games.
This is a bit of a mixed thing really. On one hand, few things cause more problems than the "special" configurations that were foisted by default on users of Small Business Servers however the cost saving of the Small Business Server bundle compared to the alternatives made them a good solution.
No surprise that MS want to force everything possible onto Sharepoint (they've been beating this drum for years) and their cloud or, more accurately, their subscription offerings.
This isn't so much the increase in speed for a discrete user's connection, this is backbone technology.
i.e. You're in a street with 150 houses, 1/3 of which have an active Internet connection at 20Mb/s. Just that one street is looking at a peak throughput of 50 x 20Mb/s (4000 Mb/s). This street is neighboured to 9 other similar streets (10 x 4000Mb/s = 40,000Mb/s). This traffic has to get in and out of this neighbourhood, how many similar neighbourhoods are bundled together before the traffic starts diverging?
Obviously, these are simple peak throughput examples, but when you start to use Video on Demand services these start chewing through inordinate bandwidth when taken in just a small area.
Re: Excellent Stuff!
Yep. And before this article I didn't even consider how lifts and lift designs were tested. Obvious sense that they are and should be, but it's nice to see it so well (and visibly) demonstrated.
Re: Holy shit ...
Hahahaha.. That's almost a utterly messed up as the monster cables website.
So far the gem of the site has to be the "13 amp - High Performance Hi-Fi Fuses"... from only £34.94.
And the inevitable techno-babble bullshit: High performance fuses will protect the circuit but the integrety of the supply line is vastly upgraded, Like a good power cable allow 50+ hours to bed in.
I'm also going to have to have serious words with my "bad" power cables that have a very nasty habit of obeying the laws of physics (and sense) and tend to work straight away without requiring 50+ hours to start to work properly.
Just to add to the flickering (light) topic here... human eyes have considerably more movement sensors on the periphery than in the centre, interesting balanced by having almost no colour sensors in the periphery where we see in monochrome and the brain fills in the detail with what it remembers (or guesses from experience).
As a result, many household bulbs don't flicker when looked at directly but look (!) at them from the corner of your eye and you'll see the flicker. This flicker can also be seen when the light is reflecting off a surface. It's one of the (many) causes behind offices fitted with fluorescent bulbs giving staff headaches.Interestingly the flicker is also one of the reason that these bulbs often come in pairs (or more) as gives not only gives fail-over in the event of tube failure but reduces the impact of the flicker through it being masked by neighbouring tubes.
Incandescent bulbs also flicker due to the power supply frequency but the effect is negligible as they operate by heating an element and this element does not cool enough between cycles for the flicker to be noticeable. However you can make it so you can see the flicker if you use a high output bulb and reduce the output to minimal using a dimmer switch.
Re: Alternative to Windows?
Re: Start button - Never used it anyway.
Unfortunately you are a "power user" and have entirely missed the point.
99.999% of Windows users are not not power users. They have not memorised arcane keyboard shortcuts, they often don't even know that there are keyboard shortcuts (especially helped by MS's stupid insistence on hiding them as much as possible). They can just about wield a mouse in anger, often don't understand the shift key compared to the caps-lock key or know how to cursor through text instead of hitting backspace until they find the offending mistake and then retyping what they just deleted.
Users require an indication that there is some functionality available, this is a basic, fundamental aspect of good user interface design. They should not be given an artistically blank and meaningless screen and expected to somehow "know" how to bring up some functionality on it by clicking / thumbing arbitrary screen locations. This is why buttons were created in user interfaces (and hyperlinks in HTML documents) to give a user an indication that there is an action available. Unfortunately now we've gone backwards and the "artistic" (form over function) trend is to hide anything vaguely functional so a user is left having to randomly thumb an interface or patiently wave a mouse cursor over it hoping for something interesting to be revealed.
For what it's worth, I've been a specialist in User Interface design for over 20 years. I also use keyboard shortcuts extensively :)
Re: Sourceforge & Classic Shell to the rescue! (@AndrueC)
Precisely, I've done that for years on dedicated kiosk systems where they need to login (auto-login is a feature that's been there for a long time) and setting the explorer.exe replacement for that particular user.
It's staggering just how many developers of "embedded" (kiosk) systems using full Windows don't know this and calmly boot to explorer as the shell then auto-load their application. All it takes is a use to alt-tab and they have total control of the system. If a maintenance user of the embedded / kiosk system needs access to the explorer shell it's a simple matter of running explorer.exe from within the kiosk application and access is given.
Re: *looks at Eadon and laughs*
Rebuild to change the IP address?
I think you're getting confused with the nightmares of Windows NT4 service packs. Arrrrgghhh! They're sending shivers back just thinking about the farcical things we had to do to NT4 just to make some otherwise what should have been simple changes.
As for Linux vs BSD - they share a lot of code and features and there's a lot of movement going both ways. This makes a lot of sense and saves reinventing the wheel code wise.
Partly Political Broadcast
They already do... those kind of entertainment shows have to be fronted by a clear, unambiguous "This is a Partly Political Broadcast on behalf of <insert party here>" statement.
This notifies of an upcoming show where we all need to be prepared to suspend our disbelief circuits and get our chunder buckets ready to watch the grinning lunatics hug otherwise previously innocent, unsoiled children.
When you describe it like that, I completely agree about having a good range of devices - that have differentiating features.
The problem I have is that we're left looking at a line of devices that are (superficially) very similar to look at and who's to know which ones are actually worth the money and which ones are more land-fill?
Take a look at this page: http://www.htc.com/uk/smartphones/
Aside from the two Windows 8 devices (proving that it's not just Nokia that make them), the rest of the phones on the page are distinguishable by small variations in size (don't forget, they're all scaled to one size), by name... err HTC One - One SV, One X+, One XL, One X, One S and One V... wtf? By the marketing tag rubbish such as "Simply stunning" or "Exceptional performance comes standard" and the inevitable near 5 star rating lies that we expect to see on a manufacturer's own website.
There's probably only one or two phones on that page that are worth the bother for the money, a couple that are penis extensions for those with money and the rest? Probably land fill.
Looking at a similar Nokia page: http://www.nokia.com/gb-en/phones/lumia/ the problem's the same. Other than a couple of more rounded, possibly smaller, devices they all largely look the same and feature the same baffling product numbers that seem to make little sense and there's no order to on the page.
Another Lumia? If they're not careful they'll be as bad as HTC are/were and Sumsung are getting with so many damn devices it's hard to know which are the turkeys and which are genuinely good devices for the money.
The build and specifications look good, and while I like some features in WinPhone as some are well thought out and work well, I find a lot of the basics extremely irritating.
As a designer and UI specialist I find (subjectively) that the interface is ghastly and there's too much "hidden" functionality than is not obvious and instead you're left searching around the interface for arbitrary ellipses (...) or swiping randomly in the hope of finding what you're looking for. But then the latest Android versions have gone backwards on this invisible interface of "..." front as well, which isn't good.
Re: Does not compute
I read it that DirectAccess functionality is required on every device on your network. :)
Re: Does not compute
I'm not entirely sure how directaccess could protect a light bulb... unless this light bulb is also running windows, in which case that's a scary prospect - both the additional requirements and the sheer inefficiency of running 2gb of bloat on a light bulb (because we can bet these technologies won't be available on the embedded form).
But back to the other problem... I have a single Internet connection for my home, this is shared between multiple devices and systems as they don't have their own connection and I'm definitely not stupid enough to run an open routing gateway. Where does it make sense to put the protection? On each device, or on the gateway? Not that directaccess couldn't prove useful for windows only environments where you don't mind (or care) about the inevitable lock in, it could be a very useful additional tool, but for protecting arbitrary devices it just doesn't read like it's the right tool.
My guess, on this much more important question that Crysis framerate ;), is that it'll be able to complete the task in a time span marginally longer than the total read time of the disc. Unfortunately blu-ray drives are not exactly known for read speed...
Armed forces are running their own locked down network that's not the Internet? That's revolutionary! Maybe some of ours might like to consider that doing that (properly) is a good idea?
Re: Coat's First Law Of Optical Media
There must be a similar observation regarding tapes as well...
I, for one, welcome our new metal bug overlords.
See below ;)
Again, we see that all you need for Computing is Maths. And yet, almost without fail, the worst developers that I've come across have also been maths graduates.
I've had Maths specialists spray me with spittle for hours on end, telling me that the "next generation" of computer languages will write themselves, there's no point in learning development and that every application can and should be mathematically described. Even the simplest of optimisation / performance demonstrations between a mathematically described process and one that's been thought about failed to sway most of these spittle spreaders. It was almost a religion to many of them.
There are many things you need to be a good developer, a maths degree is far from the most important unless you're going to be specifically using maths in what you develop, in which case pray for the sanity of those that take over what you've developed if your maths is good but your coding skills are poor. Coding skills? Entirely separate from maths, are more about organisation and planning, logical awareness and experience and knowledge with a hearty dose of artistry thrown in than any degree course title.
Re: No you choose your degree at 13
When I got my choices at school as to what I studied, many of the combinations that I'd have liked to do were incompatible due to scheduling. It's wasn't just an arbitrary decision that I couldn't take both English and Computer Studies (as my computer course was known as) there were three streams and I in one of the choices I could choose either English, Computer Studies or Art. There was some sense to it, so the more science related people could choose science related topics, hence my "maths", "science/physics" and "computer studies" selection but I'd have liked to do English and Art but these were excluded.
So around the age of 13, is the time that many of these life long decisions are made.
So if it's only operational when transmitting crash data in response to a crash... just how the hell is this going to help when the vehicle is stolen unless the thief subsequently crashes the vehicle?
Some things just don't add up
Last time I flew back from the US (including an internal flight) I didn't even have a passport (it was lost/stolen).
It was almost comedy... "ID?" "I don't have any, I have lost my passport and am flying home" Oh, carry on then.
Landing home in the UK "ID?" "I lost it, but am a UK citizen" "OK, fill in this form" (form filled in - basically, name and address) "OK, welcome back to the UK"
Re: 5 screens sizes
Personally I've found the annoyance of different device and screen aspects and resolutions more annoying on iOS than on Android. Not that the wide range of screen resolutions on Android isn't an issue, but it feels like I have better inbuilt tools to deal with one application and multiple resolutions and ratios than in iOS.
Yes, there are different versions of Android to deal with - currently two main ones unless you want to be cutting edge. But even that's not too hard as you can target the cutting edge and have fallback to the older versions as the support libraries work quite nicely (at times :-) ). It does require testing but if you develop applications properly and cleanly separate functionality from interface (Model - View - Controller) then even if you have entirely different interfaces it is not always that difficult to develop, after all, many of us develop apps that can be operated in landscape or portrait mode and this kind of model is normal to us.
Re: CEO of company slags off major competitor
"Jobs spent much more time during these events talking about how great Apple stuff was"
...and this is exactly how you should do it. Jobs may have had his faults, but Apple's fortunes did turn around when he was there and negative marketing (criticising competitors) is a defeatist way to operate and is usually a sign of weakness, poor judgement and lousy marketing.
Most products and systems have advantages or disadvantages compared to other, more so when the operating environments are different - and don't forget that while iOS and Android nominally are similar their operating environments are different: Apple have a tight reign on the hardware, OS and applications where Android is much looser and open. [This isn't an argument as to which is "right", just stating facts - both approaches have major positive and negative points].
So when this guy starts to criticise a competitor like this (negative marketing) then it's an indication of weakness in him and likely his products too. Would you rather deal with somebody who is positive about their own products or somebody who is busy being negative about a competitor's where they should be telling you about theirs?
Well, the HDD (or other storage manufacturers) started to make a mess of things a long while ago. All in the concept of clarity, or maybe just sales and marketing lies...
The two sets of figures are now clarified in parallel - the base 10 (1TB = 1000GB, 1GB = 1000MB, 1MB = 1000K and so on) and the base 2 (1TB = 1024 GB).
So when a HDD manufacturer quotes a capacity in TB the total number of bytes compared to what they put on the packaging and what you might expect can be quite different as they'll use the base 10 values and if you're using the base 2 then you're going to be quite annoyed...
But this isn't a justification for BYOD. This is a need for a responsive IT / Facilities / whatever department that is progressive enough to supply the right equipment for the job rather than one-size-fits-nothing (bare minimum) approach.
Re: 70 per cent of US mobile workers now pay for their own kit
Yes. The article seems to be deliberately worded to make it appear that it's the purchase of (computer) equipment, where the reality is much more likely to be mobile phones or sat-navs, but then equipment could even include laptop bags, there's nothing to say that it's even electrical equipment. The original survey may clarify it more, but I don't have the time or inclination to go through a registration process to view it.
So it just looks like this "article" is just another BYOD sales advertorial trying to convince us that insanity is the best way forward as it's already happening and we should follow them off the cliff.
Any sane company that I've encountered recently has set of laptops for their road botherers to pick from as the size of a laptop can be quite a personal preference and tastes and requirements can vary quite drastically. Few still say "here's the one laptop model you may use" as there's little point in standardising like that because unless you buy all your laptops in one batch by the time you next come to order one your chosen model is inevitably obsolete and has been replaced.
I see your point, but Intel pushing a bit harder (and fairly - hah!) should push ARM to get better just like ARM's given Intel the severe kick it needed to stop producing small thermal generators and start making more power efficient chips instead.
Intel wiping out ARM is a different prospect to Intel vs AMD as ARM licences it's chip designs and sub-components to be manufactured by a wide variety of companies, some of which are at least as influential as Intel.
I thoroughly agree about the x86 often not being the correct chip for the job, these are desktop PC chips at heart and while subverting them for use in mobile phones and tablets isn't inappropriate as these are genuinely portable computers often with capabilities far in excess of what we had on desktop systems ten years ago. For lesser systems such as basic PLC and control systems they are overkill and the convoluted and inefficient instruction set combined with the higher integration requirements and subsequent costs really hold them back as well.
Competition. Yay! Good! ...and so on.
...now if they were to mandate well (and clearly) defined interfaces between systems then they could pick and choose suppliers as they feel fit and choose the solutions that are most effective, reliable and cost-effective for the job.
Unfortunately we all know that this won't happen and with past experience with the five listed, each will assign twenty project "managers" to each project. These project managers will change every project meeting and the chance of meeting the same one twice will be slim, nobody in these five suppliers will take overall responsibility for anything and all the actual, real work will be sub-contracted out to almost as frustrated sub-contractors as the the client (Network Rail).
Re: I've said it before.
Oh no... and it's not even Friday.
My prediction, as ever, is "Yet Another Rubber Faced" loon. Like the previous few.
Noddy Holder would make it special though!
Cloud (online data storage mainly) security, the real side of it, is a continual minefield.
In the end, it's safest to work on the assumption that if the data is replicated out of the EU then it has gone to somewhere insecure that has no real concept of privacy, such as the US. The US safe harbor(sp) agreements are never enforced or checked and are usually so specific in use that your data will bypass the agreements and won't be covered by them... and the safe harbour agreements only dictate what the company may voluntarily do with your data in their posession, as noted here it has nothing to do with legal, government or other processes.
Where I work a small amount of our data is extremely confidential and sensitive as it relates to high profile events and court cases, a sizeable chunk is commercially sensitive such that the company involved would not like a competitor to access it but most is trivial and of little interest to almost everyone. To be safe we work on the basis that it's all very confidential and as a result there's no way we can seriously consider cloud data or cloud application hosting, especially if the service provider mirrors the data outside of the EU but that we'd also be implicitly trusting all staff, contractors and other third parties that are involved with the hosting.
The thing is, compared to contemporary PC games at the time of each release, Halo has always been found wanting. The less than optimal controls are fine for a console but poor in comparison to the PC FPS gamers' choice of mouse and keyboard. The game itself was unremarkable compared to the features, visuals and gameplay or the PC FPS games that arrived at the same time and the conversion attempts of Halo from XBOX to PC really only served to highlight the gulf in quality. It's not helped that XBOX hardware was static and PC hardware always moved on but that's the nature of console game design and development.
Not that Halo isn't a fun game, it's polished enough so that it is good fun to grab some friends and shoot them, or even shoot with them! But compared to similar games it's nothing special so it's cult following on the XBOX does tend to leave PC gamers with bemused expressions.
I'll get me towel...
While some viruses do jump species, in reality complete jumps are very, very rare. We're all subject to millions of viruses every day, especially those who work or otherwise live with animals, most of these viruses are just not capable of properly infecting humans.
The reason is down to what viruses actually are - to put it in a fairly basic manner, they are a form of life that cannot reproduce on its own and instead has to invade other life (cells) and hijack the mechanisms of these cells in order to reproduce itself - essentially the lowest form of parasite and just like higher forms of parasite, they need to be specialised to do this effectively. From Wikipedia "A virus is a small infectious agent that can replicate only inside the living cells of an organism." (http://en.wikipedia.org/wiki/Virus). To start with, many viruses have to survive in the open environment, or at least outside of a host organism and HIV, for example, is not particularly tough and instead has to be transmitted without using the open environment... which is why, you won't get HIV through (non-intimate or blood sharing) contact with a carrier, through touching what they've touched or inhaling when the carrier has sneezed or coughed. A virus then needs to be lucky enough to find cells that it can utilise, seeking these through the chemical markers of the cells, latching onto these cells, invading them and hijacking the inner mechanisms of the cell. Through all these steps the virus is exposed to being cleaned up, removed, disabled or rendered useless by basic biological and chemical processes or in the case of cross-species, just not able to get far as it will target the wrong genes, mechanisms or chemical signatures. Once in a cell a virus then uses and controls the cell's mechanisms to reproduce itself (some viruses also cause the cells to reproduce, which is the general cause of virus induced cancers). Even once in a cell, if a virus causes the cell to outwardly reflect that the cell is not working correctly then the cell, and the virus in it, will be destroyed by the body's immune system. It's not easy being a new virus!
Viruses mutate a lot, therefore when living continually with animals that are shedding viruses there is always a chance that one virus strain might happen to be able to infect a human. A certain level of "infection" is quite common but usually entirely harmless as the viruses fail at some step of their hijack and reproduction process. For example, the virus might be able to infect a cell but not to reproduce. Even should a virus happen to manage to reproduce itself, it still has to find its way from one human to another, and that's a very different problem as that involves getting out, crossing between hosts somehow (HIV takes shortcuts on this and needs direct transmission) and then evading the immune system and defences of the new potential host.
Many of these are designed to facilitate a Bring Your Own Device (BYOD) environment, such as the ability to print using Wi-Fi Direct, share the screen using Miracast, pair with printers using near field communication (NFC), and have Windows 8.1 devices act as Wi-Fi hotspots via built-in broadband tethering.
Huh? Just how much BYOD FUD is being paid for around here?
Print using Wi-Fi Direct... most useful for tablets and laptops to print to non-domain or non-local printers. i.e. go to a friends house, print something on their printer. A BYOD system in an office environment will connect to the local network and be given access to printers through the carefully controlled access to printers functionality that the BYOD sellers are selling.
Share a screen with Miracast. Nope, I'm at a loss. This is particularly related to BYOD how?
Pair with printers using NFC? Sounds like basically the same as Wi-Fi Direct...
Windows 8 Wi-Fi Hotspots... so, useful for the home, bugger all of use in a corporate environment and nothing to do with BYOD.
Well that's an odd one isn't it?
In theory, if a company's selling point is like the previously mentioned Abercrombie and similar, that they have fit, healthy looking attractive staff to front their store, should they not be allowed to advertise and look for these people? Likewise when a fashion house is parading their clothes, they need models of particular looks and sizes to model them, are they wrong to have this requirement? Do car companies pick the models to drape themselves over their latest luxury cars have requirements as well?
Not to say that it's right or wrong, but political correctness can go too far and some jobs do need above average staff - not necessarily just on the physical looks front either.
Re: At least with this site...
Yes. And better to accept this and understand that this is the way things are than to try to deny it.
- Apple: We'll unleash OS X Yosemite beta on the MASSES on 24 July
- Pics It's Google HQ - the British one: Reg man snaps covert shots INSIDE London offices
- White? Male? You work in tech? Let us guess ... Twitter? We KNEW it!
- The END of the FONDLESLAB KINGS? Apple and Samsung have reason to FEAR
- Researcher sat on critical IE bugs for THREE YEARS