Or more likely they're too busy laughing at the sharepoint users who have a much more borked system that somehow managed to inherit a lot of access issues.
1447 posts • joined 10 Apr 2007
Or more likely they're too busy laughing at the sharepoint users who have a much more borked system that somehow managed to inherit a lot of access issues.
I think I may have a few of them here... data could easily be stored in the existing central ERP system. But no, it's stored in spreadsheets that are emailed between team members.
I don't hate myself enough to push a user to MS Access. It'll just cause more pain for me later.
But with google glass, recording video is always on, and unlike CCTV, it is mobile, unlike a video camera or digital camera, it is not obvious that they are using it... I.E you have no idea if the person is reading an email, doing nothing or recording everything you say and do...
Video recording isn't always on, and video recording drains the battery like you wouldn't believe. When recording is in progress, a little LED is lit (while this is technically maskable, it is more obvious than a mobile), and if somebody is using a google glass, it's quite obvious due to their focus point. When you see one for real rather than repeating hypebole, you'll understand.
I wouldn't say that a smartphone needs to be aimed or obvious that it's clear that you're taking pictures or recording. Just with the bare thought of the situation now, I could set my smartphone to record, stick it in my shirt pocket and merrily record.
OK, the damn thing would doubtless fall out within minutes, the shirt pocket on the shirt I'm wearing today would cover the lens, and pointing my chest at people could be obvious... but the principle is still there.
But all of this is possible with current mobile devices, tablets, phones and digital cameras. It was the case before digital cameras, it's just the speed and dissemination is now much faster.
It has been the case for years, and still is: If you are in a public space, anybody may take a photograph of you - you have no expectations or rights of privacy. In general terms these pictures may be used for any purpose that doesn't unjustly misrepresent the person or doesn't imply consent or specific endorsement for any particular goods or services. For example, a picture of you in a crowd, on a bus or train, is representative of a general situation whereas a picture of you standing next to a specific item or service could imply your endorsement and therefore cannot be used without your specific permission.
Putting it like that, it is an interesting question. AFAIK black holes, whatever they are exactly defined as, have a velocity through space and it's predicted that black holes would rotate as well (as distinct from the orbital spin of the matter collapsing into them). Gravity, whatever the hell it actually is, would have to "escape" the clutches of the black hole otherwise there would be no force of attraction, which would mean no black hole could form (or at the least grow). Gravity tends to work universally therefore a black hole would be attracted to any other nearby (massive) object such as another other black hole, which given some velocity is all you really need for one to orbit the other.
This has doubtless already answered elsewhere in a much better theoretical but thoroughly non-understandable way to the likes of myself.
...and a long time ago as well.
Likely to have been a default home Windows install: auto-login with no password.
sip memory modules
Wow, I had managed to entirely block sip memory modules out of my memory. A quick look at pictures of them again brings it all back...
Not sure that this is a real problem, I read it as a statement that currently these devices are not permitted by the US FAA. In other words, new regulations and controls would have to be put in place to regulate and control what is in reality a new flying protocol - autonomous, out of line of site of operator, satellite (or otherwise) navigated devices.
A lot of licensing is still done on trust, particularly in the the larger corporate environments (or where I currently work at a top University). Server licences particularly so, as the user / access licences are worded so vaguely or inclusively that for example a data access middle tier would only require a single licence to connect to the database tier however could be serving hundreds of distinct users connecting to it. It can be argued that requiring the number of user licences for the number of end users is fair, but where does stop? Are viewers of reports included? How about live (cached) reports hosted on an intranet? To add to it, you then need a few maintenance, service or operational accounts as well.
Too right. Statistics: For the perpetually confused or gullible, I'd recommend that a lot of people read the book "How to Lie with Statistics". A nice, easy read and assassinates most statistics that you regularly see.
Somehow. Despite reported serious inconsistencies in her statements and levels of denials from "They made me do it" all the way to "I had no idea they were doing it".
The whole story behind the Transputer is quite interesting... as well as the reasons behind its eventual failure. El Reg has a bit of it here: http://www.theregister.co.uk/2011/08/18/heroes_of_tech_david_may/
I agree... I probably didn't get the right level of cynicism over. If you want real fun, try thinking of "Wearable tech" that isn't medical or health fanatic focused.
With the fridge I was trying to demonstrate that while it may currently be a relatively basic example, there are potentially useful things that can be done. Even for these examples I would agree that it's a toss up whether or not it's worth the effort, but that's something else and once these things become more commodity then the effort is diluted sufficiently and what was previously a gimmick or "nice to have" feature, becomes standard. The same arguments about "it's not worth the effort" were probably thrown at a lot of the technology that we currently use, take for granted, and would be inconvenienced without. And we'll have forgotten the stuff that really wasn't worth the effort at the time.
An IP stack may be rather complicated but like most technology these things are built on a stack of previous modules incorporating standards, knowledge and experience. Even a serial interface isn't particularly simple if you have to build one from scratch from the physical layer up (and you wouldn't believe the mistakes that I've come across when people have tried), however modern IC systems can handle most of the annoyances and details for you. So while an IP stack is rather complicated it's becoming an established module that is just another tool that a higher level layer can use.
The only question is how your suggested enhancements will benefit from Internet connectivity.
I don't consider that the "Internet of Things" is specifically about giving everything possible an Internet connection - it's about connecting devices that were previously or are currently not connected to anything else, more local or personal area networks. Typically IoT has evolved as a marketing term / buzzword from Machine to Machine (M2M) communications, which doesn't sound particularly inviting but we've been doing this at various levels for years. What it boils down to is that IoT is little more than the miniaturisation (and feature stripping and simplification) of much larger and more unwieldy systems down to the level where they can become commodity / consumer items. There will be a lot of pointless ideas that fail, it won't be anywhere near as big and pervasive as the habitual "industry" tech-predictors predict, but it will become more and more pervasive once people find useful things to do with it and devices like Arduino, Raspberry Pi and similar are the start and allow a lot of interesting, sometimes useful, experimentation.
On the fridge example above, while I agree that in general the simpler something is the better, it doesn't mean that things can't be enhanced. For example, the temperature of the fridge could be monitored allowing an alert to be generated if the temperature goes outside defined limits for a period of time, for example when a toddler (or drunken / sleepy adult) merrily raids the fridge and leaves the door open, the cooling unit fails or some other miscellaneous and annoying problem that'll ruin your morning when you find the milk is off. Hell, just the opening of the fridge between certain hours could raise an alarm if you really want to stop midnight fridge raids. A little more sophisticated could be humidity sensors, where if something leaks a similar alarm could be raised. These are just a couple of simple enhancements to a basic fridge, nothing complicated, nothing that can't be easily implemented right now.
Much of the IoT press is just marketing fluff and noise, but there are useful things to be had from it all.
Right. So a "trader" in a virtual market that performs promisory non-transactional trades in this trust-based virtual market as quickly possible while making money from others that do the same claims to have lost some unverifiable virtual money because they weren't making these phantom transactions as quickly as they expected.
Maybe I'm just too cynical but I've yet to find somebody who can genuinely explain just where the money (and value) is generated from these high speed non-transactional, trust based, non-interactive transactions on finite resources comes from.
Exactly, the CPU itself is very unlikely to have backdoors or anything specific in it. Exploits or backdoors are going to be be in the supporting services that surround the CPU, the support chipset: the OS and the OS's device drivers.
By its basic nature, the OS that runs a system requires full access to the CPU, including all operation levels and all metrics and support. There is no point in a "super-duper-secret-access-mode" function in a CPU, this level of access can be performed using normal operations. Access to more privileged operation levels in a CPU is managed by the OS.
The support chipset, on the other hand, will have direct memory access to the entire system outside of the scope of the OS, will be able to send and receive network packets without the OS ever knowing that anything is amiss - this kind of communication will be undetectable inside the system itself, however observable outside through packet monitoring.
Device drivers also tend to have enhanced access to the system, including DMA access and direct access to hardware. At this level they are more readily monitored and the source code can be decompiled and assessed for potentially unwanted behaviour. Depending on how well written the driver code is, the OS is likely to be unaware of unwanted behaviour in the driver, these are trusted components.
The OS itself can easily have backdoors and access code in it. This is more readily detectable as the executable code can be decompiled and assessed for potentially unwanted behaviour, however if written well it should be relatively easy to mask as the OS provides this functionality.
The applications on top of the OS are even more likely to have back doors, access code or just exploitable through programming defects.
In the end the most likely source of leaks is the bag'o'flesh in front of the device. Many will happily sell their passwords for chocolates, use easily guessable passwords or just email or print and lose important information.
Or linked through there, http://novafusion.pl/
(I don't have anything to do with these packages)
Great film. However it's feeling more and more like an accurate prediction than just a movie...
...or for ChromeCast.
I've wondered that with the pull along rotary dial phones... young children are unlikely to ever see such a device outside of old movies and museums yet they still produce new pull along rotary phones. Somehow my daughter even learnt to pick up the phone handset and talk to it. She's also glued to a more current phone toy model and learnt to mug adults for their touch phones at an early age.
Good. Especially making sure to include the enhanced developer tools in the same stream.
This type of pre-release makes a lot sense and reduces the pain of having to rush to check a released version against your work at around the same time that world+dog is already using it.
I assure you that the encryption involved in some postcards would be enough to baffle most government agencies.
At least that's the impression I get whenever I get a postcard from my parents. So far I've managed to decrypt 50% of the one I received last week.
Don't worry, you should be safe mentioning "plans for the revolution" as long as there's no related mention of Al-Qaeda, The Terrorists Cookbook or other subversive material such as "The Little Book of Common Sense", 1984, or indeed any mention of Brazil.
Technically, Association Football.
It's generally septics that call it 'soccer' to differentiate it between their local sport of rugby in armour... where the ball very rarely comes into contact with a foot.
[Sigh] Whoever initially used the phrase Global Warming and prominent public figures who repeat this do a lot of damage to the environmental cause.
The considered term is Climate Change. This is where humans are proven to be polluting the environment and through this there is proven disruption to environmental processes, both local and wider. The exact impact of the disruption to these environmental processes is the main contentious issue: some of these are relatively trivial or have a narrow impact, some while having been disrupted are replaced by other process and some of them are more immediately obvious when disrupted such as the hole in the ozone layer. The difficulty is that there are a huge number of of environmental processes, many of which are interlinked somehow, many are hidden or obscured by others and this makes it incredibly difficult to make predictions of what may happen when one or more fails or is disrupted. Therefore the considered acknowledgement is that while we are proven to be damaging the environment, we don't know exactly what will happen but there will be changes and if we don't stop damaging the environment then the predictability and potential severity of these is statistically likely to be more serious.
It is not impossible that the processes could be disrupted in a manner that could lead to Global Cooling, not Hollywood action-movie style, but it could be as disruptive as Global Warming because it would affect precipitation which would have a catastrophic effect on food crops and the distribution of fresh water.
I guess if more radar (EM) waves are observed then "brighter" would be an adequate term for the amount of EM radiation received: the only difference between what we consider "light" and radar is the frequency.
Oh dear, reminds me of the farcical time in a previous company where a department's self serving nutjob decided to rebrand the company's "admin" team as "Central Services", then assigning titles such as "central services executive". This resulted in two things:
1) The girls in the office having to explain to potential new employers, friends and so on that they considered the job title was really "admin assistant" and having to put this on their CVs to make it clear.
2) The IT support manager renamed his department "Essential Services". No electricity, network or computers? No administration... :)
upvote for "Giggidybits" genius
Agreed. It would make a sterling addition to the El Reg standard units.
Please restate your question using proper and correct measurements. It makes a lot more sense in Linguine.
For your reference and correctional education: http://www.theregister.co.uk/2007/08/24/vulture_central_standards/ or for the slide rule shy (*): http://www.theregister.co.uk/Design/page/reg-standards-converter.html
* I'll be buggered if I know how to use a slide rule either.
Yep, they've generated just as sympathy much as the tube drivers who are already paid more than most people for sitting in their cabin, pushing buttons and occasionally ranting at the paying passengers. The same tube drivers who decided that on top of their ordinary and extraordinary overtime they would also need further extra money during the olympics because, err, honestly: the excuse was so wafer thin I can't even recall it now.
My solution to the presentation problem was to fit a dedicated PC, configured and locked down to the resolution of the projector. Audio went through dedicated speakers, all connectors were screwed in place.
Users could either access their presentation across the network or plug in a USB stick, which is usually daft when they've saved the presentation to a network location, and with non-embedded embedded content always causes fun and games.
I still had some users busily unscrewing everything in order to present a high resolution (aka stupidly large amounts of small text) presentation using a low resolution laptop. They could have configured a secondary display with the correct resolution, or even aspect ratio, but then they wouldn't have been able to see the presentation on the small screen at the same time. Staring at which defeated the entire concept behind delivering a presentation in the first place, which is to present, not to sit there mumbling at a laptop keyboard. To make it worse, I even had a dedicated external connector which users who couldn't understand that they could use a dedicated PC for the task, or were external visitors, could use... but this didn't stop the attempted disassembly of the main system at times.
All this and there's still the point that's most often forgotten: PowerPoint is not a presentation; PowerPoint is merely a tool that allows you to enhance a presentation. [but frequently used to kill one]
I "dodged" that one as well... specifically the cybernetics course at Reading. In the end I avoided AI as much as I could because I quickly considered that none of what was being taught as AI was in fact AI: at best it was Logical Reasoning.
As for Professor Warwick, I consider that he's a very good promoter of the subject, rather over-enthusiastic at times, and he does, in his own way, raise the profile of a lot of interesting problems that could do with being raised - for example the boundaries between human and machine. Eccentric, out-spoken, often technically wrong but largely harmless.
It would be interesting if after all this time he could be persuaded to directly speak with El Reg...
I'm pretty sure that this default display of FOG is not truly random and your assertion that they avoid showing this when it is actually foggy would back this up. In my experience they seem to target bright sunny afternoons more than any other time of day.
Unfortunately that's a sensible plan, and not something that will have been dreamed up by a politician or parcelled out by accountants therefore it will never happen. Just like the good plans of building canals to ship heavy non-time sensitive goods up and down the country and to shift water from where there's lots of it (I'm thinking of you, Manchester) to where there isn't so much.
In Oxford it was always regarded that PPE is a bullshit waffle degree where you can write whatever you want as long as you make up some half arsed justification in your text. It was rumoured to be pretty much marked on word count... [I didn't do PPE]
Worse to come is the water / food crisis:
1) Sell off the resevoirs so houses can be built on them.
2) Sell off prime agricultural land so houses can be built on them.
3) Look confused as to why concreting over enormous amounts of land causes drainage problems.
4) Look confused as to why with increasing oil prices food becomes even more expensive when most of it has to be imported.
It's not necessarily pandering to the green vote, it's pandering to the oil vote. Shutting down nuclear reactors benefits non-sustainable power sources.
What we really need is good energy storage. Renewables can become useful then.
Is it me or do these kind of systems blur the difference between a system running many processes and many processes running on a system?
A virtual machine is little more than a set of processes (with communications and storage), how much of the unnecessary junk can be removed from a virtual machine before it becomes hard to differentiate between it a group of processes running on another system?
Just a lunchtime thought, may even make sense tomorrow
Yes. You're so obsolete that you don't even know it yourself.
Intriguing read about the "Halting Problem". But it goes to justify why my loosely held general belief that mathematicians should stay clear of programming still runs true.
I've had countless arguments with mathematicians pretending to be programmers... from those that claimed that "5g" languages would make programmers obsolete to those that can't grasp that while small parts of a typical application can be represented in a mathematical manner, it quickly becomes pointless trying to apply such an unsuitable technique to wider applications or algorithms. While it is of course possible, the dataset rapidly becomes a ludicrous set of multi-dimensional possibilities and while the analysis can be streamlined the sheer processing power requirements to model and validate the entire thing renders any attempt pointless. In the end the algorithm effectively degenerates into a simulation. In many ways this is similar to computer chess.
That's the problem with, in this case, a historical lack of understanding. The brain isn't a binary device and while any individual component doesn't run especially fast, they do run in parallel. The concept of a machine fooling a human in a blind test is still a clever device, even if the understanding and predictions were out.
This kind of historical take on something is often quite interesting, for example Asimov's robots could not speak but could understand. It was later advances in technology that lead to the "artificial voicebox" in his books. From a biological point of view it was correct - babies and toddlers can understand much more than they can speak, however from a technology point of view it's reversed as speech synthesis is simple compared to contextual comprehension.
Pretty close to the "reviews" many goods, services, holiday locations or restaurants receive. It's almost uncanny how many always include the same key points.
The difference between these looks to be the IP rating (no note of this on the ebay item, IP54 for the lacie) and I'd hope that the interior circuitry / connectors would be more rugged in the lacie than the ebay item.
Yes and no. Hence messy. For example, online file streams do not contain metadata.
All of the metadata (file streams) attached to an individual file would have to be verified to ensure consistent operation on the off chance that code within that module, or any other for that matter, checks the metadata and changes behaviour as a result.
The vertical deployment of the drives looks sensible from the purely spatial point of view as it means all the non-drive space (power, data, cables and support) is put into one plane which should optimise the use of space. Vertical stacking would remove the need for cables in the same way that commercial removable external HDD units work (if you're in the business of swapping out HDDs, these kind of exposed external HDD "caddies" are invaluable).
I can't see any details from the picture, but if I were designing this I would combine the cooling and support elements into one form, a thin metal (e.g. thermally conductive) caddy that ensures that the drive sits true on the connectors and doesn't topple or otherwise shear or twist the connectors. It would effectively make the caddy a part of a monster heat pipe.
They will be a bugger to deal with though, particularly when you need to swap a drive in the top unit at the top of a 42U rack. Servers are annoying enough, any although these probably don't have a lid case on top to content with, the drives would have to be carefully removed to not interfere with the operation of adjacent drives.
EDIT: Just googled the SS88460 user guide and aside from the unit looking different to the datasheet model and the image here on El Reg, it has slots for pairs of drives and enclosed caddies for each HDD.