1369 posts • joined 10 Apr 2007
Re: @Evil Auditor (Was: Re eucalyptus weeds)
"Back when I was picking coffee beans in Guatamala..."
Is there anything that WonderJake can't do?
Be truthful? :)
I just find it entertaining...
Sci-Fi is just a setting
Sci-Fi is a setting - in general the more it is treated like this, and not an effects-laden crutch for poor script, dialogue and ideas, the better the film is.
The first Star Wars film (ep:IV) succeeded because it took commonly used story elements and a relatively standard plot line and set these elements using a Sci-Fi setting. The effects weren't cheap for their time, therefore they didn't detract from the plot. Likewise the effects weren't the focus of the film either.
The second film (ep:V, The Empire Strikes Back), continued more or less along the same lines and while there was greater emphasis on the technology and effects, they generally didn't feel like they were shoved in just because they could be, and a plot was fitted around them afterwards.
The third film (ep:VI, The Return of the Jedi), showed a bit of promise but on the marketing (merchandising) success of the previous two, piled in with merchandising features often to the detriment of the film. It still worked as it wasn't too grossly overdone, but it did detract from the film.
The recent films (ep:I, II & III) were built from working out what (and who) to merchandise, fitting special effects around them and then trying to shoe-horn any form of plot but only only if there was space available.
The "Star Trek" relaunch was very much similar, but more from the point of starting with pointless special effects, incompetent plots and then throwing in a few "popular" actors and topping it off with a few nods to the original to keep some "spirit" of the originals in there.
The entire concept of films and where they come from is often lost, for example Impossible Mission was all about a team of people working together, not one single "super character" (played by a famous actor) where it became a more effects driven copy of any James Bond film.
Re: How long till bandwidth maximum?
While you have a point regarding the "last mile" bandwidth, it's getting the data from the Content Delivery Network (CDN) that is the problem.
In principle there should be no problem streaming 17 HD films down your 70Mb link itself. However those 17 HD films have to get to you from the content provider's network. If 1000 people in your city are also concurrently streaming 17 HD files, then that's 17000 HD films that need to be delivered concurrently, with no or minimal loss of packets and no pauses (buffering during playback); a short amount of buffering is always in place with streaming and this happens as you start the video. That's a hell of a strain on an infrastructure to deliver that.
Net Neutrality is a weird one
While the principle that a company, certainly one that is the originator of a lot of content, can, or should, pay more for better delivery of their content, is fair, it's potentially problematic.
Any company can arrange for a high speed link to the Internet. Throw more money at it and you will get better a better link - lower latency, higher throughput, burst capacity provisions and so on. If a business relies on getting content onto the Internet, then it should have a high level of redundancy internally and should also have redundant (physical) Internet connections, therefore multiple arrangements would be made with different ISPs. This redundancy could be used a a fail-over or even to load balance content, or anything in between.
Few would argue that the above is a bad thing: The company is paying more to provide the amount of content it delivers, which is fair.
In order to improve efficiency, a company can pay even more money to get closer to the core of an ISP's networking setup. Cutting out a few network hops here and there may not seem that important however every network hop adds latency and slows things down overall and when you talk about a high volume of content, this adds to a lot of potential loss of overall speed and a customer's perception of quality is often dictated by speed, or more accurately the lack of (which makes it outstanding that many set top box manufacturer's still push cruddy, low spec, badly programmed kit). An additional point in favour of this arrangement is that by bypassing network hops that the content provider shares with other companies, it will help to optimise throughput and should improve the experience for these other companies.
Again, few would argue that the above is a bad thing. The company is paying even more for its connection and optimising the route is sensible on a lot of fronts.
All this is wonderful if you happen to be a customer that uses the same ISP, or one of the ISPs as the content provider. The delivery of content is optimal and the customer gets a great service. However there are many ISPs and if you're not with one that the content provider uses then you will get a worse service than the content provider would like to deliver.
A natural solution to improve this situation is to enhance the peering infrastructure that ISPs already have between each other, and it's at this point where it starts to get murky and less than ideal. ISPs are not equal, in size or capacities and the choice of ISP is usually constrained by physical location. On the one hand, it is arguably good that a heavy content delivery company would pay for additional peering from their core ISPs to other ISPs - after all they are using a huge amount of bandwidth. On the other hand, this starts to get into the problem of selecting the ISPs to peer with, which will usually disfavour the smaller or regional ISPs, and the actual implementation of the additional peering... if the content provider pays for additional peering bandwidth, then there are few that would argue that this is bad, however if the content provider starts taking a higher share (or priority) of existing bandwidth then there are definite downsides.
In the end, it's all down to implementation and control. I believe that a company should be able to pay for better delivery of their services but that it should be in a controlled and regulated manner, should not disfavour smaller or regional network providers and should not impact other content provider's share of existing bandwidth.
Re: "Surface ***3***. i.e. this is our 3rd attempt after 2 dismal failures"
The phrase "Bing it on my Zune" still raises a smile from me. It just sums up perfectly the quality of Microsoftness, like the kid who is trying so very hard to be cool.
Or more like the politician who's trying so very to be cool, or down with the masses, or whoever they're targetting today who isn't a political donor.
I managed to head off a supplier of ours from switching to silverlight when, a couple of years ago, they announced at an event that they were going to start using it. A choice: HTML (enhanced with HTML5 tech) or silverlight? It's a no brainer.
Re: No problems here
I'm getting by without Flash on my personal PCs as well as no Silverlight. Usually it's OK but I still come across fucknut websites that are "written in flash" rather than "enhanced by flash" but the number is reducing. Apple can take some credit for this. Now if the BBC were to ditch flash as well...
Java is also blocked from running on my browsers as well. This has had less of an impact than not installing Flash as you'd imagine. Java is still installed for applications that require it, just no browser plugins.
Competition! It should be good for us consumers.
Interesting trends seeing this and Moto-E removing the front facing camera and having "not very good" main cameras, no flash, etc that are OK for quick snaps in good daylight, not great for much else. However a huge chunk of the market don't care for these features (or removable batteries or storage, which will doubtless be screamed about by the usual fanatics) so why should a budget phone have them? Want better or more features... buy a different, or non-budget phone.
Wouldn't say that I specialise in this kind of thing, buy over the years I have more than a passing professional and personal interest...
I've been through all the pain of many solutions, including a PC based file server, which while quick and easy to get going chews through 'leccy like no business. Most of the kit doesn't have to be expensive, however if you aren't careful it will get expensive very quickly.
You don't need gigabit ethernet for streaming, and as noted above, most of the kit just doesn't do it anyway. Wifi is a waste of time for streaming, while the headline speeds may seem good, the real speeds are never close and as soon as you get multiple devices connected (or neighbours with wifi) the performance rapidly drops below useful levels.
NAS: Currently I'm using a ZyXEL unit of some form, with pretty much most features turned off. Stick to the basics, ignore the "value added" functions and most commodity NAS devices will do the job. I chose this ZyXEL device because of its support for Linux and Windows sharing, and the reviews reported success with both (and it powers down to low power usage). It's a single 2Tb drive, while RAID of some measure might sound like a good idea, it's not usually useful and adds needless complications and power problems (it's not that I'm not a fan of RAID, just sometimes it's not always useful). Redundancy, backup? Easy? Buy a second NAS and that's your backup. If you're careful and upgrade regularly you can buy a new NAS device with extra capacity, copy the old content onto it and have an instant backup of when you last had the content. Unless you throw your devices about and are willing to replace a HDD drive every couple of years, you are considerably more likely to lose content through accidental deletion than HDD failure. If you're particularly paranoid, power the NAS through a reasonable UPS so you get both power level smoothing and a few minutes of battery backed up power (I do).
Playback: It's generally best to have a single playback system for each screen. This gives you maximum redundancy but also keeps the cabling and communications sane. While you could have a super playback system playing four independent streams, you will quickly suffer internal bandwidth issues but more importantly you have to both stream the video and audio content from this one box to each screen and feed IR remote control signals back the same distance. The further the distance the nastier both of these become. I'm currently using an Acer Revo box with a Microsoft media centre IR controller connected for each screen. These PCs are cheap (£200-£250), small and are pretty low power particularly if you fit an SSD or low power HDD to them. Many are effectively silent or fanless as well. On the software front I get the cheaper Revo devices that don't have Windows on them, wipe the junk they do come with and install a build of XBMC on them. In the past I've had annoyances with audio playback but these days the drivers are all just there and the Linux audio layers have matured sensibly.
Security: Fit everything into your own wired network. You can add usernames and passwords for sharing files and for a home system you really shouldn't need much more. XBMC supports content levels so you can protect minors from inappropriate content.
Re: Do not touch the switch ...
Maybe they've upgraded the notice from a post-it note to one applied using tape?
Re: I think a certain water bureau might not be very good at their job
The interesting thing about arsenic is that we need some, in the correct form, in our diet. The not so great thing about arsenic is that even a little too much is very unpleasant. While it has to be in the correct form, the same can be said for Carbon, Oxygen, Hydrogen, Iron and so on.
arsenic is the 20th most common element in the Earth’s crust and the 12th most common element in the human body
I'm unable to find a clear attribution of this statement, however we do have arsenic in appreciable quantities in our bodies which would indicate that it's required. I have seen peer reviewed articles reporting this however can't find them due to the expected "arsenic news" noise in Internet searches...
Re: Over reaction?
The pictures in the article clearly show an open reservoir. Adding human pee to that would hardly contaminate it compared to everything that is in it already.
Pharmacies will never be perfectly stocked all of the time, there's upwards of 3500 commonly prescribed drugs / doses - there's no way that your local pharmacy will have all of them to hand and definitely not the less common ones.
The concept of the electronic prescription service is that electronic prescriptions can either be sent to a specific pharmacy or the patient can be given a printed identifier which any pharmacy can use to retrieve a prescription. A pharmacy's dispensing and drug management system is required to support this functionality and the messaging is carried by the N3 network (direct or tunnelled), which for those that don't know is the largest private internet there is.
The advantage of sending a prescription to a pharmacy is that the pharmacy can use this to get the prescription ready for you to pick up which reduces your potential wait time It also allows them to smooth out the pharmacy assistants' picking process and to better use their dispensing pharmacist's time because they must double check every prescription before it is handed over.
The electronic prescription is also designed to remove the step where a paper prescription must be confirmed by a pharmacy with the prescribing doctor prior to picking and hand over. This is an obvious check to have in place as a prescription is nothing more than an easily forgable pre-printed slip of paper and this check does add quite a lot of burden to the process.
Re: Medicines Act 1969
As I understand it, this is to dissuade pharmacies from cherry picking the supply of drugs, whether for stocking, financial or other considerations.
Re: Massive Budgets
Why when I think of Charles Koch do I think of the tobacco industry? Where it is has been repeatedly proven to be very damaging to health and later leaked that this was known by the tobacco industry even earlier. However they squashed this research, justified their entire business to themselves and carried on as long as they could. Follow the money.
When it comes to climate change, so much damage has been done by those who are involved, including the green movement (nuclear power), governments (ignoring or adjusting reports as they feel fit and pandering to cronies) and I'd most definitely like to slap senseless the muppet (aka greedy economist) who came up with "carbon trading".
Re: Funny how they only discover these things after the data makes it clear
Climate modelling is ludicrously complicated, and as models are created new feedback loops and effects are discovered. It is impossible to model such a complicated system from the start taking into every effect. You have to start a model somewhere. So when something is observed but they're not sure why then this is still modelled and the understanding is that later this can be improved upon.
The only perfect model will be one that looks back as to what happened. By then it's too late of course and even that model would not be perfect for predicting future events, it'll be a start but that's it.
At present, Samsung's own apps and services are largely shunned by the hordes who buy its hardware.
Because they are gut wrenchingly appalling, interfere with anything standard and generally make the phone experience somewhat worse? Samsung make great phones, but are utterly clueless about software.
MrXavia's point about the purchases not being transferable from one device to another unless they're both Samsung devices is a good one - just hadn't thought about that before. If Samsung introduced a really good, unobtrusive and well managed app-store that could also be used on all Android installs (it could still include Samsung exclusives), then that's one thing. However with the heap of unremovable junk that Samsung shovel onto phones they aren't helping themselves.
The May release will be the first in more than a decade to not include any bulletins for Windows XP. The venerable OS was officially retired from support by Microsoft last month, though subsequent exploitation of flaws in the OS by miscreants has forced the company to issue an out-of-band update.
...and there I was thinking that the out-of-band update last month was to patch Internet Explorer, and the security-hell-hole that is ActiveX within it. That's not a flaw in XP as such.
Don't know why you've been downvoted, but any discussion of Apple tends to bring out weird responses...
In theory they should all be built as well as each other regardless of manufacturer as they should pass the same tests that Apple will have defined. Multiple, competing manufacturers should also be in competition with each other and therefore quality should improve overall. Or cost, but the same quality. Either way it shouldn't be a bad thing.
Spreading manufacture across providers is a wise decision - it's a high risk having a single manufacturer, particularly if Foxconn were to be hit by urgent priorities for other customers, potential ethical and legal issues over its workforce or anything else that could cause severe disruption at just the wrong moment.
As for the iPhone 6 and it's rumours... I don't especially care for the rumours as it's likely to be a largely incremental update than anything spectacular. This is not an attack on Apple, just an observation as to where the market is.
This is, of course, how apps for modern operating systems are programmed: they do little or nothing until the user interacts with the UI, or data comes in from the network, say. But it’s not a common approach with embedded devices, such as Arduino.
Errrr... what is the writer of this smoking? Any vaguely sane (*) embedded developer spends their life with event driven code. Depending on the embedded system it will either have hardware support and masks for event triggers or for simpler systems the developer has to implement everything in an execution loop but it's still event driven. Or does the writer think that having events built into a high level language or API makes an application event driven?
* I've seen some complete balls-ups from developers who plainly had no idea what they were doing at a low level (and often at a high level either). It was usually quicker to rip everything up and start again than attempt to step through and debug a spaghetti mess of interwoven interface and state checking code. The basic flow usually never needed to be complicated, often something like this:
1) Initialisation code
2) Check something
3) Do something depending on what happened in 2.
4) Exit code set? No, go back to 2.
5) Cleanup code [often considered optional]
Re: Migratory Robin?
UK robins only live a couple of years (fast paced years admittedly) and while some migrate, others don't. Even the robins that do migrate might not migrate very far, many were recorded as migrating just a few hundred miles if that far. They're adaptable (omnivorous) little birds that don't especially need to migrate in the UK as our winters and summers are mild enough. The situation is more confusing because populations of robins swap places...
Re: @Seanie Ryan
As a side note, am I the only one thinking the film title should be 'Star Wars Episode 6 1/2: The Smell of Fear'...
It should be "Spaceballs 2: The Search for More Money"
...perhaps followed by "Spaceballs 3: The Search for Spaceballs 2"
Please no, not Zac Efron or any other largely talentless Disney "child star or teen-idol". And no ****ing time travel either - although I'll accept that there's a chance that an actor like Zac Efron might not ruin the film totally, adding time travel to it will.
Re: They'd never get me
I've been offered bribes of sex and cash in the past. Unfortunately it was a long time in the past and when I was working on the gates to a concert, nothing IT related.
Re: Well if it involves...
Yes I'm that shallow.
I'm not. How tall is Anna Chapman? :)
"Effective or necessary" depends on your point of view. Certain certified environments mandate that all systems have antivirus software installed and that the images are regularly updated (I never did get a real answer as to what they meant by "regularly" though).
In some ways AV software is like a biological immune system where you get a shitty cold and get protection only after the infection, but you are protected from a repeat of the same virus - the difference is that your protection can be shared with others. One serious problem with AV software virus detection is that it is retrospective - it takes time for a virus to be detected in the wild as the "best" viruses avoid detection for a while and then for an AV vendor to produce working detection rules and to check these in house against various permutations of the virus and known "safe" software that shouldn't come up with a false positive. The AV vendors are always behind, and end users always suffer as each detection rule added to AV software increases the number of checks that need to be made, necessarily slowing your system down to a crawl.
It's not a situation where the AV vendors can ever win, the only "fix" is to improve prevention and this requires careful operating system design. When the most common PC operating system has roots in a system that was designed specifically for single user, stand-alone use with everything else cobbled on top in a frequently changing direction with "new" products and platforms abandoned and left hanging regularly it's no wonder we're in the mess we are in. A more secure system is a more closed and controlled system, but how closed can it get before we start reacting to the loss of the freedom that enjoyed before?
Re: Bring back Ada
The heartbleed bug could be merrily implemented in any language that supports memory access, it was an algorithm error, not a bounds violation of any form.
Modula-2 might be ok but it was ruined by the inane insistence of the designer that it was going to be a single-pass compilation process. In reality this just doesn't work and you either wind up with horrible kludges to the code or progressively more unwieldy development environments.
I'm becoming increasingly convinced that there is simply *no excuse* for writing stuff in C (and C++) any more. There's just better ways to do it these days.
No one language is so superior to all the others that it is usable at all levels, from device driver all the way to up to user script level. As a general rule: the closer you get to the hardware, the lower the level of language that is appropriate for use. Efficiency really matters at the lower level, while wasting thousands of CPU cycles with boilerplate and support code is almost acceptable at the application level, it most definitely is not for an API call that could be called hundreds of thousands of times a second. Like everything there are always trade offs balancing code security and with efficiency.
Re: GOTO be GONE?
GOTO statement still have some relevance, but in general in higher languages it should be avoided. An algorithm can usually be written in a more structured, clearer manner where a GOTO statement is no longer required.
I would much prefer to see a GOTO statement than a "BREAK <n>" statement where you have to work through the layers of conditionals and loops to work out how many levels are actually being skipped out of in the parameterised version of the BREAK statement. "COMEFROM" would be clearer :)
Lower level, of course, you will see the exact functionality of GOTO everywhere because it is a fundamental control structure - JUMP and (conditional) BRANCH operators are key to assembly language processes. It's just that with progress we've abstracted their use away to reduce the number of problems they cause.
Re: Note to all C programmers
The style definitely doesn't help - and I'm certainly not a "friend" to many of the code formatting styles out there which encourage poorly indented and defined conditional blocks.
It's an absolutely appalling bug to be in place because:
1) An automatic code formatter applied to the code would have shown the problem with ease in a visual review.
2) The compiler would have produced a warning that the code block following it is never executed. Modern compilers are helpful like that. Then utter fuckwit developers either turn off the warnings or ignore them as there are so many. Hint for the clueless: the warnings are there for a reason, deal with them.
3) Testing should have revealed this bug very quickly as the function would not have behaved as expected. To be fair what probably happened was that the code was tested, then the developer hit Ctrl-D while the cursor happened to be on the badly formatted line, duplicating it (Ctrl-D is a common shortcut on many IDEs) probably while pressing Ctrl-S to save the current source file. However again, a commit of the source and the subsequent diff should have revealed this error straight away unless it was introduced as part of a larger block of changes, in which case the unit tests should have been re-run for all cases and the fault identified.
Re: A brain is not the answer.
A brain is a massively parallel pattern matching system. It also has a relatively ingenious lossy compression memory as well that manages to overlay many memories over each other but somehow keeping them intact enough to be separate.
Any individual, small part of the brain can be implemented faster in silicon, however the ability to form and reform a wide mesh of analog connections between many neurons (quite flexible processing "cores") is something that is very costly to implement.
Repetitive, exacting processes are ideal for procedural computers, however other processes such pattern matching, approximation and detail substitution are much more suited to neural networks. While one can substitute for the other in most circumstances, it is far less efficient.
Re: @Nick Ryan
You need to get a copy of Settlers 10th anniversary.
Oooh... thanks for that. I think I may need to purchase that straight away!
Re: My fondest gaming experiences have been two+ player:
Settlers... with a cardboard screen splitting the screen in two so we couldn't so easily see what the other one was doing. Unfortunately settlers got worse and worse for 2-player with every new version and it lost all the charm of the first. The latest, is completely ruined by unisofts moronic insistence on everything being about meaningless "micro-purchases", a ratings ladder and very limited (if pretty) maps - they don't even permit a ****ing save game feature in two player because it might be mis-used in the ratings ladder. Guess what, we don't give a flying rats about the rating system, we just want to play the game. And without unisoft's DRM and other intrusive nonsense getting in the way as well.
Sensible Soccer (tournaments) - we took teams, played against each other, drunk beer. Some days just never got better.
Re: Well, it was only a matter of time......
I'd much rather that "touchwiz" was an optional, uninstallable skin that could be installed (vendor locked) or uninstalled as desired. Same for the other manufacturer's launchers as well. If I never see another bit of "carrier" content again I'll be happy as well, I remember too many phones utterly ruined by the total trash that the networks put onto devices while simultaneously removing anything useful that competes.
On a side note, if The Demon Spawn of Redmond, AKA M$ or Windoze (or whatever tired and unimaginative insults you can come up with) had announced plans to do this to their OEMs (or if they were to do so in future), what do you imagine would have been/will be posted on a thread like this one....hmm?
To be honest, this Google plan doesn't sound very different to how MS currently operate with Windows Phone, so they'd be hard pushed to make such an announcement.
While it is an extension of the Nexus devices, which in my opinion appear to be there to keep the other manufacturers on their toes, I'm not sure if this is going too far.
Maybe I'm missing something, but other than the headline where did it mention ARM? The Basic Qualifications section of the role lists "In-depth experience in optimizing workloads for high-performance x86 architecture" with no mention of ARM anywhere.
Intel are also working with integrated or custom dies as well and while it's a rather different licensing model to ARM's the basic principle is similar.
Headline writers... grrr... it's like they're attempting to catch our attention or something :)
Re: I actually like Windows phone
Why so many downvotes for somebody simply for saying that they like a product? Bizarre.
Because this topic is fan-troll bait, including as it does the topics of Windows Phone and Android in one therefore even rational posts are going to get lots of spurious down votes but few counter-arguments.
The only comment I'd have about the above poster's comparison is that IMHO he's not comparing like-for-like devices, but given the number of devices and combinations of features and devices it's hard to really compare devices objectively.
Re: "reacts like a teenager whose divorced dad has been seen dating a young stripper."
I think we need a "I think this would be better demonstrated using playmobil figures" icon... :)
Re: Low power tasks???
Yes, Video playback is a relatively low cpu power task - the processor has to do little more than orchestrate the passing of data to the dedicated video decode hardware that is genuinely efficient. Hence low power, as in a low-power CPU can perform the task.
The display will take more (electrical) power to display the video...
Re: Luckily for me
It also aids productivity because it ensures (*) that you concentrate on one thing at a time rather than continually flit like a geriatric lunatic between different tabs and downloads.
* as in, it could only do one thing at a time itself, therefore that is how you had to operate. No downloading in the background, no seeing the page until it was loaded, no tabs (don't remember an "open in new window feature" either)... and no .png support, no scripting... errr... I'll just load up lynx thanks. Did it even support marquee and flashing text?
I think I like the term "earslab" better though.
Re: Pottie "Moz's C/C++ replacement Rust"
"....what about C#?...." The problem for C# for many of the Penguinistas is that they see it as firstly a Microsoft product, and it is, in their eyes, therefore too tainted for them to consider, despite it now being an ISO standard CLI language. They even get huffy over the FOSS Mono version, calling it an MS Trojan horse.
C# is a Microsoft product - while it is labelled as an "ISO standard CLI language", we all know how Microsoft rigged the standard for Office documents.
Once the Microsoft dependencies (libraries) are stripped out, there is unfortunately not a lot left to C#. While the same could be said for other languages, at least for many of the others there are working alternatives for the functional libraries. Once these Microsoft dependencies are removed there is not a lot of real incentive to use C# compared to C++ as there are relatively few compatible libraries and pools of example code, although recently more have been released. AIUI it's also quite a bit slower than C++ for many tasks due to the additional baggage that comes with managed code - in theory it is safer though.
It now has 540 million such profiles, of which around 300 million people are said to be active in the Google+ "stream".
Is this "around 300m people" the ones who haven't figured out how to or haven't yet, disabled the g+ "integration" options on everything google?
Re: Not that easy
Unless there's a head crash, inserting old Amiga floppy disks into an old Amiga disk drive shouldn't damage them. There is a chance that if the data is magnetically "faded" (not sure what the correct term for this is) then it could be flipped by the read head but in this case the data is probably knackered anyway. Still, the caution that they exhibited wasn't entirely unwarranted given the potential value of what may be on them.
Amiga disks didn't operate with a variable speed, that was a feature of the Macintosh systems. The actual physical disk drive components used by the Amiga 3 1/2" SD floppy disk drives and PC 3 1/2" SD drives were the same it was the interfaces that were different. The biggest problem was that PC operating systems were designed such that supporting other formats other than their own was very difficult. AmigaOS, on the other hand, had a very flexible disk operating system and supported different formats with relative ease. Most problems with this support came down to supporting the primitive file systems and their inefficient use of disk space - e.g. 8.3 uppercase formatted file names compared to case-capable but case insensitive full length file names, 720k capacity compared to 880k. While annoying it is easy enough to copy content from an Amiga to a PC using an SD floppy disk, although if you want to preserve file names then it's a good ideal to compress the content into an archive file of some form - lha and lzh are supported by many PC archive applications. Other transfer alternatives are null modem cables and the huge number of transfer suites that are, or were, available for this, and even IPv4 networking if you have the patience to get it working. One of the most useful tools I remember was software that mounted an FTP site as just another drive in the Amiga - this allowed you to relatively painlessly copy files to and from a FTP server using whatever application you wanted.
Converting data from the majority of IFF files, which encompassed ILBM and a lot of other formats, is not a particularly troublesome task given even basic coding skills. Again there are a few tools still going that help with this.
Re: Some clarification
There is also the issue that what is traditionally referred to as "junk" in the DNA is in reality not junk and is critical. As a result comparing a "few" marker genes in no way is a complete comparison of species - it's a starting point though. The actions controlled by this "junk" are very interlinked, resilient and there are clearly documented cases where different arrangements of this "junk" trigger the same end result.
Whatever it is for (most likely AppleTV of some form), a speech control interface is welcome. Siri may have its faults, but it's a step in the right direction.
Would save so many problems with losing the remote controls all the time...
Re: The odds are not too shabby @ Bilby
Come friendly asteroids, land on Milton Keynes?
Just doesn't quite have the same ring to it as bombing Slough. Although I'd argue for Slough, Milton Keynes, Luton and a good few other places as well.
I am far from a luddite (maybe rather closer to a closet tech-geek), but why do the damn interfaces on these things have to be so awful?
It is much nicer to use a push on/off rotating dimmer switch compared to dual function up / down buttons. I hate button re-use with a passion, it makes for some of the worst interfaces. It's not as if switches have to mechanically control the circuit therefore a digital rotating control, perhaps with a mechanical stop, and a push on/off button is not hard. And get rid of the bloody LEDs. I have too many of these things glowing away for no readily useful reason and while a nice subtle LED lighting a switch in a dark room isn't an entirely bad thing, a "burn your eyes out it's so bright" blue LED is what tends to get fitted these days.
And as for the remotes... the cheapest, nastiest, OEM remotes with... wait for it... dual function barely explicably captioned (icon'd) buttons. Gits.
- +Comment Trips to Mars may be OFF: The SUN has changed in a way we've NEVER SEEN
- Vid Google opens Inbox – email for people too stupid to use email
- Back to the ... drawing board: 'Hoverboard' will disappoint Marty McFly wannabes
- Pic Forget the $2499 5K iMac – today we reveal Apple's most expensive computer to date
- Google+ goes TITSUP. But WHO knew? How long? Anyone ... Hello ...