33 posts • joined Tuesday 22nd May 2007 13:49 GMT
Power Puff Girls might have got him, at least the plastic shopping bags did ;)
Last night, it was on the news. I think it was dead, ate something like 6 or 12 plastic shopping bags, an inflatable float, and something else, or two.
You guys are a bit slow ion the funny news aren't you. Parliament finished last Thursday and they have gone back to the capital.
> By Peyton Posted Friday 31st October 2008 13:35 GMT
> Surely someone can just call the Powerpuff Girls in to deal with it...
> (http://www.cartoonnetwork.com/tv_shows/ppg/ if that reference doesn't translate across the pond)
The irony does not escape people from the rival city Cairns. I was once thinking of going to Townsville, and that great big, Welcome to Townsville, "the capital of North Queensland" sign, and put a great big "The Home of the Power Puff Girls" sign underneath it. I had this nice great big Power Puff Girl poster picture frame too that I picked up cheap at the $2 shop to potentially go with it. Don't suppose I can do it now, too old, and the police would be able to trace me through this post :(.
You guys beat me to it, yes a peep whole, guess no patent for such an "unobvious" improvement to the invention . I do science fiction stories, and come up with a lot of fancy ideas and the peep hole is something I use for situations like this but with some improvements.
A peep hole to relay an image can be so small that you don't notice it, think pin hole with an image sensor (like those security cameras). The peep hole can be made to emit a light spread that matches what people should see from the point anyway, with a matching hole on the other side to reproduce it's light (through interpolating and closing over the peep hole can also hide it).
Another alternative is to let enough light through for a sensor (or the naked eye) to see but result in so little dimming in broad daylight that it is unnoticeable to the naked eye.
A material can detect what hits and passes through it to those on the inside, and relay the information and reproduce the light where it should pass out the other sides. I do some innovation design, and have put some thought in techniques useful for this.
Another option is that no light has to pass through at all, the photons can be detected before it is relayed and reproduced inside, and reproduced inside.
Realistically, it will look wonky for some time as light is imperfectly bent around, or reflected (as in predator film).
A better way.
Yi! 95 years, who gave them that, used to be 50 years. This is a ridiculous money grab. They should have 20 years exclusive copyright, and then open licenses for 50 years at prescribed low royalty rates, or the lifetime of the author plus 20 years, whichever is latter. In this case, companies would sign up with a central authority after the first 20 years and use the music under a simple contract and content registered with the authority.
Schultz, they have not made enough of our money yet.
Copy right and wrong
95 Years just sounds crazy to me. Successful musicians and performers will have made their money long before and the others...
What about a solar sail like wrap?
I remember many years ago, waking up and smelling the most beautiful spaghetti. So, I got out of bed and went to the kitchen, and had a little of it from the spot, boy did it taste good. I then looked over at the chopping board and saw blue mince bits on it, then it dawned on me that my father had grabbed the kangaroo pet mince from the fridge ;) .
Don't know what he did, I think it is the best (apart from my Asian style) that I have ever smelled and haven't been able to replicate it myself. Hmm, I wonder if it is time to jump back to the pan ;).
Invisible troops photo.
Nice, photo, but I can't see the squashed grass, you didn't say they had invented anti-gravity as well.
But have you guys forgotten that Japanese inventor, that invented a cloak to bend light around him, not that effective, but nice.
I'm still think holographic storage was possibly behind the decision to go with cartridges with the Nintendo 64. Before it came out, there had been a breakthrough in cheap holographic storage (soon, sort of thing, but as usual this did not eventuate).
While cartridges were costly and limited in capacity (by cost for one) the holographic part was promising lower cost and more capacity, while further securing the storage cartridge from simple copying.
It will be interceding to see what the strategy is now.
You notice the address bar now searches the bookmarks as well, that was one of my suggestions (I don't think I suggested address bar history, it has been around for years, it was a long time ago to remember). I have over 12K bookmarks, and trying to find web pages through google is frustrating and suggested this. Trying to use bookmark search is also frustrating as it is too simple,. so one feature I wanted to see was more comprehensive search functions in bookmarks and email, even goggling on the page list in bookmarks for some once seen information, but these are the types of features I mentioned, that Opera seems reluctant to incorporate. Just those suggestions are a revolution for a researcher trying to find something they once saw.
The not having to save to backup a email to draft was something I was also onto them, I am pretty sure the discard button as well, but it is placed such that you can (and I have) accidentally pressed it and it wipes the draft and the current compose session.
Here is another suggestion, I suggested they move the tabs window control into the blank space next to the menus, which they did (which maybe a little bit close to the close apps button). But I think I also suggested that it could go into the application bar, if possible. In reality, if windows allowed it, I would prefer to the panels, new tab, and closed icons in the blank space in the menu bare as well (unless there was not enough room, then Opera would default to the current configuration) and the whole lot moved into the application bar, with the title also inserted in smaller text, or at least the tab controls. Would be extreme, unless you use large sized fonts like I do.
About Firefox stealing Operas ideas, they could always patent those features, if they hired the original inventor of the features ;) .
9.50 was probably a major clean up after all the bugs previously reported, 9.51 just was an easy cleanup of a few missed ones that had been reported recently.
It is a shame most of you guys would have disappeared by now (for this site does not email notifications when new comments are received).
A lot of work-flow improvements and fixing of stability issues, are from reports I put into Opera. I spend a lot of time doing research on the web with opera, having over 400 page tabs open some times. Because I am a heavy user, I tend to see all the bugs and stability problems you don't notice with ten tabs, or short sessions (the transfer memory leak, printing related stability issues were some of the recent problems I found and reported, a lot of adjustments and a number of work-flow improvements to 9.5 were ones I submitted). I am also a computer scientist, and do work-flow related design.
I have gone through the ups and downs of Opera releases and there are normally problems, sometimes (like the release of the 9 series, which were worse than normal, and you get sick of finding new issues and sending reports, but a ten hour day is a lot slower if the issues cost an extra 20%, or more, on your workload)). I notice that Visual C/ms featured in a fair few bugs. I don't know what happened, wherever they converted to a MS based development system, new team or whatever. There are a lot of improvements from my suggestions in the way the work-flow works, and at last I am happy with 9.5, but they tend to ignore some of the new big feature improvements I want, that would just crush the competition from a professional users viewpoint (but also very good for normal users). Opera is probably still the top dog, but If I took all my suggestions over to Firefox, would it be? All firefox would need (if they don't have it already) is comprehensive testing team, with brilliant ideas for improvements, and a brilliant development team, with more ideas, that could take all that and program without errors. You would also need a couple of brilliant individuals to guide the process and strategy.
I would say give it a try, but when you do just remember, there is a lot of hidden features that might not be obvious without some use (also read the documentation to find them).
Sorry for the quality of the writing. it is rather late here.
Forgot to mention. Brown and Tiapans are aggressive, Tiapans the most. During mating they will defend d territory, and you can drop within 20 meters of being bitten, and they can travel at fast running pace. Cycle riders have had snakes chase them down the road and struggled to get away from them (probably hilly).
Death adders just sit there waiting for you to step on them, they don't get out of the way, and are hard even for trained trackers to see. They attract prey by wagging their tale like a worm One brushed my ankle, and I looked behind me and it was only the eyes that gave him away. I went away and came back again to the exact spot, and took a while to find him, he just looked like one of the sparse mango leaves or sticks on the green brass. Very well camouflaged.
It is not so dangerous, many snakes get out of the way, and you can always stay out of tall grass and bush.
I come from near Cairns, where the Tiapan was originally identified. What's an inland Tiapan? I know that the the Western Brown is the most deadliest in the world. A Tiapen can produce enough venom to kill a hundred people, but I think it is more like 30-40 normally. There is a documentary around where a guy in nearby Mareeba milks the most angry snake (Tiapan)I have ever even heard of, and just after he finishes telling them that those snakes can produce enough venom to kill 30 or 40 people, the snake drops in what looks like enough to kill a hundred. Same documentary has a six year old child that a scrub/carpet snake tried to eat. The record carpet snake In Australia was in my local neighbour hood.
Most of the most deadliest snakes in the world are local.
Rodger doger, and here's a big fat Aussie Trouser Snake to you, Register.
Quickly scanned the article, passing note. But belong confined into a military like room with heaps of electronic equipments emitting electro magnetic fields, vapor fumes, heat and producing positive ions, doing a stressful job with little physical activity is prone to make people exhausted, stressed and fatigued (not to mention diets of junk/fast foods crunchie chocolate bars and coke, instead of low burn physical fitness/mental alertness diets of veggies salad and some meat, with moderate dousings of carbs). These people are doing stressful jobs like air traffic controllers, and should be given similar breaks and pay.
So, give them fresh air, install a negative ion generator and active carbon filter, give them good diets and exercise, and pout them on Aircraft controller like work loads and pay.
To all Anonymous Cowards
>Your "anti-sniper radar for Katyusha" has one flaw: the perpetrators are
>nowhere near the rockets when they are fired. You'd have to spot them
> setting up, and they're probably pretty good at Maskirovska.
I think you are misunderstanding (actually mix-up separate methods directed at different things).
The sniper is oriented by visual confirmation as to where it comes from (ever hear of continuously flying drone systems). So sensors in the sky and on buildings are sufficient.
The missile shoot down is done by radar (and visual if needed) it doesn't matter where it is fired from or what elevation, what matters is what it passes.
As per the method, NO EXPLOSIVE ROUND IS FIRED INTO CROWDED AREAS, THE METHODS ARE DESIGNED TO ISOLATE OUT INNOCENT NON-COMBATANTS (unlike many present methods) re-read it.
I hope you realise that you don't need a border check point to drop in weaponry, in particular if it was done prior to isolation. We developed the most advanced over the horizon radar system in the world in Australia, and still it was penetrated undetected. The systems might be more advance nowadays, but there are still ways around them, in particular tunneling.
Sorry for the spelling, haven't been well and wrote it quickly.
More effective to
I came up with an idea for a solution to this sort of threat years ago.
Establish a visual sensor network to identify the source of the launch, establish auto targeting system to target the area of launch (there is a way to determine the target that the military should be aware of) hand control over to firing operators that determine if the target and which targets are legitimate and launch. By using a long range bullet like round (and wind radar etc) culprits can be targeted sniper style. Make no mistake, still pretty sophisticated, but a Korean firm has sensor technology able to target night and day with enough ability to clearly see culprits. Alternatively satellite tracking can kick in to track assailants movements and be apprehended. It also maybe possible to tag assailants ands track them, or identify by the tags (fired at them at velocity designed to cause little damage except for the tag) which is desirable when you don't know if a person is an assailant or a bystander, they can then be questioned (please no jokes about the likely hood of the army doing that, this is about what could be done if done correctly). My favourite would be the system where they fire two laser beams that cause the air to electrify and electricity is run down it. With such a system the area can be sweeped within a second of the launch, and then the potential suspects picked up and compared to video evidence. Eventually (given resolution and processing) ultra high definition video evidence could be examined to determine who was involved prior to launch, and also processed to determine if a launch is being setup by image analysis.
The Australian Navy has been using a rapid fire and targeting system for their ships for years to target missiles and shot them down. This would probably be much more economical and effective.
It is not warranted to dismiss the chemical lasers on the basis of cost and awkward hazardousness simply (but maybe they were astronomically expensive) as the use of the laser will act as a deterrent to the continued sue of missile attack, that eventually should lead to decreased attacks and decreased use of the hazardous lasers. It is more credible to dismiss because the lasers cant target or act quick enough to ensure effectiveness (which is an possibility). However, they are probably more effective against longer range ballistic missiles, which is probably more likely there designed purpose. However I do some design work on laser scanners and I am sure that it might be possible to target anything in a 360 degree arc in a fraction of a second to a second (which is probably what the Americans are having trouble with).
I would like to state I am not affiliated with an army of defence contractor, but merely an innovator in consumer oriented electronics.
Extreme self-commenting code.
Not going to read all this but:
Naivety. Sure self-commenting code helps a lot, but the very best most efficient code can be extremely complex to read and understand (so comment where necessary). The smarty pants that wrote it, probably factored in one step what would take others several steps to do, so how do you expect others to suddenly get the insight he has in factoring to understand the self-commenting code. To further add to the problem, said smarty pants, if he did comment it, might think it is so obvious that his comments are so abstract that they don't help much either. So, comment where necessary down to the standard the competent/average programmer can understand (which ever is less). If you have below average programmers, comment down to their standard. And don't assume that a competent, or average, programmer bears an direct relation to your high standard of intellect.
I tend to write down to a level of reasonable readability. But I had an engineering friend in uni, and she could immediately write something simpler than several of my attempts. Even in my VOS design documents, I sometimes write descriptions that seem to have obvious meaning, that I cannot fathom latter. The problem is we assume something is obvious because we already know the knowledge experience or answer, so we should never con ourselves that the meaning is so obvious.
One of the problem I imagine, in this debate, is the common use of high level language. Where convoluted code can be inefficient code, and the assumption that truly efficient code can be written in that language, leading them to the false security that self-commenting code is more effective. Low level languages however, tend to allow a broader range of possible combinations of instructions and techniques to produce more efficient convoluted code.
I know somebody that wrote an IDE hard disk interface driver in 100 instructions, or 100 bytes, I forget which. The code was not as readable to others as it might have appeared. Actually, the self-commenting code of this class of extreme programming is not so readable.
It is like an ego thing, "yes, spend ten years of zen to understand my program.." The last rule is, why put people through all that pain to understand your code, when you can just adequately comment it for them to zoom in on the area of interest and change it. "Because I don't like commenting.." maybe, in other-words couldn't be bothered, it is somebody else's problem.
When they can extreme pogrom in assembler so that 90% of programmers can easily follow what they are doing, then I'll believe some of these things.
PS Jame's lead looks promising.
Search and software.
They have got to be joking. There is a lot of improvement that could be done to google, maybe the improvements are tied up in illegal software patent (illegally forcing people in other countries where they are not legally enforceable to suffer second rate service). there are an number of functions that have been in other browsers for years that I sorely miss. I can't even tell google to look for exact spelling, because it allows for multiple endings of words, even, I think, when you put it in brackets, which doesn't help if you are looking for an exact word or phrase to filter down the number of results, or punctuation like "." after a initial, or tab, or looking for an model code/name or initial and it grabs in similar words. And there is no real wild-card, like * for an combination of letters and numbers, in any part of the word, and ? for each letter. The web browser companies really need to give some sort of semi-proper comprehensive search facilities like you get in data precessing script languages under Unix. The same happens with file search in windows, you have to dig deep in the indexing system to find something and that is difficult to get up. I'll mention an new feature to think about, "Sounds Like" where you can feed in a word or phrase, and the search is done on on any phrase that phonetically sounds like that, the search engine would store words along with their phonetically equivalent, this would not require correct spelling of exact phrases. These sorts of things would have saved me many hours many times, in searching. I have come up with a lot of search technology over the years, and a lot of engines are scraping the surface of human oriented performance.
I call on Microsoft and google, to tear off the patent shackles, and use all the techniques for service to countries that do not support software patent, nothing illegal in that, actually applaudible and admirable, all they do locate the servers in countries that don't have software patents too (some of them will be most glad for the income). Then merely put notes on the help pages of all countries, under functions covered by software patents, saying, this command MAY NOT be available in your country, because it MAYBE covered BY SOFTWARE PATENTS. And also on the screen an explanation of how many countries don't have software patents and consider such an thing to be invalid etc etc etc.
I might also say about complaining about google apps, so much software seems to be faulty or not well defined, google is not the only one. Microsoft, Google, and others, need to crack down on this trend in the computer industry. Making lots more faster has been an recipe for disaster for users, and also for code patching.
Real man=real solutions.
Another words, hire an competent technician, before a code factory worker in a suit or a overly confident nerd.
I want an Titanium one, with actual Titanium case, and weather proof for the beach, or diving ;).
They could at least make and solid aluminium one, or with that stuff they make those shine mugs out of.
Now, there would be something worth buying.
RE: Pravda - Russian?
It does cause you to think strange, I tried an veggie diet for an week and an half or two weeks. Sort of soft spongy thinking/logic perverted, the sort of thing you need if you are being led.
I am more interested in how true what they claim is (without reading through over an 100 comments) in summary did anybody find that any of it was not supported by evidence and untrue?
It is funny that the bad behaviour symptoms they ascribe to vegetarians sounds like the symptoms of ill health from mal-nourishment. The only vegetarians I have personally known have not suffered from this, but then again I don't know for certain they did not have some meat. Now Vegans, have an more restrictive diet, I wonder if they used studies that use them. I am not saying that vegetarianism is OK for most people, it is harder to get adequate dietary intake from what I have been told.
About lack of meat leading to low birth rates in Russia after the collapse of the soviet union (where did you get that from?). I remember that back in that time news reports of whole groups of miners, not getting paid for months, and others not getting paid (for whatever period). Food was scarce, and they were depicted eating an few vegetables. So that part, might, be true, but I also imagine it has something to do with wages and standard of living going down making people reluctant to invest into having large families (which the state used to promote). The other factor that might, on an slimmer chance, have something to do with it, is the allegation that certain chemicals turning up in the food chain, and from plastic, could be producing the falling sperm counts in western men. Basically, one mimics estrogen and is used in food manufacture (I think also as an waste product in manufacture) get into the environment and into our food, the wrapping for our food (even tinned food is lined with plastic now a days) and our microwave food containers, and tries to neuter our sperm count an bit.
Ahhg! How long has this Amiga Anywhere revolution being coming, ten years? Sounds familiar doesn't it, DukeNukem Forever=Dukenuken Anywhere style.
The Taos Intent stuff was really good, I am into OS design, and they were my only real technical competition.
They really killed it when they decided to charge for the Amiga player software. They could have made money, like Flash did, from selling development tools, and anybody could have got into buying software for the platform, or even web pages using it instead of Flash (another possible market).
The Amiga has had the most brilliant run of BAD luck, that I can remember in the computer industry. Amiga did not advance the system enough to produce quality demand to keep enough money coming in. We then got the take overs, here there etc, promising all sorts of things (like the Amiga Vacuum cleaner like case). After the first one or two takeovers, the Amiga hardware was looking worth as much as the Sinclair QL compatible computer industry. These days, those designs are really only suitable as cheap Chinese 1000 in 1 game systems. The Amiga anywhere software, was an good shot in the arm that did not go very far.
What can we do these days. Development of Linux is so far advanced, that drab desktop screens of revolutionary Amiga OS just don't cut it (get an ergonomic GUI stylist in guys). Hardware is even worse. The cost of graphic development is so high, even the console industry use PC graphics technologies. The cost of processing development is also so high that the PC and console industry already have many things covered. The consoles have become the Amigas of this day. So, unless they want to invest seriously big money, how can they compete on conventional technology.
There are an few avenues, there are massive parallel arrays, used on chips like the Ambarella codec, clear-speed etc, role your own massive ARM processor array, and processor in Memory precessing arrays, are some of the few low-cost avenues left. There are also some other technologies, including new ways to do physical displays, 3D graphics and simulation, like I have been considering. But the reality is, few companies have use of the right staff to make the breakthrough to pull ahead.
I identified this sort of issue an while ago. Any simple hash function must produce the same value for countless combinations. Though most of those combinations might have limited/no practical use to an hacker, some intelligence can narrow down to an useful design.
So I devised an comprehensive idea (unimplemented) of multiple hash values that nullifies this potential threat. In reality, there maybe no file system supporting this level of comprehensiveness, so separate hash values would have to be maintained, and when an file change happens an comparison made (i.e. time consuming) until OS companies can incorporate such hash systems. Until then, the present hash systems can be incorporate with file change details, and file lengths. As an file change indicates something has happened (though the date might be got around, so OS companies will have to enforce, or at least enforce an flag that an change has happened). File length, because it is easy to make an hack that changes the file size without compromising file functionality, but hard to make one the same size, unless you refer to another file, in which case the files that the file passes execution to need to be pre-defined to easily detect unusual behaviour.
Combustion engine efficiency
Too much to read, but I did not see the following before I stopped reading.
Your article points to the energy density of fuel, even compared to recent battery advancements, but this is not the full story.
Petrol engines are very inefficient. Though you might be able to get 20-30% efficiency (old figures) at an ideal rev range (such as in an hybrid car) in reality it operates over an range of less efficient speeds, and the act of changing the operating speed during travel probably uses an fair bit extra fuel to. Now, put that through and transmission etc and you maybe looking at an ideal rev range producing down to less than 10% efficiency. Now subtract for engines running at less than ideal revs, and braking (though an normal car could be designed with regenerative braking, they are not as well setup for it). So now the differences between the batteries are much less.
What answer for the internal combustion engine? I have lost track of the number of promising fuel efficiency technologies fro the internal combustion industry that have gone walk abouts in recent decades. These petrol companies want to be broke in 20 years leaving the rest of us to suffer. If all the technologies were put into hybrid engines we could look at efficiencies closer to 50% going to an electric power train that will preserve more than 80% of the energy (conservative) and these companies can be in business for far longer. Add to this an idea like using an bank of super capacitors (research going on for an major Australian car) or an compact battery like the one referred to, that can be charged at home and reduce fuel consumption. The other possibility is fuel cells, if fuel cells can be made sufficiently efficient, we can reduce the size and weight of cars.
This was local. Lucky he wasn't in the toilet, it would lead to confusion as to what sort of "piss" he was taking while leaving the book face up on the ledge.
If you think that is bad, try getting on an local plane and telling them the book is "not an bomb" (seriously, don't, the laws give long prison sentences and there have been an number of local incidents ;).
Political paranoia gone mad, maybe. If the dimwits only knew how easy it was to seriously do something, they might have some sort of sense of humor. Back in the 70's (or maybe early 80's) there was an sixteen year old that designed an serious bomb as an party trick, according to law enforcement all it needed was the actual materials.
Let's see, if you were reporting on white supremacists, the south of the US might feature heavily (apart from an more obvious choice) if you were reporting on 419 scammers picking on people of another country/race, then where they come from might feature more prominently. If I made an homemade helicopter from an civic, I might be lucky to feature in the Register (unless they found out about it, in an non-racist way, expect for terms like "Australian suicidal nutter").
Actually, since I saw that article on the home made jet pack, I have been tempted to make an mock one, invite my impressionable friends over, stage an leak and pretend that it is going to immediately explode. If I could get them on film, with me running after them pleading/begging them to help me get the thing off before it blows up at any second, even better (please note: do not try this at home, such an thing is extremely dangerous, accidents and heart attacks, resulting in dead people, do happen, and the police are sure to arrest you for public nuisance, possibly fake bomb making, even being part of an terrorist plot and beat you up).
Of course, an lot of that was said in my "House" voice (thanks to God, that he allowed Hugh Laurie to exist)..
That is my plan, I am researching an number of technologies and techniques that would make this possible across Australia (and allow us to corner the dwindling supply of the timber market).
I have been worried about what this article has been talking about, and an cascading failure of Antarctica ice shelves for some time, and looks like it might be true.
Apparently, carbon is stored in water that sinks to great depths and travels fro hundreds of years, eventually coming up again. There are probably an number of possible reasons for what the article is reporting, this may or may not be related. Guess what happened in previouse centuries, the industrial revolution. If it is this water returning then we can expect absorption to go down as more polluted water comes through over time, maxing in hundreds of years. This could be an combination with saturation from present absorption as well, or mainly related to present trends, as long as it is not some cascading effect it will be slow after it quickly gets an lot worse than we thought. So, time to powder and fertilise that scrap metal and throw it in the dead zone.
The other thing, is an article recently that worried about an 1 meter short term see level rise, and 5 meter (2070 or end of the century I think) with failures of Antarctica sheets. I wonder what they guy will be revising his sea level rise predictions to given this latest news? I have been wondering about all this blustery Antarctica weather we have been getting this year, after the sea ice collapse in Antarctica. That cold air where warmer air should be means all that heat must be going somewhere. An 5 meter see rise is nothing compared to the possible sea rise.
The good news is that according to Al Gores "inconvenient" graphs ;) carbon levels should go up and crash as they have before (we hope) eventually. By those graphs, I think there is an feedback mechanisms, and an interesting thing I have seen speculated, is algae full sees (with jelly fish). There you go, not need to worry about seafood stock depletion, or need to burn more coal.
Is this broken
Far to obvious, one click instead of more than one, an shopping cart, all too obvious. You are not supposed to be able to patent obvious improvements, the US system is just too extreme.
NOBODY, should have to pay to get an obvious review done. Such reviews should be largely unneeded, as the patenter should be liable to check and patent only an valid patent, and criminally liable for those people that are trying to deliberately slip by an "sly one" fro commercial gain, and the Patent office should be liable to take out proper searches and procedures to largely eliminate patent "mistakes" and invalid patents. Fighting the rejection, or exploration (including hindering) of an patent claim, should be taken as an criminally labile act, unless it is provable or has reasonable expectation/belief behind it.
As an incentive, even an reward could be offered for successful challenges of patents based on claims struck down or modified.
No money should be required to defend an patent.
Also, companies/law firms, that specialise in breaking patents, by finding patents that are owned by individuals, or small businesses, that can't afford to defend them/prosecute unlicensed use of their patents. Should be healed legally, and criminally, liable, with the assets of people/entities involved in these criminal collaborations seized (to the extent of their involvements) and put in jail. With no time limit of prosecution of individuals, and entities that have not changed out of the hands of conspirators.
All this will lead to an higher quality patent database, more patents, and lower cost+higher productivity as an consequence. Some pain, for much much gain.
There is an lot of garbage being planted here, no pun intended. By the accumulation of effect of different methods we can make an difference. The oceans accumulate deep, and over hundreds of years it turns up at the other end, how long since the industrial revolution? How much can it take. In the so called, dead zone, in the South Pacific, it is devoid of an lot of biomass, and they don't know exactly why apparently. But what they have shown, is the addition of an little iron, or fertiliser will encourage great growth (blue see to greenish see). It is being thought of as an potential big carbon sink. But not so directly predictable. So the reality is that making money out of trees is much more attractive.
One of the problems/solutions, is too much of the earth is arid or semi-arid, if it all had substantial rainfall much of the problem could be abated with biomass.
I thought that when an tree decomposed it didn't just all go up in CO2 so quickly. But I would like to suggest, as pointed out elsewhere, apart from storing it like previously suggested, like filling up deep deep valleys, how much of it is stored up in use makes an difference. Chip an rock multiple times, and you likely to land up with an bigger chunk out of it.
The Internet is being scourged with useless advertising. If it is being put on an website, it should be paid per view, or per period, period. The websites are doing the advertiser an publicity service, and should be paid for it, they are wasting their time and space, it is up to advertisers to actually sell something. Worse still, they raise awareness of the brand and products, but the user might go to the website at an latter stage, or from another website because they checked it out at the previous site, and the original website will not be paid.
What is happening instead of straight forward, honest, non-underhanded advertising, is that these alternative schemes, in general, are forcing website owners to put more advertising on their websites, copious amounts actually, overloading users and making them pay much less attention to them. Also causing pages and load times to be much larger. This shoots everybody in the foot, the advertisers get lost in the humongous crowd, as they don't stick out (unless they put an annoying advertisement) the website owner gets little return as users are now avoiding the advertising, forcing the website to depend on even more advertising. If we had stuck to period, or per view advertising, then advertisers could pay to be the only advertiser per page, and websites could get away with only an couple advertisements per page, which the suers would be much more interested in responding too.
Quote of the month, "What is Google AdSense, is it used around here?" says the person that is so used to ignoring too much advertising, he doesn't realise it exists.
Our modern programming industry is an bit brain dead. I agree that only experts should do optimisations (and that we need to be more trained) and hardware tricks are undesirable (because they don't transfer over to new machines and hardware etc) but this illustrates the problems in the industry of sheer inefficient bloat and slow down. From the performance of PC software, it just doesn't seem to be highly optimised, but it would not surprise me if it just isn't being done well.
If the hardware, then firmware, then OS, then the design of the program was optimised in the first place, then the programming could be more easily optimised, and Vista and Office could maybe fit in 10MB (depending on graphic/sound/data file use etc) and run at least twice as fast. Instead we have slobbery design from deep down in your computer and up, adding layer on layer of inefficiency with much extra complication and code, which introduce many more errors. We have lost the plot, as far an realtime embedded design principles, which is an side line from the regular mega OS/application programming of the history of computers.
The alternative approach is simpler, specifically talented hand picked people to govern and design the basics of the computer, firmware and large sections of the OS, eventually transfered to machine code longterm (the design being good enough for long term use to justify this). This should vastly shrink the code size, and error rate, of these components, just with that. This means, that only an very small fraction of the programmer/engineering population is needed to achieve this (there is not many of those "100+" times better programmers available anyway). Middleware, development tools, authorware, and tradable code modules, to be done by expert programmers, and the dumb masses of application programmers to use these to make applications, and scripting/authorware for the rest. This resulting level of design optimisation should enable normal programmers to more quickly produce more efficient code.
Does it really matter, we struggle with our present systems after such short times as the industry strives to fill up the extra space. With optimisations we can fit more on our computers, and have them work faster, in everyday language that's still good. But optimisation is an "experts" game.
(PS. Having just read some of the other posted comments, I am glad that others can see the sense in optimisation as well. The optimisation I talk about above, is an example of extreme practical optimisation, that come out of my exposure to the Forth language and Misc processor communities, among others).
The good ol days, the future.
It was cool, because the game side was run by UK game types, originally. What has happened now? Progressively we have seen an move to US, then Japanese, overemphasis in the industry.
I am not from the UK, but have been exposed to UK and US games for quiet sometime, from the early days of the home computer. The UK game industry was often more innovative and fun, and many computers I owned, as well as games and magazines, came from the UK. Considering that Apogee games were a breath of fresh air in the US industry, it really shows how bad the industry was. I know that UK industry, in general, has an bad record of failing, but where are you guys now, we need you?
Speaking of the failings of the PSP, lets see, too expensive, too big, no touch screen, no keyboard (an pop out one, like on smart phones, would do) low resolution display (VGA or DVD at least) and no embedded camera with HD video. An CHEAP PSP mobile smart phone is needed that is not hobbled (or restricted to mobile downloads). An larger UMPC like tablet (even running PS2 disks), that could have faster hardware (to compete with the sometimes mentioned Game cube based Game Boy Advanced 2) would be good. Computer functionality with home development, Linux or Java, are also desirable.
Unfortunately for the price of the PS3 now, it really should have computer and HDTV Digital Video Recorder functionality, and media editing (and an DX10 level GPU). Until the current machine goes to half the price, it will probably not be on the level it should be. Remember the promise of the PS2 as an computer, as an networked broadband device, AV centre, and the PS2 chip permeating it's way through to other products, what happened? We now hear similar statements for the PS3. If Sony wants to survive, they really need to release advanced functionality PS3s at an small profit, soon, not latter at an high AV price, and cheaper PS3's with well supported computer kits as well (Mac OSX please, or Linux even).
As I remember, the reason for UDI was so that companies could get out of paying a relatively small license fee for the HDMI interface and content protection scheme. Now it looks like they will have to equip monitors and cards with two ports anyway, license the content protection scheme anyway, and foot the bill to the customer. there is little reason for anything other than HDMI, except maybe an cut down version for computers and laptop LCD panels under the HDMI standard. Kill off display port and UDI.
I wish somebody made an universal IO interface that could also stream to monitors and HDTV's, and hard drives. So we could have only one sort of IO port, which anything could be plugged into outside and inside our computers. USB2.0 is long overdue for an upgrade to USB3.0. Adapted HDMI 1.3, 10/GigE, PCI-E 2.0, SATA, external etc would be interesting candidates.
Firewire was clearly superior except for price, but then price differences fell, and more with USB2.0. Firewires design required an lot less processor time being tired up and better timing. As USB2.0 data rate approaches it's maximum data rate these can become substantial factors, imagine the system having to service 6 high data rate USB2.0 devices. USB caused a major side track in the industry, had slow penetration, and delayed, and headed off, the arrival of the authentic alternative Firewire. We are all the worse as users for it, now is this being repeated here with these video standards, and display-port?
Who started up The Inquirer?
An lack of evidence is proof of nothing, empirical evidence is better than nothing, as it is still evidence. Studies can be, "guided" to desired results but that does not mean they are right. If the government banned commercial organisations from funding potentially conflicting external research, or interfering, we may have better results.
- Facebook offshores HUGE WAD OF CASH to Caymans - via Ireland
- Mexican Cobalt-60 robbers are DEAD MEN, say authorities
- Apple's spamtastic iBeacon retail alerts launch with Frisco FAIL
- Submerged Navy submarine successfully launches drone from missile tubes
- Apple sends in the bulldozers as Fruit Loop construction begins