16 posts • joined Saturday 22nd September 2007 13:14 GMT
@ Chris C
I think that data includes the detected putative permafrost, and water erosion-like features of the landscape. They think that the water is released every few million years by volcanic activity, and that sometimes Mars may have oceans.
OK, none of it is proof, but there is a lot more evidence for liquid water interactions than you allow, so I guess you can forgive them for forming a hypothesis?
I actually downloaded the nucleotide sequence for ricin from NCBI BLAST for no particular reason. To my surprise that it was freely available (and would be a cinch to use this to design PCR primers to clone it). Strange thing was, afterwards my zonealarm was registering about 200 alerts every minute for days every time I logged on to the web. I did a whois and the information that came back said "this block of IPs are reserved for special uses" or sometihing like that. I used one of those google maps-linked whois IP locating tools and it indicated a point on the north bank of the Thames in central London. This is abosolutely true, I would encourage you to try it yourselves. Seems that us biochemists are being watched by Big Brother......
Sadly, I thought that the readership of Reg was a little more sophisticated than BBC HYS ranters or Ebaumsworld.
Yes I am a Scientist. In fact I am sitting in my lab typing this now. I can ensure you that the Scientific Method is more important than anything to the vast majority of us.
Like most scientists, although I have studied for longer and in more depth than a student medical doctor to get my Ph.D, my earnings are less than the average bricklayer. We do this work because it is important to us and to humanity. Perhaps this suggests to you that we are not a bunch of snide dirty liars who will do anything to get our hands on some cash? Had I spent my time and efforts into becoming a banker or a businessman or a medical doctor instead, I sure that I could be a very rich man by now.
So far, the majority of evidence tested by the highly stringent Scientific Method suggests that global warming is very real. I find the uneducated layman who is prepared to declare his superior knowlege an interesting if frightenting scientific study in itself - e.g. christians do not believe in evolution as if the bible is wrong, there is probably no heaven (and therefore no afterlife), while global warming deniers may not want to admit the possibility, as they do not want to change their lifestyle (ditch the new 4x4/BMW, stop flying to Ibiza for 2 weeks every year etc), or simply cannot face the frightening reality that faces us.
@ Paul R. So far as "oh well, humanity will deal with it, wrong. The last time the ice caps melted, the ocean conveyor stopped and the seas became stagnant and most sealife died. Huge amounts of poison gas were emitted into the atmosphere, and most of the larger life forms became extinct. Humanity could probably not survive this.
Licensing is not a viable (or vaguely intelligent) solution!
A very partisan and elitist opinion. There is much truth in what you say, but in pandering to your sites demographic, you have comprehensively failed to address the ramifications of your suggestions. A restriction on computer use would result in the stagnation of the technological development that has resulted from the proliferation of PCs across the World. Much fewer and more expensive Killer Apps' would be available, game and CGI development would never have been able to proceed to the levels that they have now due to more archaic technology and higher cost due to fewer sales. Think of the impact of cost and logistics on schools and universities. Also, bear in mind that many of the malware that untraps these less "non-IT industry" mere mortals was written by highly talented, if unscrupulous, software engineers who will just work harder to entrap the brave new breed of licenced users. What is needed in not legislation to stop less talented people from using information technology, but for a worldwide consortium that REALLY takes ownership of problems such as botnets, ID theft and viruses, while relentlessly persuing and prosecuting the perpetrators. Anything else is pure fantasy. If you don't like it, stay beardy, fire up that old 386, and log on to Usenet.
Antimatter does not equal antigravity, charge is not magnetism.
Antimatter has opposite CHARGE to matter not negative MASS, so no antigravity for you trek fans.
The charges on atomic/subatomic particles are not the same as magnetism, which for atomic particles is based on spin (and for molecules also based on dipole moment) so anti-hydrogen is no more magnetically neutral than an antiproton, moreover a magnetically neutral element could not be held in a magnetic field. Sorry to piss on your chips, chaps.
I also imagine that the discovery of interstellar antimatter will not lead to laser swords, time travel, hovering skateboards, teleportation, warp drives or the possbility of finally meeting and having sex with humanoids sporting an amazing variety of different shaped latex foreheads and eyebrows.
This is not news!
Nice to see Brighton Polytechnic *cough* I mean University is coming in line with what we at traditional universities have always specified anyway.
Sources absolutely have to be peer reviewed to prove that they are true knowledge to the best of the ability of the author or researcher. Although I love Wikis and search engines and feel that they are a valued day-to -day resource, they have no place in academic work.
Any amateur, liar or charlatan can post any (mis)information they like on the net, and it is a matter of course in the university that I work at that web links are rejected as a form of research for course work. I am surprised that Brighton Uni ever accepted them in the first place.
Considering that during the Falklands war 20 Harriers managed to thwart 120 Argentinian mirages by shooting -a lot- of them down using the air-braking/enemy overshooting strategy. The Argentinians (and French) also thought that the Harriers would be easy targets, and how wrong they were. Its a great strategy that has been conclusively proven in a theatre of war.
IT killed the movie star
No doubt vast swathes of CGA and no plot whatsoever (Rather than vast lumps of plasticene/latex and no plot whatsoever as per the original).
A common theme coming out here? Maybe they should come up with a computer program that can write good plotlines.
Should be a good earner for you embearded ones in the IT business though.
P.s., Why not a movie of Larry Nivens Ringworld, or something by Iain M Banks? There is so much really great stuff out there to blow a few mil on.
This article read by no
Are we to take the ironic lack of comments on this article as evidence that the world of IT workers are all down the pub instead of pretending to work by reading El Reg or playing counterstrike at their workstations and answering the odd phone call saying "have you switched it off and on again?" before returing to the surruptitious websurfing or departmental lanfest?
I am sitting in my lab waiting for my TCA precipitation to centrifuge while all the other scientists have gone to the Genome Centre kneesup or down to the "Mongolian Barbeque" in Brighton for some nosh and booze. (I am not that sad - planned experiment while forgetting that the pissup/skive was on for today, D'oh!).
Lily Allen, a truly modern entertainer and social warrior - don't need talent, she's got cash.
So far as I can see, Lily is as vacuous, sarcastic and spoilt as her father is arsey and unpleasant. Lets see, dad Keith employs a whole bands worth of very expensive session musicians for her online musical debut, ensuring that her rise to fame is as undeserved as his own. So relying on dad's large pot of cash to buy celebrity is a shining example to all feminists is it? Frankly, here in Brighton you can go to any open mike night and see girls that are exponentially more talented than Lily, who also almost certainly know exponentially more about social politics.
They HAVE been trying it for a long time, but the neurons just don't like it and tend to shrink away from artificial attachments. Also, it's hard to attach them and keep them from being yanked out, and a bit of a complicated wiring job when each one of thousands of neurons is just a few microns across and mother nature forgot to colour code them so you don't know what each one does. Also, need a chip that can individually either detect chemical neurotransmitters, cos that what you get in neuronal synapses, or one that can detect the NA+/K+ ion charge differences caused by the wave of depolarisation in each stimulated neuonal axon. Tough one that. Also, could be frustrating when your battery goes flat during periods of intensive arm movement.
I remember seeing an old vid of something very similar to this (but a bit more basic), must have been from the 1960s or something. They fly in towards the camera, land, take the wings off and fold them up then stow them in about a minute, then they drive off like a hokey looking car with aeroplaney wheels, from what I remember. Didn't catch on then, probably won't now.
Only read this if you really care how viruses cause cancer.
Ack! Sorry to also correct some people, but there are many cases of cancer being transmitted by viruses, including some recent evidence that breast cancer is transmitted by viruses in mice.
To say that the cancer is due to the "reaction" of the body of the infected individual is not really very accurate. Probably the best characterised are the avian sarcomas (which derive their name from the growth factor, and therefore cell-division stimulating, signalling protein src).
To put it simply with a couple of examples:
v-src is a homologue of cellular src (or c-src) which is encoded by the viruses own genome and therefore manufactured in the hosts cells during infection. Unlike c-src, v-src is constitutively active even in the absence of growth factors, and this means that cell division continues to be stimulated and the result is cancer. This is an "oncogene".
Other viruses can cause cancer by inserting their genome into the hosts genome right where a gene that inhibits cell division (a "tumour supressor gene") lies, thereby disrupting it. So again cell division continues uncontrolled and cancer ensues.
(Yes, I'm a biochemistry researcher and not a computer programmer and sorry, there is no IT angle).
Ha ha! you all miss the point, surely?
First off, I'm no IT professional, I'm a research scientist. I know little about what *makes* an operating system. Like many of you I am sure, I learned BASIC on a ZX and a Speccy, then a 386 then did some C stuff at night school, so not a total dimmock, just an educated amateur. I use Mac's sometimes at conferences as they are reliable and don't crash half-way through a talk. I installed Linux once, but it didn't do much, and as I had no particular interest in half-finished home-written software to run on it, or hacking servers, and I am not a particularly skilled software developer, I uninstalled it.
What REALLY drives micro$oft's monopoly? Simple, 2 things, GAMES and MICROSOFT OFFICE.
Office is the global standard for business and it is a slow, difficult and expensive thing for a business to change. The only thing that would even begin a global change in business OS use would be to implement cross compatibility across all file types (which would surely restrict competitiveness and the development of new business software innovation), or a drastic reduction in the cost of a Mac (I can buy a Dell workstation for a couple of hundred quid VAT free through the University where I work).
For the common home user, (almost everybody) even those who primarily bought their box to edit photographs, or home movies and not to play games, many of them have kids (or husbands) who want games or easily available educational software. Not many contemporary games are available for the Mac (although this is changing a little), and I can't see kids getting much cred from bringing their mates home to play "Snake" on the family Linux system.
Yes, I am a gamer, but I also use Windows PCs to drive confocal microscopes and qPCR machines as well as for DTP, molecular biology and office apps in my office, so I am hardly an ignorant fanboy. Until the price of Macs comes down and general availability of Mac software increases, I'll have to stick to Windows PCs for my work and gaming, Linux remains a platform for the niche enthusiast or simple and cheap legacy user. Like me, the vast majority of users are not IT professionals, and until you all develop the ability to see past your own front yard, you'll never realize the big picture.
- Review Samsung Galaxy Note 8: Proof the pen is mightier?
- Nuke plants to rely on PDP-11 code UNTIL 2050!
- Spin doctors brazenly fiddle with tiny bits in front of the neighbours
- Game Theory Out with a bang: The Last of Us lets PS3 exit with head held high
- New material enables 1,000-meter super-skyscrapers