It's a shame no-one complained about the story!
Derivative, to say the least.
In my view, this was NOT a good start to Peter Capaldi's term as The Doctor.
At least it makes it easy to get better.
2593 posts • joined 15 Jun 2007
Derivative, to say the least.
In my view, this was NOT a good start to Peter Capaldi's term as The Doctor.
At least it makes it easy to get better.
"Nice satellites you've got there, tovarishch. Would be a shame if they ended up in the wrong orbit, nyeht!"
Only 30 addresses? My DHCP server struggles to allocate addresses even though it has ~100 to play with (the other 100+ addresses are in reserved ranges for static IP addresses). And I have used something like 30 of these static addresses for machines I want to have fixes addresses - like the main laptops for each of the kids so that I can monitor/arbitrate who is using the most traffic as well.
I have seven adults in the house, with WiFi mobile phones, tablets and eBook readers, laptops and larger gaming rigs. Add to this all the consoles and hand-held games, set-top boxes, and a smattering for the infrastructure devices (WiFi hubs and routers) and we've used up a significant part of a Class-C subnet just in one house! I'm really not looking forward to transitioning IPv6 (I'll probably set up an IPv4 island when I have to!)
The ATPL board had some jumpers, but I think that it synthesized a write enable from the address bus.
As a result, you could not move the various buffers (like the disk buffers) or worksapce for DFS into sideways RAM. There were hacked DFSs (I think the Watford DDFS was one) that could work in shadow mode, but it did that by changing the addresses of the buffers in the code, not re-directing the addresses.
The Solidisk board for the Model B was more sophisticated, but I believe that it required a wire either inserted into one of the chip sockets in parallel to the chip pin, or a fly lead soldered to the board.
The way that the ATPL add-in worked was basically that any write to an address above &8000 got directed into the (single) bank of static RAM, regardless of the ROM select register. Some ROM providers got canny to this, and during ROM setup, would do a write to overwrite some of the ROM image (Wordwise was the first one that I came across) to cause the initialisation to crash the BEEB if it was running the ROM image from RAM. This could be prevented by adding a switch to the write-enable line of the static memory (there was a solder link and pads for a switch on the ATPL board) that would disable the writes to the RAM. The sequence would be load the image, write protect the RAM, and reset the BEEB (in fact you did not need to reset the BEEB, there was an OSCLI call to initialise the new image - something I used to enable switching between the runtime and compile ROMs in RAM of the Acornsoft ISO Pascal system, which came as 2 ROMs).
Back to Wordwise, when I got a Master 128 (at work), which did not have a write defeat switch for the sideways RAM, I hacked Wordwise to remove the offending code in the image to still allow it to work. Not that I used Wordwise. If I was using the BEEB as a word-processor, I preferred View, but if I just wanted an editor, I used the one built in to the ISO Pascal runtime. Most of my documentation was actually done on my (well, work's, but I was the sole sysadmin, so it was "mine") UNIX box using nroff and a Qume Sprint 5 daisy-wheel printer.
The shuffling of the programmes down was something that was done way before the B+ or B+128. You would load a small piece of machine code into the cassette buffer or somewhere, *LOAD the cassette image into a higher memory, and then move the data down before changing the video mode.
Some of the ROM toolkits did this for you. I think that both DISK DOCTOR and the ROM based BEEBUG monitor had this feature.
What the B+ and B+128 did do, however, was allow the disk subsystem to use 'shadow' memory for the various disk buffers, meaning that PAGE remained at &0E00, rather than the &1900 that was normal for a machine with Acorn DFS on either the Intel 8271 or WD1770 disk controllers, or &1A00 for a system with disk and Econet, or &2100 (I think) for a system with ADFS (yes, you could get ADFS for BBC Model B's, it was used to run the 10MB hard-disk in a Level 3 Econet server).
They also moved the screen into shadow memory so that memory up to &7FFF was available regardless of the screen mode. The primary use for the extra 64KB of memory in the B+128 was to hold RAM copies of sideways ROM packages. I have an ATPL Sideways RAM board for that (but only 16K of static memory) so I never invested in a B+ or B+128, or a Solidisk add on shadow RAM board.
Must have a play again sometime.
The first UNIX system I ever used had 2 RK05 cartridge disks, each 2.5MB in size, and 128KB of memory (this pre-dated the PC by several years). It was never about the size of the disk, it was about the speed of the disk and the model used for running commands, especially if they were chained together in a pipeline.
I used a system that had a minimal UNIX-like OS (it was so similar, I wondered whether it was a direct port of V6) on two floppy disks. One was the system, and the other was used for user/application data including the pipe files (if you remember back as far as UNIX Version/Edition 6/7, you will remember that unlinked files were used to keep the data that was in the pipeline).
The amount of thrash that went on between the two disks whenever you ran something as simple as "ls -l | more" (IIRC it was a port of UNIX V6 with some BSD 2.3 enhancements, possibly called IDRIS) was more than anybody could bear, and for these systems, you could only really use the OS as an application launcher, not in the way that a UNIX power user would use it.
AFAIK, all systems that Ken worked on either had Core memory, which was persistent and had the OS loaded from paper tape or DECtape, or had hard-disks. There were no floppy based UNIX systems at Murray Hill.
PDP11s (except for the very smallest ones) had MMUs that allowed them to address up to 256KB or 4 MB of memory dependent on which model they were.
I am positing that PC-DOS was never provided by Microsoft. If CP/M-86 had been the OS for the IBM PC, then MS-DOS, OS/2 and Windows would never have happened, and the PC would have evolved to multi-tasking and protected mode machine as the hardware became cheap enough, because the rudimentary features were already in CP/M-86. With a proper multi-tasking OS, a windowing desktop would have followed quite naturally.
I've deliberately not mentioned UNIX, although it has been my career, because I'm well aware that in the early '80s, the requirement for a hard disk that UNIX has would have prevented it from appearing on commodity hardware.
Yes, I admit that some historic features of UNIX may be undesirable, particularly the security model which is effective but probably too simplistic by what is required today, but I would again suggest that if UNIX had been more prominent outside of the server room, there would have been more pressure to modernise some of the least desirable features of UNIX. In some respects, UNIX is a victim of being as good as it was when it was written. It's been just about capable as written, so people were able to work around problems, never requiring significant re-write.
Nice to see the PDP-11 architecture being used as the reference for mini-computer memory management. Should always be regarded as a classic architecture.
But the final analysis is flawed. There were micro's with MMUs available when the IBM PC was produced. There were MMUs for 68000s and Z8000s that would have allowed proper protected mode OS's like UNIX or Concurrent CPM/86 to run on the desktop. They were, however, too expensive for the types of machines that IBM envisaged (single user, single task machines that worked like Apple ][s, but with a more 'modern' processor). Cost and maximising profit was the main cause of using poor hardware that did not have the required capabilities for security.
It was a failure of imagination that led to the development of the IBM PC and PC/MS-DOS in the first place, and once out there, nobody was going to be able to shake the dominance of these platforms on the desktop, even though they were technically flawed and limited, even when they were new.
Imagine if Gary Kildall had actually met and agreed to supply IBM with the OS for the IBM PC. I'm absolutely sure that with a CPM/86 derived OS, multi-tasking, potentially multi-user and running protected-mode processes, together with a supervisor mode OS would have appeared in desktop machines way before WinNT.
Windows even now is still living with the legacy of poor design decisions taken in MS-DOS and early versions of Windows, which persisted well into the times of hardware (and indeed Windows core security capabilities) capable of running properly protected.
I would not say that it was only disks and tape drives that broke. I've been involved with many other hardware failures across the spectrum, but the one thing RS/6000/pSeries/Power systems will do is actually tell you what is most likely to have failed.
It also had (actually, still has) very good hardware diagnostics (for AIX systems) to back up the POST and BIST checking, although almost everybody has forgotten them. Add in the HMC call-home and remote console functions that were added somewhere around the millennium for the pSeries systems, and you have a platform that is robust, stable and supportable, and is IMHO still best-of-breed (of the UNIX systems) when it comes to running a service.
And people say that changing settings in Linux is obscure and convoluted!
I know, I just could not resist the double-entendre.
Chances are that the canards probably won't have a huge effect at the launch altitude because of the rarefied atmosphere.
They will come into their own as the 'plane descends.
That would probably be all of the servers that they are currently running, as most companies depreciate capital assets including IT over a period of 3-5 years.
And that's just the financial side. At the current rate of change, they would be technically obsolete before then.
BTW. My home IT infrastructure is built on obsolete or discarded systems, so if anybody wants to get rid of their working 5 year old Xeon or Core Quad system, I would be quite happy to discuss giving it a home (running Linux, of course).
Harmful emissions from CRT tubes wasn't all nonsense, it's just that the concerns persisted well beyond the point where they were relevant.
Shooting high power electron beams in the direction of people, even though there was some form of screen between the beam generator and the people did result in various types of radiation, from visible through to X-ray, low-energy beta and possibly even alpha particles or fast ions.
Very early CRTs probably did emit small amounts of harmful radiation. But by the time they were commonplace in offices and homes, the problems were sufficiently well understood that any alpha (which were probably stopped by the glass anyway), beta and even X-rays were being blocked by coatings on the glass or diverted away from the person sitting at the screen. There is not enough energy in the electron beams to generate gamma radiation.
So any terminal/monitor made after the late 1970s were not a problem, but the information persisted.
An interesting page is the description of the stickers on the tubes of Lear Siegler ADM3a terminals (an early glass TTY) at http://www.tentacle.franken.de/adm3a.
The interview on Radio 4 this morning was talking about posting video clips obtained from television coverage onto YouTube or other social media. With no fair-use provision in UK copyright law, any video obtained from transmitted material that is redistributed is a breach of copyright, unless allowed by a specific waiver of copyright.
What is not copyright breach is using a phone in the ground to record part of the match, and then posting that. That may breach the terms and conditions of the ticket, but would not be a copyright offence (unless the owners of the advertising objected to that appearing - but they'd be stupid to complain about wider distribution of their adverts!)
You forgot longevity. A pen with only one colour probably has more of that ink, and will last longer.
Another similarity to WinXP!
While this is generally true, it depends on what counts as a degree.
I used to think that a degree meant that the person had succeeded in achieving an advanced qualification, requiring learning and diligence and often independent thinking, without being watched all the time (like at school), and all the time exhibiting restraint against the worst excesses of the results of being free from parental oversight.
This lack of oversight was one of the primary differences between universities and polytechnics. Poly's kept a close watch on their students, and offered better support services to advise students and keep them on their courses. Universities often just let the students sort themselves out, or fail.
Nowadays, it seems to me that students are given subjects that are less rigorous, and also have much better support services that attempts to prevent the students from failing. This means that University is much less academically and personally demanding (although I acknowledge that there are financial pressures), resulting in the value of a degree being diminished.
I know that I am generalising. I'm sure some universities are still turning out excellent graduates. But many aren't, and this means that industry no longer values a degree as a guarantee of certain qualities, and that is what is damaging.
Bring back the rigour that a degree used to represent, and I will agree wholeheartedly with your statement.
P.S. I graduated in 1981 from one of the long-established universities, after nearly failing my degree at the end of the first year. The fact that I nearly failed was scary, and taught me a lot, and I believe that it enhanced my resultant work ethic and character..
is actually because the Government believed the crap they were being fed about more graduates in the job market leading to higher productivity and a move to a skills-based economy.
They forgot that in order to have that number of students, it was necessary to actually have courses that kids wanted and were able to do, and that led to degrees in the most unlikely and useless subjects. To cap it all, they encouraged all the Polytechnics, which were turning out useful people with lesser qualifications, but suited to industry, into second-class universities (I worked at a Poly. before the switch, and it was excellent at what it did, but that was not turning out degree graduates). Couple that with the travesty that is a "foundation" degree that pollutes the meaning of a degree, and it's a real mess.
And then, because there were more students, they could not afford the the grant system, so introduced loans, which are not saving *any* money because of the poor rate of pay-back (often graduates do not pass the threshold at which they start paying the loan back, because they are not using [or can't work in the field of] their degree).
We need to go back to Universities being elitist, turning out the right number of the right people for the jobs that really need a degree, and move back to apprenticeships and on-the-job training for the majority of young people. Competition for fewer university places means that those that want to go work hard at 'A' levels, and stay working hard to keep on the courses.
What should be done is that they make it so the panic PIN will work in the hand-held devices, and will dispense money the first time it's used in an ATM but alert the bank and the Police. The mugger won't know that they don't have the proper PIN, and hopefully will release the victim.
The bank can then flag the card to cause any ATMs to go out-of-service (rather than declining the card) whenever the card is used again, hopefully leading the mugger to be unsure whether the card has been blocked (in case they demand that a second transaction is done by the victim), or whether the ATM is truly faulty. All the time, you pass the location on to the police whenever the card is used.
The customer and the bank may argue who pays for the first cash withdrawal (the bank will want to make sure that it really was a withdrawal under duress), but that should be a small problem.
The original question was "Can you tell me which other OS was ported twice to an other processor architecture?"
It said nothing about serial ports.
I admit I got it wrong about MacOS. Maybe I should have said NeXTstep(68000)->OSX(powerpc)->OSX(x86-64)!
I think that if you look at the myriad of Linux ports out there, you will find one that is not one port away from x86 anyway.
And I know there are a lot of UNIX ports out there, but how about AIX(ROMP)->AIX(POWER)->AIX(IA64 - although did not last long), and along the way there were s370 and x86 ports as well.
Ummm, off the top of my head.
BeOS (may be stretching things here)
It's more common that you might think.
I'm constantly infuriated by my Linux colleagues who assume that Linux is a POSIX compliant operating system, and that anything written for Linux can be easily backported to UNIX or other POSIX compliant operating systems.
I currently work supporting and AIX HPC in an environment where Linux is used extensively for other data manipulation and modelling work. I keep getting questions like "Why is Linux tool X or Y or Z not available on the HPC", and I have to patiently explain that because the tool requires the complete KDE or Gnome environment, or reliance on dbus or udev or KMS, none of which are in the POSIX standard, or any number of cumulative package dependences, a back port is almost impossible.
They cannot see that Linux has done the Embrace and Extend, and is well down the Extinguish path against UNIX and POSIX in a manner that would make Microsoft proud.
And I would not mind too much if there was a new POSIX standard that was extended to specify parts of the Linux and GNU tool chain that genetic UNICIES could be extended to include, but there is no such thing! There was the LSB, but that's an unmaintained standard that everybody ignores.
There is no workable Linux standard! And to cap it all, there is almost no Linux distribution that has even got full POSIX 1003 compliance, much less the more recent UNIX V7 <rant>(FOR GOD SAKE - UNIX V7 ALREADY MEANT SOMETHING! COULD THEY NOT HAVE USED ANOTHER NAME!)</rant> standard.
UNIX is standardised. Linux is not. Linux should work like UNIX, not the other way round.
Still I think I approve of the extended lease of life for VMS.
This is what the previous administration was attempting when they designed the databases to back up the Identity Card scheme that the Conservatives were so keen to put down. By adding a super-key associated with someone's identity to all the other databases, it would have enabled them to join together disparate information sources however they wanted.
They tried again in 2009 with Clause 152 of the Coroners and Justice Bill.
I seem to remember one "David Cameron" was particularly keen to oppose the measures.
I'm sure every Government wants to do this, but there are safeguards called Information Sharing Orders that deliberately restrict how government departments share data so as not to upset the citizen vs. state balance.. If this plan is implemented, they will be tearing up all of these, to the advantage of the state against it's own citizens.
This is interesting. What about US companies operating government contracts in other countries.
For example, in the UK, IBM run parts of the IT for the DVLA, the ID and Passort Service, parts of DEFRA, and probably other government or civil service entities. I think HP has a strong relationship with the Inland Revenue, and I'm absolutely certain that there is one or more US company associated with running the NHS IT systems.
And the UK Government has said that it intends to use Office 365 (although how that sits with the ODF statement recently, I don't know).
I ought to point out that on Linux is it perfectly possible to whitelist your udev rules so that only known devices (manufacturer, ID and function) can be configured.
Of course, this will not prevent a device masquerading as another by using the ID strings of another device, but it would make the attack surface much smaller in that the miscreant would have to know which devices are allowed.
The other thing that I'm spotting here is a suggestion that the code in the USB device could examine the system. I'm not sure whether that is possible, particularly if it is appearing as a keyboard. Flow of data is particularly one-way for a keyboard. Those that offer programmability in the hardware (gamers keyboards, for example) generally appear as more than one USB device anyway, with the non-keyboard device being used as a control point for the controller generating keyboard scan codes. You could block all but the keyboard device.
If it is configured solely as a keyboard, I don't think that the OS would send any data to it for it to be able to look at the system. At least not for a USB device. If it were fireware, then all bets would be off.
In order to axe all of the repeats, it would be necessary to produce many times the current amount of new programmes. This will either mean programmes created with extremely low budgets, or the cost of watching TV, either directly by subscription or by increased advertising increasing significantly.
Face it. All the time that there is something like the current airtime available, repeats will happen.
I think that there may be scope in eliminating some of the channels. Maybe set a limit of a dozen channels, but make sure that they cover a wide spectrum of quality programmes to appeal to a broad audience.
I'm going senile, obviously!
The TV manufacturers want a repeat of the 'flat panel' effect. It won't happen.
For most people, TV's are a long-term purchase. Provided it still works, they would not normally consider replacing them.
Flat-panel TVs, once they became cheap enough, shifted the paradigm. People replaced perfectly functional CRT TVs, not particularly because the picture was better, but because flat-panel TVs occupy much less space than a CRT. Couple that with a significantly reduced power consumption for LCD TVs at a time when people were being made energy aware, and the CRTs went down to the recycling centres by the truckload. That enabled people to reclaim space in their living rooms so that the TV was no longer the major piece of furniture it had been, feel good about reducing their energy footprint and, by the way, have 'better' pictures (although I still know people who prefer high scan rate CRT TVs over flat-panels).
This was reflected in how fast CRT TV's disappeared from the shops once flat panel TV's got to within spitting distance of the price of CRTs. And often, it was not the high cost TV's that generated the profits. It was the wholesale replacement of hundreds of thousands (millions?) of TVs with low-to-midrange price tags that earned the money.
We won't see this happening again unless there is some overwhelming technology leap that provides a must-have feature. 3D and 4K are not that, and I can't really see anything on the horizon that would. Maybe a virtual floating screen so that you don't even need to dedicate wall space, but I doubt that is within current technology.
Planned obsolescence is the manufacturers best bet to keep TV sales ticking over (maybe that is why they use such damned poor Chinese capacitors - the single most common cause of TV failure), but I'm sure if it was revealed that this was a deliberate policy, the consumer groups would be up in arms!
Sky in the UK delivers 1080i60 (at least that's what my telly and Wikipedia says). That's an interlaced 1920x1080 image at 60 frames a second, so two successive frames make up a full 1920x1080 image, effectively halving the frame rate (most televisions do some form of de-interlacing on such an image by combining the 'odd' and 'even' lines into a single frame, and actually displaying it at half the frame rate).
This means that in most cases, provided that the original was shot at 30 frames per second (and most made-for-TV programmes are), there should be no effective difference between 1080i and 1080p (1080p will transmit two identical frames, 1080i will construct a single frame from two adjacent frames). Of course, any material shot at the full 60 frames per second will suffer de-interlacing artefacts when transmitted at 1080i.
You can also get quantization errors if the original was shot at 24, 25 or some other number of frames a second. There will be some of this type of error whenever the original frame rate does not match the display rate.
You obviously don't mean that you save it from View on your BBC. As far as I'm aware, all support and updates for that stopped at least a decade before XML and ODF were defined.
I'm supposing that you are using something that understands View format (as it is from a much earlier age, it's a much simpler format, and one that probably leaked in it's entirety into the public domain), and can write ODF.
I don't appear to have any View files convenient at the moment, but the version of LibreOffice I have installed does not appear to have explicit View support, although it may be there.
I was of course talking about a term of a government mandate, i.e. the time between elections, not the overall length of government.
What I was eluding to is the fact that this could be being done to go on their election
propaganda manifesto. If it wins them votes, then they benefit, and can work out whether it was a good idea or not, but they're still in power. If they lose, then it's not their problem anyway.
Remember that UK Governments last no more than 5 years. This means that a wholesale switch to SaaS will show expense removed from the balance sheet before the next election.
The ongoing costs will be the problem of the next administration. Like with PPP and PFI.
You may also find that software counts as a Capital expenditure, so reducing that is also a win (when presented to the weak-minded electorate) for them in apparently reducing the costs of Government.
It's all a bit smoke-and-mirrors.
Points well made. Have an upvote.
Thank you for pointing this out. I had not considered whether the Canton system (which is still a representational democracy) in Switzerland was applicable in the UK, but I suspect that such a system would not work here.
The Swiss population is small at just under 8 million. It has 24 Cantons, and Switzerland itself is a federal state made up of these Cantons. The population of the Cantons range from under 1.5 million down to just over 15 thousand. If we had something similar in the UK, we would end up with something over 200 regions over the population size of 100,000. We have 650 parliamentary constituencies, which means that one Canton would equate to something like 3 parliamentary constituencies. Trying to run a federation of this number of states would be much worse than in Switzerland.
Alternatively, we could escalate the county structure to become more state-like. This would give a much smaller number of states, but would end up with huge inequities, as there is huge variation in the population and revenue of the current counties, and would lead some 'states' running a permanent deficit.
Either way, the resultant federal government would be difficult to run, and would would still end up with things like surveillance and security policy having to be centrally run in a way that would not be that dissimilar to our current parliament. It would still be necessary to arrange voting blocks to get any large decisions made.
I suspect that the reason why it apparently works in Switzerland is because of how insular they are. They do not have a prominent role in foreign politics (you particularly mention warmongering), or world trade. The result is that there are fewer issues that require a referendum. They regard themselves as being too uninteresting to be invaded, and this policy served them relatively well in the world wars last century. This may be a good thing, but if every democratic country moved to a foreign policy where they hid under a rock, it would not be very long before more aggressive and territory hungry regimes were knocking at their borders.
Switzerland does not really have to worry about this at the moment because they are surrounded by relative benign states (France, Germany, Austria, Italy and Liechtenstein), and are not very attractive to invade anyway. If they had a border with a country like Pakistan, Somalia or maybe even Russia, I suspect that they would be significantly less insular, and worry about defence and foreign policy rather more than they do at the moment.
It will be interesting to see whether when water becomes a constrained resource, Switzerland alters it's foreign policy!
That's an interesting point of view, and of course I cannot argue against it because I don't like party voting and the whipping system, but I wonder how the democratic movement intended to run the country at the turn of the 20th Century, when there was no mass communication, rapid transport was still fairly basic, and the public at large were largely uneducated?
Elections or referendums took weeks to organise and count, and at the time, only selected people had the vote anyway (remember the suffragettes).
If you are arguing that the political party system is an issue, then I guess there is some mileage in that, but even if you disbanded the party system, and had each MP stand for what their constituency believed, you would still get them banding together in voting blocks, not dissimilar to a party in order to get anything done.
You could also argue that the method of electing MPs is flawed, but I don't like the idea of party lists being used in a PR system, which is what seems to be touted as an alternative. I want to vote for a person, not a list.
Nowadays, in theory, it would be possible to have technology led referendums of the entire voting population (as long as you can fix the voter identity issue - machine readable ID cards anyone?), but how long do you think your average couch potato would give to looking at today's issues and voting on them? Enough time to actually understand the issues?
My guess is that if you had an hour a day to present all issues and take a vote, only a small fraction, probably <10% of the electorate would actually take the time to sit in front of their computer/television to watch any arguments. Of that <10%, probably a significant number would not understand enough of the background to make sensible decisions.
And you also have the problem of who presents the arguments. Without sufficient background, it would be entirely possible to present a totally biased view of any issue to get a particular result.
No, for the majority of issues that are debated day-to-day, a two house system, with the two houses selected in a different manner to each other is about the best I can see at this time. The real problem is that the minutia of day-to-day decision making is just not interesting enough to the general population to make any general referendum system workable for anything except really important issues, so representative democracy is here to stay. Maybe one issue a week could be handled by a technology run referendum.
I would like some more democratic control of my elected representative, especially on certain important issues, and I think that steps toward this may slowly be happening. The powers that be have been discussing the possibility of a constituency sacking an MP. That may make them more respectful of those they represent.
When Winston Churchill said that quote, he was probably actually paraphrasing someone else. Looking into it, the statement was preceded with "It has been said...".
In his life, he was a statesman, a soldier with extensive foreign service, an historian and a writer, and was awarded the Nobel prize for Literature. This makes him more qualified than many of his generation, and most of us now, to make this type of statement with some authority.
How did I get 13 (and counting) down votes for my response to Forget it?
Come on. I'm not defining the system, just saying how it is.
If you don't like the current system, do something like lobby your MP, or stand for parliament yourself.
Anybody fancy establishing the Vulture party? After all, we have quite deep thinking (as well as some shallow - but I'll gloss over that) on these forums.
(P.S. I don't want to be the leader. After not running a company well for a few years, I don't think I would run a country any better!)
The time allowed for debate has nothing to do with whether it is undemocratic or not. There is nothing enshrined in the UK political system that requires an MP to consult their constituents before voting on a bill. It's good form for them to, but if you look at when the system developed (admittedly before there was any effective distant communication or rapid travel possible), it was often the case that the MP completely ignored the people who elected them once they were in office!
Where there are serious problems are that you cannot currently sack your MP. They can be deselected by the party, but that does not force a by-election, which means that they can sit not representing you until the next election.
Couple that with the whipping system that can force an MP to toe the party line, and that's undemocratic.
We really could do with a local referendum system that allowed us the constituents to force our MPs to ignore the whip for particular issues. That might offset some of the major stumbling blocks with our system.
Please note that I agree with you that what's happened is an utter travesty, but it's not undemocratic, at least not according to the system.
It really is representational democracy. Your constituency selected a representative (your MP) by a majority of those who bothered to get up off the sofa to vote, and they have voted on your behalf. Just because they did not represent your view does not make it undemocratic.
So does that make you a banana?
What's not right is the fact that the MPs and Lords have not had enough time to debate the bill before having to vote on it.
"Democracy is the worst form of government, except for all those other forms that have been tried from time to time." - Winston Churchill
I think it should be in Hansard tomorrow.
<pedant>You actually need to know who opposed the bill, not who did not vote for it. Unfortunately, not every MP will be in parliament today, and those not there will not vote either way. In addition, MPs in the UK do not sign legislation, and at this point, it's not even legislation. It's a draft bill on it's second reading in the House of Commons</pedant>
Strictly speaking it would be a 3270 emulator. A 'glass TTY' is normally regarded as an ADM3, Wyse 50 or VT100.1
I'm surprised TN3270 is not already available. There's several versions in Google's Play store for Android.
1 Other terminals used to be available!
They and their distribution channels do act as VAT collectors for the Government, though.
They don't pay Corporation tax, as apparently they don't make any profit in the UK. As if...
It was added as an amendment by the opposition. So this should mean that there is an opportunity to have it reviewed by the house some time in the next parliament. But it's not a proper sunset clause, just a review.
Whether this will actually cause it to be changed is another matter entirely!
I read some of the debate from Tuesday. The existing legislation that requires Service Providers to keep call 'metadata' was passed into UK law as Secondary Legislation. This means that it had not been debated in the House of Commons at all, merely in committee.
The Home Secretary obviously decided that only being Secondary Legislation meant that although it is still UK Law, it is weaker and could possibly be neutered if it challenged in the Supreme Court by the Service Providers, particularly those in other countries.
Rushing DRIP through means that it will now be Primary Legislation, and would be harder to challenge. This is something that I only understood yesterday.
I hate the Government, regardless of their colour, using Secondary Legislation for something that is as important as this. It's what finally convinced me that the UK Identity Card scheme was a really bad idea, because the bill to authorise the ID card system was deliberately designed so that it could be extended by Secondary Legislation without it being debated in either house. Once set up, the underlying database could have been used for anything that the government wanted without proper scrutiny.
I think you know the answer. But does it affect all version?
Read what I wrote.
Many system builders (like Dell) bought Windows 7 OEM licenses upfront (remember all those stories about MS claiming that Win 7 had a fast uptake rate because of counting these pre-purchases as shipped systems), so have a stock of licenses they can use to put on newly built machines. As I understand it, MS are no longer allowing OEM Win 7 licenses to be purchased, so they will run out at some point.
So you have one of those pre-bought licenses on your machine. The only reason you have to apologise is for not reading my post.
Touching the screen leaves nasty greasy marks, even if you've not been eating crisps, and not having a clean screen drives me crazy! In fact, I've been known to snatch a pen that someone points too close to my screen, and make to stab their hand with it.
Also, the keyboard is normally arranged within arm's reach. The screen is often not. I'm waiting to see how workstation position guidelines are changed to prevent RSI's when reaching your whole arm out to touch the screen.
Stargate assumes that there has been migration or seeding, possibly with teraforming as a result of having FTL travel. In those circumstances, it's not surprising that there are human-like people with green vegetation.
It also makes production of the TV programme cheaper!
... the vagina/vulva discussion again?
Please remember that EU legislation, although providing a template for national laws, is not directly enforceable in the member countries. This is because we are not the United States of Europe, at least not yet, not until Jean-Claude Juncker starts pressing for closer European ties.
A directive is passed in the EU parliament, and then that directive has to be enacted by each country's parliament in their own national legislation, which then becomes law in those countries.
The converse is true. If an EU directive is overturned, then that does not automatically mean that the national legislation is also overturned. In the UK, this requires a modification of the national legislation, which means action in the UK parliament.
Between the EU directive being deemed invalid, and the corresponding changes in a country's national laws, the government of that country can be taken to one of the European courts for not complying with EU law, but that is unlikely to happen in the short-term, because there is a reasonable amount of time allowed for national laws to reflect changes in EU directives. What is reasonable is open to debate, but can be several years.
So what this means is that the existing UK legislation was still effective, and would be until amended, something that could have waited until the next term. This latest knee-jerk reaction was not required, so there really must be something hiding in there that Mrs May did not want examined too closely!
Biting the hand that feeds IT © 1998–2017