back to article Welcome, stranger: Inside Microsoft's command line shell

PowerShell is everywhere, it seems. Not just in Windows Server, SharePoint, SQL Server, Exchange, Lync and Azure cloud, but it’s in third-party software, too. Take VMWare PowerCLI – that’s an extension of PowerShell. With many in the Windows world chewing on this fat PowerShell server software sandwich it’s easy to take …

  1. This post has been deleted by its author

    1. Lee D Silver badge

      And then you discovered 4DOS. Which was damn amazing. And the 4DOS tools worked on any DOS you put them on, too.

      But, for years, my computer's scripting - every line from booting to loading drivers to starting games - was a combination of batch files, PC Pro / PC Magazine command-line utilities, and simple freeware.

      STRINGS, CHOICE, AMENU, it's all coming flooding back.

      Those were the days. When the computer did what you told it to and no more. And if you wanted, you could get 638Kb of base RAM out of 640Kb with enough drivers to play game X or run app X and just stick it in a menu. And a reboot took seconds and pushed you into a nice menu that loaded up the exact configuration you needed for a program, and you never even saw the jiggery-pokery to make it all work but you could at any point.

      Can''t even squeeze a webpage into 640Kb now. Still have no control over what starts up or in what order when you start Windows and half the stuff you can't turn off without breaking completely unrelated features (did you know that if you stop the Window Search service, you can't then add a new keyboard language?).

      Never used a batch file compiler because - well, you never needed to. A 386 was more than capable of churning through a batch file in no time at all.

      1. Andrew Orlowski (Written by Reg staff)

        +1 for 4DOS. It was the first thing to install on a new box.

        1. Bloakey1

          "+1 for 4DOS. It was the first thing to install on a new box."

          I liked DR DOS, twas far superior to MS Dos.

          I used to do a lot of command line stuff on an NT DEC Alpha. For various reasons I ran an 86 emulator to shut down 64 Progress databases and back the buggers up.All done through the auspices of a little batch file.

          NT4 on the DEC Alpha was far superior to all the other OS's I had ranging from NT4 86 to Citrix Winframes and VAX..

          1. billdehaan

            I liked DR DOS too, but you can't really compare it to 4DOS, they were different things. And yes, I have run 4DOS on DRDOS (4DOS v3 on DRDOS v6, I believe, around 1991).

            The only real problems I had with DRDOS were that (1) it wouldn't run Windows reliably (no great shock), (2) there was some funny bug that caused the internal "xdir" command to crash the PC occasionally for a reason I could never determine, and (3) it had problems running in a DOS box under OS/2, and *really* didn't like HPFS partitions. But as a standalone DOS replacement, it was great.

            1. tekHedd

              Windows under DR DOS?

              I didn't have problems with windows under DR DOS 6, but then the last versions of DR DOS were overshadowed by DOS (er) 5 (?) that actually had some advanced features.

              I ran DESQview for a while. Worked great, but wow the strange things we did to juggle processes.

              1. Richard Plinston Silver badge

                Re: Windows under DR DOS?

                > I didn't have problems with windows under DR DOS 6, but then the last versions of DR DOS were overshadowed by DOS (er) 5 (?) that actually had some advanced features.

                DR-DOS 3.4x supported large disk partitions when MS-DOS was stuck with 32Mbyte per partition (some OEMS (Wyse, Compaq,..) also had large partition support).

                DR-DOS 5 offered EMS and HiMem and many utilities and was contemporaneous with MS-DOS 4.01. 20 months later MS caught up with MS-DOS 5. Then DR-DOS 6 added task switching and better memory management which took the best part of a year to almost catch up with MS-DOS 6. In the meantime MS contracted its OEMs with illegal 'per box pricing' so that users had to pay for MS-DOS even if they bought DR-DOS.

                DR-DOS 7 (later Novell-DOS 7) added real multi-tasking as well as task switching.

                The other feature that DR-DOS had is that it would _run_ from ROM and not just load. This made embedded systems much faster and more secure.

                I don't know what you thought that MS-DOS had that was 'advanced', it was always behind.

              2. Alan Brown Silver badge

                Re: Windows under DR DOS?

                "I ran DESQview for a while. Worked great, but wow the strange things we did to juggle processes."

                I used DESQview/X to run a multiline BBS. Multitasking on a 286, etc. Those were the days (no I don't want to relive 'em)

        2. billdehaan

          4NT for the win

          It still is.

          It's called Take Command, now, and it's an all-singing, all-dancing command process as well as a terminal on steroids (think of xterm in terms of functions).

          The command processor can run separately; it's called TCC (Take Control Console), and there's a freeware dumbed down version (still orders of magnitude above the Command Shell) called TCC/LE. You can get it at JP Software, and it's strongly recommended.

          I've played with PowerShell, but I still find I can knock out a TCC/4NT/Take Command shell script in a tenth of the time, and it does a hell of a lot more, and easier, than the PowerShell script.

          1. BillG Silver badge
            Happy

            Re: 4NT for the win

            Just think about how much better the world would be if IBM based the PC on the Motorola 68000 instead of the Intel x86.

        3. yoganmahew

          "+1 for 4DOS. It was the first thing to install on a new box."

          What is this 'install' thing? Do you mean put the 5andaquarter in?

          Assuming you haven't sat on it and bent it...

          1. (AMPC) Anonymous and mostly paranoid coward

            Oh yes.... CMD.EXE /C

            For many years, I did almost everything I needed with DOS (2.0 to NTDos) from building and maintaining uniform directory and permission structures for file systems to piloting remote desktop installations and operations with other CLI tools like psexec. I loved and hated it, really.

            DOS had many flaws, shortcomings and weaknesses (still does). It is not even comparable to the power of mature *nix command shells, like BASH. However, it was very often good enough for the job, particularly when used in conjunction with other command line programs.

            Today's IT youngsters (most of whom are morbidly ignorant of the CLI) don't know what joys and frustation they have missed. I'm not sure that is altogether a good thing.

            As a colleague of my generation once said, "they don't remember how easy it was to completely f*k up a system with a few keystrokes" and the discipline that encouraged.

            1. Anonymous Coward
              Anonymous Coward

              Re: Oh yes.... CMD.EXE /C

              Be assured, it is still plenty easy to fuck up a system with a few keystrokes....

        4. GrantB

          I remember setting up many machines with a basic toolbox of programs like 4DOS (and later 4NT), grep, PKZIP etc before I moved on to just using cygwin.

          I think everybody in IT from the 80s and 90s who used DOS/Windows would have a collection of tools to extend and make MSDOS/CMD actually useful.

          Thing I never understood about MS, was the apparent 'not invented here' approach to releasing better tools. They could surely have brought the rights to bundle tools like 4DOS into Windows and quickly improved the rudimentary command line.

          Some bundled Windows utilities like Notepad don't seem to have changed much since Windows 3.11 days, despite Microsoft having better free editors available in-house.

      2. Tom 13

        I was an SDIR and Norton Utilities man myself, but yes you could do a heck of a lot back in those days. We were using QEMM and Quarterdeck for our memory manipulations. And of course every time MS released a DOS update, they broke.

        1. Ian 55

          'I was a Norton Utilities man myself'

          They licensed 4DOS for NU 7 and NU 8 and called it NDOS.

        2. Long John Brass Silver badge

          QEMM and Quarterdeck

          Those are names that bring back somewhat fond memories :)

          And the Borland C compiler

    2. Uncle Slacky Silver badge
      Thumb Up

      I used to write entire application installers with the PowerBatch compiler.

    3. Anonymous Coward
      Anonymous Coward

      "PowerShell was also unofficially touted as "the Linux version of the command line"

      Nope - it's far more powerful than Bash, etc.

  2. AndrueC Silver badge
    Boffin

    Piping is the next powerful ability of PowerShell. Piping uses the pipe symbol | to split commands and feed the latter to the former. So get-childitem | where name -notlike Windows would show you the directory listing, but excluding anything that matches the name Windows. You can't do that with a single command prompt line.

    Yes you can. The MSDOS command shell supported piping. Try this:

    dir c:\*.* /s | more

    That's not what the example asks about but it does demonstrating that piping commands was available in MSDOS and had been since 3.x - maybe earlier for all I know. I don't think there there was a built-in command you could pipe to that excluded by name but it wouldn't have been difficult to write one.

    1. boltar Silver badge

      "piping commands was available in MSDOS and had been since 3.x"

      Except you could only pipe into certain commands and IIRC you could only pipe once - you couldn't daisy chain them. MS never really "got" the purpose of stdin, stdout & stderr. They still don't as far as I can see.

      1. AndrueC Silver badge
        Happy

        Except you could only pipe into certain commands

        Well..yes. Piping only worked with programs that had been written to use stdin/stdout. It's unfortunate that for performance reasons a lot of command line programs chose to perform direct I/O rather than going that route but I'm not sure you can call that a limitation of MSDOS. MSDOS piping works with any application that sticks to the MDSOS API.

        IIRC you could only pipe once - you couldn't daisy chain them

        dir | sort | more

        :)

      2. Vic

        Except you could only pipe into certain commands and IIRC you could only pipe once - you couldn't daisy chain them

        In days of yore, piping worked perfectly. You could pipe anything into anything, and use as many pipes as you wanted to. It worked.

        Then they brought in long filename support, including spaces in filenames. This inherently broke the pipe system, leading to the situation you describe.

        But prior to that, it was all good...

        Vic.

        1. Alan Brown Silver badge

          "Then they brought in long filename support, including spaces in filenames. This inherently broke the pipe system"

          As with *nix, enclosing the filename in speechmarks solves that issue.

          1. Vic

            As with *nix, enclosing the filename in speechmarks solves that issue.

            Are you sure about that? ISTR the piping being performed entirely differently when LFN came in. I'm pretty sure the pipe mechanism was entirely re-written...

            Vic.

    2. Kristian Walsh Silver badge

      PowerShell commands output structured data, not text. In the absence of a consumer, the data is converted to text, but if you pipe it, then the consumer receives objects. In the example in the article, the "where" command filters its input objects by examining their "name" attribute.

      If you've ever had to write a lot of shell-script on Linux, I'm sure you'll appreciate how useful this could be.. especially as many really useful Linux tools provide such machine-unfriendly output.

      1. Daggerchild Silver badge

        PowerShell commands output structured data, not text. In the absence of a consumer, the data is converted to text, but if you pipe it, then the consumer receives objects. In the example in the article, the "where" command filters its input objects by examining their "name" attribute
        Ah, I was wondering when this would start appearing. I could smell XML inter-proc pipes a while back. I remeber an XML 'ls' somewhere, with the output terminal taking cues. Next up, p2p negotiation, maybe ending up with JIT compilation... mm.. evil..

    3. Anonymous Coward
      Anonymous Coward

      MSDOS piping would just write the entire output to a temporary file, the same as:

      dir c:\*.* /s >tmpfile

      more <tmpfile

      The second command wouldn't start processing until the first had finished.

      Microsoft didn't really "get" the idea of pipes or concurrency.

      1. AndrueC Silver badge
        Happy

        Microsoft didn't really "get" the idea of pipes or concurrency.

        Ah now, that I agree with. Sorta. Except that MSDOS was never claimed to be a multi-tasking operating system so it's a bit harsh to criticise it for the workaround of using a temporary file. Clearly Microsoft were aware of the importance of piping and went to some lengths in order to fake it.

        As for not supporting objects - well yes that's true but the article didn't use objects for that example. It quite specifically mentioned directory listings. As another commentard with more time on his hands (or a better memory) has pointed out that there was such a filtering program available so the specific example given in the article is entirely possible under MSDOS.

        And a note to the downvoters - I'm not attacking PS here. I know it's better and I love it - have interfaced to it from C# on several occasions. The only reason I've posted these comments is to point out a factual inaccuracy in the article.

        1. Robert Helpmann?? Silver badge
          Childcatcher

          Microsoft didn't really "get" the idea

          The only reason I've posted these comments is to point out a factual inaccuracy in the article.

          And there are others... I get the impression the author doesn't use Windows command line much except for PowerShell, if that. Too, there were other MS scripting possibilities not mentioned in the article (e.g. cscript/wscript, VBScript, JScript). I've had the... joy? of working with one incarnation of MS-DOS AKA CMD or another for 30 years now. While I think that it PowerShell is interesting in the way it does things and am pleased with the return to using command line as the default in MS OS administration, I find the change from CMD to PS as jarring as moving from anything else to Windows 8. I've written scripts to be run on a variety of *NIXes and am having a harder time shifting to PS than learning any of these from scratch. Maybe I have just gone from getting to being old.

          PS has a few neat tricks like being able to specify output types that are native to MS Office formats, but I have been able to do that more generically using CSV and RTF for years. Except for things that were designed and created with PS as the default scripting language, I haven't run into anything that I couldn't do previously with CMD.

          Essentially, MS has done to admins what they have been doing to all their other users: changing everything, telling us it is for our own good, and forcing us to relearn things that we have been able to do just fine for years. Not much of a production boost as far as I can see, but it is the Microsoft way.

          1. Richard Plinston Silver badge

            Re: Microsoft didn't really "get" the idea

            > changing everything, telling us it is for our own good,

            If MS didn't change stuff then there would be no reason to buy the next version.

        2. Vic

          Except that MSDOS was never claimed to be a multi-tasking operating system

          But it *nearly* was.

          The original design had all task-context data in a swappable chunk - the SDA. By changing the SDA pointer, you changed the context. There was some grief with changing that whilst certain non-thread-safe DOS operations wre ongoing - hence the InDOS flag - but it had clearly been built with multi-tasking in mind.

          Sadly, this doesn't ever seem to have been completed (hence the single-tasking nature of what shipped), and each new version seemed to have more and more static data that wasn't in the SDA, leading to lots of work-arounds and side-effects :-(

          Vic.

          1. AndrueC Silver badge
            Thumb Up

            hence the InDOS flag

            Wow, I just got that lovely 'plink' from my brain as you unlocked another chunk of memory. Thank you, sir for reviving an old memory. Have an upvote :)

      2. Anonymous Coward
        Anonymous Coward

        DOS was a single process, single thread operating system...

        1. Alan Brown Silver badge

          DOS

          Um, no. DOS was an interrupt-driven bootstrap loader.

          As soon as any program ended, the system had to reload command.com (and if it wasn't there, things would break)

      3. Ian 55

        Pipes

        After MS-DOS 1, Microsoft promised proper pipes and multitasking, then delivered MS-DOS 2 with the 'make the first program finish before letting the second one see the results' bodge that lasted for the rest of MS-DOS.

        1. david 12 Bronze badge

          Re: Pipes

          For those too young to remember, I'd just like to clarify that you could "see the results before the first one finished". That was not the problem.

          The problem was that MS-DOS would only run one program at a time. The "second one" wouldn't start until the 'first one' finished, even though "the results" were ready and available.

          And yes, it was possible to work around this limitation of programs run by DOS, but it was a work-around, not a natural part of the system.

          Just like (while I'm here), DOS 3.x did support "partitions larger than 32 MB", through resident driver chaining, and from DOS 2.x supported large disks through installable block devices drivers.

          Unlike the native support for command line editing and recall, using the Fn keys, which was a natural part of the system, not some little-remembered work-around

          1. Richard Plinston Silver badge

            Re: Pipes

            > Just like (while I'm here), DOS 3.x did support "partitions larger than 32 MB", through resident driver chaining, and from DOS 2.x supported large disks through installable block devices drivers.

            Not from Microsoft it didn't. There were 3rd party add-ons. Some OEMs modified the system, in different ways, to support larger partitions, for example I used 'Wyse-DOS' 3.31 with this. IBM was annoyed that other OEMs had features that were not in PC-DOS (or standard MS-DOS) so they wrote code to create PC-DOS 4.0 and gave it back to MS for MS-DOS 4.0x

            http://www.os2museum.com/wp/dos/dos-4-0/

            """Perhaps the most significant change in DOS 4.0 was the introduction of 32-bit logical sector numbers and the consequent breaking of the 32MB partition size barrier. That change wasn’t strictly speaking new, having been first introduced in Compaq’s DOS 3.31 in late 1987. However, beginning with DOS 4.0, every OEM version (starting with IBM’s) supported large partitions."""

            1. Alan Brown Silver badge

              Re: Pipes

              "IBM was annoyed that other OEMs had features that were not in PC-DOS (or standard MS-DOS) so they wrote code to create PC-DOS 4.0 and gave it back to MS for MS-DOS 4.0x "

              The legend is that MS said "DOS is done", fully baked, no more work needed, etc.

              IBM added the extra stuff as proof of concept and MS immediately took it(*) to sell as DOS 4.0 - which was unfortunate, as it was bug-riddled. A lot of early adopters lost the entire content of their systems.

              (*) The licensing conditions for DOS included a clause that any modifications were the property of Microsoft. This wasn't unusual - Rockwell included the same clauses in its modem chip licensing,

              Thankfully at least one machine I owned (Sanyo MBC550) wouldn't run anything newer than DOS 2.11, so I was spared that carnage until well after the event.

    4. oldcoder

      Actually, that isn't a pipe.

      What it did was create a tmp file with the output of the first command, which was then read by the second command.

      1. Anonymous Coward
        Anonymous Coward

        Easier than ever nowadays

        Oy intern,

        sort this out will ya

      2. tom dial Silver badge

        If I recall correctly, Unix specifications at the time did not require that pipes be implemented in any particular way, and the Microsoft way would have been suitable, although less than ideal. What really counted, though, was that the operating system provided for such things, and pretty much the entire set of standard utilities used stdin and stdout and allowed the shell to connect them fairly arbitrarily using pipes.

        1. Richard Plinston Silver badge

          > Unix specifications at the time did not require that pipes be implemented in any particular way, and the Microsoft way would have been suitable, although less than ideal.

          Named pipes are a feature of the Unix (and Unix like) operating systems. They provide arbitrary data connections between programs. It happens that various shells can use pipes to connect stdout of one program to stdin of another. MS-DOS doesn't have pipes but the shell can provide an emulation in some cases.

    5. Simon Harris Silver badge

      "dir c:\*.* /s | more

      That's not what the example asks about"

      But

      dir | find /V "Windows"

      would do pretty much the same at the DOS prompt as get-childitem | where name -notlike Windows (abeit with the output formatted slightly differently).

      1. Paul Renault

        Actually, that one would fail, as all directory listings were uppercased, so you wouldn't get any files listed.

        One very hand piped command I used to use a lot:

        CHKDSK /V | FIND "textstring"

        CHKDSK /V all by itself would give you a scrolling list of every file on the disk. Piping that output to the FIND command would result in a list of only the files where textstring matched. A 'file find' program built in to DOS, and it was free.

      2. AdamFowler_IT

        You're right - well done!

    6. Anonymous Coward
      Anonymous Coward

      From the article:> get-childitem | where name -notlike Windows

      I've not used powershell a lot but the article's command "get-childitem | where name -notlike Windows" didn't work on this Windows 7 'box'.

      dir | find /v "Windows" which is case sensitive does work, and is one command.

      dir | find /v /i "Windows" is the case insensitive version.

      1. theOtherJT

        Re: From the article:> get-childitem | where name -notlike Windows

        This is true.

        get-childitem | where {$_.name -NotLike "Windows"}

        is what you want here.

      2. AdamFowler_IT

        Re: From the article:> get-childitem | where name -notlike Windows

        Just tested again on a Windows 7 box from the standard Windows PowerShell window, and it works for me:

        get-childitem | where name -notlike Windows

        What error are you seeing when trying? PowerShell is pretty good at telling you what's wrong.

    7. This post has been deleted by its author

    8. AdamFowler_IT

      How is "dir c:\*.* /s | more" the same as "So get-childitem | where name -notlike Windows" - yours will stop at each page waiting for a keypress, mine lists everything excluding the directory name 'Windows' ?

  3. Anonymous Coward
    Anonymous Coward

    Obscure knowledge got me a job ....

    1996 - for various reasons been out of IT for a few years. Had to start low, so applied for a support job. Technical Director asked me:

    "Imagine you haven't got a text editor. How would you create a file."

    I guessed he was expecting echo "This text">C:\FILE.TXT. What I said was "I'd use the DEBUG command to write to a text file.".

    Turns out he'd not heard of that one, and he said so. MD heard him and hired me.

    1. Anonymous Coward
      Anonymous Coward

      Re: Obscure knowledge got me a job ....

      Or you could just use EDIT.

      I guess for some people proving they know something abstract is more important than the simple answer.

      1. Anonymous Coward
        Anonymous Coward

        Re: Obscure knowledge got me a job ....

        Erm, the constraint was "you haven't got a text editor"

        1. Tom 13

          Re: the constraint was "you haven't got a text editor"

          I would have opted for Edlin. Nobody who has ever used it has ever confused it with a text editor.

          1. david 12 Bronze badge

            Re: the constraint was "you haven't got a text editor"

            Edlin was a Line Editor. Which was still a well-known kind of thing in the world when PC DOS was first introduced.

      2. Dave 126 Silver badge

        Re: Obscure knowledge got me a job ....

        copy con reply.txt

        Of course you could also do this

        And when you had finished, you would

        Ctrl-Z

        1. Ol'Peculier
          Unhappy

          Re: Obscure knowledge got me a job ....

          Glad I wasn't the only one that thought of that solution.

          Grief. I feel old...

        2. omnicent

          Re: Obscure knowledge got me a job ....

          Damn, I wanted to add the "copy con" command, that was my fav....

          1. david 12 Bronze badge

            Re: Obscure knowledge got me a job ....

            Real programmers use

            copy con program.exe

          2. Anonymous Coward
            Anonymous Coward

            Re: Obscure knowledge got me a job ....

            Hell, copy con wasn't my favorite DOS command, but it was the first one ever shown to me.

            It was only 30 years ago, but this article makes it feel like yesterday, I feel more stuff dripping out of my UMB with every comment. Thanks for taking us all back, Adam.

            1. AndrueC Silver badge
              Boffin

              Re: Obscure knowledge got me a job ....

              My memory actually goes back a little further than copy con...

              pip test.txt=con:

              :D

        3. AndrueC Silver badge
          Boffin

          Re: Obscure knowledge got me a job ....

          And when you had finished, you would

          Ctrl-Z

          Nah. I'd hit F6 instead :)

      3. HipposRule

        Re: Obscure knowledge got me a job ....

        Edlin surely, didn't edit only come in with DOS 5?

        1. Colin Miller

          Re: Obscure knowledge got me a job ....

          Edlin was in MS-DOS 3.1

    2. Anonymous Coward
      Anonymous Coward

      Re: Obscure knowledge got me a job ....

      Ah, con, aux prn, still lurk in today's cmd line.

      copy con "file.txt" (CTRL-Z)

      Not so long ago a test program I was using, designed to test file system security permissions, would occasionally randomly break, for no apparent reason. The youngish chap who wrote it was randomly generating file names. After digging through a lot of tracing, every now and then, the random file name generator was attempting to create files called "con", "aux" and "prn", creating a big spanner where none was expected.

      Just occasionally old arcane knowledge comes in handy.

      1. Pirate Dave
        Pirate

        Re: Obscure knowledge got me a job ....

        Ehhh, I still use copy con when I'm creating small test files. Old habits die hard.

        And actually, on several of my 2012 servers, I use batch files for backup jobs (along with an rsync client). Easier, more comprehensible that Powershell, IMHO. And more straight-forward. Put too many powershell one-liners in a script, and a year later I'm like "what the fuck did I do here?" Only time that happens in a batch file is if I try to get really fancy with a FOR command.

        Powershell has some uses, although after 4 years of using it, I really think it's main strengths are dealing with "new" Windows or Office365 features that are Powershell enabled. For stuff like that (especially dealing with O365, and to a lesser extent, AD), some of it is much, much easier to do in Powershell than in a GU, webpage, or a batch file. But many times it's easier to do something using a simple batch file and an executable utility rather than try to figure out the arcanum to invoke it in Powershell (or worse, have to drop to the underlying .NET stuff).

        And the key to keeping your sanity is to remember Powershell is a SCRIPTING language, not a PROGRAMMING language, even though MS tried really, really hard to make it look like programming.

        1. John 104

          Re: Obscure knowledge got me a job ....

          The trick to remembering "what the fuck did I do here" is to write comments in your script so you don't have to remember in 6 months...

          And in case you don't know how to do that....

          #

          Personally, I've been using it since inception and it is hands down better than the endeared dos command line. And it blows the shit out of VBS.

          Stay current, learn new tools, stay employed...

          1. Pirate Dave
            Pirate

            Re: Obscure knowledge got me a job ....

            To each his own, I guess.

            I never cared for VB script. For places where I could have used it, I'd usually go into VB6 instead. Always seemed easier to just copy an EXE around to various machines than deal with the scripting engines.

        2. Adam 1 Silver badge

          Re: Obscure knowledge got me a job ....

          >Only time that happens in a batch file is if I try to get really fancy with a FOR command.

          Or any other processing involving the system date; stuff like rename that zip file with the prefix 20150428 is a right PITA with batch files.

          1. Pirate Dave
            Pirate

            Re: Obscure knowledge got me a job ....

            "stuff like rename that zip file with the prefix 20150428 is a right PITA with batch files."

            Yes, but the newer versions of the SET command can do that kind of stuff fairly easily (for some values of "easily"), it's just dog-ugly syntax and a real "WTF?" moment a year or two later without comments explaining it.

    3. Alan Brown Silver badge

      Re: Obscure knowledge got me a job ....

      COPY CON (which lots have beaten me to)

      And F3/F8 plus others for command line editing.

      I actually went out and bought a copy of IBM DOS 3.3 specifically because the manual which came with it came with full explanations and examples for every single command.

      The MS-DOS manuals were poor things by comparison.

      That IBM manual stayed around until the late 1990s, because it was a useful reference tome. It went walkies when shifting house and I suspect one of my "helpers" decided it was more use to him than to me.

  4. This post has been deleted by its author

    1. launcap Silver badge

      Re: paths

      >> “Bad command or file name” message

      >To be fair, that's true in *n*x shells as well.

      However - most *n*x shells are not set up (unless the sysadmin is *really* clueless) with . as one of the PATH entries..

      1. Alan Brown Silver badge

        Re: paths

        Even most clueless unix sysadmins generally set "." as the _last_ PATH entry.

        With DOS it was the first one, which made trojan horsing much easier

        I had to deal with a number of infestations caused by that and the old BIOS flaw of looking at the floppy before trying to boot off HDD. Every single computer I worked on that could have the boot order changed, did so (often to the consternation of "experts" who would advise users not to boot with floppies in, and then have their demonstrations fail)

        1. AndrueC Silver badge
          Meh

          Re: paths

          With DOS it was the first one, which made trojan horsing much easier

          True and arguably it's even worse than that. That's because command.com always looks in the current directory and only resorts to PATH after checking the cwd. So even if your PATH variable is empty you'll still execute programs in the cwd. I'm pretty sure there is no way to stop command.com looking in the cwd first but I vaguely recall that with 4DOS at least if PATH did contain '.' then it overrode the default behaviour and thus at least allowed you to push it further down the list.

          1. david 12 Bronze badge

            Re: paths

            Yes, PCDOS 1.x was primarily a floppy disk system, and it always ran programs off the current floppy disk, and did not require a path to do so. When it morphed into something completely different, this started to be a problem.

            no, it did not require a path, and dot was never an element of the path. It did become common to put things like root or dot dot into the path, and the order of the path was commonly set for optimum speed, because, until very late in the piece, MSDOS did not cache the file listing, and even on Hard Disks, a path search could be noticeably slow.

        2. tom dial Silver badge

          Re: paths

          Not US DoD administrators, and never, ever, for those with privilege.

  5. boltar Silver badge

    Piping and conditional logic

    Thats where command lines excel. GUIs are great for single tasks that can be visualised - eg drag and drop or button clicking - but for disperate non visual or abstract tasks that need to be linked together and require some glue logic and maybe looping , well , some GUIs have been designed that can do that (Scratch programming language for example) but its easier just to use an old fashioned command line whether its powershell on windows or bash on linux.

    1. phuzz Silver badge

      Re: Piping and conditional logic

      To use the example of Exchange, if you have to check that pager number for a particular employee is correct, you're probably better off using a GUI.

      If you have to change the pager number for a whole group of people then the command line is a better choice.

      Your level of comfort with a command line could be quantified by how many times you're willing to repeat a manual task before busting out a batch file to automate it.

      (Note to kids: a pager was a sort of proto-smart watch that you wore on your belt rather than your wrist, but it had it's own radio so you didn't require a phone)

      1. John 104

        Re: Piping and conditional logic

        (Note to kids: a pager was a sort of proto-smart watch that you wore on your belt rather than your wrist, but it had it's own radio so you didn't require a phone)

        Wins the internet for the day.

    2. Jim 59

      Re: Piping and conditional logic

      Big up the command line in general. For many of us, it was also a natural progression from the 1980s home computer usage.

    3. Alan Brown Silver badge

      Re: Piping and conditional logic

      "GUIs are great for single tasks that can be visualised - eg drag and drop"

      If you've ever seen what goes on behind the scenes with "drag and drop" you'd run screaming.

      Multiselect (Unixen, Macs and Windows alike) results in a series of INDIVIDUAL commands. I've seen many systems brought to their knees by that kind of shitty behaviour (particularly if the target directory is on a network fileserver)

    4. Adam 1 Silver badge

      Re: Piping and conditional logic

      Many ps applications basically generate the appropriate cmdlet that achieves what you clicked. This lets you do it through the ui, then grab the script and do it in bulk.

  6. Extra spicey vindaloo

    "Commands and paths had to be typed in full in the MS-DOS days, there was no fancy time-saving command-line auto completion. This feature popped up much later in the picture, when the command prompt had almost become a forgotten artifact of the pre-GUI era."

    DOSKEY was available from MSDOS 5.0 and later.

    Powershell is powerful but, I use it maybe once a month, and then mostly for Chocolatey.

    But I have a command window open on my machine nearly all day. At the moment there are 3 of them.

    Dir *.txt is a hell of a lot faster to find a file than trying to find it in a window.

    1. TonyJ Silver badge

      But I have a command window open on my machine nearly all day. At the moment there are 3 of them.

      Similar here, but I use a PS window. Since it does everything the a command window does, plus the PS bits, I prefer it that way.

      Each to their own. :)

      1. Pirate Dave

        One of the things I miss most frequently in PS are the switches to the DIR command.

        dir /a-d

        dir /o-g

        those are mighty handy at times. I seem to recall they can be sort of emulated with scripting in your profile, but it would have been nice if MS had added them by default.

    2. billdehaan

      DOSKEY was available from MSDOS 5.0 and later.

      And there were many other keyboard and command line enhancers before then. My personal favourite was Chris Dunford's CED (command editor) back in 1987, three years before DOS 5 came out. I liked it enough that I bought the the professional version, PCED. Sadly, it had some conflict with the Smartkey keyboard enhancer, as I recall, but by then, we were already playing with the (late, but not lamented) Command Plus shell, and later 4DOS, rendering DOSKEY moot.

      Sadly, I now have a quirky application which for unknown reasons doesn't like to run in TCC/Take Command, and so I *must* launch it from a DOS shell, forcing me to learn/relearn DOSKEY, thirty years later...

    3. david 12 Bronze badge

      Fn keys were available from DOS 1 and later. Giving you, well, very very basic command line completion.

    4. AdamFowler_IT

      You can still do that command in a PowerShell window.

  7. Len Goddard

    Discovery

    I was completely unaware of Powershell until I found it on the Win 10 Tech Preview. It is a definite step in the right direction, I suppose, but for me it won't replace the cygwin linux command line / utilities package I always install on a new system before I do anything else.

    1. Lee D Silver badge

      Re: Discovery

      I stopped using powershell the moment I realised that a simple AD command to do something (I think it was related to promotion of a certain role, but can't remember off-hand) had gone from an 8-character name to something so long and unguessable that - even with autocomplete - there were ten similarly named, stupidly long, easy to confuse commands and that in any tutorial they had to be written out correctly and not jump off the sides of the screen when you typed them because otherwise it was too easy to hit the alternates.

      That's not what you want when you're playing with AD in a Powershell box.

      1. Anonymous Coward
        Anonymous Coward

        Powershell - I knew it well

        There I was on site in Foreign lands trying to get Powershell to failover a Server 2008R2 CORE cluster.

        The problem was that the Cluster commands were not available.

        The problem was that a 'Mandatory???(Wtf)' patch had disabled/removed them.

        A OPTIONAL patch if it had been applied would have re-enabled them again. Doh!

        How many hours of head bashing resulted? Far too many.

        Onto the command language.

        It seems that the Powershell designers have taken the worst bits of DCL (Digital Command Language) from VMS and used them. After years and years of using VMS even I find the syntax frankly *******.

        1. A Non e-mouse Silver badge

          Re: Powershell - I knew it well

          It seems that the Powershell designers have taken the worst bits of DCL (Digital Command Language) from VMS...

          Windows NT was inspired by VMS due to the work of Dave Cutler who worked on VMS at Digital before moving to Microsoft to work on NT. en.wikipedia.org/wiki/Dave_Cutler

        2. Roo
          Windows

          Re: Powershell - I knew it well

          "It seems that the Powershell designers have taken the worst bits of DCL (Digital Command Language) from VMS and used them. After years and years of using VMS even I find the syntax frankly *******."

          I'm glad someone else has had that thought. I spent a couple of years using DCL, tried bourne shell and never looked back... Until I tried PowerShell, which reminded me why I didn't go back to DCL. ;)

          For the record I didn't actually mind DCL when I was using it - but then again I didn't know what I was missing until I get a couple of weeks of Bourne shell under my belt...

      2. jsnover [MSFT]

        Re: Discovery

        I always thought about the admin at 3am in the middle of an IT crisis in mind. I thought about the desperation that person would feel if they were trying to understand what had occurred and opened up a PERL script and needed to understand it. When that person opens up a PowerShell script, they will be able to read it and understand what happened.

        That is why things tend to be more verbose - because verbosity is your friend when the chips are down. As you correctly point out, verbosity is not your friend for interactive use. That is why we provide aliases, positional parameters, wildcards, etc.

        At the end of the day, we build tools to make you successful so if you are successful with the tools you are using - then it's all good.

        Jeffrey Snover [MSFT]

        Distinguished Engineer

        1. HmmmYes Silver badge

          Re: Discovery

          But why does my Powershell take several 10s of second before I can enter a command?

          Is powershell meant to be a programming environment, or an interactive one? Powershell seems to be stuck between the two.

          I trying to move all my Windows stuff to powershell and I am finding it a PITA. Its powerful, sure but a shell???

          Oh and don;t get me started on all the different versions? Did you not learn anything from WinSocks?

        2. Roo
          Windows

          Re: Discovery

          "I always thought about the admin at 3am in the middle of an IT crisis in mind. I thought about the desperation that person would feel if they were trying to understand what had occurred and opened up a PERL script and needed to understand it. When that person opens up a PowerShell script, they will be able to read it and understand what happened."

          People can write unreadable code in pretty much any language out there, and at 3AM the chances are groking anything is going to be harder than usual... So instead of forcing people to learn new tools, syntax and conventions at 3AM how about just using presenting them with something familiar & well proven - like Python packaged with a bunch of libs to facilitate doing tasks on Windows boxes ?

          I'm guessing it's down to our old friend "Not Invented Here".

    2. sawatts

      Cygwin

      - ditto -

      I tend to use Cygwin on every Windows box I have to use. While I know that some people are thrilled with PowerShell, but I use the same Bash scripts on UNIX and Windows.

      Anyone else remember that Windows NT originally came with a POSIX-subsystem?

      1. Roo
        Windows

        Re: Cygwin

        "Anyone else remember that Windows NT originally came with a POSIX-subsystem?"

        Yes, I do. I mainly remember because it wasn't actually shipped with the POSIX subsystem in working order (as of NT 3.51), you had to install it off an extra CD. The advertising was very misleading. In my experience that feature was successful at convincing mentally defective PHBs that NT could run code currently running on UNIX boxes than it was actually doing it's job...

  8. JimmyPage Silver badge
    Linux

    GUIs can be great ...

    Ongoing argument with my penguin mad brother over preferring GUIs - in *some* situations.

    Because a good GUI can help a great deal towards presenting a quick logical overview of what on earth it is you are doing. You can grey out controls, or link them so that you know selecting an option requires addition parameters. You can also ensure mutually exclusive commands can never be issued. And you can provide tooltips to assist in more obscure or lesser used options. Best of all you tend to work in generics, rather than specifics - you want the outcome to be "Delete temporary files on completion" - or is it -B ? -D ? --delete-temp-files-on-exit ? --cleanup ?

    However, I do like Linux, so --->

  9. Yugguy

    Repetitive, mind-numbing menial tasks are IMPORTANT

    It's what stops uppity 2nd-line people from developing ideas above their station.

    1. harmjschoonhoven

      Re: Repetitive, mind-numbing menial tasks are IMPORTANT

      It's what stops uppity 2nd-line people from developing ideas above their station.

      Bare Windows® gives me the feeling I have a teaspoon to hammer a nail in a wall. UnxUtils is a lifeline, Cygwin a must and WinBatch fun.

      BTW after 10000+ day as UNIX-superuser I do not trust anybody with that power, including myself. Cmd.exe is more benign.

  10. John Miles

    re: Windows XP was the first PC operating system to drop the MS-DOS

    So what happened to Windows 2000 Professional and Windows NT3.5/4 Workstation - which were all PC OSes (and used on PC not server)

    1. Paul Crawford Silver badge

      Re: re: Windows XP was the first PC operating system to drop the MS-DOS

      I think he meant the first consumer-facing system. They ran in parallel with 95/98/ME and were intended for serious applications (proper 32-bit programs, multi-user, etc).

      Sadly in the push to make consumer & professional lines converge and be fast enough for gaming, compatible with older badly written software (some of it MS' of course!), etc, a lot of dumb decisions were made w.r.t. security, etc.

      1. david 12 Bronze badge

        Re: re: Windows XP was the first PC operating system to drop the MS-DOS

        Also, I think he messed up about Windows ME ??? - one of the great complaints about it was that it wasn't possible to just "boot into a pure MS-DOS prompt by pressing the right start-up bypass keys", because they had removed that feature ???

        1. AdamFowler_IT

          Re: re: Windows XP was the first PC operating system to drop the MS-DOS

          Happy to be corrected, but I don't think so:

          http://www.mdgx.com/msdos.htm

    2. davidp231

      Re: re: Windows XP was the first PC operating system to drop the MS-DOS

      NT3.1 introduced CMD.EXE (aka Command Prompt).

    3. AdamFowler_IT

      Re: re: Windows XP was the first PC operating system to drop the MS-DOS

      The same paragraph explained that:

      Windows XP was the first PC operating system to drop the MS-DOS Prompt and change it to Command Prompt, due to a change to the NT kernel. The Windows NT family has used the newer Command Prompt since it started with Windows NT 3.1, so it was nothing new on that side of the fence.

      1. John Miles

        Re: The same paragraph explained that:

        But XP derived from NT not Dos & Windows 95 - it was the first version of the NT family sold without NT label.

  11. jake Silver badge

    Funny thing is ...

    ... those of us who were already using UNIX[tm] in the early-mid 1980s found MS-DOS's command.com to be a brain-dead command interpreter. To this day, Microsoft hasn't really figured the concept out. IMO, of course.

  12. Alan Sharkey

    Back in the 90's I used to teach courses on how to get the most amount of memory out of 640Kb. I also wrote DOS programs (anyone remember EasyEdit - the best text edtor at the time (so Byte said)) which used overlays to move stuff in and out of extended memory to leave the most free real memory.

    It was fun in those days. Windows has dumbed us all down.

    Alan

    1. Jamie Jones Silver badge
      Facepalm

      "Windows has dumbed us all down."

      Indeed. I'm sure we've all heard:

      • Have you tried switching it off and on again?
      • Yeah that file is corrupt, you need to reinstall the whole operating system
      • Yeah, all computers slow down if you don't reboot them every few days, and rebuild them once a month

    2. jake Silver badge

      @ Alan Sharkey

      I just fired up my 1988 386sx16, math-co, 8megs, 40meg, 1meg on VLB video card ... DOS 5.0 (mouse driver loads high automatically! [was HUGE back then ...]), DESQview, QEMM, Windows 3.0, Lotus, dBaseIII+, WP ... Still runs nicely, is pretty snappy, even. Windows ran nicely under DESQview, but it seemed rather pointless ...

      The "640K should be enough" attributed to Bill Gates is a myth. On the original IBM PC, MS/PCDOS could use 760K(ish) of so-called "low-mem", before it ran into IBM's built-in hardware stoppage. Which was an IBM hardware issue, not a Microsoft coding issue. Eventually, we figured out how to use nearly 950K of low-mem.

      The real "should be enough" quote was from Steve Jobs, when demoing the original Apple Macintosh at the Home Brew Computer Club, a couple weeks before the official unveiling. He said, and I quote, "256K should be more than enough for home users" ... and he had a point. We had flight simulators running in 64K of RAM back then.

      EasyEdit? I've been using vi from time immemorial ...

      Sometimes I look at the modern world and despair over the sheer waste ...

      1. Richard Plinston Silver badge

        Re: @ Alan Sharkey

        > On the original IBM PC,

        On the _original_ IBM PC (5150 Model A) it would only support 256Kb max, no mattter how many cards you could afford. Base memory was 16Kb for ROM BASIC and Cassette port. Model B (I have one here) supported max 640Kb.

        > MS/PCDOS could use 760K(ish) of so-called "low-mem", before it ran into IBM's built-in hardware stoppage.

        IBM reserved the areas above 640Kb for hardware adaptor memory. The CGI card occupied addresses at 640Kb. If only a MDI or hercules card was used then another 64Kb could be used to give 704Kb. Anything beyond that required memory management hardware such as an EMS or EEMS card that could switch address spaces around.

        However, later MS-DOS (5 or later), DR-DOS, QEMM or others on a 286 or later could emulate EMS and could shift the OS into high memory or beyond 1Mb.

        > Eventually, we figured out how to use nearly 950K of low-mem.

        Not on a 8088 based PC or PC XT you didn't. There were machines that could support almost the full addressable 1Mb of a 8086/8088. SCP Zebra series for example, or other S100 bus based systems. The Sharp MZ-5600 that I have here also could utilise 512Kb for OS and programs the other 512Kb address space was reserved.

        I do have other 8088/8086 machines that can use the full 1Mb but they run Concurrent-CP/M-86 on several serial terminals.

        > Which was an IBM hardware issue, not a Microsoft coding issue. Eventually, we figured out how to use nearly 950K of low-mem.

        1. jake Silver badge

          Re: @ Alan Sharkey

          "On the _original_ IBM PC (5150 Model A) it would only support 256Kb max"

          You never piggy-backed RAM, Richard Plinston?

  13. Liam Proven

    From /In the Beginning was the Command Line/ by Neal Stephenson

    [...] Note the obsessive use of abbreviations and avoidance of capital letters; this is a system invented by people to whom repetitive stress disorder is what black lung is to miners. Long names get worn down to three-letter nubbins, like stones smoothed by a river.

    This is not the place to try to explain why each of the above directories exists, and what is contained in it. At first it all seems obscure; worse, it seems deliberately obscure. When I started using Linux I was accustomed to being able to create directories wherever I wanted and to give them whatever names struck my fancy. Under Unix you are free to do that, of course (you are free to do anything) but as you gain experience with the system you come to understand that the directories listed above were created for the best of reasons and that your life will be much easier if you follow along (within /home, by the way, you have pretty much unlimited freedom).

    After this kind of thing has happened several hundred or thousand times, the hacker understands why Unix is the way it is, and agrees that it wouldn't be the same any other way. It is this sort of acculturation that gives Unix hackers their confidence in the system, and the attitude of calm, unshakable, annoying superiority captured in the Dilbert cartoon. Windows 95 and MacOS are products, contrived by engineers in the service of specific companies. Unix, by contrast, is not so much a product as it is a painstakingly compiled oral history of the hacker subculture. It is our Gilgamesh epic.

    What made old epics like Gilgamesh so powerful and so long-lived was that they were living bodies of narrative that many people knew by heart, and told over and over again--making their own personal embellishments whenever it struck their fancy. The bad embellishments were shouted down, the good ones picked up by others, polished, improved, and, over time, incorporated into the story. Likewise, Unix is known, loved, and understood by so many hackers that it can be re-created from scratch whenever someone needs it. This is very difficult to understand for people who are accustomed to thinking of OSes as things that absolutely have to be bought.

    http://steve-parker.org/articles/others/stephenson/oral.shtml

    1. Anonymous Coward
      Anonymous Coward

      Re: From /In the Beginning was the Command Line/ by Neal Stephenson

      +1 for the Neal Stephenson reference. I have a copy on my bookshelf.

  14. Liam Proven

    Which is to say...

    The point being, important lessons were learned building the Unix shell. Yes there's cruft too -- it's over 40 years old. But it's polished smooth, for all that.

    PowerShell learns few of those lessons.

    George Santayana said: "Those who do not remember the past are condemned to repeat it."

    Henry Spencer modified this to: "Those who do not understand Unix are condemned to reinvent it, poorly."

    Microsoft is still learning to reinvent Unix -- slowly separating text-mode core OS from graphical layer; learning the importance of a rich command line; learning to write graphical commands that emit said CLI, easing automation. But it's not doing it terribly well.

    The trouble is that the Stockholm Syndrome world of corporate IT has been brainwashed into believing that it's the only way and to frantically deny the Great Heresy that is Unix.

    1. Anonymous Coward
      Anonymous Coward

      Re: Which is to say...

      "slowly separating text-mode core OS from graphical layer"

      More accurate to write: "slowly reseparating". Recall that Cutler's team started with text-mode before Bilge ordered the GUI bolted on regardless.

      1. Roo
        Windows

        Re: Which is to say...

        "More accurate to write: "slowly reseparating". Recall that Cutler's team started with text-mode before Bilge ordered the GUI bolted on regardless."

        As good as some of Cutler's work has been and as smart as he is, I feel people are a bit too quick to put him on a pedestal when it comes to WNT.

        1) I would fully expect WNT to development to have started out with "text-mode" - simply because developing all those graphics drivers, GUIs and supporting libraries would have taken a very long time. I would *expect* Cutler et al to have debugged & interacted with those early kernels via "text-mode" over a RS232 port, or perhaps via VGA card (text only - natch).

        2) When Cutler was hired & developing NT, GUIs were the thing people wanted to buy, therefore he should have known up front that a GUI would be the main way of interacting with the new OS, he would have to have been deaf dumb, blind and terminally retarded not to see which way the wind was blowing at Redmond. To give Cutler his due, I am fairly certain he would have had a big problem with a lot of aspects of the bits outside of the Kernel on WNT, and would agree that WNT would have looked totally different if Cutler had full control over it's development... Pretty sure he would have strangled Win32 in it's cot for starters. ;)

        The reason why OSes & drivers were often developed in "text-mode" is driving an RS-232 interface or VGA card doesn't require much in the way of code and there is very little to go wrong with it. For those reasons a lot of UNIXen, their admins & users have carried on using "text-mode". That said I fully expect pretty much any Linux distro to boot into a GUI and work by default these days. ;)

    2. John 104

      Re: Which is to say...

      Microsoft is still learning to reinvent Unix -- slowly separating text-mode core OS from graphical layer; learning the importance of a rich command line; learning to write graphical commands that emit said CLI, easing automation. But it's not doing it terribly well.

      You could say the same thing in reverse for Linux and the desktop. I can't recall the number of flavors of Linux GUI/Apps I've tried over the years just to toss them out because they were too much hassle to make work.In the end, as a consumer of a desktop OS, I want to use it for productivity.

      Ubuntu is the latest trend and it is getting better, but I would never throw it at my users.

      And managing users in nix is a joke. LDAP is the king, and MS has the single best implementation of that technology to date.

      Back on topic, Unix systems have had, hands down, the best command line power for decades. At this point, I'd say PowerShell is getting MS to where it needs to be to be a serious tool for command line junkies. But it sure wasn't there to begin with!

      1. Roo
        Windows

        Re: Which is to say...

        "LDAP is the king"...

        Sure, and it's been available for UNIX boxes forever...

  15. Bleu

    Having had a Unix and proprietary mainframe

    upbringing, and micros at home, I am less than impressed by Microsoft's efforts on the shell and command-line fronts.

    After all, there was clearly a fight over whether or not to continue it at all. Nice to see that sanity prevailed.

    That said, its roots in DOS are very clear, it isn't much good as a consequence. Also, useful.

  16. Kubla Cant Silver badge

    Not dead yet

    Curously, there seems to be evidence that the command prompt in Windows still has a tiny spark of life in it. (Or maybe I'm late discovering features in the obscure and hard-to-find documentation.) "set /?" now delivers three screens of help, and includes features like string replacement and delayed variable expansion. You can write surprisingly capable scripts now. Unfortunately there seems to be some rule that any new feature has to be invoked by obscure metacharacters. I suspect that this is a legacy of the original feeble MS-DOS parser.

    I'm reasonably sure that the first versions of MS-DOS did offer command-line editing. It used function keys F1 to F9(?) and it's still available in Windows 7, although some of the functions now produce a popup prompt which obviously wasn't there in MS-DOS.

    1. Vic

      Re: Not dead yet

      I'm reasonably sure that the first versions of MS-DOS did offer command-line editing.

      I don't remember a version that didn't - although I might simply have forgotten some...

      The early stuff used F1-F3; F1 would repeat the next character in the history buffer. F2 and a character would repeat up until the next occurrence of that character (and ISTR you could prefix a number to repeat up until the nth occurrence), and F3 would repeat the whole line.

      Vic.

    2. david 12 Bronze badge

      Re: Not dead yet

      >seems to be some rule that any new feature has to be invoked by obscure metacharacters.

      Not the legacy of the original feeble MS-DOS parser -- it's the legacy of ALGOL, as also seen in powershell, combined with legacy of people who are single-finger typists.

      Some people like FORTRAN and COBOL because they can type. Some people like obscure metacharacters because they can't.

  17. David Harper 1

    They say imitation is the sincerest form of flattery

    It's nice to see that Microsoft is finally catching up to where Unix was, circa 1975 :-)

  18. Jon Massey
    Trollface

    That's awkward...

    <tab;&gt, is a bit of a complicated completion command sequence to remember - by the time you'd typed all that you could have written the whole command out instead!

  19. Mage Silver badge
    Facepalm

    Written badly

    "Windows XP was the first PC operating system to drop the MS-DOS Prompt and change it to Command Prompt, due to a change to the NT kernel. The Windows NT family has used the newer Command Prompt since it started with Windows NT 3.1, so it was nothing new on that side of the fence."

    All versions of NT from 3.1 to the 5.1 (XP) had choice of 32bit native console (looked like a DOS prompt) or running MSDOS shell via NTVDM (Real DOS prompt).

    NT also could run OS/2 scripts and console executables as well as native NT scripts.

    MS Services for Unix (1999) added Bourne Shell to NT4.0

  20. John Sanders
    Linux

    Powershell

    Behaves more like a base scripting language (a summary of the worst of bash, php and perl)

    Then you get application extensions to provide extra functionality for the base scripting language to be of any use.

    In Unix you have scripting languages and anything that you install/add to the system is already available to anything else that you can call from the shell, or anything that can start a shell, regardless of the language.

    Say whatever you want about PowerShell object oriented usage, you will not do much with Exchange objects feeding them to VMWare unless vmware adds support for them, nor you would be able to use those objects on tomorrows latest fart without MS's intervention.

    As I said to a Windows colleague once, "Relax; you just have discovered scripting, I was as excited as you when I discovered what you could do with .bat files back in the late 80's"

  21. b166er

    'MS-DOS was lacking other features, too, that many would now consider unforgivable. After typing out an incredibly long command and realising there was an extra letter at the very beginning, all you'd end up with was an unusable chunk of text.' - a bit like when you've composed a txt or email then realise you have a spelling error and are sadly using an iPhone instead of a phone with editing capabilities.

    1. AIBailey

      ... or you'd hit space, then press F3 to repeat the rest of the command. Not sure what version of DOS that keyboard shortcut was added in, but it's saved me a whole load of typing in the past.

    2. david 12 Bronze badge

      all you'd end up with was an unusable chunk of text.'

      >After typing out an incredibly long command and realising there was an extra letter at the very beginning, all you'd end up with was an unusable chunk of text.'

      except it wasn't actually like that. You just backspace through it, delete the character, then replay the characters.

      If you had altready attempted the command, you just use the correct Fn key to replay the very long command, then backspace through it

  22. Permidion

    UNICODE/UTF8

    PowerShell still doesnt support correctly UNICODE/UTF8 characters.

    Try pasting 漢字 into it, it wont work (funnily you can even crash PowerShell that way) even with changing the codepage.

    PowerShell is maybe better than an old command terminal, it is fine to manage a Windows system, but it is still way far from a really usable terminal window.

    I still have to use a Cygwin terminal or some specific terminal (pythonwin) when handling UNICODE/UTF8 and displaying something more useful than lines of ?

    1. ratfox Silver badge

      Re: UNICODE/UTF8

      That's pretty bad. I understand that Powershell is considered by its users as superior to bash, but at least that's a problem that bash does not have.

      I can totally imagine the reasons for which MS would have developed its own rather than going with bash, between the fact bash was considered the competition, that it would have been losing face to adopt it, that they were intelligent enough to create something better, influential enough to get their solution accepted, and so on…

      Feels a lot like something Google would do nowadays. MS seems to have grown humble in comparison.

    2. Not Terry Wogan

      Re: UNICODE/UTF8

      http://stackoverflow.com/a/5808445/465415

  23. martinusher Silver badge

    Bash?

    The MS-DOS shell, like most things Microsoft, is just a poor copy of a standard program. In this case its 'sh' or, in Gnuese, 'bash'. Even with teaks and enhancements it doesn't do most of things you can do with a real shell, including starting scripts as programs with the '#!' construct.

    If you do a lot of embedded work but you're stuck with a Windows platform (a common scenario) then you find that the tools for the most part are 'ix' based, using Cygwin as a shim to give you something like proper OS functionality. This has the side effect of not only giving you a bash to work with if you need it but also being able to use standard commands directly from the Windows command prompt.

  24. W. Anderson

    Upgrading Windows programming with NIX concepts

    For more than a decade, most Microsoft Windows users who were totally ignorant about the power and productivity of the Command Line Interface (CLI) and BASH/other Shell scripting tools in *NIX Operating Systems (OS), condemned the non_Windows OS as "stone age" and unsophisticated, not knowing that a significant amount of administrative work and systems configurations were virtually impossible to perform "efficiently and quickly" via any Windows GUI, especially on Windows Servers, Networking operations and for many qpplications progrmming projects in PC desktop envionments.

    Most of my Microsoft Windows severe critics from 8 - 10 years back are "now" mentally blank about their earlier views while they take to (painfully) learning and understanding the inevitably of Windows Powershell in modern technology.

    My, how times have changed.

    1. Richard Plinston Silver badge

      Re: Upgrading Windows programming with NIX concepts

      > condemned the non_Windows OS as "stone age" and unsophisticated,

      Microsoft worked hard to make their CLI very poor so that they could point out how useless it was in order to convince users to switch to GUI. Even when MS wrote a semi-decent CLI enhancer for Windows 95/98 they didn't install it automatically, didn't mention it in the manual and hid it away.

      They even seem to have removed command line options from programs (such as net) so that users were forced to use the GUI rather than have a batch file do stuff automatically.

  25. Vinyl-Junkie
    Coat

    Falcon 3.0, the ultimate test of your config.sys/autoexec.bat skills?

    From what I remember from my dim & distant past is that Falcon 3.0 required around 605K of the 640K available to run in AND need access to extended memory. So you needed to load the extended memory driver (himem.sys), a mouse driver (it needed one) and a sound driver enabler into 35k of memory, as well as the core OS...

    This was made slightly more, erm, interesting, by the fact that the order in which they were loaded would affect how much memory they took up. Creating bootdisks for Falcon 3.0 was an art!

    Kids today..... etc, etc (wanders off mumbling)

    1. theOtherJT

      Re: Falcon 3.0, the ultimate test of your config.sys/autoexec.bat skills?

      My personal demon was MechWarrior2, which needed to be run off a parallel port connected zip disk because there wasn't room for it on the C drive. Himem, CDRom drivers, Sound drivers, joystick drivers LAN drivers and Zip drive drivers... Took DAYS to finally get that bastard to start.

    2. Chris Holford
      Unhappy

      config.sys/autoexec.bat fail!

      I remember spending Christmas morning struggling with config.sys and autoexec.bat to get 'Magic Carpet' to work on our 386 computer. It had been a present for 10 yr old son; after about 3hrs I just got it to work, by which time son was totally disillusioned.

      1. Roo
        Windows

        Re: config.sys/autoexec.bat fail!

        "I remember spending Christmas morning struggling with config.sys and autoexec.bat to get 'Magic Carpet' to work on our 386 computer. It had been a present for 10 yr old son; after about 3hrs I just got it to work, by which time son was totally disillusioned."

        Funnily enough I'm still going through that nightmare with our < 10 year old kids at the moment. I installed Win 8.1, dutifully slotted in a Disney DVD and waited for it to play... OK, so there's no DVD playback, kicked off a VLC download and figured I'd put on some music from the DLNA server while we waited... Ah of course Win 8.1 doesn't support FLAC presumably because Microsoft can't afford to pay someone nothing to bundle it... As with most prior Windows installs it turned into a really boring afternoon packed with disappointment.

        It looks like I've been spoilt by Linux distros. :)

    3. PJF

      Re: Falcon 3.0, the ultimate test of your config.sys/autoexec.bat skills?

      Similar situation with an irrigation program..

      Needed a mouse, sound, and a 4-color light pen driver to be loaded, before it ate about 600k of the 640. This was on a "true" IBM 286 w/ the 287 math co/pro.

      This was the t-o-t-l (top of the line) system - 16 colors, 4-voices, 2-button mouse, and an 84(?) keyboard. Propitiatory 8-bit comm card, and software to drive it, dual UART232 for a dedicated weather station, (on a short-haul modem) and their "custom" mouse/brick (with a "special" pad), Centronics 36 parallel port to a 9-pin (IBM) printer, and a DB-9 to the "largest" (14inch) color display..

      Ahh, kids these days... I don't know.. That crap music they play, touch screen this 'n that, social media....

      WHEN I WAS A KID......

  26. John Brown (no body) Silver badge

    command-line programs, such as diskpart,

    Was there a DOS diskpart back then? Surely the default MS-DOS tool was fdisk.

  27. This post has been deleted by its author

  28. John 104

    Alias

    Don't fret, children. You don't really have to type get-childitem to get the contents of a directory. Those memorable commands still work as aliases. You can also type ls. That's what I usually use. I'm thankful they added that in as a default alias. Switching between Linux and Windows systems I used to always manage to type the wrong one... :)

    As for the cmd line being dead. It isn't really. You can still type it from the run box or a powershell window and get all your old commands back.

    You can also run commands the old fashioned way in PS by using invoke-expression or invoke-command. Then you can do all sorts of nifty management things with it like error control, writing events to log files or the windows event log, etc.

    To all of you die hard batch writers, give PS a chance. You may just expand your scripting chops and stay relevant to the IT world while you are at it...

  29. disgruntled yank Silver badge

    fondly

    "It's easy to look back now and wonder how people put up with such a manual, non-user-friendly system, but personally I still look back on it fondly. Many hours were spent learning every command available, all the switches and what they do."

    Yes, indeed. I remember the customer's employee who was working his way through the commands manual one night, and zapped most of the data.

    I have done a little scripting with PowerShell, but have not yet warmed to it. I'm not sure why.

  30. Michael Duke

    Quote: Windows XP was the first PC operating system to drop the MS-DOS Prompt and change it to Command Prompt, due to a change to the NT kernel. The Windows NT family has used the newer Command Prompt since it started with Windows NT 3.1, so it was nothing new on that side of the fence.

    Umm Windows NT 3.5 Workstation, Windows NT 3.51 Workstation, Windows NT 4.0 Workstation and Windows 2000 Professional would like to talk to you behind the bike shed :)

  31. PJF
    Mushroom

    Anyone...

    Anyone else tries to keep their file names in an "eight-point-three" format anymore?

    When MS went to 256.256 (98?), I just threw my hands up in surrender..

  32. tom dial Silver badge

    ZCPR

    As someone else mentioned, MS seems finally to have approached the state of 1975 Unix (or maybe 1983, with Korn shell). I have to add ZCPR for Z80 based 8 bit systems. If I recall correctly, it had a passable imitation of the Unix shell, common utilities, and I think pipes, subject to the limitations of an 8 bit CPU, 64K memory, and lack of multitasking. I saw nothing better until I started playing with Minix 1 as part of a grad school class, and then found a used copy of IBM Xenix 1.0 at an amateur radio flea market. The last saved me a lot of time when learning to handle C pointers and references without the need to reboot at every mistake.

  33. Anonymous Coward
    Anonymous Coward

    Choices

    I'll grant Powershell has a certain amount of advantages if you're running 2008R2/Windows 7 or later with the certainty it's built in, or when needing to perform administration tasks closely coupled with some Microsoft products. However, for other instances, why would I bother using something extremely platform specific when there's the opportunity to use Python or a wealth of Unix derived utilities to solve the problem, and not tie my skills to a particular OS?

    I've done some scripting, read the documentation and some of it is extremely well designed and rather neat. For most purposes, though, I'd still rather use Cygwin, Python or C++.

  34. Henry Wertz 1 Gold badge

    Re: piping

    The piping in DOS was also a nasty kludge; it did not support true pipes. It would write the ENTIRE stdout from the first command into a temporary file, then only when this was completely written out, open the temp file and feed it into the standard input of the next command. I.e.,

    dir | more

    would write the entire result of "dir" into a temp file, then open the temp file and run it into "more".

    I assume (hope) that powershell uses actual pipes to implement pipes.

    1. Charles 9 Silver badge

      Re: piping

      Are you SURE it went to a temp file and not RAM? I know at least once I overloaded a pipe which you wouldn't expect to happen with a temp file given enough free space.

      1. Richard Plinston Silver badge

        Re: piping

        > Are you SURE it went to a temp file and not RAM? I know at least once I overloaded a pipe which you wouldn't expect to happen with a temp file given enough free space.

        Yes. If it went to RAM during the first program then when it tried to load the 2nd program it may not fit - or more likely would overwrite the data.

        1. Charles 9 Silver badge

          Re: piping

          But that's what I'm saying. I've had cases of the pipe not working, probably because the second program tried to load after the first, couldn't, and DOS returned an error to that effect. Like trying to stuff a huge text file (~1MB I think) through more.

  35. timrichardson

    idiosyncratic

    A compromise between a useful shell and learning something which only has value on windows, a platform of declining relevance to the world of servers, is bash. Git for windows comes with bash and it's nice to have one common shell as I move across os x, linux and windows. powershell looks very interesting but I have not been able to justify learning it yet. For more advanced admin python on windows works well and once again there is not a new learning curve. I wonder if the new Microsoft would have done something as idiosyncratic as powershell.

  36. Robert Harrison

    Not so impressed with Powershell

    Maybe this has since been resolved, but I remember being underwhelmed by my first major outing with Powershell.

    The task was to replace some aging VB scripts that communicated with Exchange Server 2003 via CDO. The exchange server was being replaced with Exchange 2010 which of course no longer supported CDO in favour of Exchange Web Services (EWS).

    So we selected Powershell to use the EWS API. We ended up with a custom C# snippet in the Powershell script that implemented an accept-all certificate handler to work around the connection errors we were experiencing between the script and Exchange (as advised by MSDN and MS blogs).

    Oh and of course make sure for the love of god that you selected the right number of bits (32 versus 64) when you executed the PS script. And that you selected *exactly* the right version of the EWS API DLL to download and deploy, otherwise PS just threw an exception.

    All this just to read certain subject lines from emails in an Inbox. Conclusion: Even in 2014 the tools felt like a poor beta.

    (Oh yes, code signing PS scripts with the cmdlet. Sometimes it just wouldn't. But take the script + code signing cert to a similar workstation and then it worked. Weird!)

  37. david 12 Bronze badge

    Just looking for a little Active Directory administration. How do I do it? Oh, here are some examples: In powershell I do it using exactly the same ADO objects I've been using for the last 15 years.

  38. david 12 Bronze badge

    Just looking for a little Active Directory administration. How do I do it? Oh, here are some examples: In powershell I do it using exactly the same ADO objects I've been using for the last 10-15 years.

    1. Pirate Dave
      Pirate

      Yeah, just watch out for the icacls command - I wrote an all-in-one create-user script last year, and after an hour or so of aggravation, I gave up trying to escape the colons and parenthesis and just stuck it in a batch file that I called from the create-user script. There are "native" ACL commands in Powershell that are powerful, but they are even uglier than a batch file and take too much coding - or at least it seemed like a lot of code just to replicate the functionality of a single line call to

      "icacls %2 /grant %1:(oi)(ci)(M,DC)"

      If it isn't making the job easier, why use it?

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2019