back to article Intel teaches machines to build own device drivers

Intel Labs is working to automate the tedious and error-prone process of writing device drivers and porting them to different operating systems. Explaining the need for a tool that could synthesize device drivers, Intel Labs software engineer Arun Raghunath told The Reg: "A bunch of studies have shown that the prime cause of …

COMMENTS

This topic is closed for new posts.
  1. Anonymous Coward
    Thumb Up

    Sounds good

    But the situation really shouldn't be as much of a mess as it is portrayed to be. Proper development practises would see one team concentrating on the device driving and providing an API to another team which knows the OS and what is needed by the applications which use it. So there shouldn't be problems due to one team needing to know it all.

    It may be a bit ambitious to think that the code can be auto-generated - someone is always going to have a reason to get dirty.

    Still, good luck with the plan. I would advise Mr Raghunath, though, to avoid using the word 'leverage' - it's a massive bullshit indicator these days.

    A qualified thumbs up.

  2. John Burton

    I confidently predict this is the last we'll hear of this.

    ... you just know this isn't ever going to actually be used for anything

    1. Charles Manning

      The Last One

      http://en.wikipedia.org/wiki/The_Last_One_%28software%29

      I remember back in the 1980s seeing programmers quaking in their boots because automatic software generators were going to steal their jobs.

      Yeah, right!

      As for device drivers (or which I have written many), the challenge is not in accessing the devices but managing their state. For example most of the complexity in a USB driver is handling all the state transitions and building a robust state machine. Is that really something an automated driver can do?

      1. John Smith 19 Gold badge
        Happy

        @Charles Manning

        "For example most of the complexity in a USB driver is handling all the state transitions and building a robust state machine. Is that really something an automated driver can do?"

        Various chip design tools will take a boolean logic spec and spit out the layout for a piece of hardware (used to be a PLA) for said state machine. This has been available since at least the mid 80s.

        Computers do logic pretty well, when they have been given the framework in which to operate.

  3. A Non e-mouse Silver badge

    Moving the goal posts

    The problem with this system, is that it requires people to write these spec documents correctly. Who say's that writing these spec documents is going to be less error prone than the current system of writing device drivers ?

    I've seen articles mention, for example, how Linux has to deviate from the spec for X because Vendor A and Microsoft both made mistakes and fudged it under the bonnet.

    I've done some small programing with SOAP, and I've found that even though things appear to adhere to the spec, they don't work.

    I think it's a nice ideal, but I don't think it'll work in the real word.

  4. Graham Bartlett

    Alternatively

    You could always hire programmers who know what the fuck they're doing.

    The problem is that the majority of Windows programmers I've met have very little clue about the embedded world; and vice versa. Device driver development needs skills in both area. If this is your job and you can't do it, you either need more training or you need to not be in that job,.

    1. raving angry loony

      not the problem

      The actual problem is that those who DO know what they're doing want to get paid for their extensive knowledge. Managers, however, are invariably judged on how little they pay. The problem is therefore not the people writing the device drivers, the problem is cheap arse managers with accounting backgrounds assigning the wrong people to the job because it's cheaper, and not having to pay for the inevitable cock-up out of their own budget, because they can hand it over to "support", get their promotion, and fuck off to wealthier climes.

  5. John Styles

    The 80s called, they want their formal methods delusions back

    Is this anything other than the formal methods stuff that British academia was so keen on, and helped made the UKs operating systems / indiginous hardware plaforms what they are to day (ha ha), coupled to the 'throw random numbers at it' stuff also beloved of academics that never really gets anywhere?

    (Neural Nets and GAs and so forth seem to me to be a good way of keeping mediocre Ph. D students busy for 3 years - allows lots of papers and results to be generated without them ever having to actually understand at a deep level whatever it is they're trying to optimise. It is kind of like experimenting on rats, it's not tremendously useful but it keeps the Ph. D. mill going - personally I think they should cut out the middle man and have the medical students experiment on the computer science ones instead)

    1. Anonymous Coward
      Stop

      No, I don't think it is...

      The idea behind formal methods was to be able to mathematically prove the functionality of the software. So you describe the functions of the software using set theory and boolean algebra which can then be mathematically proofed. Then the general structure of the code could then be auto generated by a number of tools.

      The key being the specification was created using maths, not vaguely descriptive paragraphs which brush over or sidestep required functionality.

      Formal Methods was mostly used for creating extremely safety critical software - aviation, military stuff etc... Mostly because it is extremely hard, detailed and time consuming.

      However there are many other aspects of software development born in the universities which make a hell of a difference to the robustness, maintainability and reusability of code. Unfortunately most of it is ignored by shortsighted programmers who can't see past the next algorithm.

      Universities are the same as any other processing unit - it is the quality of the input which determines the quality of the output.

      Of course there are many kinds of computing degree on offer and I have seen some designated comp sci which didn't even feature a single programming module in a 3 year degree. Which does piss me off because we all get tarred with the same brush.

  6. John Sager
    Thumb Up

    Good luck with that

    Not entirely a cynical comment - I really hope they get somewhere with this. Reading device driver comments, the frustration pours out. Few devices, especially complex ones, seem to behave entirely according to spec, and that prompts quite a few profanities!

    So something that can ferret out off-spec or unspecced behaviour would be magic, quite apart from the multi-OS capability.

  7. Anonymous Coward
    Anonymous Coward

    Can't wait but...

    From what they say, most problems are due to ambiguity or lack of clarity in specifications, as well as a lack of standard. So how do you eliminate that? Do you let the device write its own specifications? How? Do you formalise a standard yet somehow keep it backward compatible with what was non standard before it? And how do you get all that to work? Using the elusive "figureOutWhat" and "figureOutHow" fnuctions that I've never been told about?

    In which case, can someone email me the syntax for these functions, my boss and colleagues at work seem to think that those functions are readily available in every processor.

  8. Gerhard Mack

    nuts to that

    Wouldn't it be easier to make devices conform to a set standard and write drivers for that only?

  9. TeeCee Gold badge
    WTF?

    Game theory device drivers.

    <Rolls dice>

    Double-six. Damn. I won't be using the printer today then.......

    1. Col

      Double sixes?

      That's a critical success! Printer corrects grammatical errors, +2 to ink cartridge levels. Double ones, on the other hand, results in a printer on fire error.

  10. Anonymous Coward
    Anonymous Coward

    Getting the info

    Good luck with getting any accurate info from semiconductor manufacturers. Most of my job as a device driver engineer is filling in the missing details.

  11. janimal

    But isn't the spec the problem in the first place?

    Surely if the manufacturers and OS vendors provided detailed enuogh specs in the first place driver developers would need to infer less. Documentation has always been the big problem in software development of any kind.

    Despite 30 years of trying to move software development into the engineering domain, programmers and companies continue to consistently provide sub standard documentation (if any at all). Most companies that I have worked in where they do claim to follow a methodology usually simply go through the motions ticking boxes.

    This is even odder considering I was developing chemical process & plant design software and would be told not to spend more than a day or two on design & to get coding right away by people who would spend a year designing a chemical plant before the foundations were laid.

    If they built chemical plants (or bridges, skyscrapers, nMRI scanners etc...) using the same methodology as software it would be a very dangerous world!

    Unfortunately commercial companies rarely see notice the benefit of accurately designed and documented software.

    My team were once able to produce an enhancement to an application which had been allocated 8 months development time in three weeks because we were able to build on an excellent, flexible design of the original app. It was only at that point that my manager had one of those 'aaaaah I see!' moments.

  12. M Gale
    Mushroom

    A strange game.

    The only winning move is not to play. So why don't we get some termites to do it for us?

    Nuclear explosion because, well... if you don't know, your geek credentials are hereby revoked.

    1. William Towle
      Boffin

      ...possibly more appropriately:

      "After very careful consideration, sir, I've come to the conclusion that your system sucks"

  13. -tim
    Go

    More recycled old tech?

    Microware's OS9 (for 6809 and 68000+) had a great modular device driver module where a small file would say "I've got a serial port here on this IRQ called Ser1 and you can check this bit to find out if I need to run" and a second copy with a different name and maybe a different IRQ or address for other ports. Then they would have a module that only moved data from the chips to the OS and the OS took care of most of the rest of the work based on a few data structures. That model worked well from the early days of the 6809 to the last days of industrial control systems with fast networks and disks with very few extensions made to the core.

  14. Identity
    Stop

    I'm always amazed

    at the dchotomy between developing cool new automation tech and the need of the workforce to have jobs. Don't fire your coders *yet* indeed!

    We really need to think ahead about how we will deal with an economy where having a paying job in order to eat is no longer something everyone can do, as jobs from supermarket clerk to computer programmer go to machines...

    1. Figgus

      Det. Spooner...

      "I suppose your father lost his job to a robot. I don't know, maybe you would have simply banned the Internet to keep the libraries open. "

    2. Tom 13

      I tend to go mostly to the clerks.

      Ironically, they are faster than the machines.

    3. Someone Else Silver badge
      Coat

      Oooooohh...

      "We really need to think ahead about how we will deal with an economy where having a paying job in order to eat is no longer something everyone can do [...]"

      The Teabaggers won't like that idea much as the implication is that "someone" will have to support the rabble, and someone would be the "damn gubmint". But for the 1-percenters, that is their Utopian wet dream!

  15. K. Adams
    Terminator

    I, for one, welcome...

    ... our self-hardware-interface-protocol-authoring overlords...

  16. Kev99 Silver badge

    Freely available for older devices

    This would be a really great piece of work if any user could download it and get their still still perfectly functional "legacy" devices to work with modern bloatware, such as Windows #.

    Fat chance of that happening. The hardware companies would fight that no end.

  17. Primus Secundus Tertius
    Thumb Down

    DARPA ...

    This article reads like DARPAcrap - especially that dreaming diagram.

  18. John Smith 19 Gold badge
    Meh

    At what point does "device spec" become "device *model*" ?

    The *idea* is good and simple

    But what happens when the result comes back ERROR and error recovery *depends* on diagnosing the error, working out the options depending on additional state information because the hardware mfg has not *given* it to you.

    Humans would start firing up their debuggers at this point and expect to be staying late.

    But in *this* system all you would have is "this is an error". In reality the spec is incomplete and the mfg can't be arsed to fill in the details because you have some of their hardware in front of you so you figure it out.

    Unless it's *not* a spec, its a black box with a fairly detailed model of the hardware internal function

    How many mfg's would let something like that out of their offices?

    IMHO device driver construction *could* have been automated *decades* ago but OS APIs and device interface specs have never been available in a form standardised enough and detailed enough to automate the process.

    Anyone wondering if the person giving this presentation has a cunning plan to make money involving underpants?

  19. ckaspereli

    Herding cats but the meanest bully on the block runs the show.

    Sure it's like herding cats but Intel has all the cloat and moxy to muscle players that wish to remain or have ambitions to become relevant to buy into this simple scheme. They've done it before and they'll do it again.

  20. Julo

    Drivers are bad - write fewer of them

    Drivers are the main cause of OS failure and due to the sizable number of drivers (over 5000 for either Windows or Linux) complete testing of an OS is not feasible. Amazing each new generation of devices and systems adds more drivers without a real increase in function/performance/safety.

    Several years ago I lead a team at IBM research that suggested a system architecture (and prototyped it) that took the drivers out of the I/O game. Like for the mainframes (that are not plagued by the driver issue) it standardized the I/O interfaces while leaving room for innovation by adding programability to the I/O language (unlike the mainframe the scheme was both scalable and secure). I/O vendors will have to build their wares once for all CPU/OS vendors and safety/security is built in the "interface". We still feel that this is more practical that automating building drivers and better that all the "sandboxing" schemes that have been proposed in the past. Whether IBM (unlikely) Intel (possibly) or somebody else will continue this effort hard to say but in the multicore era that is probably the way to go - at least for drivers :-)

    Julian Satran

  21. mhenriday

    Wonder what the odds are

    on the USPTO granting Intel a patent on the process on the strength (?) of that charming diagram the Reg just published - if now Intel were to apply for one ? Stranger patents have been granted. Something is rotten in the state of....

    Henri

  22. Anonymous Coward
    Terminator

    EON ... Jart

    http://newyorkzipcodes.net/videos-eon-worlds-within-worlds-%5BVrLo1qUkCH8%5D.cfm

    "How did you break through my barriers?"

    "Your understanding of certain algorithms is incomplete."

This topic is closed for new posts.

Other stories you might like