So when they fixed it, it was still broken
So the nifty better signals were infact a kludge then. In the desire to prevent signal handling damaging the code, they actually threw out the real feature of signals - that they can interrupt anything AND might well need to.
It would have been far far better to implement a special data structure or signal FIFO that allowed the signal handler to interface cleanly with your main code (which ought to know when it can safely notice). What happened ofcourse was that someone moved the decision point from "your code" where you "needed to know how to write a signal handler properly and safely" to "their internal code" where you get a simplified and less useful version of signals. They now only go off at the times some PerlGod decided they ought to. And as you demonstrate this means you lose all control when external libraries and processes are used EVEN THOUGH those are the two main times to worry about control, signals, timeouts and interrupts.
Did the PerlGods hire a M$ programmer to design this new signal handler? It reduces realtime-signals to the status of windows-message-passing. That's not a fix its a different beast entirely. As I said a far better solution would have been implementing a standard-signal-fifo or standard-event-queue. That would have avoided the data corruption risk while preserving as much of the realtime-nature as the program developer required.
It reminds me of a time at Reuter's I think - where I eventually uncovered that windows (3.12 or 386 I think) handled certain things (like keyboard interrupts) while it was sometimes in an obscure part of the kernel with its own very very small stack. So when complicated code was added for trader's keyboards the stack blew bigtime! Oh, AND an idiot had implemented the keyboard interrupt code so that it could go recursive in the final two instructions! It was a nice combination. It was an interesting fix too, and rather better thought out than the perl signal handler.
I concur with posters who felt that an XML parser ought not to hang on bogus input!! All the XML parser's I've come across seem to HUGE, SLOW and frequently BROKEN like this. Given that the syntax is simple BUT highly recursive why oh why can no-one write a simple proper parser, or is it the obsession with linear-ising the ENTIRE input bundle in one go that does it instead of building a naturally parsed tree. Anyone want to borrow a copy of "Understanding and Writing Compilers"....
Pirate's as there is no Cowboy Symbol.