I'm not entirely sure you know what you're talking about.
The classic web server flow is receive network packet, read disc, send network packet. The thread pool shouldn't be blocked by disc reads, and you shouldn't be using a thread per connection either. So, you need an async event-based framework to sensibly write code, such as Node.js.
And yes, Javascript, on the server. Node.js runs on the V8 engine which puts it a mile ahead of interpreted PHP in terms of performance. If people are running .NET or Java on the server, there's no reason not to use Javascript.
I get the feeling you have one way you prefer to do things, and this isn't it. Fair enough, but that doesn't prove that Node.js is a crock. *This* proves that Node.js is a crock:
1. No support for threads; it pervasively assumes a single-threaded architecture. No sync objects anywhere, and paradigmatic code that will never work on multiple threads. FAIL.
2. Not on Windows. Since Linux performs well without the event model, but Windows totally needs it, this is a mismatch. They say they're fixing that but I don't believe it - existing Node.js code relies on the UNIX user & filesystem model to a pretty large extent. This is presumably why their own Windows port of Node.js is not forthcoming - it needs a rewrite for Windows, and will be a pretty different system.
3. Linux support for async I/O is shit anyway; Node.js has to do things synchronously sometimes because no-one in the UNIX world ever thought to provide a unified system for async I/O.
4. Linux supports the thread-per-connection model well, by scheduling threads within a process quickly. Node.js on Linux solves what is pretty much a non-issue on that platform, by using a method that isn't well supported on that platform.