The massive blockbuster Avatar reintroduced 3D to the 21st century. The big difference from the previous 3D invasions was digital technology. Optically and physiologically the principle was the same: pairs of frames, representing a left eye view and a right eye view, are presented (near-) simultaneously to the viewer. The 3D …
Who remembers how long it took to render the apple image in Real3D on an Amiga A500? Or trying to get anything except BLACK to render in the first version of Imagine.. I still remember nights waiting for single frames to render on a 30s 900 frame animation, eating up hard disk space like nothing else could.
Some random realtime 3D coverdisk demo got me hooked, then I moved on to Real3D, didn't get on with Imagine and ended up spending serious cash on an Amiga 4000/040 and one of the first UK versions of Lightwave 3.5 when it got unbundled from VideoToaster.
That rig spent 4 months rendering a 20 minute video. I was paid the grand sum of £800 for my trouble.
Reminds me of another blast from the past
The "Driller" and "Total Eclipse" games for the 8-bits, using one of the first true 3D engines at the time, called Freescape. The movement in the games was glacial at best and it a long time for the character to turn around....
....but's from slow, small scale stuff that comes the likes of Skyrim and Battlefield. I'm not saying that 3D today is awesome (because frankly Avatar was fine in 2D for me and Conan 3D got on my nerves after a while) but it's only by building on existing technology that we get new technology.
I can't believe a whole article on cgi has passed by without one single mention of the former incarnation of sgi, or more properly irix which at least had a coherent environment supporting it instead of linux's random ragtag gnu utils and conflicting sound drivers that have meant every single professional animator, lighter, creature td, fx td, compositor and rotoscoper in every major effects studio for the past 10 years has created 99% of all the big blockbuster films without a functioning sound card.
"...a huge post-production house with a 65,000-square-foot HQ in Mumbai, India. "They have thousands of people there, mapping out objects and projecting them onto models converting 2D to 3D."
<sarcasm> And I'm sure they have licensed copies of every piece of software in use.
Plenty of CAD warehouses in Asia have 'hundreds' of CADmonkeys on tap. "need more? no problem!" At only £3000-£5000 a seat. Where did all my work go?
"Windows, Linux, what works best for the studios?"
I'm guessing Linux since it won't crash randomly in the middle of rendering!
Yeah, cause it kernel panics right after post.
...restart automatically after installing a 'critical update' through Windows update.
"some TVs purport to be able to dimensionalise conventional 2D on the fly"
Yeah, and back in the early days of colour there were folks selling things that would colourise your b+w tv
Blue at the top, green at the bottom, and checkered in the middle.
That's got to be right at least as often as a "dimensionalised on the fly" 2d->3d conversion.
If anything, this type of gimmicks will only turn people off 3D completely.
Windows is too expensive so they chose Redhat. Good one.
Get some decent sysadmins who can operate Linux without the safety hat on and save for real.
Do those decent "sysadmins come" with a financially backed SLA? No? Good one. :-)
Back to basics
What's wrong with some modelling clay, if it's going to take all that time to render a frame, then go back to the KISS approach.
Nowt wrong with rendering on Windows
If you're a small shop and half your render boxes are also your artists' workstations, using something like Muster to schedule work and to drop them on and off the farm (and having a careful approach to things like Windows Update), rendering on Windows isn't a problem at all.
"and having a careful approach to things like Windows Update"
You mean like turning the bloody thing off? :-)
Great movie, created with Blender. Free download, suitable for work.
Seems to me that 3d is wanted mostly by the film companies and cinemas. Consumers appear to be saying "Meh' and shrugging when asked about 3d.
Personally, I think that if done well, 3D can add a lot to certain types of film (action, children's films etc), but not all. It's better for films with lots of Special FX, as they tend to have lots of action, rather than films that are more intellectual (for instance, I don't think Tinker, Tailor, Soldier, Spy would have benefited from 3D).
Do I think 3D will be anything more than a passing fad. Well, no. I know too many people who are given headaches by the glasses, or made to feel sick.
3D in the home also had the problem of the cost of decent glasses, and wondering whether they'll work with your set. It looks like even the glasses free 3D sets won't be without problems. From what I have read, they'll only work if the viewer is in one of several fixed positions.
Regarding the comment about the move from analogue film to digital, I think that the film industry is being extremely short sighted. Why? Most digital cinema is shot at 2 or 4K resolution. Blu ray is currently 2k resolution. It's likely that the next generation home video system after Blu Ray will be 4 or 8K. The generation after than is likely to be 8 or 16K, but you won't see any improvement if the film was shot at 2 or 4K.
Analogue film has the advantage the resolution is, to all intents and purpose, infinite. So, whether it's reproduce on 2, 4, 8, 16K or whatever resolution we end up with, it should keep improving..
In fact, their attitude is rather suprising when you consider how much money they've made by re releasing old films in newer formats (thinking particularly of Disney here).
This you elucidated the key problem for 3-D:
"...are given headaches by the glasses..."
They need a projection system that doesn't require glasses and produces something you can walk around. Until then, it continues to be Epic Fail.
I would have agreed until recently, but I've recently seen a couple of 3D movies on LG TVs that use cheap passive glasses like the ones used in Cinemas, and I have to say that the results were impressive - far more impressive that I expected.
I'm not about to run out and replace my 4 year old 37" 720p TV just yet, but I think 3D support will be on my checklist when I do decide to upgrade the current set
3D, properly done, is a great asset
"Personally, I think that if done well, 3D can add a lot to certain types of film (action, children's films etc), but not all."
I disagree. Used correctly, 3D could provide a better feeling of reality to a dining room. Not just to make the car pop on your head, but for you to see the distance between a fork and a plate.
"Analogue film has the advantage the resolution is, to all intents and purpose, infinite."
No, it is not. You are limited by the size of the silver grains on the emulsion - and they are a lot bigger than we think.
Two links that might interest You:
Just thought I would add...
I've met Jim Blinn.
Seem to remember lots of hair.
Actually, it should triple the workload. You render the 3D eye-views (x2) then you render the 2D view.
Except, clearly you don't always. Arthur Christmas in 2D had these hideous jittering artifacts that I can only put down to coming from some fucked up reconstruction of the 2D movie from the 3D sources. Starts at frame #1, if you're on the lookout for it. I would have walked out if I wasn't there with 8-year-old in tow (who hates 3D - good taste).
Seriously, Hollywood, sort your shit out.
Animation's come a long way from Toy Story to Avatar
Sure, but if I remember correctly I enjoyed watching Toy Story.
You're not alone... Like many things it seems the technology and effects have taken over from a good story line. Toy Story didn't have the toys and spangles of Avatar, so the writers wrote a story which would be timeless, as opposted to re-hashing Pochahontas, or Last of the Mohicans, in space... with smurfs...
Same is true all over the place; Ask an RPG fan their favourite Final Fantasy and 9/10 will prefer whiskers (no wait, that's cats. They'll say FF7). even though the gameplay mechanics, graphics and spangles have been superceeded by a huge margin. When Square released the tech demo of the PS3 by rendering the FF7 intro in high-def, I can't count how many people yearned for the full game, rather than the XIII that was thrust upon them in its place.
I think the biggest shame is the films which think "we've got threedee now, we must make the most of it" and so fill the film with as many instances of stuff rushing toward the camera as possible, completely lacking any substance to weave them together (Final Destination, I'm looking at you!)
There is at least one other Render Farm in the UK other than Render Nation that I am aware of, Render Monkey (www.rendermonkey.co.uk)
And if you use SideFX Houdini, you can batch render on Amazon's EC2. Very cool.
what I dont get...
There's got to be a way to get a quick and dirty approximation of what the fully rendered frame(s) will look like. I get that it won't be as pixel perfect as the zillion-core-rendered overnight job, but there has to be some compromise which is 'fast enough to provide some level of interactivity for the artists' but doesn't cost a zillion cores per artist.
Full disclosure: I write code and my artistic ability never got much past stick figures. I'm familiar with most of the software mentioned in the article but couldn't, for my life, produce anything aesthetically pleasing with any of them. Therefore, I could well be completely wrong.
2D to 3D
Yesterday, I saw a trailer for Titanic 3D, which was presumably 3Dized by the aforementioned army pixel pushers in Mumbai. It looked terrible. Reminded me of peering into a View-Master.
<AutoDesk recommends nVidia cards. "You can use a competitors' hardware," says Howard, "but AutoDesk don't guarantee it.">
The correct way to phrase that is, "You can use a competitor's hardware, but Autodesk won't give you support if you do".
I was told to upgrade to a qualifying card in order to receive support (despite being a Subscription owner). I was also told I needed an extra twelve gigs of RAM. Turns out they were full of shit, and all I had to do was check "Preserve Memory" in the render dialog. This was for a bug that had nothing to do with the graphics card.
I fucking hate Autodesk. Their software gets buggier every year, and they make up for lack of innovation by buying everything around them and turning it into similarly buggy pieces of expensive rubbish.