I hope Adobe somehow standardizes access to the GPU within AE and opens it to 3rd party devs. Right now, every plugin with GPU Rendering functionality tries to checkout huge GPU buffers/use all the resources without any co-ordination & control, which usually leads to crashes. Another issue, GPU features are highly dependent on the hardware make & model and its impossible to test features on all modern GPU cards for 3rd party developers.
On Wednesday, May 9, 2012 at 9:21 AM, mylenium@mylenium.de wrote:
Nah. Cause and effect, you know. The problem is not GPU usage vs. CPU usage, the problem is simply this whole system being so poorly optimized. I mean, so far all the examples using this raytrace stuff are trivial beyond belief, yet all the brute force of a GPU can't compensate for those deficiencies when rendering some simple cheesy text. And you are wrong about CPUs not having evolved. Today's Xeons are compeltely different beasts than their predecessors were a few years ago jsut as a 3rd generation core7i is a different one than a first generation one 3 years ago. Still, even that isn't really the point - my core7i here still renders such a scene in the same time in Cinema4D, but without any GPU usage simply because its CPU code has evolved over the years and is pretty optimized. And C4D doesn't even have the fastest renderer. modo, Lightwave, Mental Ray and others are even further optimized, not to speak of hybrid GPGPU renderers like Arion, Maxwell RT, iRay or even pure GPU renderers like Octane built on CUDA rather than OptiX. And you are over-obsessing over "raytraced" and "interactive". I've said many times that I could do this since 2003 in FPrime for Lightwave and that one was 90% CPU based with some OpenGL thrown in here and there. So no matter how you pose the question, something is not right about the road Adobe chose. My only consolation really is that hopefully they really are working on reworking all the API underpinnings and then we may be talking when we have alternatives and need not rely on Adobe's good will. Until then I will remain extremely critical and skeptical of what I see so far. As far as I'm concerned, they simply have their priorities backwards and if you allow me - all this will ever be to me is a bling-bling feature that makes some nice marketing, but quickly falls apart in practical use. None of the people I talked to would upgrade to CS6 for this, but of course most of them are 3D artists...
Mylenium
[Pour Mylène, ange sur terre]
-----------------------------------------
www.mylenium.de
Steve Oakley < steveo@practicali.com> hat am 9. Mai 2012 um 18:00 geschrieben:
Ok, let me make some clarifications with the project open in front of me this morning -
I did make ONE render pass at the project with really high photons/pixel that did take a long time. it was too glossy looking. thats the one that stuck in my head
the pass it seems I posted was actually 6 photons/pixel, render time 12min 12 sec - thats 1080p24 with DoF and motion blur cranked up. lots of DoF and MB make for longer render times. turn them down / off and render time can drop to next to nothing.
why GPU ? because it provides a level of interactivity you can't get w/o massive processing. remember this is ray traced, not some OGL phong shader. while 3D does work on the CPU, yes its slow.
the next answer is simpler - CPU's have really kinda of stagnated the last couple of years. their performance gains have been marginal while the price of the top performers is thru the roof. its easy to spend $1k+ per Xeon CPU if you want the current screamer.
the latest GPU's are far far far cheaper. the newest 680 is what - $600 ? adding that to your current and probably a bit old machine will give it a lot more life then dropping $5k to just get started with a dual Xeon box. seriously my 8core MBP with Dual GPU's is still very viable for every day work. the GPU investment as stretched out my purchase cycle for an entirely new machine quite a bit.
you don't have to get a Quadro... I'm going to have a blog post about this soon... but the short answer is if you do this at home, casually at work, ect a GFX class card will probably be fine for you. thats a few hundred $. OTH if you edit all day long, do 3D in AE all day, render at nite and just pound hard on the machine then a Quadro is what you want because they are built for this level of continuous use and abuse. if you make your living from the machine, its not that hard a decision.
Prem Pro also greatly benefits from multiple GPU's too, or just faster ones.
So does speed grade
and davinci resolve
and photoshop
given that all my main apps are seeing a lot of great improvements from more / better GPU's I'm there. also note that Prem Pro has OCL acceleration now - so ATI GPU's are getting some love too now.
it just amazes me.... people say "I want you to max out the CPU's, RAM, GPU's of my machine. I've got'm, please use them. Then when an app does...."
S
On May 9, 2012, at 9:53 AM, Greg Balint wrote:
I agree Mylenium,
I was super excited to hear about the 3d stuff being added to AE. I thought. "man, this will save me a lot of time and effort coordinating renders from c4d by just doing simple things right in AE." But I have to say I'm severely disappointed with this. I figured from the simple extrusion feature that this would be some quick turn around 3d feature, but with all of that horsepower, and still 40 minutes render? I could probably render the same scene, with native DoF and motion blur directly in C4d in about 5 minutes in C4D.
Now I'm not bashing the example piece itself. But I can't see how I'd even be able to stand watching all of that processing power take forever for a 3d extrusion of some text with reflections and shadows.
No doubt this is a welcome feature for people who don't have any 3d package to work with, but for those of us that do, I don't see any of us using this feature for production environment speedy work.
I also wonder why the 3d implementation was decided to work solely on gpu processing, when CPU is obviously more versatile and powerfulas seen with 3d packages that don't use gpu at all for rendering. As a PC user with a beefy ATI card installed, it surely feels like Adobe has its pockets lined with Nvidia money.
Just my opinion here, but I'm way more excited about the caching features they've added.
////Greg Balint
///Art Director / Motion Graphics Designer
Nobody is arguing the need for some sort of 3D in AE (though not exactly that kind of 3D as far as I'm concerned) and the marketability of such a feature, but seriously, if it takes 40 minutes on a machine
with a beefy 4000 series Quadro and an additional Tesla board, then something is severely is askew and one can only wonder why the AE team chose this approach in the first place. I wouldn't even wanna know how long it takes to render this if AE falls back to its software mode. 2 hours? 3 hours? 5 hours? In a day and age where software renderers in 3D programs churn out frames with full global illumination and all the bells and whistles in minutes even without any GPU involvement, what AE has to offer compares poorly.
Mylenium
[Pour Mylène, ange sur terre]
-----------------------------------------
www.mylenium.de
Steve Oakley <
steveo@practicali.com
> hat am 9. Mai 2012 um 16:07 geschrieben:
I don't have or use C4D so I couldn't render anything 3D that way. AE is my app of choice, its what I have in front of me to use. please keep in mind this is all new for AE - ray tracing and the start of real 3D. its an entirely new look / capabilities. for how long have people asked for real 3D in AE ? forever. I don't think knocking is right.
16 photons was about right. I had done some lower passes and it was too grainy. not saying that wasn't a good look...I'll go check the project again.
I've seen some other really nice work done which will hopefully surface soon.
S
On May 9, 2012, at 8:37 AM, James WIlson wrote:
I think 16 photons might be overkill considering the dark nature and minimal amount of objects to bounce light off of. Maybe it could have been rendered in less time with the same results if you took that setting down. It does look nice.
Yeah, and the same stuff probably renders in the same time in C4D - without any GPU fancies required and on a 3 year old machine. Ridiculous and nothing to write home about.
Mylenium
[Pour Mylène, ange sur terre]
-----------------------------------------
www.mylenium.de
Steve Oakley <
steveo@practicali.com
> hat am 9. Mai 2012 um 07:27 geschrieben:
> ok... now that AE CS6 is loose I'll show you some titles I made for my web shows. CS6 rendered with Q4000 + Tesla C2075 in 40 min as I recall... thats with cranked up DoF, motion blur, 16 photons / pixel - ie real nice settings :) its quick, don't blink. totally done in AE
>
>
http://www.youtube.com/watch?v=X-ZNVyGSVIY&feature=youtu.be
>
|