Return-Path: Received: from moutng.kundenserver.de ([212.227.17.8] verified) by media-motion.tv (CommuniGate Pro SMTP 4.2.10) with ESMTP id 4710290 for AE-List@media-motion.tv; Wed, 09 May 2012 18:20:23 +0200 Received: from oxbaltgw08.schlund.de (oxbaltgw08.schlund.de [172.19.246.14]) by mrelayeu.kundenserver.de (node=mrbap1) with ESMTP (Nemesis) id 0LzpYp-1S5pXI1btL-014IHj; Wed, 09 May 2012 18:21:59 +0200 Date: Wed, 9 May 2012 18:21:59 +0200 (CEST) From: "mylenium@mylenium.de" Reply-To: "mylenium@mylenium.de" To: After Effects Mail List Message-ID: <2016839092.936757.1336580519427.JavaMail.open-xchange@email.1und1.de> In-Reply-To: References: Subject: Re: [AE] made in CS6 MIME-Version: 1.0 Content-Type: multipart/alternative; boundary="----=_Part_936756_1524823176.1336580519360" X-Priority: 3 Importance: Medium X-Mailer: Open-Xchange Mailer v6.20.0-Rev36 X-Provags-ID: V02:K0:rEIpcoF6KSwqHLVQo7o3WlNGBl020ceKH+tQfk1Qwhr xWhANR87fon2sXgBXJQNfsb2nKMBXgzq4FkUhRGwn76UGUKrHB T501W8sX99h1LjCLJ9soWwJD1DNl+mbCqmKvJv1ZeRRsDQpk67 vjhyAJ61bs8lHVkGKnQPXL8Uf/GrWtbTcRIW5n1c0sNWq0OvPr 9hw3CypJkbN/Riw8oPQ1c4ny8CLfKWVfEWPWvUBnNu2P2txXTV 0A+PSKLmRDehEw/ZgfrRj4BO6zNlKZrJt9jxtl3bO202Qoy5+H lm8cXdiHCUoTgXOowO997rQZK/pTl0yD09ikw7dOA3abzNApPU hrGk/GBgNhcfhH245Knup1blLtFpORZX1LcrmywNK9zHzXNHlz E9KMf23Q5Z8AtKbJBiMHYbgwrrLTkNi8L1jhRuddSt2gf6Q+Fq lsLVv ------=_Part_936756_1524823176.1336580519360 MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable Nah. Cause and effect, you know. The problem is not GPU usage vs. CPU usage, the problem is simply this whole system being so poorly optimized. I mean, so far all the examples using this raytrace stuff are trivial beyond belief, yet all the brute force of a GPU can't compensate for those deficiencies when rendering some simple cheesy text. And you are wrong about CPUs not having evolved. Today's Xeons are compeltely different beasts than their predecessors were a few years ago jsut as a 3rd generation core7i is a different one than a first generation one 3 years ago. Still, even that isn't really the point - my core7i here still renders such a scene in the same time in Cinema4D, but without any GPU usage simply because its CPU code has evolved over the years and is pretty optimized. And C4D doesn't even have the fastest renderer. modo, Lightwave, Mental Ray and others are even further optimized, not to speak of hybrid GPGPU renderers like Arion, Maxwell RT, iRay or even pure GPU renderers like Octane built on CUDA rather than OptiX. And you are over-obsessing over "raytraced" and "interactive". I've said many times that I could do this since 2003 in FPrime for Lightwave and that one was 90% CPU based with some OpenGL thrown in here and there. So no matter how you pose the question, something is not right about the road Adobe chose. My only consolation really is that hopefully they really are working on reworking all the API underpinnings and then we may be talking when we have alternatives and need not rely on Adobe's good will. Until then I will remain extremely critical and skeptical of what I see so far. As far as I'm concerned, they simply have their priorities backwards and if you allow me - all this will ever be to me is a bling-bling feature that makes some nice marketing, but quickly falls apart in practical use. None of the people I talked to would upgrade to CS6 for this, but of course most of them are 3D artists... Mylenium [Pour Myl=C3=A8ne, ange sur terre] ----------------------------------------- www.mylenium.de Steve Oakley hat am 9. Mai 2012 um 18:00 geschrieben: > with the project open in front of me this morning - >=20 > I did make ONE render pass at the project with really high photons/pixel > that did take a long time. it was too glossy looking. thats the one that > stuck in my head >=20 > the pass it seems I posted was actually 6 photons/pixel, render time 12mi= n > 12 sec - thats 1080p24 with DoF and motion blur cranked up. lots of DoF a= nd > MB make for longer render times. turn them down / off and render time can > drop to next to nothing. >=20 > why GPU ? because it provides a level of interactivity you can't get w/o > massive processing. remember this is ray traced, not some OGL phong shade= r. > while 3D does work on the CPU, yes its slow. >=20 > the next answer is simpler - CPU's have really kinda of stagnated the las= t > couple of years. their performance gains have been marginal while the pri= ce > of the top performers is thru the roof. its easy to spend $1k+ per Xeon C= PU > if you want the current screamer. >=20 > the latest GPU's are far far far cheaper. the newest 680 is what - $600 ? > adding that to your current and probably a bit old machine will give it a > lot more life then dropping $5k to just get started with a dual Xeon box. > seriously my 8core MBP with Dual GPU's is still very viable for every day > work. the GPU investment as stretched out my purchase cycle for an entire= ly > new machine quite a bit. >=20 > you don't have to get a Quadro... I'm going to have a blog post about thi= s > soon... but the short answer is if you do this at home, casually at work, > ect a GFX class card will probably be fine for you. thats a few hundred $= . > OTH if you edit all day long, do 3D in AE all day, render at nite and jus= t > pound hard on the machine then a Quadro is what you want because they are > built for this level of continuous use and abuse. if you make your living > from the machine, its not that hard a decision. >=20 > Prem Pro also greatly benefits from multiple GPU's too, or just faster > ones. >=20 > So does speed grade >=20 > and davinci resolve >=20 > and photoshop >=20 > given that all my main apps are seeing a lot of great improvements from > more / better GPU's I'm there. also note that Prem Pro has OCL accelerati= on > now - so ATI GPU's are getting some love too now. >=20 > it just amazes me.... people say "I want you to max out the CPU's, RAM, > GPU's of my machine. I've got'm, please use them. Then when an app > does...." >=20 > S >=20 > On May 9, 2012, at 9:53 AM, Greg Balint wrote: >=20 > > I agree Mylenium, > >=20 > > I was super excited to hear about the 3d stuff being added to AE. I > > thought. "man, this will save me a lot of time and effort coordinating > > renders from c4d by just doing simple things right in AE." But I have t= o > > say I'm severely disappointed with this. I figured from the simple > > extrusion feature that this would be some quick turn around 3d feature,= but > > with all of that horsepower, and still 40 minutes render? I could prob= ably > > render the same scene, with native DoF and motion blur directly in C4d = in > > about 5 minutes in C4D. > >=20 > > Now I'm not bashing the example piece itself. But I can't see how I'd e= ven > > be able to stand watching all of that processing power take forever for= a > > 3d extrusion of some text with reflections and shadows. > >=20 > > No doubt this is a welcome feature for people who don't have any 3d pac= kage > > to work with, but for those of us that do, I don't see any of us using = this > > feature for production environment speedy work. > >=20 > >=20 > > I also wonder why the 3d implementation was decided to work solely on g= pu > > processing, when CPU is obviously more versatile and powerfulas seen wi= th > > 3d packages that don't use gpu at all for rendering. As a PC user with= a > > beefy ATI card installed, it surely feels like Adobe has its pockets li= ned > > with Nvidia money. > >=20 > > Just my opinion here, but I'm way more excited about the caching featur= es > > they've added. > >=20 > > ////Greg Balint > > ///Art Director / Motion Graphics Designer > > delRAZOR.com/ > >=20 > > On May 9, 2012, at 10:20 AM, " mylenium@mylenium.de > > " < mylenium@mylenium.de > > > wrote: > >=20 > > > Nobody is arguing the need for some sort of 3D in AE (though not exac= tly > > > that kind of 3D as far as I'm concerned) and the marketability of suc= h a > > > feature, but seriously, if it takes 40 minutes on a machine with a be= efy > > > 4000 series Quadro and an additional Tesla board, then something is > > > severely is askew and one can only wonder why the AE team chose this > > > approach in the first place. I wouldn't even wanna know how long it t= akes > > > to render this if AE falls back to its software mode. 2 hours? 3 hour= s? 5 > > > hours? In a day and age where software renderers in 3D programs churn= out > > > frames with full global illumination and all the bells and whistles i= n > > > minutes even without any GPU involvement, what AE has to offer compar= es > > > poorly. > > >=20 > > > Mylenium > > >=20 > > >=20 > > > [Pour Myl=C3=A8ne, ange sur terre] > > > ----------------------------------------- > > > www.mylenium.de > > >=20 > > >=20 > > >=20 > > >=20 > > > Steve Oakley < steveo@practicali.com >= hat > > > am 9. Mai 2012 um 16:07 geschrieben: > > >=20 > > > > couldn't render anything 3D that way. AE is my app of choice, its w= hat I > > > > have in front of me to use. please keep in mind this is all new for= AE - > > > > ray tracing and the start of real 3D. its an entirely new look / > > > > capabilities. for how long have people asked for real 3D in AE ? > > > > forever. > > > > I don't think knocking is right. > > > >=20 > > > > 16 photons was about right. I had done some lower passes and it was= too > > > > grainy. not saying that wasn't a good look...I'll go check the proj= ect > > > > again. > > > >=20 > > > > I've seen some other really nice work done which will hopefully > > > > surface > > > > soon. > > > >=20 > > > > S > > > >=20 > > > > On May 9, 2012, at 8:37 AM, James WIlson wrote: > > > >=20 > > > > > I think 16 photons might be overkill considering the dark nature = and > > > > > minimal amount of objects to bounce light off of. Maybe it could = have > > > > > been > > > > > rendered in less time with the same results if you took that sett= ing > > > > > down. > > > > > It does look nice. > > > > >=20 > > > > >=20 > > > > > On May 9, 2012, at 2:47 AM, mylenium@mylenium.de > > > > > wrote: > > > > >=20 > > > > > > Yeah, and the same stuff probably renders in the same time in C= 4D - > > > > > > without > > > > > > any GPU fancies required and on a 3 year old machine. Ridiculou= s and > > > > > > nothing to write home about. > > > > > >=20 > > > > > > Mylenium > > > > > >=20 > > > > > >=20 > > > > > > [Pour Myl=C3=A8ne, ange sur terre] > > > > > > ----------------------------------------- > > > > > > www.mylenium.de > > > > > >=20 > > > > > >=20 > > > > > >=20 > > > > > >=20 > > > > > > Steve Oakley < steveo@practicali.com > > > > > > > hat > > > > > > am 9. Mai 2012 um 07:27 geschrieben: > > > > > >=20 > > > > > > > ok... now that AE CS6 is loose I'll show you some titles I m= ade > > > > > > > for my > > > > > > web shows. CS6 rendered with Q4000 + Tesla C2075 in 40 min as I > > > > > > recall... > > > > > > thats with cranked up DoF, motion blur, 16 photons / pixel - ie= real > > > > > > nice > > > > > > settings :) its quick, don't blink. totally done in AE > > > > > > > > > > > > > > http://www.youtube.com/watch?v=3DX-ZNVyGSVIY&feature=3Dyoutu.= be > > > > > > > > > > > > >=20 > > > > > >=20 > > >=20 > > >=20 ------=_Part_936756_1524823176.1336580519360 MIME-Version: 1.0 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable

Nah. Cause and effect, you know. The problem is not GPU usage vs.= CPU usage, the problem is simply this whole system being so poorly optimiz= ed. I mean, so far all the examples using this raytrace stuff are trivial b= eyond belief, yet all the brute force of a GPU can't compensate for tho= se deficiencies when rendering some simple cheesy text. And you are wrong a= bout CPUs not having evolved. Today's Xeons are compeltely different be= asts than their predecessors were a few years ago jsut as a 3rd generation = core7i is a different one than a first generation one 3 years ago. Still, e= ven that isn't really the point - my core7i here still renders such a s= cene in the same time in Cinema4D, but without any GPU usage simply because= its CPU code has evolved over the years and is pretty optimized. And C4D d= oesn't even have the fastest renderer. modo, Lightwave, Mental Ray and = others are even further optimized, not to speak of hybrid GPGPU renderers l= ike Arion, Maxwell RT, iRay or even pure GPU renderers like Octane built on= CUDA rather than OptiX. And you are over-obsessing over "raytraced"= ; and "interactive". I've said many times that I could do this = since 2003 in FPrime for Lightwave and that one was 90% CPU based with some= OpenGL thrown in here and there. So no matter how you pose the question, s= omething is not right about the road Adobe chose. My only consolation reall= y is that hopefully they really are working on reworking all the API underp= innings and then we may be talking when we have alternatives and need not r= ely on Adobe's good will. Until then I will remain extremely critical a= nd skeptical of what I see so far. As far as I'm concerned, they simply= have their priorities backwards and if you allow me - all this will ever b= e to me is a bling-bling feature that makes some nice marketing, but quickl= y falls apart in practical use. None of the people I talked to would upgrad= e to CS6 for this, but of course most of them are 3D artists...


Mylenium

[Pour Mylène, ange sur terre]
-----------------------------------------
www.mylenium.de


Steve Oakley <steveo@practicali.com> hat am 9. Mai 2012 um 18:00= geschrieben:

Ok, let me make some clarifications with the project open in front of = me this morning -
 
I did make ONE render pass at the project with really high photon= s/pixel that did take a long time. it was too glossy looking. thats the one= that stuck in my head
 
the pass it seems I posted was actually 6 photons/pixel, render t= ime 12min 12 sec - thats 1080p24 with DoF and motion blur cranked up. lots = of DoF and MB make for longer render times. turn them down / off and render= time can drop to next to nothing.
 
why GPU ? because it provides a level of interactivity you can= 9;t get w/o massive processing. remember this is ray traced, not some OGL p= hong shader. while 3D does work on the CPU, yes its slow. 
 
the next answer is simpler - CPU's have really kinda of stagn= ated the last couple of years. their performance gains have been marginal w= hile the price of the top performers is thru the roof. its easy to spend $1= k+ per Xeon CPU if you want the current screamer. 
 
the latest GPU's are far far far cheaper. the newest 680 is w= hat - $600 ? adding that to your current and probably a bit old machine wil= l give it a lot more life then dropping $5k to just get started with a dual= Xeon box. seriously my 8core MBP with Dual GPU's is still very viable = for every day work. the GPU investment as stretched out my purchase cycle f= or an entirely new machine quite a bit. 
 
you don't have to get a Quadro... I'm going to have a blo= g post about this soon... but the short answer is if you do this at home, c= asually at work, ect a GFX class card will probably be fine for you. thats = a few hundred $. OTH if you edit all day long, do 3D in AE all day, render = at nite and just pound hard on the machine then a Quadro is what you want b= ecause they are built for this level of continuous use and abuse. if you ma= ke your living from the machine, its not that hard a decision. 
 
Prem Pro also greatly benefits from multiple GPU's too, or ju= st faster ones. 
 
So does speed grade
 
and davinci resolve
 
and photoshop
 
given that all my main apps are seeing a lot of great improvement= s from more / better GPU's I'm there. also note that Prem Pro has O= CL acceleration now - so ATI GPU's are getting some love too now.
 
it just amazes me.... people say "I want you to max out the C= PU's, RAM, GPU's of my machine. I've got'm, please use them= . Then when an app does...." 
 
S

On May 9, 2012, at 9:53 AM, Greg Balint wrote:

I agree Mylenium,
 
I was super excited to hear about the 3d stuff being added to= AE. I thought. "man, this will save me a lot of time and effort coordi= nating renders from c4d by just doing simple things right in AE." But I= have to say I'm severely disappointed with this. I figured from the si= mple extrusion feature that this would be some quick turn around 3d feature= , but with all of that horsepower, and still 40 minutes render?  I cou= ld probably render the same scene, with native DoF and motion blur directly= in C4d in about 5 minutes in C4D. 
 
Now I'm not bashing the example piece itself. But I can&#= 39;t see how I'd even be able to stand watching all of that processing = power take forever for a 3d extrusion of some text with reflections and sha= dows. 
 
No doubt this is a welcome feature for people who don't h= ave any 3d package to work with, but for those of us that do, I don't s= ee any of us using this feature for production environment speedy work.= 0;
 
 
I also wonder why the 3d implementation was decided to work s= olely on gpu processing, when CPU is obviously more versatile and powerfula= s seen with 3d packages that don't use  gpu at all for rendering. = As a PC user with a beefy ATI card installed, it surely feels like Adobe ha= s its pockets lined with Nvidia money. 
 
Just my opinion here, but I'm way more excited about the = caching features they've added. 

////Greg Balint
///Art Director / Motion Graphics Designer

On May 9, 2012, at 10:20 AM, " mylenium@mylenium.de " < mylenium@mylenium.de > wrote:
 
Nobody is arguing the need for some sort of 3D in AE (though n= ot exactly that kind of 3D as far as I'm concerned) and the marketabili= ty of such a feature, but seriously, if it takes 40 minutes on a machine with a beefy 4000 series Quadro and an additional Tesla = board, then something is severely is askew and one can only wonder why the = AE team chose this approach in the first place. I wouldn't even wanna k= now how long it takes to render this if AE falls back to its software mode.= 2 hours? 3 hours? 5 hours? In a day and age where software renderers in 3D= programs churn out frames with full global illumination and all the bells = and whistles in minutes even without any GPU involvement, what AE has to of= fer compares poorly.

Mylenium

 

[Pour Mylène, ange sur terre]
-----------------------------------------
www.mylenium.de

 


Steve Oakley < steveo@practicali.com<= /a> > hat am 9. Mai 2012 um 16:07 geschrieben:

I don't have or use C4D so I couldn't render anything= 3D that way. AE is my app of choice, its what I have in front of me to use= . please keep in mind this is all new for AE - ray tracing and the sta= rt of real 3D. its an entirely new look / capabilities.  for how long = have people asked for real 3D in AE ? forever. I don't think knock= ing is right.
 
16 photons was about right. I had done some lower passes= and it was too grainy. not saying that wasn't a good look...I'll g= o check the project again.
 
 I've seen some other really nice work done whi= ch will hopefully  surface soon.
 
S

On May 9, 2012, at 8:37 AM, James WIlson wrote:

I think 16 photons might be overkill considering the dark= nature and minimal amount of objects to bounce light off of. Maybe it coul= d have been rendered in less time with the same results if you took that se= tting down. It does look nice.
 

 

 
------=_Part_936756_1524823176.1336580519360--