Nov 2006
1 / 21
Nov 2006
Aug 2012

Hello!

I'm desperately trying to get the vector blur to work, but for some reason I also always end up with a choked alpha channel. The interior of an object gets blurred fine, but the edges are just way to sharp.

I tested almost all ways to get UV velocity information from Maya to Nuke that I could find, lm2DV shaders with smoothkit and reelsmart settings, the puppet shader, a mel-script from alias (originally for maya4, recompiled it for 8) that converts maya motion vectors stored in .iff files into rg(-space - but always end up with the choked edges. Outputting mental ray velocity info via .mt files or control buffers I didn't try so far, but I suppose there won't be such a big difference....soooo....

Maybe I'm doing something wrong in comping?

I'm piping the rg of the velocity info into the uv-channels and use these as input for the vector blur and process rgba - so far, I tested all possible options/settings in the vector blur node itself, at least I think so...maybe something else has to be done before to the alpha or it is necessary to provide a supplementary vector alpha, whatever...so far, I was only using the r and the g channel of the rendered velocity info...
Was testing with 16bit iff files.

Anyone out there who's got the vector blur working with good result?
Would be so nice to get some help on this...

PS: I also found out, that Nuke can acces the uv-info stored in maya-iff files when checking "keep motion vectors" - unfortunately, it seems to decode it wrong, but nevertheless it can see that there's a uv channel - maybe somebody can implement that alias script (also found the .cpp sourcecode) into Nuke somehow so that, as soon as the choked alpha problem is gone, u can acces this info directly? Don't know its bit depth, though....nor do I have proper programming/scripting skills to do that...

Cheers, michi.

  • created

    Nov '06
  • last reply

    Aug '12
  • 20

    replies

  • 25.8k

    views

  • 1

    user

  • 6

    links

you are saying you're using tiffs with the vector data in the red and green channels. did you switch the Read node to linear colour space?
If not Nuke will assume you are pulling in sRgb color space and will apply a lut to convert to linear. This would screw your vectors information which might be what you're seeing.

Just a guess though.
cheers,
frank

Hey,

no, I was using maya .iff- files which can store rgbaz+uv natively...I also switched the
colorspace to sRGB, but unfortunately that's not the reason for the choked alpha...

I'll try to put the files on a server pretty soon, think that says more than 1000 words..

But thx for the guess!

yep, would be good to have a looksee. I've used vector blur successfully in the past but I think I was always using it as camera blur not object blur.

I have a thread going on this very same topic. I put an email into D2 support last week to try to get the skinny on what Nuke is expecting -1-1, 0-1, etc. but I haven't got any response. If I get a response I'd be more than happy to write a shader or phenomenon or whatever we can get away with for this. So, if anyone can get that information or even a file that is working with uv info and send it to me, I'd be happy to share my results.

Frank, you mentioned that you always have used vectorblur for camera blur, so what was your pipeline for object motion blur? (if your willing to share)

Well, I can't recall having to apply object motion blur to a 3D render, at least not as a featured blur, more to fix low sampling in already existing motion blur.
If the object is animated in Nuke I tend to use multi sample blur which is much nicer (but of course way slower).
So I'm afraid I'm not of too much help here.

frank

So in other words, your object motion blur came entirely from the 3d guys? Render farms are a good thing...

Allright,

as mentioned before - here's the files I was testing and messing around today...

www.hdm-stuttgart.de/~mr20/downloads/vectorblur.zip5 - around 30MB

There's 8x 10frame- filesequences that should be named self-epxlanatory, a nuke script using these and a .mel script including cpp-sourcecode that converts .iff maya motion vectors into rgb-files.

I rendered an .iff16 sequence in maya with maya standard renderer containing motion vectors, converted them with the script (unfortunately only 8bits so far...) and got a pretty decent, but not perfect result with some processing before using vector blur and feeding them in as uv info - as comparison served a sequence with rendered 2D MB (also inlcuded). Pretty clear that the quality is definitely not comparable with true 3D-MB, but it is a start and can be a big time saver....

My tests with the lm2dv shader for mr were less succesful, reelsmart setting doesn't work that well, smoothkit seems to be a bit better... (see detailed info on 6http://www.revisionfx.com/generalfaqsMVFrom3D.htm6 )

Furthermore I was testing a mental ray shader named p_motiontoRGB /http://www.puppet.cgtalk.ru/download/index_e.shtml) and rendered mental ray vector in a float-exr sequence...also with little success.

Unfortunately, I have no clue what the UV-info vector blur expects is supposed to look like, was also asking the guys from D2 but didn't get an answer yet...

We don't have renderman, so a shader like mentioned before doesn't help that much...but think that should also be possible in mental ray...

Or maybe somebody can take the source code for the mel-script and turn it into a tcl-script or gizmo, so that nuke can use the maya vectors directly....would be awesome....especially if that'd be in 16bit or even float...
unfortunately I don't have the skills to do that...

Maybe somebody finds a better solution or already has one, I'd be very happy to get feedback from you guys,

cheers, michi.

QUOTE(Scott212 @ 11/30/06, 12:11 PM) [snapback]253661[/snapback]
Render farms are a good thing...

true :smiley:

with all these mb issues floating around maybe I should pick up an old R&D project again where a colleague and I set up a combination between vector blur and multi sample blur; every sample was vector blurred slightly before blended together to the final image so you could balance quality and render time.
I'd have to start from scratch though as I left all my nuke scripts at the company I was with last year.

I don't have too much time to look at the whole thing at the moment but one thing that odes jump out on me is that you are applying an sRGB lookup to the motion data which is most likely not correct (unless Maya renders such data in non-linear sRGB space which would be very odd).
Switch the color space in the Read nodes for your vector data to "linear" or "raw" otherwise, when left on "auto-detect", Nuke will assume that iff files are sRGB space and apply a lut.
Where this may not be the only issue here it certainly gives you a better start.

Another thing I remember from playing with motion blur in Nuke is that with vector blur you have to use a shutter offset twice the size of that of a TimeBlur (multi sample ) to match the two.
This indicates that the shutter offset may be a culprit as well?!

As for how Nuke interprets motion data, it's easy to test.

Just attach a 2DTransform node to something and animate it across X and/or Y from 0 @ frame 1 to 1 @ frame 2 (for example). attach a MotionBlur2D node and plug the animated transform into the side input. This will generate Nuke's native uvs for vector blur which you can then examine and measure in relation to the amount of motion you're applying through the transform node.
A quick test reveals:
moving something across frame (left to right) by 1 pixel over two frames will give you a u value of 0.5 at frame 1.5. moving it right to left will return -0.5. Moving it by 100 pixels will return 50 and -50 respectively. Doing a few more tests and looking at values at keyframes where the motion stops or even changes direction reveals the following:
Nuke takes the distance a pixel moves from the previous frame to the current frame and adds the distance the same pixel moves from the current frame to the next. Where the distance is in pixels (i.e. not normalized) and directions are expressed in positive and negative values (left to right or bottom to top is positive, right to left or top to bottom is negative).

I think that's about all there is to know about the vector info in Nuke. Here are the nodes I looked at:

set cut_paste_input [stack 0]
version 4.5027
CheckerBoard2 {
inputs 0
centerlinecolor {1 1 0 1}
name CheckerBoard1
selected true
xpos -345
ypos -176
}
Transform {
translate {{curve l x1 0 k 1 s1 t-1} 0}
center {320 240}
name Transform1
selected true
xpos -345
ypos -107
}
set N2b4c5e0 [stack 0]
push $N2b4c5e0
MotionBlur2D {
inputs 2
name MotionBlur2D1
selected true
xpos -345
ypos -81
}

frank

Hi all,

Get the motion vector sequence rendered with LM_2DMV using ReelSmart Motion Blur settings (use linear space), unpremult it by it's alpha and then add a color expression node with the following:

For R set -> ((r-0.5)*width)/2
For G set -> ((g-0.5)*height)/2

I haven't done a stress test yet, but it seems to be working ok.

cheers,

Hey folks,

with the tips from diogo and frank I finally got the vectors blur to work properly - using lm2dmv shaders the motion blur looks pretty close to the rendered MR-motion blur, thx!

I will still try to get the maya .iff motion vector thing to work, also seems to be an interessting approach to me - I'll let you know as soon as I got something.

Glat to know it worked out.

Yeah keep us posted about the .iff MVs.

Cheers,

4 months later
15 days later

If anyone is interested, I have slightly modified this Renderman shader that I obtained from

http://www.3delight.com/en/modules/PunBB/v...p?id=61&p=16

(Thanks to the original author Mauritius).

...to get motion vectors from Maya in a Nuke-friendly format. The only modifications I made were to delete the normalising and clamping parameters, and to invert the red channel at the end. The output matches motion blurred 3Delight output quite closely when you adjust the shutter offset in VectorBlur (also, set the method to forward and use the alpha). Don't forget to render from Maya in float, and to turn "samplemotion" off [0] in the rib.

HTH

Michael


surface nukeVect()
{
// Get blur vector on image plane in pixels
point rasterP = transform( "raster", P );
point rasterPdPdtime = transform( "raster", P + dPdtime );
vector pixelsMoved = rasterPdPdtime - rasterP;

// Invert the red channel
Ci = color( float (( 1-(comp(pixelsMoved,0)))), comp(pixelsMoved, 1), comp(pixelsMoved, 2));
Oi = 1;

}

11 days later
10 days later

QUOTE(NEO^AMiGA @ 04/17/07, 08:19 AM) [snapback]264236[/snapback]
Hey guys!

Can't really get it ito work with this apporach. Dunno what I'm doing wrong. Some screenshots can be found here:
http://www.vfxtalk.com/forum/showpost.php?...amp;postcount=51

You might need to change the expression you're using to convert normalized values to offset values to ignore the zeros:

r==0?0:((r-0.5)*width)/2
g==0?0:((g-0.5)*height)/2

And also, change knob
VectorBlur.method forward

from its default backward... it's slow though.

-Ean

9 months later

Do you still have the mel script for extracting an rbg image from the motion vector data in an iff? I've been looking for something like this for a while.

QUOTE(Michi0711 @ 11/30/06, 12:49 AM) [snapback]253608[/snapback]
Hello!

I'm desperately trying to get the vector blur to work, but for some reason I also always end up with a choked alpha channel. The interior of an object gets blurred fine, but the edges are just way to sharp.

I tested almost all ways to get UV velocity information from Maya to Nuke that I could find, lm2DV shaders with smoothkit and reelsmart settings, the puppet shader, a mel-script from alias (originally for maya4, recompiled it for 8) that converts maya motion vectors stored in .iff files into rg(-space - but always end up with the choked edges. Outputting mental ray velocity info via .mt files or control buffers I didn't try so far, but I suppose there won't be such a big difference....soooo....

Maybe I'm doing something wrong in comping?

I'm piping the rg of the velocity info into the uv-channels and use these as input for the vector blur and process rgba - so far, I tested all possible options/settings in the vector blur node itself, at least I think so...maybe something else has to be done before to the alpha or it is necessary to provide a supplementary vector alpha, whatever...so far, I was only using the r and the g channel of the rendered velocity info...
Was testing with 16bit iff files.

Anyone out there who's got the vector blur working with good result?
Would be so nice to get some help on this...
PS: I also found out, that Nuke can acces the uv-info stored in maya-iff files when checking "keep motion vectors" - unfortunately, it seems to decode it wrong, but nevertheless it can see that there's a uv channel - maybe somebody can implement that alias script (also found the .cpp sourcecode) into Nuke somehow so that, as soon as the choked alpha problem is gone, u can acces this info directly? Don't know its bit depth, though....nor do I have proper programming/scripting skills to do that...

Cheers, michi.

1 month later

Hey,

I'll have a look - I might still have it flying around somewhere....

QUOTE(dziegler @ 03/13/08, 06:41 AM) [snapback]282541[/snapback]
Do you still have the mel script for extracting an rbg image from the motion vector data in an iff? I've been looking for something like this for a while.

3 years later

Ok guys, I had a problem with the motionblur on premultiplied rendered 3d. Where the edges where waaaay to sharp around the object...

solution: In the vector blur node, you have an option where you can chose an alpha (above mask options), set this to your objects alpha. And don't forget to set method to "Forward".

Hope it helps! Cheers