3D Models and Terragen - Basics

Started by j meyer, July 03, 2015, 03:17:21 PM

Previous topic - Next topic

j meyer

Ulco it would be nice if you could mail me a simple example of such a no UV obj
with a strange preview so I could look into it.

As for the smoothing thing:
1.generally - as far as I know you don't smooth normals,but you set a creasing angle
which determines if an edge is considered to be soft or sharp by the render engine.So
that the "smooth normals" effect/feature of the render engine can take that into account
Otherwise it would constantly try to smooth out anything.(See first post)

To avoid further confusion about that - at least in this thread - please let us all try
to use the same terms.

2.Yeah,I also have problems to understand the way PoseRay handles or better defines
the angles.Pretty confusing.Still trying to find out though.
3.Unless you need sharp edges on your humans or clothes you should not recalculate the
normals at all.It's really only necessary in cases where hard (sharp,creased) edges are
needed/wanted.(or if the normals are missing of course)
As for the 45° angle it should work with a setting of 40 -45° anything above and you
see these "shadows".That's the same as the ugly black stuff in the first post.
It just looks a bit different in P-ray.You can use that as an indicator wether your edges
are creased correctly or not.
You said you see that effect ( the "shadows") also with settings below.
There seems to be some change in P-rays behaviour at 35° as it creases angles greater
than 90° with that setting too.
Have to investigate that some more.

paq

#16
Hello,

I'm not sure I will help about the confusion here, my english is very basic.

The problem here has nothing to do with angle based smoothing .. whatever the way you split your normal (manually, angle based), self raytrace shadow artifact will allways happens until something is fixed in the renderer itself.

Gouraud shading will smooth the normals to make a polymesh looking smooth, but raytrace shadow use the 'real' geometry to compute the rays. As a result gouraund shading and raytrace self-shadow dont match.
You can easily see why if you render the mesh without any smoothing.

As I said previously some engine fixed this problem by introduce a bias either by light, or by object. This bias will cut ray hit point within the distance define by the user (bias) and fixed the glitch.
I prefer the 'by object' method because the bias needed can be very different, big utlra low mesh need a much bigger 'fix' distance than utlra hires mesh.

This bias will also shrink the shadow of every object in the scene, that's why a 'by object' solution seems better.

My guess is that there is allready a minimal global bias distance in Terragen, to deal with doubled sided geometry, but maybe the bias is not big enough to deal with very low geometry like posted here.




Gameloft

Dune

So maybe Matt should introduce a bias slider per object?

I'll see if I still have an unmapped object, or I'll fix one. Problem with my workflow is that if I set merge points in LW, I see torn edges in the maps in Poseray. So I unweld everything in LW and only weld in Poseray. Then I have to recalculate normals.

j meyer

paq - too technical for me,but thanks anyway.

Ulco - did Kadri's post with the P-ray quote help you?
I did some more experiments last night and think I understand how to make it
work for me at least.Ran out of time before I could render some examples.
Hopefully tomorrow,if that's still wanted.
Do you do your UV mapping in LW before you merge points? If so,chances are
good that that messes up your UVs and thus the torn edges.As far as I know
you can split/cut parts after UVing,but welding pieces afterwards requires
some repair work.



paq

#19
Hey Dune,

I'm still not really sure why you use PoseRay ? To have hard/smooth edge working in Terragen ?
Looks like they are a couple of option in the LW-modeler .obj exporter :



The most important one seems : obj write normals, that should give you a 100% identical shading result between LW and Terragen.
If 'merge point' also merge uv coordonate, then don't toggle this one, as it will indeed broken your uv's layout.

Keep in mind that .obj only support one uv map, so check if uv's are not split into multiple channel, and if it's the case merge them all in one channel.

I havn't use lw since years, but if I remember right, the LW-Layout also get multiple exporter options, that use different .dll from the LW-Modeler.
Maybe you could also give .fbx a try (from Layout) ?
Gameloft

j meyer

Thanks paq,that is really helpful!

TheBadger

^^
By chance do you know the Maya equivalent to this? Having trouble determining why poseray works but not Maya. And frankly it would be really strange that poseray does something Maya can't.
It has been eaten.

paq

hello,
Just to be sure, I did a nasty test to check Terragen import capacity, and it works perfectly, at least the .obj one .
The mesh was generated with a Nurbs software (MOI), so it's a collection of nasty micro triangle and quad's.

Custom vertex normal data prevent ugly shading resuly, and Terragen use the data perfectly.

Modo export .obj fine too out of the box.
Maya, I don't know, what are the options available ?

Maybe you have to compute and store the normals before exporting ? (looks like there is a lock normal feature ? but I'm not sure if it's the way to go)
Gameloft

Kadri


One OBJ file to rule them all is it not as it looks.
There shouldn't be so much differences in different OBJ export-imports but not much to do as it looks...
If the technical side comes so much in front i begin to not like what i do.

Paq, that OBJ loader image looks good indeed.

Kadri


Paq, i remember that in one or more verions in the past in Lightwave the OBJ export was bad-brocken.
Since then i use mostly Poseray to convert LWO files.It works now probably better.
Not sure if this is one of the reasons Ulco uses Poseray too.

Dune

I am very grateful for all your input guys, very helpful indeed. Though I didn't get a chance to read through the Poseray manual, I saved it as pdf, so will later.
@paq: the problem might be that I still work in LW 9, and that doesn't export obj's set up in layers, e.g. Only one layer exports. If I put it all on one layer it works though. I still have to experiment more, but I am kind of glued to my workflow, and it takes time to deviate and experiment. LW 9 doesn't have such a fancy exporter menu, it just runs. So I guess, Kadri is right. And yes, that's why I use P.
@Jochen: In LW I make a part, like a beam, and UV it. If I want specific parts to be oriented differently, I select the polys, cut and paste, or unweld, and move it around, or remap that part of the part. Then what I usually do is merge points.
But I also cut of parts of finished beams off in the normal viewports, and paste them to another layer temporarily, and move them around for additional beams (under eaves for instance), so I don't have to UVmap them separately. Then I select them randomly or one by one and move them around or rotate 180º in UV space, so they show different parts of one texuremap.
Probably not the right way, as exporting this lot often gets the UV map torn.
Anyway, I haven't been modeling that long, so it's still a big learning process for me, and thanks again for all the info guys!

TheBadger

QuoteMaybe you have to compute and store the normals before exporting ?
Yes, this may be something to go after. I read that normals have to be "applied", or rather any changes do. But there was no instruction, just the statement. So I have to search this out tonight.

my feeling is that somewhere in this is the answer to what poser is doing and what the other soft is doing automatically.

Where Maya is concerned, it feels to me that there are many many tools but no automation. For automation they give you python and mell, which you can build your own auto tools with. But I can write only very little in mell, and don't know any python, so I have to find all the manual ways of doing everything, or find scripts write by others.
It has been eaten.

paq

Hi Dune,

Might be a little bit tricky to find a lw 9 trial, so I'm affraid I can't help you that much for now.
'The merge layer before export' sounds familiar to me from the old days, but I'm sure there are script and plugins that can handle this ... maybe even a native 'flatten' layer' command in LW-M ?

As for the .obj export options, they are in the global options of LW-M (so it's not something that pop at export time). Try to type 'o' in the modeler ?
Gameloft

j meyer

Ulco you can access poseRay's manual form inside the app and in the folder where
the program's files are is a clickable htm file to read in your browser.
Paq is right LW should have global export options.That would have been my next
recommendation,too.

I can't say that the manual is right about how it works.Maybe it was intended to do
so,but practice shows something else.But,of course I might be too stupid again.
That's another thing I like about Wings3D you don't have to care about creasing
angles or how they are defined by the programmers.You want an edge to be hard
you set it to be hard and that's it.That way you can crease as many different angles
as you like,all in one single piece object,no problem.


j meyer

There are three approaches to low poly or mid poly modeling for TG I know of.

The first one would be to use hard/creased edges only.

The second hard edges in conjunction with a bevel.That catches a bit of light
and mitigates the sharp look of the first.

Third is no hard edges at all.Instead you have sort of rounded off corners/edges.
Take a look and you'll hopefully see what's meant.

Hard edges
[attachimg=1]

Bevel and hard edges
[attachimg=2]

Soft/uncreased edges
[attachimg=3]

That's how they look in TG
[attachimg=4]

to be continued in the next post