64-bit, network rendering and supersample prepass

Started by PorcupineFloyd, July 23, 2008, 07:53:42 am

Previous topic - Next topic

PorcupineFloyd

Hello,

I've recently pre-purchased TG2 and I have some questions regarding future of this great software:

- Is 64-bit version planned? Would benefits of using 64-bit version be really worth releasing such version?

- Is network rendering planned in the future? Or maybe GPU rendering?

- Does supersample prepass affect the scene without using GI surface details?

I'd also like to suggest a database explaining all of present TG2 settings to sum up questions like mine regarding supersampling.
I've seen TG2 documentation and FAQ page but it's incomplete and focuses more on tutorials than settings one-after one.
It could be community written and administrated by Planetside staff.
What do you think?

jo

Hi,

Quote from: PorcupineFloyd on July 23, 2008, 07:53:42 am
- Is 64-bit version planned? Would benefits of using 64-bit version be really worth releasing such version?


Yes and yes. 64 bit support is planned for some time after the TG2 final release. TG2 would greatly benefit from being able to use more memory via 64 bit support.

Quote- Is network rendering planned in the future?


The approach we've decided to take with this is to not support network rendering directly, but instead provide scripts or similar to enable TG2 to work with existing network rendering solutions. I don't think there has been any choice regarding a solution to support yet. TG2 has a command line interface which allows it to be used with many network rendering/batch queuing systems, and in fact it has been already. At the moment network rendering is mainly useful for animations, but we may provide a way to render parts of an image across multiple machines. You can do that to an extent now with crop regions, but it isn't automated.

QuoteOr maybe GPU rendering?


No. Well, maybe waaaay in the future, but not in the foreseeable future. It isn't really practical at this point.

I can't address your other questions and such.

Regards,

Jo

PorcupineFloyd


Oshyan

A full node reference and completed "tutorial-style" documentation will be made available with the final release. Jo has answered the rest of your questions quite well.

- Oshyan

Matt

Supersample prepass and GI surface details do different things and are fairly unrelated.

GI surface details only affects the final pass, ray-tracing short-range contributions to global illumination which would otherwise be blurred and indistinct in the global illumination cache. For most scenes I would not recommend it because it dramatically increases render times. Always do crop renders to test how important this is in your image before enabling it.

Supersample prepass takes more closely-packed samples in the prepass so that it is less likely to miss small or narrow objects that don't have any other objects nearby. If you can see that the prepass is missing objects and there are no other objects nearby (e.g. blades of grass in the foreground, or narrow tree trunks), you should probably enable this. Missed objects in the prepass can result in the shadows (areas where GI is important) being too dark.

Matt
Just because milk is white doesn't mean that clouds are made of milk.

PorcupineFloyd

Wow!

Now that's what I call an explanation :)

I'm doing a render right now in 3000x2000 px and I'm curious if I really need 0.95 detail. It has been rendering for 24 hours now (Q6600 @ 3.2GHz) and I have like 3/4 done. However clouds still appear a bit noisy in those thinner parts. It appears that ~80 samples for cumulus clouds is far away from enough.

I'll post final picture in appropriate forum after finishing this render.

Tangled-Universe

The number of cloudsamples doesn't always determine the eventual quality. Even more, most of the times it doesn't.
Since the amount of samples is mostly related to the depth and density of the cloud it would be better to look at the quality setting of the slider self. 80 samples could be quality 1.0 but also quality 0.5...all depends on your cloud-node settings etc.
If you have them at quality 1.0 and there is still grain you might consider increasing the atmosphere-samples (since the noise is in the thinner parts). Make small crop renders to play with these values to test, it is then also useful to render it at the desired final rendersize and quality. To me quite logical, but I've read many people don't do this.

Martin

reck

Quote from: PorcupineFloyd on July 23, 2008, 07:53:42 am


- Is network rendering planned in the future? Or maybe GPU rendering?



Would GPU rendering be a possibility with DirectX 11?

I saw this written about DX11 in an article the earlier, DX11 will come with a "new shader technology that re-positions GPUs as general-purpose parallel processors". Basically this means that DX11 will bring GPGPU support.


Wikipedia says this about GPGPU:

"General-purpose computing on graphics processing units (GPGPU, also referred to as GPGP and to a lesser extent GP²) is the technique of using a GPU, which typically handles computation only for computer graphics, to perform computation in applications traditionally handled by the CPU."

So do you think this technology could be harnessed by TG in the future? Sounds like MS are trying to make it easier for software developers to make sure of the GPU for other things apart from games. Also DX11 is supposed to bring better support for multithreading so that it's easier for software developers to make multithreaded apps, something thats not that easy apparently.

The good news is that everyone with a DX10 card can use DX11.

EDIT: Also wanted to add that Intels new gfx technology, Larrabee, is being positioned for a GPGPU market as well. It sounds like using the gfx card but stuff other than games is not to far round the corner now :)

Tangled-Universe

Quote from: reck on July 24, 2008, 08:22:22 am

The good news is that everyone with a DX10 card can use DX11.



Which is by then the standard.

It just looks like MS is following Nvidia's idea of CUDA?

rcallicotte

But, Windows XP users can't take advantage of either Dx10 or Dx11.   :(
So this is Disney World.  Can we live here?

Moose

Just being ignorant here... but isn't DirectX propriety to Windows - or can Mac and Linux make use of it too?

And slightly related... The Khronos Group are aiming towards creating an open standard - they have Apple (amongst other big names) on board, so I imagine Apples proposed OpenCL is to be cross-platform, is it(?).

Khronos Group press release - http://www.khronos.org/news/press/releases/khronos_launches_heterogeneous_computing_initiative/
More on Apples OpenCL - http://www.anandtech.com/weblog/showpost.aspx?i=461

pixelmonk

Quote from: Tangled-Universe on July 24, 2008, 08:37:18 am
Quote from: reck on July 24, 2008, 08:22:22 am

The good news is that everyone with a DX10 card can use DX11.



Which is by then the standard.

It just looks like MS is following Nvidia's idea of CUDA?


I don't know if following is the right word.  MS has been developing DX11 for some time, but other companies have been trying to develop software that can take advantage of GPUs for a few years now.  One actually created their own raytracing boards for use with their software.

reck

Calico, if/when this happens it's probably going to be another 18-24 months at least and by that time Windows 7 will be out...maybe. If your still using XP by then it might be time to consider upgrading. The main reasons people have put off upgrading to Vista are for compatibility and performance reasons. By that point I would say these things won't be an issue any more.  

pixelmonk

Quote from: Moose on July 24, 2008, 09:36:06 am
Just being ignorant here... but isn't DirectX propriety to Windows - or can Mac and Linux make use of it too?

And slightly related... The Khronos Group are aiming towards creating an open standard - they have Apple (amongst other big names) on board, so I imagine Apples proposed OpenCL is to be cross-platform, is it(?).

Khronos Group press release - http://www.khronos.org/news/press/releases/khronos_launches_heterogeneous_computing_initiative/
More on Apples OpenCL - http://www.anandtech.com/weblog/showpost.aspx?i=461


the main problem with open standards is it takes years to "approve" due to the selfish needs and reasons of the individual players involved.  With more hands in the pot, the more chances of one not playing nice with the others for their own reasons.   OpenGL had those issues for the longest.  Web standard... horribly slow to impliment.

PorcupineFloyd

Quote from: Tangled-Universe on July 24, 2008, 08:10:04 am
The number of cloudsamples doesn't always determine the eventual quality. Even more, most of the times it doesn't.
Since the amount of samples is mostly related to the depth and density of the cloud it would be better to look at the quality setting of the slider self. 80 samples could be quality 1.0 but also quality 0.5...all depends on your cloud-node settings etc.
If you have them at quality 1.0 and there is still grain you might consider increasing the atmosphere-samples (since the noise is in the thinner parts). Make small crop renders to play with these values to test, it is then also useful to render it at the desired final rendersize and quality. To me quite logical, but I've read many people don't do this.

Martin


So what you are suggesting is to increase atmosphere samples to get rid of noisy thin parts of clouds, not cloud samples?