What's is the right way now? Where to grab the best resolution sources currently?
I see I can import heightmap of both img and adf from https://viewer.nationalmap.gov but how am I supposed to match color texture or a topo map at least ( for forest coverage for example) and where is the best source for them? I recall people could grab a color image from google earth before ( in sketchup) but looks like it's not an option anymore.
The National Map Viewer and Earth Explorer both have color (satellite) imagery that includes georeferencing just like the terrain does. So to get it to align you would just load each of them into the respective Georeference-enabled shader. In the case of terrain, Geog Heightfield Load (Add Terrain -> Heightfield (load DEM)). In the case of satellite imagery, you apply color in the Shaders layout, and you want to Add Layer -> Colour Shader -> Geog Image Map. Then make sure "Georeference" is enabled in both and they should line up automatically in Terragen. Note however that they may appear far from your camera position since georeferencing places them in their real-world location on the planet. You may need to either move your camera, or adjust the "lat/long at apex" of the planet OR change the georeferencing offset manually. There is a lot of discussion of all of this elsewhere on the forums.
- Oshyan
Thanks Oshyan. That "Georeference" option seems not available in Free version. Can I get a trial of full version somehow? If it works for me it could be an incentive to upgrade my old Terragen2 license.
You can apply for an unrestricted 30 day evaluation license here:
https://docs.google.com/forms/d/1ZJS3ZFzcd48aFRZgEyWJvcfBdBOOrFiF-DzXeiNu7W4/viewform
- Oshyan
Thanks Oshyan, I got the pro trial. Still I have a problem. I can load and see .adf or .img elevation formats from https://viewer.nationalmap.gov/basic/ just ok. Pasting SW corner coordinates in Planets "Lat long at apex" makes it instantly next to camera default position as described in the help.
But the free color NAIP ( National Agriculture Imagery Program jp2 files) ones are loaded somewhere I even can't find on the planet. Not matching elevation ones at all . Should I do something special? Switch coordinate system or something? I bet they should be pretty typical ones.
Weirdly I see completely different numbers in SW and such corners in Terragen for .adf and .jp2 files downloaded from basically same place in https://viewer.nationalmap.gov/basic/ while Jp2 color texture has numbers in the file naming itself suspiciously close to what I see in .adf corners. So I wonder what am I missing?
ps. Whatever I was able to google regarding the issue seems related to older USGS file formats or something. So I wonder if any current approach tutorial exists somewhere?
Do those images appear at 0,0,0 if you disable their georeferencing? I don't have experience with the specific files or issues you're referring to and we've had no other reports of it, so it may be something specific to that data set (as you say, an older data set), or may be a bug in our handling of georeferencing that is simply rare. Can you provide more details on the *specific* data you're downloading so we can test?
- Oshyan
Yes, those NAIP jp2 color textures do appear in sight ( 0,0,0 position) once I switch from georeference to position explicitly. And the files are not old.
I downloaded them few days ago from USGS current web site https://viewer.nationalmap.gov/basic/. So I bet it's what they distribute currently.
Just tried to download files for another US place and it's all the same, nothing specific for any place I might be downloaded from or possible errors in my original files.
I think those NAIP jp2 images just have kind of different georeferencing system or something comparing to elevation formats.
I personally have no idea where this georeferencing is stored, where Terragen finds it and how it supposed to works so not sure what details I could provide. Please, could you try to download something from https://viewer.nationalmap.gov/basic/ just to check if it works on your side. I guess it's supposed to be working feature in Terragen and nationalmap is a typical source.
ps. I tried elevation of 1/3 arcsecond and NAIP color textures
here is color file example https://www.dropbox.com/s/chtvbv692rjzhvm/m_3611729_sw_11_h_20160703_20161004.jp2?dl=0
I can't even find it on a planet
I am testing the .jp2 you provided and will let you know if I can address the issue or find any other details, bugs, etc.
In the meantime I do have a thought/suggestion: if you download the color photo imagery for the same location as a DEM, then once you load all of them you can use "Center on object or shader" in the right-click menu of the 3D preview to move the camera to where the DEM is (it doesn't work with Geog Image Map shaders unfortunately, hence this technique). You want to do this *without* offsetting the georef to 0,0,0. Your camera may end up inside the planet, but try to see if you can move and reorient it so you see the loaded DEM, and then you should hopefully see the JP2 on top of it. That's assuming it's loading correctly, of course. The thing about these images is they cover only a few km, so they could be *very* hard to find on the whole planet without some kind of reference/starting point like this.
- Oshyan
Update: as an example, the image as a *single pixel* in a 2000x2000 pixel render from orbit. This is when Georef is turned *off*, so it's definitely rendering the image correctly, but it would be impossible to find at this point. It becomes barely a bit more visible with the image at 10x its actual, real-world size.
Having said all that, it's certainly still possible that it's just not rendering correctly when georeferenced. We're still investigating.
- Oshyan
I tried your method and still don't see jp2 image anywhere. They do smaller than 1/3 arcsec elevation ones so you need a few of them to cover elevation piece. Also they come with a xml sidefile like this one for previous Jp2: https://www.dropbox.com/s/zebm9uor25rkltw/m_3611729_sw_11_h_20160703_20161004.xml?dl=0
I don't see any metadata at all in jp2 itself, just RGB channels + infrared one. Photoshop opens them as a grayscale image of 4 channels. So wonder where it's supposed to get its georeferencing info from , maybe from this XML?
ps. https://www.dropbox.com/s/gi5yad1pahwsvuy/w001001.adf?dl=0 it's the DEM file I expected to see aligned with jp2 one
OK, thanks for checking. We are looking into it and may need to update our GIS data import library.
- Oshyan
Any news on the matter?
The Mac/GIS dev was not able to get to it until this week. Will hopefully have more info soon.
- Oshyan
KirillK – The jp2 file just seems to be a big dumb Jpeg 2000 file. I downloaded one out of curiosity and couldn't see any metadata attached to it when I opened it in Photoshop. But you were right about the related XML file. If you open that (in any plain text editor) you will see this element:
<bounding>
<westbc>-77.8750</westbc>
<eastbc>-77.8125</eastbc>
<northbc>43.2500</northbc>
<southbc>43.1875</southbc>
</bounding>
Just plug these values into your geog image map shader:
[attach=1]
And load the DEM file. When I did this they seemed to line up just fine:
[attach=2]
Maybe the software wizards at Planetside can automate this down the road.
These are large files. I wouldn't want to try loading more than a couple at a time.
The Geog Image Map Shader *ought* to be loading that info automatically already. We have the GIS dev looking at it.
- Oshyan
QuoteThe Geog Image Map Shader *ought* to be loading that info automatically already. We have the GIS dev looking at it.
Good to know, Oshyan. But if KrillK needs a workaround in the meantime, it really isn't too much trouble to do it manually. Assuming he needs to use only a few of the files in his scene.
Yep, appreciate the workaround suggestion for sure. :)
- Oshyan
Quote from: sboerner on October 24, 2018, 04:52:11 PM
QuoteThe Geog Image Map Shader *ought* to be loading that info automatically already. We have the GIS dev looking at it.
Good to know, Oshyan. But if KrillK needs a workaround in the meantime, it really isn't too much trouble to do it manually. Assuming he needs to use only a few of the files in his scene.
Thank you very much Sboerner. Too bad I need to load almost 100 of NAIP files for a piece of terrain we need.
Would be cool if we could export terrain micromesh with UDIM or multy texture kind of UV. I need to bake the terrain into low poly shell with normal map and displacement eventually (for a game distant background being still partly 3d and dynamically illuminated). And looks like I have no options in Terragen. Would love to see a baking render in Terragen. I bought Clarisse for that thing and it's not working properly there. I need something suitable for games whree I could influence rays casted from low poly shell pixels in a necessary direction set by a secondary "cage" object. Not just along normals.
Would be a great feature.
Quote from: KirillK on October 25, 2018, 12:07:57 PM
I need something suitable for games whree I could influence rays casted from low poly shell pixels in a necessary direction set by a secondary "cage" object. Not just along normals.
Can you tell me more about this? How would this work, and what are the advantages over normal maps?
I am not sure I understand your question. Or perhaps I am missing some possibilities in Terragen.
I assume it can render a normall pass from a spherical or looking down ortho camera. But for a game I need a tangent space normal map matching the exact low poly shell model with low poly tricks like polygonal fins with alpha on the top of a mountain ridge for example to mimic complex silhuette with all the trees. buildings and small details.
Projecting all this from a shperical camera is working only for very distant flat background and for someething 3d and closer to main part of a game scene (middleground) I need more specific to exact low poly geometry projecting angles.
ps. Perhaps I could coupe with world space normal colors to convert them into tangent ones later in soft like Substance Designer but still I need a specific angles of projection and the "cage" to set it for evry target polygon vertex/corner.
I just think that instead of transferring all the complex hi res geometry and materials from Terragen to MAx or MAya it woud be so much easier to just import low poly shell model retopoed over micromesh and use baking ray tracing to produce necessary textures.