Page 1 of 2
"even out" textures
Posted: 2013-07-11T07:55:20-07:00
by thorax
Hi guys, first post here.
Im working with satellite textures and putting them into a flight simulator to cover the elevation mesh.
These textures work great and give a very good level of realism, but have a fatal flaw. Over time my texture provider takes pictures and these usually have a different tone. So in the end result you can clary see the tone boundary, pretty much ruining the experience.
I was wondering if there is some sort of command to "even out" these tones and end up with a relative uniform texture tone. I work these textures as an array of many many X and Y 2048x2048 textures, that when are put together in the flight simulator create the "big image" result.
So some textures may be full dark, others full bright, and others may contain both a bright part and dark part.
Please see this photo to better understand the idea:
im working with many many textures, so doing this manually is ruled out.
What do you think is there any hope?
thank you!
Re: "even out" textures
Posted: 2013-07-11T10:46:35-07:00
by fmw42
Many years ago I did this same kind of thing for a flight simulator database. see
http://www.fmwconcepts.com/fmw/ipt.html, though that example is not shown on this page.
What we did was get overlapping images and we normalized them (before building the mosiac) to mostly the same brightness contrast by using an algorithm similar to either my space script or my redist script (see links below). At that time our large flight simulator database only used grayscale images, however. But the concepts should still apply. Because the images overlapped we could then use a ramped mask ( ramped at the edges of the images ) to blend them together.
You could try using IM -equalize on your images to try to normalize them or try my scripts if you are on Linux/Mac
Re: "even out" textures
Posted: 2013-07-11T10:57:59-07:00
by thorax
Of course im with linux
, ill investigate your links. Thanks so much!
Re: "even out" textures
Posted: 2013-07-11T11:05:23-07:00
by fmw42
Note that the space script probably works better in HDRI mode but may be slower. The redist script may work better. It allows you to specify the same common mean and standard deviation for ever image and tries to achieve those results by histogram processing.
Re: "even out" textures
Posted: 2013-07-11T14:22:28-07:00
by thorax
this looks complicated. Ive tested some examples with no luck.
This is a sample texture that features the boundary. The end result should be all green, the same tone
can you please give me a hint on the parameters to achieve this?
thank you
Re: "even out" textures
Posted: 2013-07-11T15:58:36-07:00
by fmw42
Is this one single photo or is it a cut-out from your mosaic? The techniques I had mentioned only work on the original images and not a cut-out from a mosiac that includes two different images with such different colors.
Re: "even out" textures
Posted: 2013-07-11T16:25:25-07:00
by thorax
this is one single photo. The thing is that my process outputs the textures already cropped. So i dont have at any point the big picture. Else its totally impractical since we are talking about 2500 textures of 2048x2048. If i had to merge them together into one single file would be gigantic.
I don't have access to the original images, only this kind of textures
any hope?
thanks
Re: "even out" textures
Posted: 2013-07-11T16:42:49-07:00
by snibgo
From your sample image, which is 1024x1024 pixels, it is easy to write a script that make the left portion look more like the right, eg with "-modulate":
Code: Select all
convert test_field.png -modulate 140,270,160.28 tf.png
A mask would determine which part of the image to change.
However, if you have 2500 such images, you would also need to automate the process of determining which part of each image is correct and which is incorrect, and thus the required shifts in hue, saturation and lightness.
Re: "even out" textures
Posted: 2013-07-11T18:14:12-07:00
by fmw42
Does the brown area on the left come from a different photo than the green area that fills most of the image or is this just something like a burned area in the one original photo?
I am trying to determine if your 2048x2048 tiles have already been merged or are they each from a single photo. If the the tile is just from one photo, then you can try to use my scripts to generate some common statistic (mean and std) for all the tiles. If they are from mixed images, then you would need to use masking, such as suggested by snibgo, to each part that is from different images.
My process that I used years ago, processed each original image before merging and tiling. So I am not really sure what to suggest where you have mixed tiles. Tiles that have data from only one photo could be processed as mentioned above.
Re: "even out" textures
Posted: 2013-07-11T18:21:35-07:00
by snibgo
I suspect the tile is a composite made from two photos, in two passes of a survey aircraft.
Looking at Thu_Jul_11_114529_ART_2013.jpg, and thinking about 2500 images about how to automate the grading process:
You may have a fairly small number of grade sets.
(Terminology: "grading" is a term used in film and video to mean modifying footage that might be shot at different times in different places with different cameras to give the impression of the same place and time. Parameters that are modified include colour balance, saturation, contrast and lightness. Typically a number of shots will need the same grading. By "grade set" I mean a series of photos from one run of the surveying aircraft that all need the same grading.)
You might:
(1) Identify each grade set. Decide which one (if any) is "correct", and what grading needs to be applied to the others. This might be as simple as the "-modulate" I gave above.
(2) Mark up each input image as being entirely within one grade set (ie the image looks like it is from one photo), or within multiple grade sets (ie looks like a composite, a mosaic).
(3) Images entirely within one grade set can be automatically graded, no problem.
(4) How many are left? These ones will need multiple gradings. We know what the gradings are. The only difficulty is determining the masks that define which part of the image needs which grading. In the sample you provided above, the answer is fairly simple: the reddish part needs grading and the greenish part doesn't. So, when defining grade sets in step (1) you could define the exact meaning of "reddish" etc.
I think this would give a good first approximation. I can see two problems:
1. Lakes etc might mess up step (4). Perhaps these would be handled manually by painting the masks, or automatically by looking for holes in masks.
2. The lighting might change as a survey aircraft makes its run, so the grading for a grade set changes along its length. This would be more complex to correct.
Re: "even out" textures
Posted: 2013-07-12T09:56:17-07:00
by thorax
Thanks guys a lot for the help. Let me try to respond the questions:
however, if you have 2500 such images, you would also need to automate the process of determining which part of each image is correct and which is incorrect, and thus the required shifts in hue, saturation and lightness.
yes that would be the case. Doing this manually its just crazy.
Does the brown area on the left come from a different photo than the green area that fills most of the image or is this just something like a burned area in the one original photo?
I am trying to determine if your 2048x2048 tiles have already been merged or are they each from a single photo. If the the tile is just from one photo, then you can try to use my scripts to generate some common statistic (mean and std) for all the tiles. If they are from mixed images, then you would need to use masking, such as suggested by snibgo, to each part that is from different images.
My process that I used years ago, processed each original image before merging and tiling. So I am not really sure what to suggest where you have mixed tiles. Tiles that have data from only one photo could be processed as mentioned above.
yes it comes from a different photo, taken at a different time of the same area. Atmosferic effects like haze , etc make the photos take different tones but the underlying area is the same. My goal is to even them to look the same. I dont care if its more or less green, just i need it to be even. With no visible "boundaries" that ruin everything.
my 2048x2048 is a crop of the big merge photo. the big mere photo is so big i handle them in small 2048x2048 textures that form a marix.
Re: "even out" textures
Posted: 2013-07-12T10:01:13-07:00
by thorax
snibgo wrote:I suspect the tile is a composite made from two photos, in two passes of a survey aircraft.
Looking at Thu_Jul_11_114529_ART_2013.jpg, and thinking about 2500 images about how to automate the grading process:
You may have a fairly small number of grade sets.
(Terminology: "grading" is a term used in film and video to mean modifying footage that might be shot at different times in different places with different cameras to give the impression of the same place and time. Parameters that are modified include colour balance, saturation, contrast and lightness. Typically a number of shots will need the same grading. By "grade set" I mean a series of photos from one run of the surveying aircraft that all need the same grading.)
yes there are a couple of them , as you can see here:
You might:
(1) Identify each grade set. Decide which one (if any) is "correct", and what grading needs to be applied to the others. This might be as simple as the "-modulate" I gave above.
(2) Mark up each input image as being entirely within one grade set (ie the image looks like it is from one photo), or within multiple grade sets (ie looks like a composite, a mosaic).
(3) Images entirely within one grade set can be automatically graded, no problem.
(4) How many are left? These ones will need multiple gradings. We know what the gradings are. The only difficulty is determining the masks that define which part of the image needs which grading. In the sample you provided above, the answer is fairly simple: the reddish part needs grading and the greenish part doesn't. So, when defining grade sets in step (1) you could define the exact meaning of "reddish" etc.
I think this would give a good first approximation. I can see two problems:
1. Lakes etc might mess up step (4). Perhaps these would be handled manually by painting the masks, or automatically by looking for holes in masks.
2. The lighting might change as a survey aircraft makes its run, so the grading for a grade set changes along its length. This would be more complex to correct.
Regarding the lakes and rivers i really dont care if they get messed up because then those are overwritten by the simulators actual lakes that goes on top.
The lightning may change slightly but not to much i thing since each grade set comes from an individual image i guess. So i think ill try some of the ideas here and see how it goes!
this has been very usefull thanks a lot!
Re: "even out" textures
Posted: 2013-07-12T10:06:21-07:00
by thorax
what i have noticed is that the areas of gradient change are either pretty much vertical or horizontal lines
would it be possible to detect these lines automatically and input that into the -modulate command?
Re: "even out" textures
Posted: 2013-07-12T10:18:21-07:00
by thorax
snibgo wrote:From your sample image, which is 1024x1024 pixels, it is easy to write a script that make the left portion look more like the right, eg with "-modulate":
Code: Select all
convert test_field.png -modulate 140,270,160.28 tf.png
A mask would determine which part of the image to change.
However, if you have 2500 such images, you would also need to automate the process of determining which part of each image is correct and which is incorrect, and thus the required shifts in hue, saturation and lightness.
im wondering how did you came up with these values? -modulate 140,270,160.28 did you calculate them or was just trial and error?
Re: "even out" textures
Posted: 2013-07-12T11:56:29-07:00
by snibgo
thorax wrote:what i have noticed is that the areas of gradient change are either pretty much vertical or horizontal lines
would it be possible to detect these lines automatically and input that into the -modulate command?
Looking carefully at your sample image, I think the line isn't straight. I'd do the job with a mask that would be created in the software. The software would create a mask that was black where the sample was reddish and white where the sample was greenish. Some experimentation would be needed to get the smoothest (least obvious) transition.
thorax wrote:im wondering how did you came up with these values? -modulate 140,270,160.28 did you calculate them or was just trial and error?
Trial and error over three variables? I'm too lazy to do that.
The left part has approx hue = 330 degrees, saturation = 11%, lightness = 17%.
The right part has hue = 113 degrees, sat = 30%, lightness = 24%.
Hue: 330-113 = 217. 360 degrees is 100, so 217 degrees is 100*217/360 = 60.28. So Hue setting is 100 + 60.28 = 160.28.
Saturation: 30/11 = 2.7, * 100 is 270.
Lightness: 24/17 = 1.4, * 100 is 140.
Getting a close match would probably also need contrast (ie standard deviation).