Page 2 of 3
Re: Texture_Overload
Posted: Fri Apr 01, 2011 12:40 pm
by GeoPappas
ROUBAL wrote:
As each compression adds new artifacts, if you save an image in jpg after working on it, the uncompressed image will be bigger after each new compression !
I don't believe that is correct.
The size in memory for an uncompressed image is dependent on three things: height, width, and color bit level. For example, if an image is 4000 x 3000 with 24 bits/pixel, then it will take up 36,000,000 bytes of memory. It won't matter if the image is all black, all white, a graduated black to white, or a mixture of all 16 million colors.
Compressing an image on disk is just used to save disk space. Some compression formats are lossy (where some of the image can be lost due to compression), such as JPG, and some compression formats are lossless, such as PNG. But when the image is uncompressed it will occupy the same amount of memory no matter how it is stored (as long as the color level - bits/pixel are the same).
Re: Texture_Overload
Posted: Fri Apr 01, 2011 2:46 pm
by matej
If using compressed images in Octane would be that easy or feasible to implement, Radiance would be probably already working on it. Instead of just a simple memory lookup, the decompressing algorithm would need to run every time some pixel info is needed in the pipeline - and that's; width * height * no. of times a texture appears in the calculation * no. of textures, per refresh cycle. My educated guess is that the speed impact would not be just minimal.
The whole node-based-materials thing is there (also) to save memory for texture data, which will be even easier to do when the new stuff will be implemented.
**STK** wrote:Yep, try to be more creative
Exactly
Re: Texture_Overload
Posted: Fri Apr 01, 2011 5:09 pm
by **STK**
ok lets see...
Re: Texture_Overload
Posted: Fri Apr 01, 2011 5:12 pm
by **STK**
Ok mr knowlege here are some extreme thoughts from some people who wants to go further from their nose...
Re: Texture_Overload
Posted: Fri Apr 01, 2011 6:00 pm
by matej
**STK** wrote:Ok mr knowlege here are some extreme thoughts from some people who wants to go further from their nose...
Since you
obviously* read the paper you posted, can you point me to the section that talks about using compressed
textures for rendering by decompressing them on the fly
And could you explain how you know that:
=> the techniques described in the paper are relevant to a GPU renderer, such as Octane
=> [if above is the case] the techniques described in the paper are applicable to and implementable on GPU architecture and CUDA technology
=> [if above is the case] Refractive software is willing to throw money and time inventing & implementing compression algorithms, instead features like nodes, render kernels & memory optimizations on geometry
=> [if above is the case] Are the users willing to wait another year for Octane 1.0, because of this
(* that paper is not applicable to the problem in this thread)
---
Ontopic:
The devs are implementing instances, which will in some cases help saving memory a lot. For everything else there is Masterc... are nodes - which in some cases already allow you to create full-fledged materials (diffuse, specular, bump) with just one grayscale texture or even without textures.
EDIT: can anyone point me to a renderer that uses compressed textures and decompresses them on the fly during rendering?
Re: Texture_Overload
Posted: Fri Apr 01, 2011 6:02 pm
by ROUBAL
@GeoPappas : you are right :
As each compression adds new artifacts, if you save an image in jpg after working on it, the uncompressed image will be bigger after each new compression !
I wanted to say "the compressed image will be bigger after each new compression".
I agree that a BMP image will have a constant size, but even if PNG is lossless, a PNG image should be bigger if it is a JPG image converted into PNG instead of a native BMP converted into a PNG. At least, it was in my own tests.
Re: Texture_Overload
Posted: Fri Apr 01, 2011 6:12 pm
by dave62
GeoPappas wrote:ROUBAL wrote:
As each compression adds new artifacts, if you save an image in jpg after working on it, the uncompressed image will be bigger after each new compression !
I don't believe that is correct.
The size in memory for an uncompressed image is dependent on three things: height, width, and color bit level. For example, if an image is 4000 x 3000 with 24 bits/pixel, then it will take up 36,000,000 bytes of memory. It won't matter if the image is all black, all white, a graduated black to white, or a mixture of all 16 million colors.
Compressing an image on disk is just used to save disk space. Some compression formats are lossy (where some of the image can be lost due to compression), such as JPG, and some compression formats are lossless, such as PNG. But when the image is uncompressed it will occupy the same amount of memory no matter how it is stored (as long as the color level - bits/pixel are the same).
i think roubal is somehow right if we talk about file format conversion. in detail we got to keep in mind that the jpg is not lossless. after uncompression of a jpg a lot pixels have changed their "original" values and mosttimes in a bad way so that now any compression algorithm cant be so effictiv any more if you save it. maybe conversion between losless formats wouldnt increase the image size.
but of course loading a jpg or a lossless png into octane wouldn´t make any changes in the memory usage.
Re: Texture_Overload
Posted: Fri Apr 01, 2011 7:39 pm
by GeoPappas
matej wrote:
EDIT: can anyone point me to a renderer that uses compressed textures and decompresses them on the fly during rendering?
I don't think that you will find one. But that is because 95% of the renderers out there are CPU-based rendering engines, which don't have memory limit issues (when using a 64-bit O/S).
But GPU-based rendering is a different matter. Memory limits do come into focus for many people that are loading complex meshes, or loading lots of textures, or loading highly detailed HDRI backgrounds. This is one of the major complaints against GPU rendering.
So I thought that it might be worth looking into.
There was another topic that discussed trying to bring in the masses from the DAZ world over to Octane, but someone pointed out that many DAZ scenes contain lots of textures that would need gobs of memory. If Octane could get a handle on reducing memory usage, then they might have a much larger customer base (which would bring in more money for development).
Re: Texture_Overload
Posted: Fri Apr 01, 2011 8:20 pm
by colorlabs
I did some experiments with VRAM and image size here:
http://www.refractivesoftware.com/forum ... f=9&t=5059
Re: Texture_Overload
Posted: Fri Apr 01, 2011 10:27 pm
by abstrax
To cut the story short: We don't see a way to use compressed textures in Octane without making it slower than a standard CPU renderer.
Another problem: Even if we used compressed textures, people would then just throw even larger textures at Octane, which would then result in the same problem again. -> You really have to be conscious about usage of textures and geometry and I think it will stay like that for quite a while.
Anyway, one thing I really don't understand: Although just 3-4 years ago almost no PC had more than 2GB RAM and a 64bit OS, people were rendering happily away. Why is it not possible anymore?
Cheers,
Marcus