Right replyIf you're not still learning, you're not paying attention
Right replyIf you're not still learning, you're not paying attention
In fact, according to Topaz - THEY DO!
But they would say that.
OnOne software has been saying for years how their software is blazingly fast. Yeah right!
My JPEGs are first changed to 16 bit TIFFs then put through my Neat Image Noise Reducing software then cropped to the final dimension I want then put through AI Gigapixel and saved as a 16 bit TIFF.
The I put them through my HDR program to bring out the details - then finally a final edit where all the final finishing is done before reducing in size and saving as a JPEG.
Well this new software is anything but fast, since Topaz says it performs MILLIONS of processes on every single pixel, but it works incredibly well..
Because of that I have changed the way I work.
My JPEGs are first changed to 16 bit TIFFs then put through my Neat Image Noise Reducing software then cropped to the final dimension I want then put through AI Gigapixel and saved as a 16 bit TIFF.
The I put them through my HDR program to bring out the details - then finally a final edit where all the final finishing is done before reducing in size and saving as a JPEG.
It may sound like an awful lot of extra work but a lot of the work is done by batching them and just letting them run while I do other things - sleeping, eating etc.
And for me it works which is all I care about.
.
This really makes no sense!
If you're going to do all that, just use the raw files? It cuts out a step as they effectively start as 16 bit TIFFS?
Converting a JPG to a 16 bit TIFF doesn't make it a 16 bit file in reality - yes the file has a 16 bit pixel depth, but that just an 8 bit file with holes in it!
I do now use RAW files but also save the reduced JPEGs to the cloud and on HDDs and Blu-Rays.
For reasons already stated many times this way is the way I have worked for years, and it works for me because in the end no matter what you do you still end up with 8 bit JPEGs on your PCs, the Web etc.
I have use this method for years and no one seems to have noticed that there are missing "holes".
And as the quality of the software available to us has improved immensely, especially with AI now beginning to be used more and more, the difference between RAW and JPEGs is, to me at least, unimportant.
If I was a professional photographer I have no doubt that I would use RAW files but I'm not so as long as I really only have myself to please with my photos I'm quite happy working the way I do.
.
That way of working really is stuff and nonsense...
Converting an 8bit file to 16bit is really a pointless step, those missing bits are not going to reappear out of the ethernet or wherever missing bits go when they are discarded...
Then you are adding invented pixels (No matter if the software is AI or not), tone mapping, then reducing the file in size again to save as an 8bit file again.
I just fail to to see any rhyme or reason behind it.
I get the impression it is because it is a busy process. Some people prefer busy to quiet excellence.That way of working really is stuff and nonsense...
Converting an 8bit file to 16bit is really a pointless step, those missing bits are not going to reappear out of the ethernet or wherever missing bits go when they are discarded...
Then you are adding invented pixels (No matter if the software is AI or not), tone mapping, then reducing the file in size again to save as an 8bit file again.
I just fail to to see any rhyme or reason behind it.
Furtim already stated the reason behind it regarding 16 bit TIFFs and what are you looking at when you look at the BILLIONS of photos displayed on the Web?
JPEGs.
Nothing more need be said!
.
I get the impression it is because it is a busy process. Some people prefer busy to quiet excellence.
By all means use an 8bit TIFF, there simply is no reason to use 16bit as it will still contain the same 8bits of colour, 16bits will add nothing nor will it reinvent what has been discarded, It also makes no sense to upsize, tone map and then downsize again, simply convert to an 8bit TIFF, tone map that and save as a JPEG. In fact there probably isn't much point in converting to a TIFF, simply tone map your JPEG file, it takes a number of saves before degradation becomes apparent.Furtim already stated the reason behind it regarding 16 bit TIFFs and what are you looking at when you look at the BILLIONS of photos displayed on the Web?
JPEGs.
Nothing more need be said!
.
By all means use an 8bit TIFF, there simply is no reason to use 16bit, It also makes no sense to upsize, tone map and then downsize again, simply convert to an 8bit TIFF, tone map that and save as a JPEG. In fact there probably isn't much point in converting to a TIFF, simply tone map your JPEG file, it takes a number of saves before degradation becomes apparent.
There shouldn't be Peter, granted you may buy yourself a very small amount of headroom against banding etc by going to a 16bit file but it aint going to be a great deal.I have tried 8 bit TIFFS but degradation is VERY apparent from the start - no idea why.
.