JPEG vs RAW

I do a mixture of photography.

Theatre - RAW + small JPG
Motorsport - Large JPG (max Q) + small JPG (max q)
Athletics - generally a small or medium JPG on both cards simultaneously. The photo team is usually asked to produce images at between 1Mb and 2Mb

Athletics - I never see the photos - I hand cards back to the team leader and they do the captioning and keywording and tagging.

Motorsport & Theatre. Initial cull based on small JPG. I will publish a photo gallery of motorsport during the event, so I use the small JPG for that to speed things up and reduce processing / resizing time, then look at the large JPG back at base.
 
I shoot both, I use the jpegs for work pics and to scroll through the shots on my laptop before I drop the RAWs in the folder I use for Lightroom. I used to use Pentax cameras and the JPEGs were great, so I didn't bother with RAW much, but after switching to Nikon (D600), the JPEGs aren't as good so I use RAW for even semi-serious stuff.
 
But they would say that.

OnOne software has been saying for years how their software is blazingly fast. Yeah right!

Well this new software is anything but fast, since Topaz says it performs MILLIONS of processes on every single pixel, but it works incredibly well..

Because of that I have changed the way I work.

My JPEGs are first changed to 16 bit TIFFs then put through my Neat Image Noise Reducing software then cropped to the final dimension I want then put through AI Gigapixel and saved as a 16 bit TIFF.

The I put them through my HDR program to bring out the details - then finally a final edit where all the final finishing is done before reducing in size and saving as a JPEG.

It may sound like an awful lot of extra work but a lot of the work is done by batching them and just letting them run while I do other things - sleeping, eating etc.

And for me it works which is all I care about.
.
 
Last edited:
My JPEGs are first changed to 16 bit TIFFs then put through my Neat Image Noise Reducing software then cropped to the final dimension I want then put through AI Gigapixel and saved as a 16 bit TIFF.

The I put them through my HDR program to bring out the details - then finally a final edit where all the final finishing is done before reducing in size and saving as a JPEG.
:thinking:
 
Oh and most of this is done inside a 32 GB Windows 7 VM running as a guest on Windows 7 for portability.

Only AI Gigapixel is running on the Host machine - a Z600 workstation with 2 Xeon Hex processors, 48 GB RAM, 2 x 500GB SSDs operating in Raid 1 and a 256GB SSD connected on the PCIe x 16 slot with 80GB allocated to Windows 7 and the rest used as a scratchpad.

I also use this machine for X-Vids and can easily run 24 VMs with them all rendering - as long as I cap them so the CPUs don't melt down!

The VMs in this case are all 1GB XPs created by gutting XP with NLite.
 
Last edited:
Well this new software is anything but fast, since Topaz says it performs MILLIONS of processes on every single pixel, but it works incredibly well..

Because of that I have changed the way I work.

My JPEGs are first changed to 16 bit TIFFs then put through my Neat Image Noise Reducing software then cropped to the final dimension I want then put through AI Gigapixel and saved as a 16 bit TIFF.

The I put them through my HDR program to bring out the details - then finally a final edit where all the final finishing is done before reducing in size and saving as a JPEG.

It may sound like an awful lot of extra work but a lot of the work is done by batching them and just letting them run while I do other things - sleeping, eating etc.

And for me it works which is all I care about.
.

This really makes no sense!
If you're going to do all that, just use the raw files? It cuts out a step as they effectively start as 16 bit TIFFS?

Converting a JPG to a 16 bit TIFF doesn't make it a 16 bit file in reality - yes the file has a 16 bit pixel depth, but that just an 8 bit file with holes in it!
 
This really makes no sense!
If you're going to do all that, just use the raw files? It cuts out a step as they effectively start as 16 bit TIFFS?

Converting a JPG to a 16 bit TIFF doesn't make it a 16 bit file in reality - yes the file has a 16 bit pixel depth, but that just an 8 bit file with holes in it!

I do now use RAW files but also save the reduced JPEGs to the cloud and on HDDs and Blu-Rays.

For reasons already stated many times this way is the way I have worked for years, and it works for me because in the end no matter what you do you still end up with 8 bit JPEGs on your PCs, the Web etc.

I have use this method for years and no one seems to have noticed that there are missing "holes".

And as the quality of the software available to us has improved immensely, especially with AI now beginning to be used more and more, the difference between RAW and JPEGs is, to me at least, unimportant.

If I was a professional photographer I have no doubt that I would use RAW files but I'm not so as long as I really only have myself to please with my photos I'm quite happy working the way I do.
.
 
I do now use RAW files but also save the reduced JPEGs to the cloud and on HDDs and Blu-Rays.

For reasons already stated many times this way is the way I have worked for years, and it works for me because in the end no matter what you do you still end up with 8 bit JPEGs on your PCs, the Web etc.

I have use this method for years and no one seems to have noticed that there are missing "holes".

And as the quality of the software available to us has improved immensely, especially with AI now beginning to be used more and more, the difference between RAW and JPEGs is, to me at least, unimportant.

If I was a professional photographer I have no doubt that I would use RAW files but I'm not so as long as I really only have myself to please with my photos I'm quite happy working the way I do.
.

Don't get me wrong, there are aspects of this that make perfect sense - taking a JPG and converting to a TIFF (8 bit will suffice), editing, saving, editing, saving, editing saving the TIFF and then finally converting to JPG is a good approach if all you have is a JPG. The key thing here is the TIFF is lossless and the JPG is lossy, so every save of a JPG loses a little more fidelity with the original.

I thought you were starting the process with a RAW and converting to JPG and then to TIFF, which would be a terrible idea, which brought about the confusion - a poor assumption on my part.

For sure, if the starting point is a old JPG, making best use of the technology is a great thing, but for newly taken images, I'd personally suggest either - starting with a RAW file to TIFF if you want to pass between various apps, or if appropriate, SooC JPG in a non-destructive package such as Lightroom depending on the subject / importance etc.
 
Last edited:
I'd much rather rely on the actual pixels and data captured in a raw file than on some fancy AI trying to second guess what should and shouldn't be in the image.

I just can't see the logic of converting your 8 bit, compressed jpg into a 16 tif. Just because you've converted it to 16 bit doesn't mean you suddenly get all that lost information back.

My previous comment regarding OnOne and speed was nothing to do with the speed of your AI Gigasnakeoil software, it was just to say that software companies to tell you what you want to hear to sell their wares.
 
Last edited:
That way of working really is stuff and nonsense...

Converting an 8bit file to 16bit is really a pointless step, those missing bits are not going to reappear out of the ethernet or wherever missing bits go when they are discarded...

Then you are adding invented pixels (No matter if the software is AI or not), tone mapping, then reducing the file in size again to save as an 8bit file again.

I just fail to to see any rhyme or reason behind it.
 
That way of working really is stuff and nonsense...

Converting an 8bit file to 16bit is really a pointless step, those missing bits are not going to reappear out of the ethernet or wherever missing bits go when they are discarded...

Then you are adding invented pixels (No matter if the software is AI or not), tone mapping, then reducing the file in size again to save as an 8bit file again.

I just fail to to see any rhyme or reason behind it.

Furtim already stated the reason behind it regarding 16 bit TIFFs and what are you looking at when you look at the BILLIONS of photos displayed on the Web?

JPEGs.

Nothing more need be said!
.
 
That way of working really is stuff and nonsense...

Converting an 8bit file to 16bit is really a pointless step, those missing bits are not going to reappear out of the ethernet or wherever missing bits go when they are discarded...

Then you are adding invented pixels (No matter if the software is AI or not), tone mapping, then reducing the file in size again to save as an 8bit file again.

I just fail to to see any rhyme or reason behind it.
I get the impression it is because it is a busy process. Some people prefer busy to quiet excellence.
 
Furtim already stated the reason behind it regarding 16 bit TIFFs and what are you looking at when you look at the BILLIONS of photos displayed on the Web?

JPEGs.

Nothing more need be said!
.

But those jpgs are the finished image, not the starting point for intense editing and tone mapping.
But whatever works for you.
 
Last edited:
Furtim already stated the reason behind it regarding 16 bit TIFFs and what are you looking at when you look at the BILLIONS of photos displayed on the Web?

JPEGs.

Nothing more need be said!
.
By all means use an 8bit TIFF, there simply is no reason to use 16bit as it will still contain the same 8bits of colour, 16bits will add nothing nor will it reinvent what has been discarded, It also makes no sense to upsize, tone map and then downsize again, simply convert to an 8bit TIFF, tone map that and save as a JPEG. In fact there probably isn't much point in converting to a TIFF, simply tone map your JPEG file, it takes a number of saves before degradation becomes apparent.
 
Last edited:
By all means use an 8bit TIFF, there simply is no reason to use 16bit, It also makes no sense to upsize, tone map and then downsize again, simply convert to an 8bit TIFF, tone map that and save as a JPEG. In fact there probably isn't much point in converting to a TIFF, simply tone map your JPEG file, it takes a number of saves before degradation becomes apparent.

I have tried 8 bit TIFFS but degradation is VERY apparent from the start - no idea why.
.
 
I have tried 8 bit TIFFS but degradation is VERY apparent from the start - no idea why.
.
There shouldn't be Peter, granted you may buy yourself a very small amount of headroom against banding etc by going to a 16bit file but it aint going to be a great deal.
 
Mostly JPG. Anything with lots of pics like sports and events straight to JPG. I’ve been getting to know the adjustments possible in camera to fine tune those, eg custom jpg profile and WB / tint adjustments. Switching to JPG has been a massive time saver.

Raw for tricky landscape or other times when PP will be needed for whatever special reason.
 
I use raw all the time... If I want the image as shot then do a quick export, if tweaks are needed then I can auto adjust.
If I want to totally change the image I've got the full raw file there... I can adjust and create as many jpegs as I wish. Once I've shot in jpeg I cannot create a raw file.
 
Back
Top