Flash connected to camera via remote = black exposure Vs perfectly exposed directly on the camera

A_S

Messages
576
Name
Andrew
Edit My Images
Yes
Here are 2 photos

the first has the flash attached to the camera via a remote (hanhel captur)

The second - the flash is attached directly to the camera

In both cases the flash is set to manual 1/1 power and the exposure set exactly the same in the illuminated panel on the right of the photo

The only difference being the zoom set at 35mm on the remote and correctly at 50mm on the direct.

Edit: I’ve noticed the shutter was set slightly slower on the remote - possibly as it defaulted to sync speed when I directly attached the flash

Can someone explain why one setup via the remote exposes a completely black image (fires from remote) and the one when connected to the camera (fires from shutter button on camera) exposes fine.IMG_5588.jpegIMG_5589.jpeg
 
Last edited:
I’ve just realised I could probably plug the remote trigger in witgout it being on the hot shoe to get round this but even so it would be nice to know why this happens - sync issue?
 
I'm confused, why do you have the flash on camera? Are you firing a second flash somewhere else or is this the only flash your using.
I dont have thsi make, but the ones I have the transmitter goes on camera and the other recievers on the flashes off camera somewhere. Never tried it with a flash on the actual camera as well.
 
Is that a reciever on the camera body with the flash attached?

Where is the trigger?

I think that cable is for triggering a remote camera from a master camera, not to fire your flash on the camera.
 
Last edited:
I do not understand what you are trying to achieve with that strange setup.
Remote is for use with OFF the camera flashes, not those mounted on it.

With most recent cameras and flashes, the flash can be used as a master when mounted directly on the camera. And a remote controller is not needed at all to fire other flashes. The flash on the camera can be set to fire itself of not as required, in which case it acts as a controller only.
Alternatively the remote unit can be used as the master, controlling any off camera flashes, when it is mounted on the camera in place of a master flash.

Earlier remote sync units could only fire the off camera flashes, not control their power or the camera settings.
 
Last edited:
Just had a thought, what camera are you using? Some cameras dont have a "standard" hot shoe. While they will fire their compatable flashguns they may not work with a third party flash or trigger. I think Sony and some mirrorless fall into this set-up, not sure.
 
If I've got this right the remote is wanted to trigger the camera, not the flash. As per OP's second post. Which will have solved the problem.
 
We need to wait to see what he is using and what he is trying to achieve.
 
Am I correct in thinking that you are trying to fire your camera and flash remotely? i.e. you want to hold the transmitter and when you press that, take the picture with flash?

I guess that it does not work when you put the flash in the remote hotshoe as in that configuration the flash is firing precisely the same time it receives the signal from the transmitter and there is no delay to allow for the time it takes the camera to activate the shutter, for example, to focus or possibly try and communicate with the flash.
 
Tim - I think you have nailed it there

this is exactly what I am trying to do. Thinking of a way to have a remote set up with a flash attached and I can then trigger it remotely.

When you say it, it sounds obvious now.... obviously the flash is not sync'd to the camera because its in manual mode and fires sooner than the shutter

Out of interest what stops this happening if I were to set up my remote with only a flash on and not attached to the camera at all.

I.e fire my camera with the transmitter attached so signal is sent when shutter button pressed, remote flash fires and shutter takes shot - how is this different from the above method, as in effect I am doing the same thing - but it works.
 
Tim - I think you have nailed it there

this is exactly what I am trying to do. Thinking of a way to have a remote set up with a flash attached and I can then trigger it remotely.

When you say it, it sounds obvious now.... obviously the flash is not sync'd to the camera because its in manual mode and fires sooner than the shutter

Out of interest what stops this happening if I were to set up my remote with only a flash on and not attached to the camera at all.

I.e fire my camera with the transmitter attached so signal is sent when shutter button pressed, remote flash fires and shutter takes shot - how is this different from the above method, as in effect I am doing the same thing - but it works.
With everything set to full manual (camera and flash) I don't see a reason why it wouldn't work... there is no need for communication between the camera/flash or syncing required. There is a chance that the receiver cannot activate two triggers simultaneously... i.e. when a cable is plugged in it disables the hotshoe passthrough... IDK; I've never tried it and I don't have those specific receivers. If you are seeing the flash fire, then the only probability I can come up with is the camera confirming focus before firing. You could try placing the camera/lens into manual focus mode or release priority to see if that makes a difference.

Your best bet is to have the receiver plugged into the camera (hanging) and the flash in the camera hotshoe. That way the remote trips the camera and the camera trips the flash... which I'm guessing is what the second example is of. Also note that the second example would likely have put the flash into HSS (1/250); and the first example could be at the limits of wireless sync (1/160) with that setup, but I kind of doubt it.
 
Thanks SK66 - I will be doing what you suggested in your second paragraph. Figured it out as a work around myself lying in bed this morning!
 
Thanks SK66 - I will be doing what you suggested in your second paragraph. Figured it out as a work around myself lying in bed this morning!
It's not a 'workaround' it's the correct way to make it work.

A flash connected to a remote trigger will fire as soon as the signal lands. A camera connected to a remote also fires 'straight away', but this is a misnomer - first it checks whether it's focussed or needs to ignore that signal, then it opens the shutter, once the first curtain has travelled the height of the sensor, that's the time for the flash to fire, then the second curtain starts moving to close the shutter. It sounds like a lot of work it's doing once you describe the process.
 
Thanks all - Have to bare with me, I am still getting my head round it all!!!
 
Ordinarily A) the camera fires and B) sends nTTL signal to remote flash transmitter, and C) a short fraction of the second later the remote receiver gets the nTTL signal and triggers the flash to fire.

With SOME remote flash triggers, there is a propogation delay between B to C, resulting in the need to SLOW the shutter from its usual X-synch speed, by about -1/3EV, so a camera with X-sync at 1/250 has to use 1/200 instead.

With your set of trigger transmiitter in hand, the trigger fires the flash perhaps before the camera shutter is open...resulting in a blank frame, but when the flash is mounted on the camera, the flash is not triggered until the camera's shutter opens, so you get a good photo.

To trigger camera with a remote AND have the flash go off successfully, you need TWO pairs of trigger-receiver.
  1. Trigger A in hand sends signal to Receiver A, firing camera
  2. Trigger B on camera sends signal with nTTL command when camera shutter opens, sending nTTL signal to Receiver B which triggers flash.
 
Back
Top