Hi everyone,
I am performing simultaneous patch clamp electrophysiology and widefield, 1p voltage imaging. To synchronize my recordings from these two channels, I am using NI Labview to send a TTL pulse to my Hamamatsu Fusion BT to start my sequence acquisition at the same time as my patch clamp electrode amplifier starts recording.
I have noticed a perplexing artifact when trying these experiments. Only when I externally trigger the camera to start, I get an artifact in which the pixel intensities (on average) start low, gradually increase for 1.5-2s, then decrease. This occurs even when the lens cap is on the camera and no light should be hitting the sensor.
Here is an example of what the mean intensity values look like for each frame across a 5000 frame recording, taken at 1kHz, with the lens cap on. Internally staring the recording via HCImageLive is on top, whereas externally triggering the start via TTL pulse is on bottom:
I have been troubleshooting this for a number of days now, and I am completely stumped as to what could be causing this. Some notes from my troubleshooting:
- The effect is independent of the duration of the TTL pulse (1ms, 100ms, 5000ms all produce the same effect)
- It is independent of recording from the full sensor or a sub-array
- It is independent of how the images are stored (memory, disk, temporary buffer, etc.)
- The effect is time-dependent, not frame-dependent (first ~2000 frames are affected when recording at 1kHz, first ~150 frames are affected when recording at 90Hz)
I am interested if anyone has plausible explanations, or of course, potential fixes. Eager to discuss with folks – thank you for your time!