Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Manual exposure control for raspistill #76

Closed
dmopalmer opened this issue Aug 28, 2013 · 28 comments
Closed

Manual exposure control for raspistill #76

dmopalmer opened this issue Aug 28, 2013 · 28 comments
Assignees

Comments

@dmopalmer
Copy link
Contributor

This is a request to make the exposure time (shutter speed) an exposed parameter in the API (or if it is already, to document how to use it.)

I would like to set the shutter speed for raspistill. I want to do technical imaging where I can control what the camera does (and, secondarily, make it do it with as little lag as possible).

If I set --exposure=off then it always takes a 1/100 s image (according to the EXIF).

I have looked through the mmal_parameters_camera.h and don't see any way to set the exposure time, unless it is disguised as e.g. MMAL_PARAMETER_FRAME_RATE, MMAL_PARAMETER_FPS_RANGE, etc.

Since the Raspberry Pi is not an open-source project, I can't go in and see how it is setting the shutter speed.

This capability would be useful for, e.g.:

  1. HDR imaging
  2. Astronomical imaging
  3. structured light scanning
  4. photogrametry
    and any other purpose where the data is more important than the aesthetics.
@JamesH65
Copy link
Collaborator

This is on my list of stuff to look at. If I remember the code correctly (and it's a nightmarish and very large bit of code to handle all the exposure modes) this won't be an simple thing to do.

@ghost ghost assigned JamesH65 Aug 28, 2013
@dmopalmer
Copy link
Contributor Author

If there were some command to replace the 1/100 s that is used for --exposure=off, then that would probably be enough.

(But you know the code and I can see how that might screw up the timing for other modes, e.g. if that parameter is also the initial guess that is used in autoexposure.)

@JamesH65
Copy link
Collaborator

JamesH65 commented Oct 2, 2013

Turned out not to be so bad. I've made exposure a command line parameter, it's currently going though merging, but because it needs changes on GPU and ARM side, it will take a few days to get through the system.

There is an issue with exposures > about 1/3rd of a second - they lock the GPU - this was not introduced by this change, but has always been there. I'm investigating, but so far looks like a driver or perhaps even a sensor problem. I've contacted Omnivision to see if they can help because everything in the register setting looks right to me!

@dmopalmer
Copy link
Contributor Author

As an astronomer, I am quite interested in exposures > 1/3 second, so I thank you for pushing this through.
Is the exposure time given in e.g. row times, and the problem occurs at a 2^n boundary, or is it something more random?

(There are limits due to dark current, but modern CMOS imagers should be able to handle seconds of exposure even at room+ temperatures, and I am willing to strap an ice-cube to the sensor to get longer exposures.)

@JamesH65
Copy link
Collaborator

JamesH65 commented Oct 3, 2013

I've checked the rows and suchlike - there seems to be no obvious reason
why it fails with the particular value it does - no boundary or similar -
hence requesting help from Omnivision. Unfortunately the FAE I work with is
a bit busy at the moment.

But it is a specific point it fails, so they should be able to either
replicate or not fairly quickly.

On 2 October 2013 23:55, David Palmer notifications@github.com wrote:

As an astronomer, I am quite interested in exposures > 1/3 second, so I
thank you for pushing this through.
Is the exposure time given in e.g. row times, and the problem occurs at a
2^n boundary, or is it something more random?

(There are limits due to dark current, but modern CMOS imagers should be
able to handle seconds of exposure even at room+ temperatures, and I am
willing to strap an ice-cube to the sensor to get longer exposures.)


Reply to this email directly or view it on GitHubhttps://github.com//issues/76#issuecomment-25584177
.

@HeatfanJohn
Copy link

@dmopalmer, just curious, how do you interface the Pi's camera to your telescope? Do you have any blog postings that detail how you are doing that?

JamesH65 added a commit to JamesH65/userland that referenced this issue Oct 4, 2013
Added  option to set the exposure/shutter time.

Needs a change on GPU binary, Brcm ref: 425060 and
a consequent change to
userland/interface/mmal/mmal_parameters_camera.h

There is currently a limitation of about 350ms beyond which
the camera will lock (depending on other settings). This
is being investigated, but is unrelated to this change
i.e. You can get the fault in other ways, not just by setting
the shutter speed.
@dmopalmer
Copy link
Contributor Author

@HeatfanJohn, I will be unscrewing the lens from the camera module, and 3d printing a mount that has a tube that goes into the eyepiece tube. The pixel size on the camera module is a good enough fit for the prime focus of my 5" Celestron. Eyepiece projection would allow more flexibility, but I may as well start easy.

I will eventually also print up something that holds the Pi and a battery to give a single unit that works as a web-enabled eyepiece (using my piCamServer ) so I can view it from an iPad without any cables.

@dfrey
Copy link

dfrey commented Oct 5, 2013

Hi David,
I am really interested in your plan for prime focus. Are you planning to share your plans for the mount?

Thanks

Davide

On Oct 5, 2013, at 3:35 AM, David Palmer notifications@github.com wrote:

@HeatfanJohn, I will be unscrewing the lens from the camera module, and 3d printing a mount that has a tube that goes into the eyepiece tube. The pixel size on the camera module is a good enough fit for the prime focus of my 5" Celestron. Eyepiece projection would allow more flexibility, but I may as well start easy.

I will eventually also print up something that holds the Pi and a battery to give a single unit that works as a web-enabled eyepiece (using my piCamServer ) so I can view it from an iPad without any cables.


Reply to this email directly or view it on GitHub.

@dmopalmer
Copy link
Contributor Author

When I start printing the mount, I will put it on Thingiverse and/or elsewhere (most likely on github along with any astronomical imaging software I develop for it. Lucky imaging is one of the things I want to try with it.)

I use OpenSCAD for 3D design, so I will make the draw tube diameter a parameter (0.965" on mine, but 1.25" and 2" are common and parameterization means that doesn't have to be a common size). If i get fancy with internal baffles I can set the minimum f-number as an additional parameter. Do you have a specific optical configuration I should try to make it compatible with?

(Some of this is that when you have a hammer, everything looks like a nail. If I didn't have a 3D printer I'd put something together with tubing from the hardware store and epoxy in an hour. If I had a machine shop and the appropriate skills I'd use a lathe and be done in an afternoon, unless it had a CNC mill in which case I could spend weeks getting it right.)

But I'm not ready to take the lens off the camera module yet because it's useful for debugging to have simple optics so I can just stand in front of the camera and wave my arms to test things out.

JamesH65 added a commit to JamesH65/userland that referenced this issue Oct 14, 2013
Added  option to set the exposure/shutter time.

Needs a change on GPU binary, Brcm ref: 425060 and
a consequent change to
userland/interface/mmal/mmal_parameters_camera.h

There is currently a limitation of about 350ms beyond which
the camera will lock (depending on other settings). This
is being investigated, but is unrelated to this change
i.e. You can get the fault in other ways, not just by setting
the shutter speed.
JamesH65 added a commit to JamesH65/userland that referenced this issue Oct 18, 2013
Added  option to set the exposure/shutter time.

Needs a change on GPU binary, Brcm ref: 425060 and
a consequent change to
userland/interface/mmal/mmal_parameters_camera.h

There is currently a limitation of about 350ms beyond which
the camera will lock (depending on other settings). This
is being investigated, but is unrelated to this change
i.e. You can get the fault in other ways, not just by setting
the shutter speed.
popcornmix added a commit that referenced this issue Oct 20, 2013
@dmopalmer
Copy link
Contributor Author

@HeatfanJohn @dfrey
I have just released the design for the telescope interface to GitHub (its official home) and Thingiverse (which I consider a convenience copy, with a few more pictures).
https://github.com/dmopalmer/PiPiece
http://www.thingiverse.com/thing:181310

The field of view is quite small, as you would expect from the size of the imager chip. The Thingiverse picture of the raven head filing the frame was shot from ~40 meters away on a 5" telescope. (The Moon was shot from 10 million times further away and it was still much bigger than the FOV. From this we reach the scientific conclusion that Raven could not steal the Moon--unless he was very tricky.)

Whenever the exposure changes get folded into the closed-source component of the Pi, I can start doing astronomy.

@maxhal
Copy link

maxhal commented Nov 17, 2013

Hi James, any news/feedback about this problem from omnivision?

@JamesH65
Copy link
Collaborator

I have info from onmivision but not got te time to invrstigate the issue. It should work...

@coolcrab
Copy link

I modded my pi for astronomy too. So it would be great to integrate for a long time!
So much potential and pretty pictures with this thing.

Any news yet? :P

@JamesH65
Copy link
Collaborator

Got it up to about 2s, but failed to get it working with anything longer. Because of the way things work, over 2s needs a change to the driver to work in a different way, and I'm not sure the layers above can cope with it. No more time at the moment to spend on this.

@coolcrab
Copy link

Could it be possible to opensource the whole thing? Then others could try :P
There is no real rush, but I'd love to see this happen.

popcornmix pushed a commit to raspberrypi/firmware that referenced this issue Nov 27, 2013
See: http://forum.xbmc.org/showthread.php?tid=169674&pid=1560992#pid1560992

firmware: fix for emmc_pll_core combined with avoid_pwm_pll getting wrong emmc freq

firmware: alsa drop samples on stop when requested
See: raspberrypi/linux#320

firmware: camera: fix for long exposure times causing lockup
See: raspberrypi/userland#33
See: raspberrypi/userland#76

userland: raspivid: Segmentation option for multiple files generated from a stream
See: raspberrypi/userland#123
popcornmix pushed a commit to Hexxeh/rpi-firmware that referenced this issue Nov 27, 2013
See: http://forum.xbmc.org/showthread.php?tid=169674&pid=1560992#pid1560992

firmware: fix for emmc_pll_core combined with avoid_pwm_pll getting wrong emmc freq

firmware: alsa drop samples on stop when requested
See: raspberrypi/linux#320

firmware: camera: fix for long exposure times causing lockup
See: raspberrypi/userland#33
See: raspberrypi/userland#76

userland: raspivid: Segmentation option for multiple files generated from a stream
See: raspberrypi/userland#123
@JamesH65
Copy link
Collaborator

No, I'm afraid all this code is on the GPU which is closed source. It's
also horrible complicated.

On 27 November 2013 22:40, coolcrab notifications@github.com wrote:

Could it be possible to opensource the whole thing? Then others could try
:P
There is no real rush, but I'd love to see this happen.


Reply to this email directly or view it on GitHubhttps://github.com//issues/76#issuecomment-29426265
.

@astrorafael
Copy link

Pehaps it would be possible if some frame averaging could be done on the GPU memory in steps of 2 secs.

@jdunmire
Copy link

Did the 2-second support make it into the source? The docs still say 330000 uSeconds, but I didn't see a range check in the code.

@digitalspaceport
Copy link

I dont think the 2 sec exposure markup made it into the release, as he indicated a change to the driver as well above. I would love to fork that base for us astronerds! 2 seconds is a pretty fair improvement over 1 sec when it comes to planetary work and getting ISO's trimmed.

@Ruffio
Copy link

Ruffio commented Jun 27, 2015

@dmopalmer @jollyjollyjolly is this still an issue?

@dmopalmer
Copy link
Contributor Author

I have not looked at the source code recently, but as of raspistill v.1.38
$ raspistill -o /tmp/foo.png --exposure verylong
gives a 1/15 s autoexposure, even when the camera is in a light-tight box

Reading #151 I see that the correct option is '--shutter' . When I use --shutter 5000000 I get an image with blown-out exposure and the EXIF indicates a 5 s exposure time, so it seems to be working.

It takes 36 s to actually get a picture this way; presumably the system is trying to find a way to make the exposure look good. I haven't read the docs recently, so I don't know if there is a a way to tell raspistill to 'set the camera up like so, then just take the picture.'

But anyway, it does not seem to be an issue.

@tejonbiker
Copy link

Hi, I noticed the whole time that takes the -ss to 6000000 (Max value), but I don't know where the camera start to record, as far I can see, the camera waits 6 seconds for capture something, later take another capture of 6 seconds (this image is showed in the preview image in a HDMI monitor), later wait another 6 seconds and start to record the ultimate image (another 6 seconds), There's any way to know where the camera start to record the ultimate image?

@JamesH65
Copy link
Collaborator

There are some threads on the forum which cover exactly this topic of using
long exposure times.

On 28 August 2015 at 06:56, tejonbiker notifications@github.com wrote:

Hi, I noticed the whole time that takes the -ss to 6000000 (Max value),
but I don't know where the camera start to record, as far I can see, the
camera waits 6 seconds for capture something, later take another capture of
6 seconds (this image is showed in the preview image in a HDMI monitor),
later wait another 6 seconds and start to record the ultimate image
(another 6 seconds), There's any way to know where the camera start to
record the ultimate image?


Reply to this email directly or view it on GitHub
#76 (comment)
.

@Ruffio
Copy link

Ruffio commented Jun 27, 2016

@dmopalmer @jollyjollyjolly is this still an issue?

neuschaefer pushed a commit to neuschaefer/raspi-binary-firmware that referenced this issue Feb 27, 2017
See: http://forum.xbmc.org/showthread.php?tid=169674&pid=1560992#pid1560992

firmware: fix for emmc_pll_core combined with avoid_pwm_pll getting wrong emmc freq

firmware: alsa drop samples on stop when requested
See: raspberrypi/linux#320

firmware: camera: fix for long exposure times causing lockup
See: raspberrypi/userland#33
See: raspberrypi/userland#76

userland: raspivid: Segmentation option for multiple files generated from a stream
See: raspberrypi/userland#123
@bbozon
Copy link

bbozon commented Sep 13, 2017

I'm still having the issue that the camera takes ~40 seconds for a 6 seconde exposure. Does anybody know how to fix this?

@JamesH65 , do you know the status?

Thanks in advance!!!

@JamesH65
Copy link
Collaborator

@6by9 Cannot remember what we suggest as a solution for this.

@6by9
Copy link
Contributor

6by9 commented Sep 13, 2017

It's almost certainly just mode switching, so pretty much inherent. 40 seconds sounds high though - I thought it was around 24.

Two constraints:

  • whenever the sensor starts streaming it produces one corrupted frame.
  • whenever the GPU has requested a frame, it completes it.

In starting raspistill, regardless of the -t setting, it starts preview and requests a frame. Both the corrupt frame and preview frame will require the exposure time of 6 seconds. T=12secs now.

Assuming the capture has been requested by 12secs, then the sensor gets stopped, reprogrammed for the capture mode, and restarted. Starting the sensor means dropping a frame, so another 12 secs.

If burst mode hasn't been requested then as soon as the capture is completed it will mode switch back to preview and request a preview frame. I thought that shutting down would abort that frame, but I suspect that it may still complete it and taking another 12seconds.

There are a couple of things that could be checked to improve this, but it's not a priority at the moment.

@bbozon
Copy link

bbozon commented Sep 13, 2017

It works!!! Thank you very much. I forgot to put
-bm
-ex off
-tl 0

Now it works!
Sorry for the trouble...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests