-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Manual exposure control for raspistill #76
Comments
This is on my list of stuff to look at. If I remember the code correctly (and it's a nightmarish and very large bit of code to handle all the exposure modes) this won't be an simple thing to do. |
If there were some command to replace the 1/100 s that is used for --exposure=off, then that would probably be enough. (But you know the code and I can see how that might screw up the timing for other modes, e.g. if that parameter is also the initial guess that is used in autoexposure.) |
Turned out not to be so bad. I've made exposure a command line parameter, it's currently going though merging, but because it needs changes on GPU and ARM side, it will take a few days to get through the system. There is an issue with exposures > about 1/3rd of a second - they lock the GPU - this was not introduced by this change, but has always been there. I'm investigating, but so far looks like a driver or perhaps even a sensor problem. I've contacted Omnivision to see if they can help because everything in the register setting looks right to me! |
As an astronomer, I am quite interested in exposures > 1/3 second, so I thank you for pushing this through. (There are limits due to dark current, but modern CMOS imagers should be able to handle seconds of exposure even at room+ temperatures, and I am willing to strap an ice-cube to the sensor to get longer exposures.) |
I've checked the rows and suchlike - there seems to be no obvious reason But it is a specific point it fails, so they should be able to either On 2 October 2013 23:55, David Palmer notifications@github.com wrote:
|
@dmopalmer, just curious, how do you interface the Pi's camera to your telescope? Do you have any blog postings that detail how you are doing that? |
Added option to set the exposure/shutter time. Needs a change on GPU binary, Brcm ref: 425060 and a consequent change to userland/interface/mmal/mmal_parameters_camera.h There is currently a limitation of about 350ms beyond which the camera will lock (depending on other settings). This is being investigated, but is unrelated to this change i.e. You can get the fault in other ways, not just by setting the shutter speed.
@HeatfanJohn, I will be unscrewing the lens from the camera module, and 3d printing a mount that has a tube that goes into the eyepiece tube. The pixel size on the camera module is a good enough fit for the prime focus of my 5" Celestron. Eyepiece projection would allow more flexibility, but I may as well start easy. I will eventually also print up something that holds the Pi and a battery to give a single unit that works as a web-enabled eyepiece (using my piCamServer ) so I can view it from an iPad without any cables. |
Hi David, Thanks Davide On Oct 5, 2013, at 3:35 AM, David Palmer notifications@github.com wrote:
|
When I start printing the mount, I will put it on Thingiverse and/or elsewhere (most likely on github along with any astronomical imaging software I develop for it. Lucky imaging is one of the things I want to try with it.) I use OpenSCAD for 3D design, so I will make the draw tube diameter a parameter (0.965" on mine, but 1.25" and 2" are common and parameterization means that doesn't have to be a common size). If i get fancy with internal baffles I can set the minimum f-number as an additional parameter. Do you have a specific optical configuration I should try to make it compatible with? (Some of this is that when you have a hammer, everything looks like a nail. If I didn't have a 3D printer I'd put something together with tubing from the hardware store and epoxy in an hour. If I had a machine shop and the appropriate skills I'd use a lathe and be done in an afternoon, unless it had a CNC mill in which case I could spend weeks getting it right.) But I'm not ready to take the lens off the camera module yet because it's useful for debugging to have simple optics so I can just stand in front of the camera and wave my arms to test things out. |
Added option to set the exposure/shutter time. Needs a change on GPU binary, Brcm ref: 425060 and a consequent change to userland/interface/mmal/mmal_parameters_camera.h There is currently a limitation of about 350ms beyond which the camera will lock (depending on other settings). This is being investigated, but is unrelated to this change i.e. You can get the fault in other ways, not just by setting the shutter speed.
Added option to set the exposure/shutter time. Needs a change on GPU binary, Brcm ref: 425060 and a consequent change to userland/interface/mmal/mmal_parameters_camera.h There is currently a limitation of about 350ms beyond which the camera will lock (depending on other settings). This is being investigated, but is unrelated to this change i.e. You can get the fault in other ways, not just by setting the shutter speed.
@HeatfanJohn @dfrey The field of view is quite small, as you would expect from the size of the imager chip. The Thingiverse picture of the raven head filing the frame was shot from ~40 meters away on a 5" telescope. (The Moon was shot from 10 million times further away and it was still much bigger than the FOV. From this we reach the scientific conclusion that Raven could not steal the Moon--unless he was very tricky.) Whenever the exposure changes get folded into the closed-source component of the Pi, I can start doing astronomy. |
Hi James, any news/feedback about this problem from omnivision? |
I have info from onmivision but not got te time to invrstigate the issue. It should work... |
I modded my pi for astronomy too. So it would be great to integrate for a long time! Any news yet? :P |
Got it up to about 2s, but failed to get it working with anything longer. Because of the way things work, over 2s needs a change to the driver to work in a different way, and I'm not sure the layers above can cope with it. No more time at the moment to spend on this. |
Could it be possible to opensource the whole thing? Then others could try :P |
See: http://forum.xbmc.org/showthread.php?tid=169674&pid=1560992#pid1560992 firmware: fix for emmc_pll_core combined with avoid_pwm_pll getting wrong emmc freq firmware: alsa drop samples on stop when requested See: raspberrypi/linux#320 firmware: camera: fix for long exposure times causing lockup See: raspberrypi/userland#33 See: raspberrypi/userland#76 userland: raspivid: Segmentation option for multiple files generated from a stream See: raspberrypi/userland#123
See: http://forum.xbmc.org/showthread.php?tid=169674&pid=1560992#pid1560992 firmware: fix for emmc_pll_core combined with avoid_pwm_pll getting wrong emmc freq firmware: alsa drop samples on stop when requested See: raspberrypi/linux#320 firmware: camera: fix for long exposure times causing lockup See: raspberrypi/userland#33 See: raspberrypi/userland#76 userland: raspivid: Segmentation option for multiple files generated from a stream See: raspberrypi/userland#123
No, I'm afraid all this code is on the GPU which is closed source. It's On 27 November 2013 22:40, coolcrab notifications@github.com wrote:
|
Pehaps it would be possible if some frame averaging could be done on the GPU memory in steps of 2 secs. |
Did the 2-second support make it into the source? The docs still say 330000 uSeconds, but I didn't see a range check in the code. |
I dont think the 2 sec exposure markup made it into the release, as he indicated a change to the driver as well above. I would love to fork that base for us astronerds! 2 seconds is a pretty fair improvement over 1 sec when it comes to planetary work and getting ISO's trimmed. |
@dmopalmer @jollyjollyjolly is this still an issue? |
I have not looked at the source code recently, but as of raspistill v.1.38 Reading #151 I see that the correct option is '--shutter' . When I use --shutter 5000000 I get an image with blown-out exposure and the EXIF indicates a 5 s exposure time, so it seems to be working. It takes 36 s to actually get a picture this way; presumably the system is trying to find a way to make the exposure look good. I haven't read the docs recently, so I don't know if there is a a way to tell raspistill to 'set the camera up like so, then just take the picture.' But anyway, it does not seem to be an issue. |
Hi, I noticed the whole time that takes the -ss to 6000000 (Max value), but I don't know where the camera start to record, as far I can see, the camera waits 6 seconds for capture something, later take another capture of 6 seconds (this image is showed in the preview image in a HDMI monitor), later wait another 6 seconds and start to record the ultimate image (another 6 seconds), There's any way to know where the camera start to record the ultimate image? |
There are some threads on the forum which cover exactly this topic of using On 28 August 2015 at 06:56, tejonbiker notifications@github.com wrote:
|
@dmopalmer @jollyjollyjolly is this still an issue? |
See: http://forum.xbmc.org/showthread.php?tid=169674&pid=1560992#pid1560992 firmware: fix for emmc_pll_core combined with avoid_pwm_pll getting wrong emmc freq firmware: alsa drop samples on stop when requested See: raspberrypi/linux#320 firmware: camera: fix for long exposure times causing lockup See: raspberrypi/userland#33 See: raspberrypi/userland#76 userland: raspivid: Segmentation option for multiple files generated from a stream See: raspberrypi/userland#123
I'm still having the issue that the camera takes ~40 seconds for a 6 seconde exposure. Does anybody know how to fix this? @JamesH65 , do you know the status? Thanks in advance!!! |
@6by9 Cannot remember what we suggest as a solution for this. |
It's almost certainly just mode switching, so pretty much inherent. 40 seconds sounds high though - I thought it was around 24. Two constraints:
In starting raspistill, regardless of the -t setting, it starts preview and requests a frame. Both the corrupt frame and preview frame will require the exposure time of 6 seconds. T=12secs now. Assuming the capture has been requested by 12secs, then the sensor gets stopped, reprogrammed for the capture mode, and restarted. Starting the sensor means dropping a frame, so another 12 secs. If burst mode hasn't been requested then as soon as the capture is completed it will mode switch back to preview and request a preview frame. I thought that shutting down would abort that frame, but I suspect that it may still complete it and taking another 12seconds. There are a couple of things that could be checked to improve this, but it's not a priority at the moment. |
It works!!! Thank you very much. I forgot to put Now it works! |
This is a request to make the exposure time (shutter speed) an exposed parameter in the API (or if it is already, to document how to use it.)
I would like to set the shutter speed for raspistill. I want to do technical imaging where I can control what the camera does (and, secondarily, make it do it with as little lag as possible).
If I set --exposure=off then it always takes a 1/100 s image (according to the EXIF).
I have looked through the mmal_parameters_camera.h and don't see any way to set the exposure time, unless it is disguised as e.g. MMAL_PARAMETER_FRAME_RATE, MMAL_PARAMETER_FPS_RANGE, etc.
Since the Raspberry Pi is not an open-source project, I can't go in and see how it is setting the shutter speed.
This capability would be useful for, e.g.:
and any other purpose where the data is more important than the aesthetics.
The text was updated successfully, but these errors were encountered: