Recent Posts

Pages: 1 ... 8 9 [10]
91
Chronos User Discussion / Re: Hyperfocal distance
« Last post by muringuets on June 24, 2021, 05:12:19 AM »
I hope this wasnt too confusing, and did help you with your Question.

It helped for sure. I'll dig into the numbers, try to compile a usefull table with different apertures and different focal distances...
92
Chronos User Discussion / Re: Hyperfocal distance
« Last post by Nikon1 on June 23, 2021, 08:39:41 AM »
So the Circle of Confusion is mostly just two things:
 1.  On an ideal, theoretical lens, which creates an perfectly sharp Image with an Ideal Sharp Rendering of the Focal Plane (note that for this assumption only this super thin (an Actual geometrical Plane) Plane would be actually in Focus, no matter the Lens Aperture, everything else is assumed as Unsharp. Basically saying Depth of Field would be 0,00mm, just to understand the Theory behind it a bit better), the Circle of Confusion describes the size of the Circle, at which a theoretical, infinitely Small Light Source would Render when out of Focus, no matter how small that would be. You for sure have seen or heard about the Term "Bokeh Balls", which a lot of Photographers like to throw around when talking about Superfast Lenses and such. Basically those are the Confusion Circles of whatever (not quite theoretically ideal and super small) Point Light Sources, rendering actual Circles of Light on the Sensor.
 2.
 On an Real World Lens, Circle of Confusion also comes into play when a Lens Designer sets out to Create a New Lens. This is a Parameter they have to keep in mind, and it basically represents (talking about the In-Focus-Plane, where on the ideal Lens, everything rendered on the Sensor would be an absolutely Flawless Reproduction of Reality, and a geometrical Point would be rendered as a Point) all the Flaws in the Lens Combined. Look up Aberrations if you want to know more about them (https://en.wikipedia.org/wiki/Optical_aberration), but for now lets just say, real world Lenses make all kinds of "Mistakes" when Rendering an Image. So the Circle of confusion in this Relation means, the maximum size of deviation from an ideal Rendering of a Point when in focus. When a new Lens is Calculated, keeping this Circle of Confusion in terms of Optical Aberrations as Small as possible is very important to give a Final Image, that has as much Detail as Possible.
Keeping the Maximum Circle Of confusion as Small as Possible has Limits however, because at some Point improving it beyond a certain point will lead to the need of a increasingly Higher Precision When manufacturing the Lens, a More Complicated Lens Design, a vast increase in Size and/ or Weight of the Lens and of course a lot more expensive Production Process (or multiple of those things Combined). Thats why Cheap Lenses are often not as "sharp" as others, and why the Last bit of possible Resolution and Lens Quality will allways make Lenses WAY more Expensive than Similar other lenses (also Size/ Weight).
 .
 So, as an Conclusion, Circle of Confusion is what limits your Resolution pretty much. For Film, that would be whatever size you are Expected to Print it at, and some kind of Viewing angle Rule to determine the Circle of Confusion (there is some kind of rule of how small of an Angular Resolution the Human Eye can Resolve, and from that, the expected Viewing distance, and Print Size you can Calculate the maximum allowed Circle of confusion, bevore anyone can tell its "unsharp", but i dont know that anymore and cant be bothered to look that up rn, let me know, if you want to know more). on Higher end Film lenses, they sometimes tried to even limit maximum Circle of Confusion to the Grain size, as Film cant just resolve anything more than that physically.
 For Digital Sensors, the Circle of Confusion is also what limits Resolution. If you look at Machine-Vision Lenses, the Better ones Have Megapixel-Ratings. Thats nothing else, than just openly saying what Maximum Circle Of Confusion they are designed to have (obviously also tested to do so), combined with the Size of Sensor they are built for. So basically a Guarantee that they can Resolve at least that many Pixels, across the Full Sensor and Range of Focus.
 Now in Terms of Depth of Field, its somewhat similar. To be able to calculate your Hyperfocal Distance/ Depth of Field, you need to know at which point something is considered "sharp" or "unsharp", as there is technically just one Plane of space to ever be actually in Focus, according to theory. In Reality, as discussed earlier, there is a certain threshold to cross for us to actually perceive it as Unsharp. On an digital Camera, this would usually be the Size of the Physical Pixels on the Sensor. If the Circle of Confusion is smaller or as big as the Pixel itself, you couldnt tell a difference from if it was perfectly sharp, as it will become a Pixel anyways. Cant resolve anymore detail, even if it Tried somehow. Only if the Circle of Confusion gets noticeable bigger than your Sensor Pixel, you will be able to tell, that its unsharp from the Recorded Image on the Camera because it will now render your point to more than just that Single Pixel and start to influence the surrounding Pixels also (or full on cover multiple Pixels).
 .
 For the Chronos, you would just look up the Pixel size ( /-Pitch, see below) in the Datasheet, and take that as your Circle of Confusion Value for your app (or Formula if you do the Calculation on your own).
 .
 According to Datasheet:
 1.4:
 Pixel Pitch: 6,6 Ám
 .
 2.1:
 Pixel Pitch: 10 Ám
 .
 Note, that they give Pixel Pitch here, not Pixel size, as on an Digital Image Sensor the Size of the Pixel is usually smaller than the Pitch, because all the Circuitry has to also fit on there, and Light sensitive area is mostly not 100% of the Sensor Area. So Pixel Pitch is the Value to look out for, because it is the Spacing of the Pixels, and given a circle of Confusion of 10Ám on the 2.1 Sensor, it would still give a perfectly Sharp image, even if that is more than the Actual Pixel area, because it ever just covers a single Pixel in size.
 .
 I hope this wasnt too confusing, and did help you with your Question.
93
Chronos User Discussion / Hyperfocal distance
« Last post by muringuets on June 23, 2021, 07:02:07 AM »
Hello everyone.

I've been using Photopills app for a while and it has a very cool table with hyperfocal distances for a bunch of sensors. I looked for formulas to calculate the hyperfocal distances for the Chronos 1.4 and the Nikkor lenses I tipically use, but most formulas use a parameter for the sensor I do not have, circle of confusion.

Any idea on how to calculate the hyperfocal distance for the Chronos sensors? A hyperfocal table would help a lot to focus on moving objects and other complex scenes (trying to catch a few bike jumps and other action takes)

Thanks
94
Chronos User Discussion / Re: Chronos 1.4 Footage Thread
« Last post by Nikon1 on June 22, 2021, 12:01:35 PM »
Found some 1.4 Footage of electrically exploded Flesh on YT:
 https://youtu.be/DQ67njnNaxw?t=274
 also, damn, they did some things to this lens, looks like its had a rough live.
95
Chronos User Discussion / Re: Chronos 1.4 Footage Thread
« Last post by 1022mm on June 22, 2021, 08:27:42 AM »
This year I've been shooting lightning with the 1.4 at 6,000 FPS.   (Actual framerate is 6,002 fps at 640 x 354).   The following is the best storm I've had for this purpose, with 17 captures at that setting.   I also had a 30fps camera recording simultaneously for most of them.

https://www.youtube.com/watch?v=hKopbGvL93A
96
Chronos User Discussion / Re: Chronos 1.4 Footage Thread
« Last post by clkdiv on June 22, 2021, 06:10:48 AM »
Jesus, where did the Chronos 1.4 land? :-)

97
Aaaah! Ooooh! Uuuug! You are a magician!

Then it would be e.g. possible to alter the defaults for the Web UI, for example. Cause every time I start the web control, I need to retype the smb share information, the pass and so on.

98
os card can be fully accessed from VirtualBox with some Ubuntu ISO file of your choice running from RAM and Mapping The Card Reader to the VM, just tested. File System of the other Partition is ext3.
 See Screenshot.
99
General high-speed discussion / Re: Fixed pattern noise removal
« Last post by clkdiv on June 21, 2021, 02:57:27 PM »
Yes, thanks. I will try with the reference frames, but sometimes this is not possible. I would like to use the Chronos like a point and shoot, and even black calibration take a lot time sometimes. Even more saving.

Would be great to have a FPN software for cases when not having reference frames. However, if one has references, your proposal is great. Thanks!
100
General high-speed discussion / Re: Fixed pattern noise removal
« Last post by Nikon1 on June 21, 2021, 02:04:13 PM »
Sorry about that, here is a .aep with only the two frames i posted. I put A comment next to the files which one is noise reference and which is footage.
 I also wouldnt reccomend using a static Image as A reference, just doesnt look natural at all. But as i said above, thats why i would record like at least 2 Sec Playback time worth of Noise Reference, and loop it if needed (would add a little fade in/ out at the point where it goes around) to make it more organic.
 ALSO for sure there is some smart way to do that with some algorithm, but good luck developing that or finding someone who can do that. I certainly cant, so i would do it with the Noise Reference Footage.
 .
 Sure there is still a bit FPN left in the Output PNG, but compare that with the unprocessed one....
 I raised exposure from the DNG quite a bit btw., so good luck getting anywhere near as clean of an output Image without that Noise Reference, ill wait.
 Also, as i said, you can still run that through your favorite Denoising Software and remove even more of the Leftover from that. Temporal Interpolation in AE helps a lot usually (Insanely long rendertimes sadly) for stuff like that. The .aep i posted is an fairly quick edit, so if you where to spend more time on it, and also used multiple layers, and maybe even different Noise Reference Sources from different exposure levels and a bunch of masks for different areas of the Sensor, i am fairly certain that you could still improove the output a lot, even without any actual denoising software, just with Noise Reference Footage. If you Shoot a lot on the Same Settings, one single Project file could possibly be enough to run all your footage through, with minimal fine-tuning needed i assume. Becomes a bit more work, if you use a bunch of different settings however.
Pages: 1 ... 8 9 [10]