How does the phone count these particles? How can a regular phone do it?
In very simplified terms:
The phone's job is to take pictures all the time (for the results to be correct the camera must be covered, and the whole phone should be away from heat or light). When the camera is covered, the most often the resulting image is black (no event) and is immediately rejected , but if a particle with adequate (high) energy (such as cosmic rays) passes through the material covering the camera, it is seen by the camera as a bright point that is counted as detection.
The detection results are sent to the server (api.credo.science) where the detection preview is available.
Of course, you can also find on the server, exceptions that are artifacts - something that is not of great importance for science, but it is of great importance in improving the filters we work on.
After I press start detections – should the camera be pointing at anything? Or should it just be on a surface with the camera pointing down? Or upwards? Should there be any distance between the camera and the ‘object’ it is looking at? The video shows someone with a black card but that is not clear what he is doing with the card.
No matter how you put the camera, she will pick up the detections. It is worth checking if there are any dependencies between the way the camera is covered and the quantity and quality of our detection. The black card on the film is a fridge magnet, the perfect way to cover our camera
What should I do before I start using apps and what should I know?
The most important for application work:
- charging or monitoring the battery status in the phone;
- it is very important to cover the camera even at night – to eliminate jamming. So far, the most effective way to cover the camera is to use a fridge magnet, you can also use black insulation tape or aluminum foil applied on a piece of paper or other, completely opaque for visible light thin material inserted between the camera and silicone (or other) smartphone protector.
- keep the phone away from sources emitting heat or light such as: a night light, diode, laser, lighted sill with moonlight at night
- Remember to keep the phone at a constant temperature. When the phone warms up in the window or other heat source, it increases the noise on the camera matrix
- run autocalibration in settings
Generally, the battery does not like to be
- all the time loaded up to 100%
The most ideal moment (having in mind the life of the battery) is to use the application when you regularly charge your phone (every day / two when the battery is almost discharged through the daily use of the phone) - when you have to charge the phone anyway.
Therefore, check if the battery and housing are in case, they do not get too hot [you can set max temperature in the application settings by a few degrees lower than the maximum temperature acceptable by your phone (information what temperature is acceptable you should find in the documentation / instructions of your device) during which the application it works].
Do the images get stored on my device? Will I run out of space for my own photos?
Images older than 10 days disappear from the application on the phone. Well, you can be calm about the phone’s memory.
If wifi is on at the same time as the app is running do the images get uploaded to your servers immediately?
Yes the results are sent immediately after detection (when there is wifi), you can see them at api.credo.science. Sometimes it may take some time before the result appears on the site.
The video shows the phone camera being covered by a piece of card the size of a credit card. Do I need to do exactly the same thing, if so do I need a special piece of card, or is putting the phone face down to cover the camera good enough
We showed the most effective method of covering the camera – covering the camera uses a fridge magnet;
but you can also use black insulation tape or aluminum foil applied on a piece of paper
or other, completely opaque for visible light, thin material inserted between the camera and silicone (or other) smartphone protector.
If you discover a better or also a good way to cover the camera that works in different conditions, please write to us
I can’t get my app to work due to the continual warning ‘Cover Camera!’
I have taped this over with black neoprene and black masking tape (phone is about the thickness of a brick) and put it in a box even in a dark site but still it will not work. Autocalibrations not work. what should I do?
Unfortunately, there are phone models where auto-calibration does not work at all.
In this case, we need to reduce the threshold for detection detection, i.e. change the values of 3 values:
“Max factor”, “Average factor” and “Black factor”
These values are different for each model.
We have to do it on the principle of trial and error.
Increase their value, e.g., by 10.
If they had, for example: 10 change to 20, 80 change to 90.
After each change, check if the application has started to work or if it continues to inform you to cover the camera. If nothing has changed, repeat the operation.
When it works, start lowering individual values slowly to reduce the risk of background occurrence.
Of course, we can not forget about access to wifi and charging the phone.
How does the algorithm work in the application? and What exactly is Max, Average and Black Factor?
The task of the algorithm is:
1. Checking if the cage is properly covered
2. Search for hits, and cutting if they are
Checking the coverage of the frame (single image) is that the brightness pixel by pixel is checked and counted:
a) average brightness of all pixels
b) number of pixels below the Black threshold
If the average brightness is below the "Average Factor" and the number of promile pixels below the threshold "Black Factor "is higher than "Black Count" then the frame (camera matrix) is considered to be properly masked and goes to search for hits (detections).
Searching for hits takes place in such a way that:
a) the brightest pixel is searched for the whole frame
b)if the pixel value is brighter (greater) than "Max Factor" then we consider it as a hit (if the darker, the frame there are no hits and the end of the algorithm), we cut it with a margin, send it to the server
c) then the fragment after the hit's excision is blackened and the algorithm returns to point a)rithm returns to point a)