The idea: Two sets of voltage and calibration profiles: -high-sensitivity voltage and CPM/uSv ratio + dead time -low-sensitivity voltage and CPM/uSv ratio + dead time -threshold CPM/uSv to switch between profiles
When the radiation source is too high, the device could switch to a different voltage and CPM/uSv conversion rate and an accompanying dead time, which is an underpowered configuration and much less sensitive. If the radiation drops below the threshold, the tube switches again to its higher-sensitivity voltage and CPM/uSv ratio + dead time.
1. This increases the tube's ability to detect radiation sources beyond design since its CPM cap remains the same but it detects much less radiation. The accompanying CPM/uSv ratio + dead time keeps the dose measurement accurate and the tube may be able to detect several times more radiation than it normally would before hitting the ceiling.
2. Battery saving, because producing fewer counts on high doses drains the battery less.
What do you think?
If this can be made to work, it will be a feature unlike any other mainstream geiger counter and allow for a uniquely high cap on the dose rate. For example the SBT-11 version GMC-600 is only able to go up to around 3-4 mSv/h but at a lower voltage it could go as high as 8-10 mSv/h.
on the GMC-500 + there are two tubes one high sensitivity and one very low sensitivity because its effective detection volume is very much smaller. lowering the voltage of the tube will certainly modify too many parameters and will play differently depending on whether you want to detect alpha beta or gamma, I don't think that's a solution.
to measure strong radiation with good precision, the GMC-500 + is the right candidate. the GMC-600 or GMC-600+ are more made for very weak, weak and medium radiations grade and also of course for the alphas detection.
Mastery is acquired by studying, with it everything becomes simple