I'm interested in the subject, and work with ionizing radiation as part of my job. I wear a "film badge" (really a thermoluminescent detector (TLD)) at work, it's required by law.
I obtained a real (calibrated) geiger counter and took it on a plane trip, Sydney to Melbourne.
On the tarmac at Sydney, 0.08 microSieverts per hour.
At 16000 ft over Kosciusko, 6.5 microSieverts per hour, spiking to 9 occasionally.
Is this a problem ? Well, you need about 0.5 Sieverts in a short time (say a day) to get close to acute radiation sickness. That's 100,000 times more than on the flight to Melbourne. Some people can tolerate more than 0.5 Sv before getting sick.
Your chance of cancer is assumed to increase in a linear manner (from presumed zero) at a rate of 5.5% per Sievert (not milliSievert or microSievert). The average deaths from cancer (pooled) is about 20%, so if you cop a Sievert, you will likely suffer acute but mild radiation sickness and be 5% more likely to die from cancer.
http://en.wikipedia.org/wiki/Radiation-induced_cancer
If you are interested in finding out your dose during the trip to Japan, don't get an iPhone device which has not been calibrated. You don't want to scare yourself. You do want to be better informed.
Regards,
Tony Barry
PS Had a look at the iPhone module. Uses a reverse biased diode as a detector, and is good for low levels of radiation. Requires a copper shield of a certain thickness to get the detector to approximate a Sv response (which is basically catering for for biological impact). The engineers have done a nice job on the device, but the detector appears to be quite small and easily saturated. This is not a problem for people taking normal level readings. Putting red dots on maps to indicate high radiation levels might be a bit much though. High in terms of device sensitivity is not the same as biological impact.
TB