TL;DR: No. At least, not chemically.
In a discussion about residential radon in which I linked this NIH study
which found that lung cancer decreases with rising residential radon levels, someone asserted that you could die by radon poisoning. I challenged that, saying that I'd calculate just what would happen if you had even 0.1 vol% radon in air. Then someone else said that people had died by radon suffocation in Appalachia, so I went and did it.
Stipulate that 0.1% by volume in air is two orders of magnitude below anything presenting an asphyxiation hazard. Radon has a density of 9.73 grams/liter, so 0.1% by volume would be 9.73 mg per liter or 43.8 μmol/liter.
The half-life of Rn-222 (the only isotope which lasts long enough to get out of soil and hang around much) is 3.824 days. This means that there's 1/e of it in 5.52 days, or 2.1e-6 (1/(86400*5.52)) of it decaying per second. For 43.8 μmol, this is 9.2e-11 mol/sec decaying or 55 TBq (terabequerels).
The decay energy of Rn-222 is 5.5 MeV, so that 4.38 μmol has a total power output of 3.04e20 eV/sec. An electron-volt is 1.602e-19 J, so that works out to 48.7 watts per liter of air. The air in a room 3mx3mx2.5m high (22500 liters), spiked with 43.8 μmol/liter Rn-222, would release about 1.1 megawatts of heat.
Anything and anyone in such a room would catch fire in seconds. There would be no time to suffocate.
Do I need to mention that if such high concentrations of radon were found in nature, people would pump it into tanks and use it to boil water? It would be one of the most fantastic sources of free energy imaginable.