The compactness problem is seemingly a contradiction that arises from GRB light curve data[1]. On the one hand, a large portion of the energy is the form of non thermal, power law spectrum photons. One the other hand, the short variations in the luminosity imply a very compact source, hence optically thick.

Naive Optical Depth Calculation Edit

The typical energy of power law photons is taken to be at least $ E = 10^{50} erg $ (in some GRBs it can be orders of magnitude larger, but this will only make the following contradiction more severe). The luminosity varies on a time scale of 10 ms, which, by multiplying with speed of light, implies a source radius of about 3000 km. To estimate the photon density we will assume thermal equilibrium (in reality the spectrum is actually non thermal, but close to the peak this approximation holds), so we can use the formulae for black body radiation. The temperature is therefore given by

$ \frac{E}{R^3} \approx a T^4 \Rightarrow T \approx \left( \frac{E}{a R^3} \right)^{1/4} $

substituting the values above yields a thermal energy very close to the rest mass energy of the electron. The number density of photons is therefore $ n = \frac{E}{kT R^3} $. The cross section for two photon pair production $ \gamma + \gamma \rightarrow e^+ + e^- $ is of the same order of magnitude as the Thomson cross section. The optical depth can be calculated using $ \tau \approx \sigma n R $. Plugging in the numbers yields an optical depth of 1e15.

There are two problems with such a large optical depth. The first is that photons are more likely to turn into electron - positron pairs then escape. The second is that even if there are enough electron positron pairs to recombine, the radiation would be in thermal equilibrium, in contrast to the observed spectrum which is better described by a superposition of two power laws, smoothly connected.

Resolution with Relativistic Motion Edit

The difficulty presented above can be circumvented if the source is moving at a relativistic speed towards the observer. In this case, the apparent variation in time would be shorter by $ \Gamma^2 $ ($ \Gamma $ is the Lorentz factor of the source) as compared with the intrinsic variability time-scale. In accordance, the actual radius should be larger by $ \Gamma^2 $ to result in the same observed time-scale. This effect reduces the density of photons by a factor of $ \Gamma^4 $ (notice that even though the radius is proportional to $ \Gamma^2 $ this does not reduce the density by a factor proportional to $ R^3 \propto \Gamma^6 $ since the source is expanding relativistically, see: Optical depth of a relativistically expanding source) and reduces the optical depth by a factor $ \Gamma^2 $.

Another result of relativity is that the apparent energies of photons are larger than the corresponding energies in the progenitor's rest frame by a factor of $ \Gamma $. At sufficiently large (large enough to enable pair creation) photon energies, the spectrum of photons is assumed to be a power law $ N\left(E\right) dE \propto E^{-\alpha} dE $. In this case, the optical depth will change by another factor of $ \Gamma^{-2\alpha} $. For $ \alpha \approx 2 $ we obtain an overall factor of $ \sim \Gamma^6 $, thus requiring $ \Gamma \approx 200 $ to reduce the optical depth to order unity, consistent with the observed spectra.

References Edit

  1. T. Piran, GAMMA-RAY BURSTS AND THE FIREBALL MODEL, Phys, Rept. 314 (1999) 575-667