The afterglow of a gamma-ray burst (GRB) is commonly thought to be the result of continuous deceleration of a relativistically expanding fireball in the surrounding medium. Assuming that the expansion of the fireball is adiabatic and that the density of the medium is a power-law function of shock radius, i.e. n(ext) proportional to R-k, we study the effects of the first-order radiative correction and the non-uniformity of the medium on a GRB afterglow analytically. We first derive a new relation among the observed time, the shock radius and the Lorentz factor of the fireball: t(+) = R/4(4 - k)gamma(2)c, and also derive a new relation among the comoving time, the shock radius and the Lorentz factor of the fireball: t(co) = 2R/(5 - k)gamma c. We next study the evolution of the fireball by using the analytic solution of Blandford & McKee. The radiation losses may not significantly influence this evolution. We further derive new scaling laws both between the X-ray flux and observed time and between the optical flux and observed time. We use these scaling laws to discuss the afterglows of GRB 970228 and GRB 970616, and find that if the spectral index of the electron distribution is p = 2.5, implied from the spectra of GRBs, the X-ray afterglow of GRB 970616 is well fitted by assuming k = 2.