I can help explain why. On earth, we are surrounded by stuff. Radiative cooling relies on thermal radiation leaving an object. Crucially, it also requires the object to absorb less thermal radiation than it emits. On earth we are surrounded by stuff, including air, that emits thermal radiation. There is a window of wavelengths, called the atmospheric window[0], that will allow parts of the thermal radiation out into space, rather than returned back. Imagine shining a flashlight on tinted glass, the light will get through depending on the color. If the light gets through, it has escaped. If not, the light is returned and heats up your surroundings again.
Also on earth the other methods (conduction, convection, and phase changes) are more effective. The earth can be used as a very big heat sink. On a spaceship or satellite, you don't have the extra mass to store the energy, so radiative is the only option.
m = ρV.
Let's simplify and assume we're using a sphere since this is the most efficient, giving V = 4/3r^3. Your shield is going to be approximately constant density since you need to shield from all directions (can optimize by using other things in your system).
m ∝ ρr^3
I'm not sure what here is decreasing nor what is a linear relationship. To adjust this to a shell you just need to consider the thickness so you can do Δr = r_outer - r_inner and that doesn't take away the cubic relationship.
https://en.wikipedia.org/wiki/Thermal_radiation#Characterist...
https://en.wikipedia.org/wiki/Black-body_radiation
https://www.nasa.gov/smallsat-institute/sst-soa/thermal-cont...
https://ocw.mit.edu/courses/16-851-satellite-engineering-fal...
I am more concerned about heat dissipation, which should scale with surface area, but heat generation scales with compute volume.
[0]:
shell thickness, t
compute radius, r
shell volume is (r+t)^3 - r^3 = 3 r^2 t + 3 r t^2 + t^3 = O(r^2)
shielding/compute is O(r^2)/O(r^3) = O(1/r), ie their linear decrease