Blender has been a hobby of mine since 2002, so as soon as I got involved with astronomy, I knew I wanted to be able to use it to view 3D FITS files. On this page I describe some of my early efforts to get this to work. Eventually I hit on far more successful techniques, but maybe these ones are still worth knowing about if only to be aware of what not to do.
My first approach to viewing FITS files was probably the most obvious. Each pixel in a FITS file can be imported as a vertex and given a special "halo" material that corresponds to (say) the flux or intensity value of that pixel. This works well enough if you just want to render the cube for a pretty movie. It's only moderately useful if you want to view a cube in real time, however. Single-vertex halos can't be given any kind of real time material in Blender, so if you need to display every pixel, all you'll see in the real-time view is a huge 3D grid of black dots. Looks pretty nice when rendered, but often nearly useless for real time.
Things get a bit better if you're able to use some cut when importing the cube. For the data sets I usually work with, most of the data is noise (which doesn't need to be imported), and the only interesting features are almost always unresolved. An example image is shown below, this was used on a poster I presented in the AAS, Boston 2011. In this case this method is not too bad, you can see the major structures clearly enough even in the real time view. But as above, for cases where there are very extended structures present, this method really isn't much use at all.
This method is probably the least memory-intensive technique for importing files, but has severe limitations. In Blender, meshes using halos could only be given a single material. In order to show data at different intensity levels (even in the rendered view), this requires importing different groups of meshes, each one assigned a single different material. This quantises the data display and limits the dynamic range, which is far from ideal. And Blender can only handle a few million vertices at most, whereas the typical FITS files I need to inspect have ten or a hundred times that many pixels. That said, I've no idea how particles work in more modern versions of Blender (but see the n-body simulations page), so perhaps this old method could be given a new lease of life. One thing it was very useful for, however, was the glass cube project (link to be added), though I think some of the more advanced techniques would be suitable for this too.
To get something more useful in the real time display, we need to have meshes with faces that can be given compatible materials. This next method imports cubes instead of simple points. Unlike single-vertex points, Blender has the capability to correctly handle different colours (i.e. intensity values) and transparency levels for meshes with faces, in real time. While this makes the real time display vastly more useful, it also eats up a lot more memory and importing the cube can be quite slow (in comparison importing point clouds is usually almost instant). You can't really import data sets of any real size with this method; as with other limitations, this is more due to Blender itself than hardware, so even buying a fancy-spec PC won't help.
Like point clouds, this again imports multiple sets of pixels into each mesh, since having each pixel/cube be its own object would use far too much memory, and for the same reason it also quantises the materials. The code is necessarily a lot more complex than the point cloud version. It certainly didn't help that at the time astropy was bugged and often didn't convert the data correctly, claiming that large parts of the data were NaNs when they were nothing of the sort.
I made an abstract-art demo real of these techniques which you can see below.