Rhys Taylor's home page
Science, Art and Data Visualisation

Technical Skills

Observing Experience : Arecibo

I visited Arecibo in both 2007 and 2008 for observing runs, then spent over two years there as a postdoc (2011 – 2013). My observations consisted of HI mapping for the AGES and WAVES projects using the ALFA multibeam receiver (using the WAPPs and Mock spectrometer), and follow-up observations using both ALFA and the L-band Wide receiver. I performed hundreds of hours of Arecibo observations both on-site and remotely, continuing to observe until shortly before the telescope collapsed. This gave me experience with the entire observing process, including :

  • Learning how to select a target field and optimise the area for efficient observing
  • Running scripts to generate the command files for CIMA. These would both configure the instruments for the correct observing mode, select the targets, and control the integrations (e.g. for drift-scan mode or on-off observing)
  • Operation of the observing scripts in the CIMA software. This involved interactively choosing which targets to observe given the available time as well as monitoring the incoming data quality (especially to check for RFI and receiver problems)
  • Using CIMA scripts to check the drift-scan quality of the previous night's observations and making corrections to forthcoming observations accordingly, and also using IDL tools to perform quick, on-the-fly data reduction of follow-up observations
  • Knowledge of things going wrong and what to do when a 900 tonne dome gets stuck at 2 am*

* The answer, of course, is be nice to the telescope operator.



Observing Experience : Other Facilities

I have performed on-site observations with the IRAM 30m telescope (November – December 2017), performing CO(1-0) and CO(2-1) line observations using the EMIR receiver in Wobbler-switching mode. I learned how to configure the telescope and its receivers, control the pointing, carry out the integrations, monitor data quality, and reduce the data in GILDAS. 


I have some very limited experience with remote observations with the Robert C. Byrd 100m Green Bank telescope. I also have experience in the preparation of observing scripts for the VLA and the Chinese FAST telescope.


Although my focus is on radio astronomy, my first observing run was a baptism of fire with the MDM 1.3m McGraw-Hill optical telescope at Kitt Peak. This involved manually steering the dome so the telescope could see out, filling the dewer with liquid nitrogen to cool the CCDs, visually inspecting the sky to check for clouds*, ensuring the telescope wasn't in danger of crashing into its own mount**, taking sky flats, bias, dark and other calibration images, and processing the data in IRAF. More recently (February 2023 – present) I have some experience of operating the Serbian 1.4m Milanković telescope for deep optical imagining.

* On a site which gets ~300 clear nights year, four of ours were cloudy, and one night it even rained.

** Which was apparently a real possibility at certain zenith angles. This involved going into the dome between exposures to check the telescope's orientation.



Observing Support

At Arecibo I acted as a "Friend Of The Telescope" for seven projects. This involved providing all levels of user support for accepted projects depending on the observer's experience. For first-time users this would mean providing instructions to using CIMA in offline mode to demonstrate how to configure and use the software for their own observations, and typically staying with them for the first night's observing in case of unexpected difficulties. In the case of more experienced observers this would usually be restricted to email conservations, e.g. to check data availability and scheduling. User support continued beyond the duration of the observing to assist with data reduction and analysis, if necessary.


I also rewrote the "IDL Cheat Sheet" for Arecibo, which contained instructions on the most commonly-used IDL scripts and CIMA data-monitoring utilities. I re-ordered this to have a more typical sequence of commands and accounting for the latest, more functional scripts that obviated the need for certain earlier procedures. Additionally, I added instructions on configuring the ALFA data monitor for remote observers using VNC, which was essential for checking the data quality in real-time.


Since 2016, as a Contact Scientist of the Czech ARC Node for the ALMA Observatory, I provide user support for accepted projects. In each observing Cycle I typically support ~3 projects. As with Arecibo, duties can range from non-existent (experienced observers) to detailed discussions on the helpdesk regarding technical capabilities of the telescope and especially change requests, co-ordinating the response with the Data Reduction Managers and the Joint ALMA Observatory regarding user inquiries. 



Programming

I routinely write and use code in Python, including modules such as numpy, matplotlib, astropy, spectral-cube, PIL, scikit-image, and parmap for multiprocessing. With FRELLED alone I have written tens of thousands of lines of Python code, using Blender version 2.4X, 2.7X and 2.9X, but I also write many standalone codes for data processing. This includes baseline and bandpass estimation for more accurate data reduction and better visualisation, conversion of spherical coordinate FITS files into Cartesian data, setting up and running batches of SPH simulations (executed in Fortran) as well as to analyse and visualise the results, setting up aperture photometry for use with ds9, and the interactive, semi-automatic GLADoS source extraction program. A reasonably complete catalogue of the more useful code I've written can be found here.


While Python is now by far my dominant programming language, I also have experience in IDL and Fortran. IDL was the language used for most of the observing, analysis and data reduction scripts at Arecibo. I wrote baseline subtraction algorithms in IDL as well as the first version of the source extraction program GLADoS.


My formal training in programming languages was in Fortran 90 at undergraduate level. During my PhD I acted as a teaching assistant in this same course, helping students with fundamental questions about programming and providing practical assistance, where necessary, with debugging their code.



Data Reduction And Analysis

I use the packages LIVEDATA and GRIDZILLA for calibrating and gridding the Arecibo ALFA observations. I analyse the results using MIRIAD, primarily with the mbspect task but using various others as well. The L-band wide observations are calibrated and analysed using Arecibo's own IDL scripts. I also have some experience using GILDAS to reduce and analyse CO data from the IRAM 30 m telescope. I have some limited experience in aperture synthesis data reduction using AIPS++, CASA, and GILDAS.


For HI source extraction I rely on my own software packages FRELLED (for visual extraction) and GLADoS (automated), but I have also used POLYFIND (an early code developed for HIPASS data) and DUCHAMP.  I have written more experimental routines for stacking HI data, both for pointed observations of individual galaxies and by spatially averaging different sized areas to search for diffuse HI.


For optical data, I use ds9 with funtools for simple aperture photometry (I have also used GAIA for this), and IRAF packages (such as galfind, isomap, imarith and imcombine) for constructing surface brightness profiles, combining data fields, and data reduction. I also use IRAF for constructing unsharp masks and residual images for revealing faint, low-contrast structures in early-type galaxies.



Visualisation

See this page for details. In brief, I have been using Blender as a hobby since 2002, and for visualising astronomical data professionally since my PhD. In particular development of FRELLED began in earnest in 2012 and has continued ever since. This combines the visual display capabilities of Blender with the data processing capabilities of Python, for example to display multi-volume 3D FITS data volumetrically (in real time), or as isosurfaces, contours, or displacement maps, or even using virtual reality; importing n-bodies and vectors is also possible. Python code is used at all levels of the process, from cleaning the data to optimising it for the different display options, to importing it into Blender in different formats (e.g. converting FITS arrays into a series of PNG images). Besides this, I am familiar, in varying degrees, with all aspects of Blender, such as video editing, armature rigging, animations, particles, UV mapping, node setups for materials and compositing, etc.


I have investigated alternative visualisation capabilities with Blender. For example, I produced a proof-of-concept code that transforms data in arbitrary coordinate systems (e.g. spherical polar coordinates are often used for disc simulations, also all-sky HI data tends to use this convention) into fully-sampled Cartesian data. I experimented with displaying all-sky HI data as a series of nested spheres, with size based on velocity rather than true 3D distance, as well as encoding velocity as colour. I have visualised particle data as meshes, either pre-generating a series of meshes for rapid animations, or updating the vertex position by incrementing the frame, which reduces the memory requirements. Particle data can be visualised as simple points, Gaussian "halos" (with a size based on the SPH kernel for adaptive resolution), or even using full Blender objects (e.g. an image of a galaxy, useful for public outreach); their colours can encode other information (e.g. temperature) to provide 3D maps of the different simulated quantities.


Besides Blender and Python, I am also familiar with Steam VR Environments. Using the various workshop tools, I can set up objects, materials and lights, as well as configuring information panels, triggering sounds, set up teleportation areas, and other basic facilities. I have written Python scripts to simplify the process of exporting this data from Blender into Steam.


For quick visual inspection of HI data, I routinely use the karma package kvis (also the Xray package and others). I am also familiar with the virtual reality software iDaVIE for inspection of data cubes. For optical data I tend to rely on ds9.  



Other

I do not claim any great proficiency in it but I do have some knowledge of the FLASH grid-based hydrocode, setting up the initial conditions and running simulations of disc galaxies experiencing ram pressure and other effects. I have more experience in using the SPH/n-body code "gf" (written in Fortran), for performing simulations of galaxy harassment in a cluster. I developed Python code to set up and run simulations in batches, as well as converting the output data into Blender-compatible formats. 


As an ALMA contact scientist I regularly use the ALMA Observing Tool, which is used for configuring the observations at all different frequencies and spatial baselines available to ALMA. I participate in the annual testing of new AOT features. 


I routinely use both Linux, Windows, and Windows Subsystem for Linux, often accessed remotely. In terms of basic software I have extensive experience with LaTeX, Topcat (for managing astronomical catalogues), the Gimp, PhotoFiltre, VirtualDub, GoldWave, Microsoft Word and PowerPoint.