Autophagic Symphony (completed)

This pilot project explored the potential of data sonification for understanding more about autophagy, the recycling system within cells.

Different sonification models were tested, including basic "audification" and some "parameter-based" mapping approaches. The various sonifications from the image data allow for listening to the autophagy process. More work needs to be done for the sonifications to be helpful. However, the pilot study has successfully shown the potential of sonification as an analysis method. Most importantly, there are now several modules available for continuing the exploration.

Image may contain: Slope, Rectangle, Font, Circle, Parallel.

Sonification

Sonification is a method used to turn data into audible sound. Just as data can be visualized in graphs or images, it can also be sonified using various music technological methods. This allows for using the high temporal capacity of the human ear to explore the data in ways that are not quickly done through visualization.

Autophagy

Autophagy is a vital process in which the body's cells "clean out" unnecessary or damaged components. Cells in living organisms are dynamic compartments continuously responding to changes in their environment to maintain physiological homeostasis. While basal autophagy exists in cells to aid in the regular turnover of cellular debris, starvation-induced autophagy is a critical cellular response to nutritional stress. However, it is of utmost importance that cells also terminate starvation-induced autophagy to prevent excessive cell damage. Still, surprisingly little is known about how cells terminate autophagy.

Pilot study

The pilot study was based on live imaging time-lapse data sets of Drosophila S2 cells during starvation and refeeding. Autophagy was initiated by starvation and terminated by refeeding. The cell images were obtained with a Nikon Sora SD confocal microscope, preprocessed, and exported as regular JPEG files. The image files were then passed through data processing in the Musical Gestures Toolbox for Python and a set of custom-built sonification modules.

Two approaches were tested during the pilot project: audification and parametric sonification. Both methods were based on encoding the image sequences into motiongrams and videograms: images that summarize the average row- or column-wise time series.

Image may contain: Font, Rectangle, Plant, Parallel, Creative arts.
Sketch of the calculation of a motiongram.

The audification treated the encoded images as spectrograms and resynthesized them by calculating the inverse FFT using the time axes of the motiongrams. These revealed only high-level patterns in the time series due to the noisy nature of the synthesis method.

The parametric sonification was based on sonifiying the time series with sine waves. We used the row- and column-wise averages of the motiongrams to assign them to the frequencies of the sine waves (either as an "equal-tempered" distribution or as a harmonic series). This resulted in a rough mapping between cell position in the image and the corresponding sine wave in frequency space. Amplitude was mapped to brightness and the time series to the sound morphology. Because the red and green channels (derived from fluorescence imaging with red fluorescent protein (RFP) and green fluorescent protein (GFP)) were critical for imaging the autophagy process, they were sonified separately and used as left and right channels in the stereo sonification. This way, the crucial moments of GFP and RFP overlapping in a particular area of the image were reproduced as sinusoidal tones in a specific frequency band that "floated" from the sides to the centre of the panoramic space of the sound (Table 1).

Image Data Synthesis parameter
Pixel position (x/y) Frequency (pitch)
Pixel brightness Amplitude (loudness)
"Color" channel (GFP or RFP) Audio channel (Left or Right)
Time (consecutive snapshots) Time (Sound evolution, duration)

Table 1. Mapping between image data and synthesis parameters.

Sonification steps (pseudo code)

  • Load folder of images containing ESF, PBSGA and PBSGB groups in starvation (ST) and refed (RF) stages, using RFP and GFP imaging. ESF (fed for 19 hours), PBSGA (starved for 19 hours) and PBSGB (starved for 4 hours and refed  for 15 hours)
  • Create dataset, extract metadata from file names
  • Separate ESF, PBSGA and PBSGB groups. 
  • In each group, separate RFP and GFP colour channels
  • Render both ST and RF image sequences into videos for all three groups in all two colour channels in all four visiting points, resulting in 2*3*2*4=48 videos (shared here)
  • Create X-axis and Y-axis videograms of each video, resulting in 48*2=96 videograms (shared here)
  • Sonify all 96 videograms (sonification parameters: 16 sines per image, lowest frequency is 110 Hz, the sines set to a harmonic series), resulting in 96 sound files (shared here)
  • For each group:
    • Set all ST parts to start at t=0 sec
    • Set all RF parts to start at t=28 sec (creating a one-second gap after the end of the ST parts)
    • Mix all visiting points of GFP channels to the left
    • Mix all visiting points of RFP channels to the right
    • Render stereo file
  • Normalize files together (keeping loudness ratios between them)

Although this parametric method was simple, it expressed the patterns in the data more efficiently than the audification. Comparing the sonification with the image analysis, we found matching patterns, although buried in a significant amount of noise. This was most likely due to the coarse mapping between image and frequency space, the data-reducing nature of motiongrams, and the simplistic, direct mapping between acoustic parameters. Below are the results of the ESF, PBSGA and PBSGB groups shown as stereo spectrograms (X-axis: time, Y-axis: frequencies) with links to their respective sonification sound files.

Cells in complete nutrition

Fed for 19 hours, basal autophagy, ESF group.

Cells without nutrition

Starved for 19 hours, starvation-induced autophagy, PBSGA group.

Image may contain: Rectangle, Font, Tints and shades, Parallel, Gas.

Cells without nutrition where nutrients are replenished after 4 hours

Starved for 4 hours and refed for 15 hours, starvation-induced autophagy followed by nutrient-induced autophagy termination, PBSGB group.

Image may contain: Rectangle, Font, Parallel, Tints and shades, Electric blue.

Published Sep. 2, 2022 8:32 PM - Last modified Oct. 4, 2022 5:10 PM