Select Page
< Go to BIOPAC.COM

With the variety of modern data acquisition tools at our disposal, acquiring high quality physiological data has become much less of a challenge than in times past. Following some basic guidelines will usually result in very good data without a lot of mystery or guesswork. But because there is no perfect scenario, there are occasions when unwanted noise or artifact creeps into the data, which can skew the analysis. Formerly, we were stuck with these outliers in the data, and forced to ignore or compensate for them in some fashion.

An analogy to this can be found in the way music is recorded. In the early days of recorded music, not only was the sound quality poor, the performance was recorded directly to a wax cylinder. If the performance was flawed, or the musician shuffled his feet or made some other bodily sound, it was immortalized. The stakes were very high for delivering a technically perfect performance. With the advent of magnetic tape, it became easier to correct errors by repeating a performance until a good take was achieved, or even splicing tape from different performances together. However, in modern digital recording, the degree of control over unwanted artifact or small errors is surgically precise. if a noise creeps into an otherwise faultless recording, the flaw can often simply be snipped out of the performance, leaving the good parts intact. Larger errors can be corrected via edits or equalizing out unwanted frequencies.

This is similar to what we can achieve with modern data acquisition software such as AcqKnowledge or BSL PRO. There is a robust suite of tools for eliminating noise or artifact from an otherwise usable experiment. Let’s briefly explore some of these tools.

If the data is simply a bit noisy overall, median smoothing is a very effective tool for removing artifact because it removes the fast-moving components that are noise related. Noise can be identified as “blips” in the waveform that are not physiologically related. Smoothing helps tone this artifact down and add clarity to the waveform. Smoothing can be pre-emptively applied in a calculation channel while recording or applied after the acquisition as a transformation.

There are also a number of filters that can be applied, such as FIR, IIR, Adaptive or Comb Band Stop. The type of filter used is often dictated by the type of data being acquired. For example, EDA skin conductance data can be enhanced by performing median smoothing followed by a low pass IIR filter fixed at 1 Hz.

A quick way to remove a “blip” or noise artifact over a small area is to simply select the portion of data in the waveform containing the blip and select Transform > Math Functions > Connect Endpoints. This action performs a linear interpolation between the left and right edges of the selected area, effectively removing the artifact and leaving the valid data intact.

There are also processes in AcqKnowledge for setting up automated scripting to comb through the data and remove detected outliers. This is especially useful for long experiments where it might become tedious to scroll through and manually identify areas of noise and artifact. Once the script is set up, it can be accurately repeated on subsequent data files with one click.

This brief journey represents but a small sampling of the tools available for removing imperfections and non-physiological artifacts from data. For a more in-depth look at enhancing and refining data, we invite you to view the following free, on-demand webinars.

BIOPAC offers a wide array of wired and wireless equipment that can be used in your research. To find more information on solutions for recording and analyzing signals such as ECG, heart rate, respiration, and more using any platforms mentioned in this blog post, you can visit the individual application pages on the BIOPAC website.

Photo by Abraham Osorio on Unsplash

< Find more solutions at BIOPAC.COM
error: Content is protected !!