Classes
Classify ([isLibrary]) |
Analysis methods for imhr.processing.preprocesing. |
Metadata ([isLibrary]) |
Process participants metadata for analysis and export. |
Processing (config[, isLibrary, isDebug]) |
Hub for running processing and analyzing raw data. |
raw ([is_library]) |
processing summary data for output |
redcap () |
Downloading data from REDCap. |
imhr.Webgazer.
Classify
(isLibrary=False)[source]¶Bases: imhr.Webgazer.classify.Classify
Analysis methods for imhr.processing.preprocesing.
Parameters: |
|
---|
Methods
Acceleration (time, data_x[, data_y]) |
Calculate the acceleration (deg/sec/sec) for data points in d_x and (optionally) d_y, using the time numpy array for time delta information. |
Velocity (time, config, d_x[, d_y]) |
Calculate the instantaneous velocity (degrees / second) for data points in d_x and (optionally) d_y, using the time numpy array for time delta information. |
VisualAngle (g_x, g_y, config) |
Convert pixel eye-coordinates to visual angle. |
hmm (data, filter_type, config) |
Hidden Makov Model, adapted from https://gitlab.com/nslr/nslr-hmm. |
idt (data, dis_threshold, dur_threshold) |
Identification with Dispersion Threshold. |
ivt (data, v_threshold, config) |
Identification with Velocity Threshold. |
savitzky_golay (y, window_size, order[, …]) |
Smooth (and optionally differentiate) data with a Savitzky-Golay filter. |
simple (df, missing, maxdist, mindur) |
Detects fixations, defined as consecutive samples with an inter-sample distance of less than a set amount of pixels (disregarding missing data). |
Acceleration
(time, data_x, data_y=None)[source]¶Calculate the acceleration (deg/sec/sec) for data points in d_x and (optionally) d_y, using the time numpy array for time delta information.
Parameters: |
|
---|
Velocity
(time, config, d_x, d_y=None)[source]¶Calculate the instantaneous velocity (degrees / second) for data points in d_x and (optionally) d_y, using the time numpy array for time delta information.
Parameters: |
|
---|
Notes
Numpy arrays time, d_x, and d_y must all be 1D arrays of the same length. If both d_x and d_y are provided, then the euclidian distance between each set of points is calculated and used in the velocity calculation. Time must be in seconds.msec units, while d_x and d_y are expected to be in visual degrees. If the position traces are in pixel coordinate space, use the VisualAngleCalc class to convert the data into degrees.
VisualAngle
(g_x, g_y, config)[source]¶Convert pixel eye-coordinates to visual angle.
Parameters: |
|
---|
Notes
hmm
(data, filter_type, config)[source]¶Hidden Makov Model, adapted from https://gitlab.com/nslr/nslr-hmm.
Parameters: |
|
---|---|
Attributes: |
|
Notes
References
[1] | Pekkanen, J., & Lappi, O. (2017). A new and general approach to signal denoising and eye movement classification based on segmented linear regression. Scientific Reports, 7(1). doi:10.1038/s41598-017-17983-x. |
idt
(data, dis_threshold, dur_threshold)[source]¶Identification with Dispersion Threshold.
Parameters: |
|
---|---|
Returns: |
|
Notes
The I-DT algorithm has two parameters: a dispersion threshold and the length of a time window in which the dispersion is calculated. The length of the time window is often set to the minimum duration of a fixation, which is around 100-200 ms.
ivt
(data, v_threshold, config)[source]¶Identification with Velocity Threshold.
In the I-VT model, the velocity value is computed for every eye position sample. The velocity value is then compared to the threshold. If the sampled velocity is less than the threshold, the corresponding eye-position sample is marked as part of a fixation, otherwise it is marked as a part of a saccade.
Parameters: |
|
---|---|
Returns: |
|
Notes
From https://github.com/ecekt/eyegaze. Formula from: https://dl.acm.org/citation.cfm?id=355028
savitzky_golay
(y, window_size, order, deriv=0, rate=1)[source]¶Smooth (and optionally differentiate) data with a Savitzky-Golay filter.
The Savitzky-Golay filter removes high frequency noise from data. It has the advantage of preserving the original shape and features of the signal better than other types of filtering approaches, such as moving averages techniques.
Parameters: |
|
---|---|
Returns: |
|
Notes
The Savitzky-Golay is a type of low-pass filter, particularly suited for smoothing noisy data. The main idea behind this approach is to make for each point a least-square fit with a polynomial of high order over a odd-sized window centered at the point. For more information, see: http://wiki.scipy.org/Cookbook/SavitzkyGolay.
Examples
>>> t = np.linspace(-4, 4, 500)
>>> y = np.exp( -t**2 ) + np.random.normal(0, 0.05, t.shape)
>>> ysg = savitzky_golay(y, window_size=31, order=4)
>>> import matplotlib.pyplot as plt
>>> plt.plot(t, y, label='Noisy signal')
>>> plt.plot(t, np.exp(-t**2), 'k', lw=1.5, label='Original signal')
>>> plt.plot(t, ysg, 'r', label='Filtered signal')
>>> plt.legend()
>>> plt.show()
References
[1] | A. Savitzky, Golay, M. (1964). Smoothing and Differentiation of Data by Simplified Least Squares Procedures. Analytical Chemistry. 36(8), pp 1627-1639. |
[2] | S.A. Teukolsky, W.T. Vetterling, B.P. Flannery Numerical Recipes 3rd Edition: The Art of Scientific Computing. W.H. Press,Cambridge University Press ISBN-13: 9780521880688. |
simple
(df, missing, maxdist, mindur)[source]¶Detects fixations, defined as consecutive samples with an inter-sample distance of less than a set amount of pixels (disregarding missing data).
Parameters: |
|
---|---|
Returns: |
|
Notes
From https://github.com/esdalmaijer/PyGazeAnalyser/blob/master/pygazeanalyser/detectors.py
imhr.Webgazer.
Metadata
(isLibrary=False)[source]¶Bases: imhr.Webgazer.metadata.Metadata
Process participants metadata for analysis and export.
Parameters: |
|
---|
Methods
predict (df) |
Predicting screen size (cm), device (i.e. |
summary (df, path) |
Preparing data for use in analysis. |
predict
(df)[source]¶Predicting screen size (cm), device (i.e. macbook 2018).
Parameters: |
|
---|---|
Returns: |
|
summary
(df, path)[source]¶Preparing data for use in analysis.
Parameters: | |
---|---|
Attributes: | |
Returns: |
|
Notes
You can either get data from all files within a directory (directory), or from a specific subject (subject_session).
Examples
>>> #if using path:
>>> df = getData(path=self.config['path'])
>>> #if getting data for single subject:
>>> df = getData(path=self.config['path'],subject_session=['1099','1', '0'])
imhr.Webgazer.
Processing
(config, isLibrary=False, isDebug=False)[source]¶Bases: object
Hub for running processing and analyzing raw data.
Methods
append_classify (self, df, cg_df) |
Appending classification to Dataframe. |
classify (self, config, df[, ctype, …]) |
I-DT algorithm takes into account the distribution or spatial proximity of eye position points in the eye-movement trace. |
dwell (self, df[, cores, isMultiprocessing]) |
Calculate dwell time for sad and neutral images. |
filter_data (self, df, filter_type, config) |
Butterworth: Design an Nth-order digital or analog Butterworth filter and return the filter coefficients. |
getData (self[, path]) |
preparing data for use in analysis. |
getEstimatedMonitor (self, diagonal, window) |
calculate estimate monitor size (w,h;cm) using estimated diagonal monitor (hypotenuse; cm). |
onset_diff (self, df0[, merge, cores]) |
Calculate differences in onset presentation (stimulus, dotloc) using bokeh, seaborn, and pandas. |
preprocess (self, df, window) |
Initial data cleaning. |
process (self, window, filters, gxy_df, trial) |
Plotting and preparing data for classification. |
roi (self[, filters, flt, df, manual, …]) |
Check if fixation is within bounds. |
run (self, path[, task_type, single_subject, …]) |
Processing of data. |
subject_metadata (self, fpath, spath) |
Collect all subjects metadata. |
variables (self, df) |
Output list of variables for easy html viewing. |
append_classify
(self, df, cg_df)[source]¶Appending classification to Dataframe.
Parameters: |
|
---|
classify
(self, config, df, ctype='ivt', filter_type=None, v_th=None, dr_th=None, di_th=None, missing=None, maxdist=None, mindur=None)[source]¶I-DT algorithm takes into account the distribution or spatial proximity of eye position points in the eye-movement trace.
In the I-VT model, the velocity value is computed for every eye position sample. The velocity value is then compared to the threshold. If the sampled velocity is less than the threshold, the corresponding eye-position sample is marked as part of a fixation, otherwise it is marked as a part of a saccade.
The simple model detects fixations, defined as consecutive samples with an inter-sample distance of less than a set amount of pixels (disregarding missing data)
Parameters: |
|
---|---|
Returns: |
|
Raises: |
|
dwell
(self, df, cores=1, isMultiprocessing=False)[source]¶Calculate dwell time for sad and neutral images.
Parameters: |
|
---|---|
Returns: |
|
filter_data
(self, df, filter_type, config)[source]¶Butterworth: Design an Nth-order digital or analog Butterworth filter and return the filter coefficients.
Parameters: |
|
---|---|
Attributes: |
|
getData
(self, path=None)[source]¶preparing data for use in analysis.
Parameters: |
|
---|---|
Attributes: |
|
Returns: |
|
Notes
You can either get data from all subjects within a directory, or from a specific subject (subject_session).
Examples
>>> #if using path:
>>> df_raw = getData(path=self.config['path']['raw'])
>>> #if getting data for single subject:
>>> df_raw = getData(path=self.config['path']['raw'],subject_session=['1099','1', '0'])
getEstimatedMonitor
(self, diagonal, window)[source]¶calculate estimate monitor size (w,h;cm) using estimated diagonal monitor (hypotenuse; cm).
Attributes: |
|
---|
onset_diff
(self, df0, merge=None, cores=1)[source]¶Calculate differences in onset presentation (stimulus, dotloc) using bokeh, seaborn, and pandas.
Parameters: |
|
---|---|
Returns: |
|
preprocess
(self, df, window)[source]¶Initial data cleaning.
Parameters: |
|
---|---|
Attributes: |
|
Notes
process
(self, window, filters, gxy_df, trial, _classify=True, ctype='simple', _param='', log=False, v_th=20, dr_th=200, di_th=20, _missing=0.0, _maxdist=25, _mindur=50)[source]¶Plotting and preparing data for classification. Combined plot of each filter.
Parameters: |
|
---|---|
Attributes: |
|
Returns: |
|
roi
(self, filters=None, flt=None, df=None, manual=False, monitorSize=None)[source]¶Check if fixation is within bounds.
Attributes: |
|
---|---|
Returns: |
|
run
(self, path, task_type='eyetracking', single_subject=False, single_trial=False, subject=0, trial=0, isMultiprocessing=True, cores=1)[source]¶Processing of data. Steps here include: cleaning data, fixation identification, and exporting data.
Parameters: |
|
---|---|
Attributes: |
|
subject_metadata
(self, fpath, spath)[source]¶Collect all subjects metadata.
Parameters: | |
---|---|
Returns: |
|
variables
(self, df)[source]¶Output list of variables for easy html viewing.
Parameters: |
|
---|---|
Returns: |
|
imhr.Webgazer.
raw
(is_library=False)[source]¶Bases: object
processing summary data for output
Methods
download (self, l_exp, log_path, save_path, …) |
Download raw data for use in analysis. |
library (self) |
Check if required libraries are available. |
imhr.Webgazer.
redcap
[source]¶Bases: object
Downloading data from REDCap.
Methods
cesd (path, filename, token, url, report_id) |
Download CES-D data for use in analysis. |
demographics (path, filename, token, url, …) |
Download demographics data for use in analysis. |
mmpi (path, filename, token, url, report_id) |
Download MMPI data for use in analysis. |
cesd
(path, filename, token, url, report_id)[source]¶Download CES-D data for use in analysis.
Parameters: |
---|
demographics
(path, filename, token, url, report_id, payload=None)[source]¶Download demographics data for use in analysis.
Parameters: |
---|
Notes