Automatic detection of epileptiform events in EEG recordings

An electroencephalogram (EEG) is the most important tool in the diagnosis of seizure disorders. Between seizures, epileptiform neural activities in EEG recordings occur in the forms of spikes or spike-and-slow wave complexes. Seeking for an automated EEG interpretation algorithm that is well-accepted by clinicians has been a research goal stretched for decades. As a participant in an NSF-funded Research Experience for Undergraduates (REU) program hosted at Clemson University School of Computing, I continued on this endeavor to develop an automated system that detected epilepsy-related events, in real-time, from scalp EEG recordings.

In finding the optimal algorithm for this purpose, I constructed a multi-stage processing pipeline. In the first stage, I cleaned up the clinic data gathered from 100 epileptic patients and treated them with cross-validation. Next, I used wavelet transformations to generate the features for study from EEG signal in a “sliding window” approach. I then applied machine learning algorithms and analyzed their performances in classifying data patterns into epileptiform activities versus other activities. For this stage I also explored the use of hidden Markov model to fit the time sequence in which epileptiform events occurred. In the final step, I further separated target eplieptiform events from noise signals, by applying a statistical model locally, and stitched outputs from different signal windows together. – source code

The automation results were highlighted these findings in realtime on the eegNet (standardized EEG database developed by Clemson) web interface.

Automatic detection of epileptiform events in EEG recordings – poster

This slideshow requires JavaScript.

The Open Science Investigation

Barriers for scientists to practice open science prevail due to a range of cultural and technological reasons. This undergraduate thesis, developed under the guidance of the Center for Open Science, seeks to understand the incentive structure for open science from a sociotechnical perspective, and attempts at a software solution to facilitate its implementation. The research paper, Incentive structure for Open Science in Web 2.0, elucidates how current reward system needs to be changed to encourage more practices of open science: to create incentives for researchers to open up their research materials for the broader community, organizations need to provide researchers with intrinsic rewards, proper credit allocation, and tangible career benefits. In the technical portion of the project, Designing Data Visualizations for Open Science, I prototyped an interactive research exploration and organizing tool for the Open Science Framework. The thesis contributes to this collective effort towards open science by making the creation of incentives as an explicit design goal for open science web applications. – thesis cover   |  STS paper

Quality control with statistical anomaly detection

While the leading edge 3D laser scanners provide accurate depiction of product geometries and allow for potentially more efficient detection of production faults, currently they are not used for quality monitoring due to lack of such frameworks. In this project I worked on a variety of statistical methods and algorithms that analyze the cloud data points generated from 3D scanners and detect production system failures.

Statistical quality control of point cloud dataslides

3441880_orig

 

Healthcare User-Centric Design

In this project at the Center for Human-Computer Interaction, I conducted research on innovative narrative analysis techniques based on 78,400 lines of interview scripts with emergency room nurses. The research outcome is used to identify faults in current hospital systems and user needs of healthcare practitioners.

Using Storytelling to Inform Design: Narrative Analysis of ER Stories

Screen Shot 2016-06-12 at 12.08.05 PM