Categorization of music plays an essential role in music appreciation and cognition. A study shows that genre is so important to listeners that the style of a piece can influence their liking for it more than the piece itself [1, 2]. The problem of recognizing song genres, however, is a challenging task as song genres are subjective in nature as there are no clear-cut boundaries between human-labeled song genres.
Multiple researches have shown that machine learning approaches have the potential to achieve significant results in this problem. However, we believe that it is possible to further explore the potential of applying deep learning approach on the music genre classification problem. While other works have aimed to adopt and assess deep learning methods that have been shown to be effective in other domains, there is still a great need for more original research focusing on music primarily and utilizing musical knowledge and insight .
For reproducibility, we published our experiment worksheet on CodaLab. This contains our introduction to the problem, the datasets, code, and other artifacts from our various experiments.
Incident Analytics was an intelligent incident management tool we developed for AppDynamics DevOps customers during a Hackathon. AppDynamics customers were able to configure health rules based on a few key metrics of their interest and get alerted when these metrics saw unexpected patterns. However, without knowing about historical data, DevOps may spend hours figuring out a resolution when someone had solved a similar issue before. In this project, we built a tool based on machine learning algorithms to automatically identify root cause analyses (RCAs) for incidents — this task previously would take hours if not days of manual work. The solution we built helped customers understand the context around incoming incidents and get to resolution much faster. We applied machine learning to grouping incidents together, correlating incidents with RCAs, and analyzing if incidents were triggered by a global issue. This constitutes a big improvement over current AppDynamics solution which provides zero out-of-box analytics.
An electroencephalogram (EEG) is the most important tool in the diagnosis of seizure disorders. Between seizures, epileptiform neural activities in EEG recordings occur in the forms of spikes or spike-and-slow wave complexes. Seeking for an automated EEG interpretation algorithm that is well-accepted by clinicians has been a research goal stretched for decades. As a participant in an NSF-funded Research Experience for Undergraduates (REU) program hosted at Clemson University School of Computing, I continued on this endeavor to develop an automated system that detected epilepsy-related events, in real-time, from scalp EEG recordings.
In finding the optimal algorithm for this purpose, I constructed a multi-stage processing pipeline. In the first stage, I cleaned up the clinic data gathered from 100 epileptic patients and treated them with cross-validation. Next, I used wavelet transformations to generate the features for study from EEG signal in a “sliding window” approach. I then applied machine learning algorithms and analyzed their performances in classifying data patterns into epileptiform activities versus other activities. For this stage I also explored the use of hidden Markov model to fit the time sequence in which epileptiform events occurred. In the final step, I further separated target eplieptiform events from noise signals, by applying a statistical model locally, and stitched outputs from different signal windows together. – source code
The automation results were highlighted these findings in realtime on the eegNet (standardized EEG database developed by Clemson) web interface.
Automatic detection of epileptiform events in EEG recordings – poster