Skip to content
This repository has been archived by the owner on Dec 7, 2021. It is now read-only.

Commit

Permalink
Merge branch 'develop'
Browse files Browse the repository at this point in the history
  • Loading branch information
AlessioZanga committed Jun 4, 2020
2 parents ac66b88 + c22ddd2 commit 211f36d
Show file tree
Hide file tree
Showing 53 changed files with 625 additions and 750 deletions.
2 changes: 2 additions & 0 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,8 @@ jobs:
os: linux
dist: xenial
python: 3.6
before_install:
- python3.6 -m pip install dataclasses
- name: "Python 3.7.0 on Bionic Linux"
dist: bionic
python: 3.7
Expand Down
19 changes: 19 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,25 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
### Removed
### Fixed

## [0.9.2] - 2020-06-04
### Added
* DOI reference link using Zenodo
* @dataclass decorator to datadabase tables ORM
* Install instruction for Python 3.6 dataclasses

### Changed
* Renamed JoinedPreprocessor to ForkedPreprocessor
* Renamed JoinDataFrames to ToMergedDataframes
* Renamed SinglePickleCache to PickleCache

### Removed
* Python 3.6 full support due to @dataclass
(workaround: python3.6 -m pip install dataclasses)
* Conda environment.yml configuration file
* ChunksPickleCache cache manager
* VerticalPipeline pipeline executor
* TextMiner textual analysis

## [0.9.1] - 2020-06-03
### Added
* Add min_value and max_value for each file Metadata
Expand Down
8 changes: 7 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# PyEEGLab

[![Project Status: WIP – Initial development is in progress, but there has not yet been a stable, usable release suitable for the public.](https://www.repostatus.org/badges/latest/wip.svg)](https://www.repostatus.org/#wip) [![Build Status](https://travis-ci.org/AlessioZanga/PyEEGLab.svg?branch=master)](https://travis-ci.org/AlessioZanga/PyEEGLab) [![Documentation Status](https://readthedocs.org/projects/pyeeglab/badge/?version=latest)](https://pyeeglab.readthedocs.io/en/latest/?badge=latest) [![Gitpod Ready-to-Code](https://img.shields.io/badge/Gitpod-Ready--to--Code-blue?logo=gitpod)](https://gitpod.io/#https://github.com/AlessioZanga/PyEEGLab) [![codecov](https://codecov.io/gh/AlessioZanga/PyEEGLab/branch/master/graph/badge.svg)](https://codecov.io/gh/AlessioZanga/PyEEGLab) [![Maintainability](https://api.codeclimate.com/v1/badges/c55f67ee28e9e8bd8038/maintainability)](https://codeclimate.com/github/AlessioZanga/PyEEGLab/maintainability)
[![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.3874461.svg)](https://doi.org/10.5281/zenodo.3874461) [![Project Status: WIP – Initial development is in progress, but there has not yet been a stable, usable release suitable for the public.](https://www.repostatus.org/badges/latest/wip.svg)](https://www.repostatus.org/#wip) [![Build Status](https://travis-ci.org/AlessioZanga/PyEEGLab.svg?branch=master)](https://travis-ci.org/AlessioZanga/PyEEGLab) [![Documentation Status](https://readthedocs.org/projects/pyeeglab/badge/?version=latest)](https://pyeeglab.readthedocs.io/en/latest/?badge=latest) [![Gitpod Ready-to-Code](https://img.shields.io/badge/Gitpod-Ready--to--Code-blue?logo=gitpod)](https://gitpod.io/#https://github.com/AlessioZanga/PyEEGLab) [![codecov](https://codecov.io/gh/AlessioZanga/PyEEGLab/branch/master/graph/badge.svg)](https://codecov.io/gh/AlessioZanga/PyEEGLab) [![Maintainability](https://api.codeclimate.com/v1/badges/c55f67ee28e9e8bd8038/maintainability)](https://codeclimate.com/github/AlessioZanga/PyEEGLab/maintainability)

Analyze and manipulate EEG data using PyEEGLab.

Expand All @@ -18,6 +18,10 @@ PyEEGLab is distributed using the pip repository:

pip install PyEEGLab

If you use Python 3.6, the dataclasses package must be installed as backport of Python 3.7 dataclasses:

pip install dataclasses

If you need a bleeding edge version, you can install it directly from GitHub:

pip install git+https://github.com/AlessioZanga/PyEEGLab@develop
Expand Down Expand Up @@ -48,6 +52,8 @@ If you use this code in your project use the citation below:
title={PyEEGLab: A simple tool for EEG manipulation},
author={Alessio Zanga},
year={2019},
doi={10.5281/zenodo.3874461},
url={https://dx.doi.org/10.5281/zenodo.3874461},
howpublished={\url{https://github.com/AlessioZanga/PyEEGLab}},
}

Expand Down
19 changes: 0 additions & 19 deletions environment.yml

This file was deleted.

4 changes: 2 additions & 2 deletions examples/tuh_eeg_abnormal/example_cnn_dense_classification.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,12 +21,12 @@
import sys

sys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(__file__), '../..')))
from pyeeglab import TUHEEGAbnormalDataset, SinglePickleCache, Pipeline, CommonChannelSet, \
from pyeeglab import TUHEEGAbnormalDataset, PickleCache, Pipeline, CommonChannelSet, \
LowestFrequency, ToDataframe, DynamicWindow, BinarizedSpearmanCorrelation, \
ToNumpy

dataset = TUHEEGAbnormalDataset('../../data/tuh_eeg_abnormal/v2.0.0/edf')
dataset.set_cache_manager(SinglePickleCache('../../export'))
dataset.set_cache_manager(PickleCache('../../export'))

preprocessing = Pipeline([
CommonChannelSet(),
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,12 +21,12 @@
import sys

sys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(__file__), '../..')))
from pyeeglab import TUHEEGAbnormalDataset, SinglePickleCache, Pipeline, CommonChannelSet, \
from pyeeglab import TUHEEGAbnormalDataset, PickleCache, Pipeline, CommonChannelSet, \
LowestFrequency, BandPassFrequency, ToDataframe, DynamicWindow, \
BinarizedSpearmanCorrelation, ToNumpy

dataset = TUHEEGAbnormalDataset('../../data/tuh_eeg_abnormal/v2.0.0/edf')
dataset.set_cache_manager(SinglePickleCache('../../export'))
dataset.set_cache_manager(PickleCache('../../export'))

preprocessing = Pipeline([
CommonChannelSet(),
Expand Down
4 changes: 2 additions & 2 deletions examples/tuh_eeg_abnormal/example_cnn_lstm_classification.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,12 +21,12 @@
import sys

sys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(__file__), '../..')))
from pyeeglab import TUHEEGAbnormalDataset, SinglePickleCache, Pipeline, CommonChannelSet, \
from pyeeglab import TUHEEGAbnormalDataset, PickleCache, Pipeline, CommonChannelSet, \
LowestFrequency, ToDataframe, DynamicWindow, BinarizedSpearmanCorrelation, \
ToNumpy

dataset = TUHEEGAbnormalDataset('../../data/tuh_eeg_abnormal/v2.0.0/edf')
dataset.set_cache_manager(SinglePickleCache('../../export'))
dataset.set_cache_manager(PickleCache('../../export'))

preprocessing = Pipeline([
CommonChannelSet(),
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,12 +21,12 @@
import sys

sys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(__file__), '../..')))
from pyeeglab import TUHEEGAbnormalDataset, SinglePickleCache, Pipeline, CommonChannelSet, \
from pyeeglab import TUHEEGAbnormalDataset, PickleCache, Pipeline, CommonChannelSet, \
LowestFrequency, BandPassFrequency, ToDataframe, DynamicWindow, \
BinarizedSpearmanCorrelation, ToNumpy

dataset = TUHEEGAbnormalDataset('../../data/tuh_eeg_abnormal/v2.0.0/edf')
dataset.set_cache_manager(SinglePickleCache('../../export'))
dataset.set_cache_manager(PickleCache('../../export'))

preprocessing = Pipeline([
CommonChannelSet(),
Expand Down
8 changes: 4 additions & 4 deletions examples/tuh_eeg_abnormal/example_gat_lstm_classification.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,19 +23,19 @@
import sys

sys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(__file__), '../..')))
from pyeeglab import TUHEEGAbnormalDataset, SinglePickleCache, Pipeline, CommonChannelSet, \
from pyeeglab import TUHEEGAbnormalDataset, PickleCache, Pipeline, CommonChannelSet, \
LowestFrequency, ToDataframe, DynamicWindow, BinarizedSpearmanCorrelation, \
CorrelationToAdjacency, Bandpower, GraphWithFeatures, JoinedPreprocessor
CorrelationToAdjacency, Bandpower, GraphWithFeatures, ForkedPreprocessor

dataset = TUHEEGAbnormalDataset('../../data/tuh_eeg_abnormal/v2.0.0/edf')
dataset.set_cache_manager(SinglePickleCache('../../export'))
dataset.set_cache_manager(PickleCache('../../export'))

preprocessing = Pipeline([
CommonChannelSet(),
LowestFrequency(),
ToDataframe(),
DynamicWindow(8),
JoinedPreprocessor(
ForkedPreprocessor(
inputs=[
[BinarizedSpearmanCorrelation(), CorrelationToAdjacency()],
Bandpower()
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -23,21 +23,21 @@
import sys

sys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(__file__), '../..')))
from pyeeglab import TUHEEGAbnormalDataset, SinglePickleCache, Pipeline, CommonChannelSet, \
from pyeeglab import TUHEEGAbnormalDataset, PickleCache, Pipeline, CommonChannelSet, \
LowestFrequency, BandPassFrequency, ToDataframe, DynamicWindow, \
BinarizedSpearmanCorrelation, CorrelationToAdjacency, Bandpower, \
GraphWithFeatures, JoinedPreprocessor
GraphWithFeatures, ForkedPreprocessor

dataset = TUHEEGAbnormalDataset('../../data/tuh_eeg_abnormal/v2.0.0/edf')
dataset.set_cache_manager(SinglePickleCache('../../export'))
dataset.set_cache_manager(PickleCache('../../export'))

preprocessing = Pipeline([
CommonChannelSet(),
LowestFrequency(),
BandPassFrequency(0.1, 47),
ToDataframe(),
DynamicWindow(8),
JoinedPreprocessor(
ForkedPreprocessor(
inputs=[
[BinarizedSpearmanCorrelation(), CorrelationToAdjacency()],
Bandpower()
Expand Down
27 changes: 13 additions & 14 deletions pyeeglab/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,21 +4,20 @@
from importlib.util import find_spec
from mne.utils import set_config

from .dataset import TUHEEGAbnormalDataset, TUHEEGAbnormalLoader, \
TUHEEGArtifactDataset, TUHEEGArtifactLoader, \
EEGMMIDBDataset, EEGMMIDBLoader, \
CHBMITLoader, CHBMITDataset
from .dataset import TUHEEGAbnormalDataset, TUHEEGAbnormalLoader, \
TUHEEGArtifactDataset, TUHEEGArtifactLoader, \
EEGMMIDBDataset, EEGMMIDBLoader, \
CHBMITLoader, CHBMITDataset
from .io import Raw
from .cache import SinglePickleCache, ChunksPickleCache
from .preprocessing import VerticalPipeline, Pipeline, JoinedPreprocessor, \
CommonChannelSet, LowestFrequency, BandPassFrequency, NotchFrequency, \
ToDataframe, ToNumpy, ToNumpy1D, StaticWindow, DynamicWindow, \
StaticWindowOverlap, DynamicWindowOverlap, SpearmanCorrelation, \
BinarizedSpearmanCorrelation, CorrelationToAdjacency, \
Bandpower, GraphGenerator, GraphWithFeatures, \
JoinDataFrames, Mean, Variance, Skewness, Kurtosis, ZeroCrossing, \
AbsoluteArea, PeakToPeak, MinMaxNormalization, MinMaxCentralizedNormalization
from .text import TextMiner
from .cache import PickleCache
from .pipeline import Pipeline, ForkedPreprocessor
from .preprocess import CommonChannelSet, LowestFrequency, BandPassFrequency, NotchFrequency, \
ToDataframe, ToNumpy, ToNumpy1D, StaticWindow, DynamicWindow, \
StaticWindowOverlap, DynamicWindowOverlap, SpearmanCorrelation, \
BinarizedSpearmanCorrelation, CorrelationToAdjacency, \
Bandpower, GraphGenerator, GraphWithFeatures, \
ToMergedDataframes, Mean, Variance, Skewness, Kurtosis, ZeroCrossing, \
AbsoluteArea, PeakToPeak, MinMaxNormalization, MinMaxCentralizedNormalization

logging.getLogger().setLevel(logging.DEBUG)

Expand Down
2 changes: 1 addition & 1 deletion pyeeglab/cache/__init__.py
Original file line number Diff line number Diff line change
@@ -1 +1 @@
from .cache import Cache, SinglePickleCache, ChunksPickleCache
from .cache import Cache, PickleCache
60 changes: 2 additions & 58 deletions pyeeglab/cache/cache.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
from pickle import load, dump

from ..io import DataLoader
from ..preprocessing import Pipeline
from ..pipeline import Pipeline

class Cache(ABC):

Expand All @@ -33,7 +33,7 @@ def load(self, dataset: str, loader: DataLoader, pipeline: Pipeline) -> Dict:
pass


class SinglePickleCache(Cache):
class PickleCache(Cache):

def __init__(self, path: str):
super().__init__()
Expand Down Expand Up @@ -64,59 +64,3 @@ def load(self, dataset: str, loader: DataLoader, pipeline: Pipeline):
logging.debug('Dumping cache file')
dump(data, file)
return data


class ChunksPickleCache(Cache):

def __init__(self, path: str, chunks: int = 5000):
super().__init__()
logging.debug('Create chunks pickle cache manager')
Path(path).mkdir(parents=True, exist_ok=True)
self.path = path
self.chunks = chunks

def _load(self, key: str):
with open(key, 'r') as file:
index = json.load(file)
index = index['files']
data = {'data': [], 'labels': []}
for i in index:
with open(i, 'rb') as chunk:
try:
logging.debug('Loading cache file')
chunk = load(chunk)
for item in chunk.keys():
data[item] += chunk[item]
except:
logging.debug('Loading cache file failed')
return data

def _save(self, key: str, data: List, pipeline: Pipeline):
data = [data[i:i+self.chunks] for i in range(0, len(data), self.chunks)]
files = {'files': []}
for index, value in enumerate(data):
data[index] = pipeline.run(value)
path = key[:-5] + '_' + str(index) + '.pkl'
with open(path, 'wb') as file:
logging.debug('Dumping cache file %d', index)
dump(data[index], file)
files['files'].append(path)
with open(key, 'w') as file:
json.dump(files, file)
return data

def load(self, dataset: str, loader: DataLoader, pipeline: Pipeline):
logging.debug('Computing cache key')
key = self._get_cache_key(dataset, loader, pipeline)
logging.debug('Computed cache key: %s', key)
key = key + '.json'
self.path = join(self.path, dataset)
Path(self.path).mkdir(parents=True, exist_ok=True)
key = join(self.path, key)
if isfile(key):
logging.debug('Cache file found')
return self._load(key)
logging.debug('Cache file not found, genereting new one')
data = loader.get_dataset()
data = self._save(key, data, pipeline)
return data
2 changes: 1 addition & 1 deletion pyeeglab/database/__init__.py
Original file line number Diff line number Diff line change
@@ -1 +1 @@
from .index import File, Metadata, Event, Index
from .tables import *
Loading

0 comments on commit 211f36d

Please sign in to comment.