-
Notifications
You must be signed in to change notification settings - Fork 92
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Optimized SPADE analysis and SPADE tutorial (#419)
* bug fixed dealing with units * fixed same unit error in trial shuffling * added Bielefeld fim.so and adapted filtering and added window param to fpgrowth * removed max_occ test * further unit stuff * debugging for new fim version * enabled multithreading in fpgrowth * less verbose in spade * set equal number of bins in bin shuffling wrt spade * added tolerance to binning in spade everywhere * Added accelerated FIM algorithm sources by Florian Porrmann. * Enh/accelerated spade build (#82) * Added cibuildwheel action * Added Python requirements to wheel build * Build only on 64bit machines, otherwise overflow * Removed Windows for testing, as vc is not available * Removed MacOS for testing, as -fopenmp is not available * Removed pp- (pypy) builds since they lack C. * Fixed removing pp- (pypy) builds since they lack C. * Put Macos back in. * Windows Hack * Remove vcpython alltogether, ignore 2.7 Python * Removed extra compile option, which breaks on Windows * Removed more extra compile options, which breaks on Windows * Try C++ instead of Gnu++. * Try C++ instead of Gnu++ Windows style argument. * Remove linux build while testing windows. * Remove libraries. * Differentiate Windows and Linux. * Added missing import. * Last mile: MacOS * Remove openMP lib * Remove openMP lib * Add openMP lib * More brew installs * Mac is called mac on github * Make sure C is reinstalled. * Multilib * Next try, new options * Ignore warning about void type * Update newsest fim package * Revert "Ignore warning about void type" This reverts commit 3ff6b62 * Revert to prior fim, new compiler argument. * Revert "Update newsest fim package" This reverts commit f321f77 * Definitely, gnu++17, but new try. * Try C++ * Warning message * llvm maybe? * Added apple in source * Small fixes for MacOS, but not comprehensive * Limit to Windows and Linux for now * Remove MacOS entry * Fix fix from mindlessness * Testrun * Trying to include fim.so, despite its renaming by wheels * Added newest version of original module * Reverted previous breaking change commited by accident. * Reverted package name from testing. * Test focal as CI build * Test bionic as CI build * Understand installation issue on CI -- is importing elephant importing the installed version? * Spelling error only * Try to make sure travis loads the installed elephant, not the cwd. * One step further -- which version will nosetests use? * Switch to pytest as of PR #413 * Added authors of new FIM module and reference in new docs. * Added authors of new FIM module and reference in new docs. * Small text clarifications. * Test if entry for fim.so/pyd in MANIFEST is now redundant. * Update elephant/spade.py Co-authored-by: Alexander Kleinjohann <[email protected]> * Update elephant/spade.py Co-authored-by: Alexander Kleinjohann <[email protected]> * Added SPADE tutorial * Prevent wheel building on every push, and limit scipy version workaround * Pushed tutorial, removed file added in error * New attempt to make mybinder install requirements. * New attempt, dropping viziphant. * Avoid recursive elephant installation by viziphant in postBuild * Removed unit test that is fragile as it depends on the implementation of surrogate methods * Add viziphant to RTD environment * Typo in tutorial * Add viziphant to travis doc tests Co-authored-by: pbouss <[email protected]> Co-authored-by: stellalessandra <[email protected]> Co-authored-by: Alessandra Stella <[email protected]> Co-authored-by: Alexander Kleinjohann <[email protected]>
- Loading branch information
1 parent
6479381
commit 8c388e4
Showing
30 changed files
with
4,310 additions
and
120 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,43 @@ | ||
name: Build Wheels | ||
|
||
# Trigger the workflow on push or pull request of the master | ||
on: | ||
push: | ||
branches: | ||
- master | ||
pull_request: | ||
branches: | ||
- master | ||
|
||
# Building wheels on Ubuntu and Windows systems | ||
jobs: | ||
build_wheels: | ||
name: Build wheels on ${{ matrix.os }} | ||
runs-on: ${{ matrix.os }} | ||
strategy: | ||
matrix: | ||
os: [ubuntu-20.04, windows-2019] | ||
|
||
steps: | ||
- uses: actions/checkout@v2 | ||
|
||
# Used to host cibuildwheel | ||
- uses: actions/setup-python@v2 | ||
|
||
- name: Install cibuildwheel | ||
run: python -m pip install cibuildwheel==1.10.0 | ||
|
||
- name: Install libomp | ||
if: runner.os == 'macOS' | ||
run: brew install libomp | ||
|
||
- name: Build wheels | ||
run: python -m cibuildwheel --output-dir wheelhouse | ||
env: | ||
CIBW_SKIP: "cp27-* cp33-* cp34-* cp35-* pp*" | ||
CIBW_PROJECT_REQUIRES_PYTHON: ">=3.6" | ||
CIBW_ARCHS: "auto64" | ||
|
||
- uses: actions/upload-artifact@v2 | ||
with: | ||
path: ./wheelhouse/*.whl |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,239 @@ | ||
{ | ||
"cells": [ | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"# SPADE Tutorial" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": { | ||
"ExecuteTime": { | ||
"end_time": "2020-04-23T08:16:59.289299Z", | ||
"start_time": "2020-04-23T08:16:58.185541Z" | ||
} | ||
}, | ||
"outputs": [], | ||
"source": [ | ||
"import quantities as pq\n", | ||
"import neo\n", | ||
"import elephant\n", | ||
"import viziphant\n", | ||
"import random\n", | ||
"random.seed(4542)" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"# Generate correlated data" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"SPADE is a method to detect repeated spatio-temporal activity patterns in parallel spike train data that occur in excess to chance expectation. In this tutorial, we will use SPADE to detect the simplest type of such patterns, synchronous events that are found across a subset of the neurons considered (i.e., patterns that do not exhibit a *temporal extent*). We will demonstrate the method on stochastic data in which we control the patterns statistics. In a first step, let use generate 10 random spike trains, each modeled after a Poisson statistics, in which a certain proportion of the spikes is synchronized across the spike trains. To this end, we use the `compound_poisson_process()` function, which expects the rate of the resulting processes in addition to a distribution `A[n]` indicating the likelihood of finding synchronous spikes of a given order `n`. In our example, we construct the distribution such that we have a small probability to produce a synchronous event of order 10 (`A[10]==0.02`). Otherwise spikes are not synchronous with those of other neurons (i.e., synchronous events of order 1, `A[1]==0.98`). Notice that the length of the distribution `A` determines the number `len(A)-1` of spiketrains returned by the function, and that `A[0]` is ignored for reasons of clearer notation." | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": { | ||
"ExecuteTime": { | ||
"end_time": "2020-04-23T08:16:59.454207Z", | ||
"start_time": "2020-04-23T08:16:59.419213Z" | ||
} | ||
}, | ||
"outputs": [], | ||
"source": [ | ||
"spiketrains = elephant.spike_train_generation.compound_poisson_process(\n", | ||
" rate=5*pq.Hz, A=[0]+[0.98]+[0]*8+[0.02], t_stop=10*pq.s)\n", | ||
"len(spiketrains)" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"In a second step, we add 90 purely random Poisson spike trains using the `homogeneous_poisson_process()|` function, such that in total we have 10 spiketrains that exhibit occasional synchronized events, and 90 uncorrelated spike trains." | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"for i in range(90):\n", | ||
" spiketrains.append(elephant.spike_train_generation.homogeneous_poisson_process(\n", | ||
" rate=5*pq.Hz, t_stop=10*pq.s))" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"# Mining patterns with SPADE" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": { | ||
"ExecuteTime": { | ||
"end_time": "2020-04-23T08:17:01.595733Z", | ||
"start_time": "2020-04-23T08:17:01.591410Z" | ||
} | ||
}, | ||
"source": [ | ||
"In the next step, we run the `spade()` method to extract the synchronous patterns. We choose 1 ms as the time scale for discretization of the patterns, and specify a window length of 1 bin (meaning, we search for synchronous patterns only). Also, we concentrate on patterns that involve at least 3 spikes, therefore significantly accelerating the search by ignoring frequent events of order 2. To test for the significance of patterns, we set to repeat the pattern detection on 100 spike dither surrogates of the original data, creating by dithing spike up to 5 ms in time. For the final step of pattern set reduction (psr), we use the standard parameter set `[0, 0, 0]`." | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": { | ||
"ExecuteTime": { | ||
"end_time": "2020-04-23T08:17:03.218505Z", | ||
"start_time": "2020-04-23T08:17:02.387311Z" | ||
} | ||
}, | ||
"outputs": [], | ||
"source": [ | ||
"patterns = elephant.spade.spade(\n", | ||
" spiketrains=spiketrains, binsize=1*pq.ms, winlen=1, min_spikes=3, \n", | ||
" n_surr=100,dither=5*pq.ms, \n", | ||
" psr_param=[0,0,0],\n", | ||
" output_format='patterns')['patterns']" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"The output `patterns` of the method contains information on the found patterns. In this case, we retrieve the pattern we put into the data: a pattern involving the first 10 neurons (IDs 0 to 9), occuring 4 times." | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"patterns" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"Lastly, we visualize the found patterns using the function `plot_patterns()` of the viziphant library. Marked in red are the patterns of order ten injected into the data." | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": { | ||
"ExecuteTime": { | ||
"end_time": "2020-04-23T08:17:04.600606Z", | ||
"start_time": "2020-04-23T08:17:04.423012Z" | ||
}, | ||
"scrolled": true | ||
}, | ||
"outputs": [], | ||
"source": [ | ||
"viziphant.spade.plot_patterns(spiketrains, patterns)" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [] | ||
} | ||
], | ||
"metadata": { | ||
"kernelspec": { | ||
"display_name": "Python 3", | ||
"language": "python", | ||
"name": "python3" | ||
}, | ||
"language_info": { | ||
"codemirror_mode": { | ||
"name": "ipython", | ||
"version": 3 | ||
}, | ||
"file_extension": ".py", | ||
"mimetype": "text/x-python", | ||
"name": "python", | ||
"nbconvert_exporter": "python", | ||
"pygments_lexer": "ipython3", | ||
"version": "3.7.3" | ||
}, | ||
"latex_envs": { | ||
"LaTeX_envs_menu_present": true, | ||
"autocomplete": true, | ||
"bibliofile": "biblio.bib", | ||
"cite_by": "apalike", | ||
"current_citInitial": 1, | ||
"eqLabelWithNumbers": true, | ||
"eqNumInitial": 1, | ||
"hotkeys": { | ||
"equation": "Ctrl-E", | ||
"itemize": "Ctrl-I" | ||
}, | ||
"labels_anchors": false, | ||
"latex_user_defs": false, | ||
"report_style_numbering": false, | ||
"user_envs_cfg": false | ||
}, | ||
"toc": { | ||
"nav_menu": {}, | ||
"number_sections": true, | ||
"sideBar": true, | ||
"skip_h1_title": false, | ||
"title_cell": "Table of Contents", | ||
"title_sidebar": "Contents", | ||
"toc_cell": false, | ||
"toc_position": {}, | ||
"toc_section_display": true, | ||
"toc_window_display": false | ||
}, | ||
"varInspector": { | ||
"cols": { | ||
"lenName": 16, | ||
"lenType": 16, | ||
"lenVar": 40 | ||
}, | ||
"kernels_config": { | ||
"python": { | ||
"delete_cmd_postfix": "", | ||
"delete_cmd_prefix": "del ", | ||
"library": "var_list.py", | ||
"varRefreshCmd": "print(var_dic_list())" | ||
}, | ||
"r": { | ||
"delete_cmd_postfix": ") ", | ||
"delete_cmd_prefix": "rm(", | ||
"library": "var_list.r", | ||
"varRefreshCmd": "cat(var_dic_list()) " | ||
} | ||
}, | ||
"types_to_exclude": [ | ||
"module", | ||
"function", | ||
"builtin_function_or_method", | ||
"instance", | ||
"_Feature" | ||
], | ||
"window_display": false | ||
} | ||
}, | ||
"nbformat": 4, | ||
"nbformat_minor": 2 | ||
} |
Oops, something went wrong.