Home  
  Spike Train Toolkit
     - Primer
     - Introduction
     - Toolkit Wiki
     - Download
     - Demonstrations
     - Input Format
     - Input Structures
     - Output Structures
     - Information Options
     - Entropy Options
     - Function Reference
     - Version History
     - License
     - Contribute
  Analysis Overview
     - Analyze Data
     - Your Submissions
  Neurodatabase
  BrainML Models
  LNI Homepage
 
User ID
Password


  Contribute to the Spike Train Analysis Toolkit



Introduction

The Spike Train Analyis Toolkit consists of several components:

  • Entropy methods estimate the entropy of a random variable from a vector of word counts. Examples include plug-in, Treves-Panzeri-Miller-Carlton, jackknife, Ma bound, best upper bound, and Chao-Shen. Entropy methods are accompanied by variance methods, which can be used to generate confidence limits. Variance methods fall into two classes: general variance methods, which apply to all entropy methods, and specific variance methods, which apply to particular entropy methods.
  • Information methods estimate the mutual information between an ensemble of spike train responses and a set of experimental conditions. An information method may consist of several calls to entropy methods. Examples include the direct method, the metric space method, and the binless embedding method.
  • Components for reading .stad/.stam files into the input data structure we have developed.
  • Components for manipulating histogram data structures that we have developed.
  • Various utilities for processing method options, allocating/freeing memory, and mathematical/sorting operations.
Members of the community are invited to contribute information or entropy methods.

Use of Matlab and C

The Spike Train Analysis Toolkit is written in a combination of Matlab and C. The computational code is written in C, which was chosen because of its fast execution, the availability of free compilers, and the potential for porting to parallel clusters. The user interface to the toolkit is written in Matlab, which was chosen for the ease with which data can be manipulated and visualized. The interface between the C code and the Matlab code is written in C using Matlab's MEX API.

Members of the community have the option of contributing implementations in Matlab or in C/MEX. For each case, we provide guidelines that contributors should follow.

Information methods

In general, we recommend that information methods be divided into several modules which correspond to distinct steps and may return useful intermediate results.

The preferred function format is

[out,opts_used]=method(in,opts)
You may want to have the main function call modules that return intermediate results of interest. The modules will also consist of Matlab functions, although intermediate variables will be passed among the module functions.

If the underlying computations are implemented in C/MEX, you must add the compilation commands to the file make.m. Also, we recommend creating a directory in the info directory and putting the associated files there.

Entropy methods

An entropy method in Matlab

It is not possible to seamlessly integrate entropy methods into the toolkit without writing them in C and the MEX framework. We will be happy to work with you to adapt your method for integration with the toolkit.

An entropy method in C/MEX

For the remainder of this document, quant will refer to the quantity being estimated (i.e., entropy or variance) and method refers to the method name.

  1. Add quant_method_c.c to the entropy/ directory.
  2. In entropy_c.h:
    1. Increment the value of ENT_EST_METHS, SPEC_VAR_EST_METHS, or GEN_VAR_EST_METHS.
    2. Add your function declaration (where struct_type is estimate for entropy methods, and nv_pair for variance methods):
      extern int quant_method(struct hist1d *in,struct options_entropy *opts,struct struct_type *out);
  3. In entropy_c.c:
    1. Add your method code name to either ent_est_meth_list or var_est_meth_list.
    2. Add quant_method to the statements that define the elements of the entropy_fun, specific_variance_fun, or general_variance_fun function pointers.
  4. In make.m add entropy/quant_method_c.c to the string entropy_files.

If your entropy method does not require any options

  1. In entropy_mx.c:
    1. In ReadOptionsEntropy(), add read_options_quant_null to the statements that define the elements of the entropy_fun or variance_fun function pointers.
    2. In WriteOptionsEntropy(), add write_options_quant_null to the statements that define the elements of the entropy_fun or variance_fun function pointers.

If your entropy method requires options

  1. Add quant_method_mx.c to the entropy/ directory.
  2. In entropy_c.h:
    1. Add the default parameter values to the #define statements
    2. Add the options to the options_entropy structure.
  3. In entropy_mx.h, add the read/write options function declarations:
    extern void read_options_quant_method(const mxArray *in,struct options_entropy *opts);
    extern mxArray *write_options_quant_method(const mxArray *in,struct options_entropy *opts);
  4. In entropy_mx.c:
    1. In ReadOptionsEntropy(), add read_options_quant_method to the statements that define the elements of the entropy_fun or variance_fun function pointers.
    2. In WriteOptionsEntropy(), add write_options_quant_method to the statements that define the elements of the entropy_fun or variance_fun function pointers.
  5. In make.m add entropy/quant_method_mx.c to the string entropy_files.



 Webmaster

Weill Medical College of Cornell University