# A large-scale spiking neural network model of the basal ganglia circuitry.

This model integrates fine-tuned models of phenomenological (Izhikevich) spiking neurons that correspond to different sub-types of cells within the BG nuclei, electrical and conductance-based chemical synapses that include short-term plasticity and neuromodulation, as well as anatomically-derived striatal connectivity.

In particular, this model comprises 10 neural populations that correspond to the four major nuclei of the biological basal ganglia and form their canonical circuit. These include the striatum (modelled with higher detail than the other groups) and the subthalamic nucleus (STN), the two inputs of the basal gnaglia, the external part of the globus pallidus (GPe), as well as the substantia nigra pars reticulata (SNr), one of the two output structures. Furthermore, the effect of the pars compacta part of the substantia nigra (SNc) is realized through the concentration of the neurotransmitter dopamine in the different parts of the network. The network is divided into three microscopic channels, which are mutually inhibited and used to represent different action requests. A full description of this model can be found in the first two published manuscripts that follow.

A list of citable manuscripts that used this model:

The latest version of this project can be also found on github: https://github.com/zfountas/basal-ganglia-model

## Prerequisites

The project's prerequisites include the python2.7 libraries brian, numpy and matplotlib. To install these libraries on a linux machine please open a terminal and type:

```
(sudo) pip install -r requirements.txt
```

## Run simulations

To run a simulation please type:

```
./bgrun -argument1 -argument2 ...
```

where the available arguments are given as:

### BASIC ARGUMENTS

### INPUT TYPES

### RECORDING OPTIONS

### INPUT MODES

### AMPLITUTE OF INPUTS

### REST

## Authors

## License

This project is licensed under the GLUv3 License - see the [LICENSE](LICENSE) file for details.

## Acknowledgments