04-05-2013, 02:57 PM
Data acquisition
Data acquisition.docx (Size: 976.13 KB / Downloads: 29)
INTRODUCTION
There are three main steps to building the virtual brain: 1) data acquisition, 2) simulation, 3) visualisation of results.
Data acquisition involves taking brain slices, placing them under a microscope, and measuring the shape and electrical activity of individual neurons. This is how the different types of neuron are studied and catalogued. The neurons are typed by morphology (i.e. their shape), electrophysiological behaviour, location within the cortex, and their population density. These observations are translated into mathematical algorithms which describe the form, function, and positioning of neurons. The algorithms are then used to generate biologically-realistic virtual neurons ready for simulation.
One of the methods is to take 300 µm-thick sagittal brain slices from the somatosensory cortex (SA1) of juvenile Wistar rats (aged 14 to 16 days). The tissue is stained with biocytin and viewed through a bright field microscope. Neuronal 3D morphologies are then reconstructed using the Neurolucida software package (pictured below, far right) which runs on Windows workstations. Staining leads to a shrinkage of 25% in thickness and 10% in length, so the reconstruction process corrects for this. Slicing also severs 20% to 40% of axonal and dendritic arbors, so these are regrown algorithmically.
The electrophysiological behaviour of neurons is studied using a 12 patch clamp instrument (pictured below left). This tool was developed for the Blue Brain Project and it forms a foundation of the research. It enables twelve living neurons to be concurrently patched and their electrical activity recorded. The Nomarski microscope enhances the contrast of the unstained samples of living neural tissue. Carbon nanotube-coated electrodes can be used to improve recording.
Simulation
NEURON
Example NEURON cell builder window
The primary software used by the BBP for neural simulations is a package called NEURON. This was developed starting in the 1990s by Michael Hines at Yale University and John Moore at Duke University. It is written in C, C++, and FORTRAN. The software continues to be under active development and, as of July 2012, is currently at version 7.2. It is free and open source software, both the code and the binaries are freely available on the website. Michael Hines and the BBP team collaborated in 2005 to port the package to the massively parallel Blue Gene supercomputer.
• Website: www.neuron.yale.edu
• Scholarpedia: NEURON simulation environment
Simulation speed
In 2012 simulations of one cortical column (~10,000 neurons) run at approximately 300 x slower than real time. So one second of simulated time takes about five minutes to complete. The simulations show approximately linear scaling - that is, doubling the size of the neural network doubles the time it takes to simulate. Currently the primary goal is biological validity rather than performance. Once it's understood which factors are biologically important for a given effect it might be possible to trim components that don't contibute in order to improve performance.
The simulation timestep for the numerical integrations is 0.025 ms and the timestep for writing the output to disk is 0.1 ms.
Workflow
The simulation step involves synthesising virtual cells using the algorithms that were found to describe real neurons. The algorthims and parameters are adjusted for the age, species, and disease stage of the animal being simulated. Every single protein is simulated, and there are about a billion of these in one cell. First a network skeleton is built from all the different kinds of synthesised neurons. Then the cells are connected together according to the rules that have been found experimentally. Finally the neurons are functionalised and the simulation brought to life. The patterns of emergent behaviour are viewed with visualisation software.
A basic unit of the cerebral cortex is the cortical column. Each column can be mapped to one function, e.g. in rats one column is devoted to each whisker. A rat cortical column has about 10,000 neurons and is about the size of a pinhead. The latest simulations, as of November 2011, contain about 100 columns, 1 million neurons, and 1 billion synapses. A real life rat has about 100,000 columns in total, and humans have around 2 million. Techniques are being developed for multiscale simulation whereby active parts of the brain are simulated in great detail while quiescent parts are not so detailed.
Every two weeks a column model is run. The simulations reproduce observations that are seen in living neurons. Emergent properties are seen that require larger and larger networks. The plan is to build a generalised simulation tool, one that makes it easy to build circuits. There are also plans to couple the brain simulations to avatars living in a virtual environment, and eventually also to robots interacting with the real world. The ultimate aim is to be able to understand and reproduce human consciousness.
Computer hardware / Supercomputers
Blue Gene/P
The primary machine used by the Blue Brain Project is a Blue Gene supercomputer built by IBM. This is where the name "Blue Brain" originates from. IBM agreed in June 2005 to supply EPFL with a Blue Gene/L as a "technology demonstrator". The IBM press release did not disclose the terms of the deal. In June 2010 this machine was upgraded to a Blue Gene/P. The machine is installed on the EPFL campus in Lausanne (Google map) and is managed by CADMOS (Center for Advanced Modelling Science).
DEEP - Dynamical Exascale Entry Platform
DEEP (deep-project.eu) is an exascale supercomputer to be built at the Jülich Research Center in Germany. The project started in December 2011 and is funded by the European Union's 7th framework programme. The three-year protoype phase of the project has received €8.5 million. A prototype supercomputer that will perform at 100 petaflops is hoped to be built by the end of 2014.
The Blue Brain Project simulations will be ported to the DEEP prototype to help test the system's performance. If successful, a future exascale version of this machine could provide the 1 exaflops of performance required for a complete human brain simulation by the 2020s.
The DEEP prototype will be built using Intel MIC (Many Integrated Cores) processors, each of which contains over 50 cores fabricated with a 22 nm process. These processors were codenamed Knights Corner during development and subsequently rebranded as Xeon Phi in June 2012. The processors will be publicly available in late 2012 or early 2013 and will offer just over 1 teraflop of performance each.
Funding
The project is funded primarily by EPFL, which in turn is funded by the Swiss government. EPFL is one of only two federally-funded universities in Switzerland, the other being ETH in Zurich. The BBP has additionally received funding from EU research grants, foundations, other entities, and individuals. Henry Markram mentioned in an interview in 2009 that there was "one special visionary donor" but he didn't specify exactly who.
In March 2012 the ETH Board requested CHF 85 million (€70 m) from the Swiss government to fund the Blue Brain Project during 2013 to 2016.
IBM has not funded the project, but they sold their Blue Gene supercomputer to EPFL at a reduced cost. This was because at the time the computer was a prototype and IBM was interested in testing the machine on different applications.
An application has been made for an EU FET Flagship grant for the Human Brain Project. This would provide €1 billion in funding over ten years. If the grant is awarded then the BBP will become a key part of the Human Brain Project and will share some of the funding. A decision on this award is expected in February 2013.