work-storage-byTopic-neuroscience-quals-Q3slides

\documentclass{seminar} \usepackage{colordvi} \begin{document}

Information in the optic nerve


\begin{itemize} \item given a visual stimulus, what patterns of activity will be induced in the optic nerve? \item \textbf{given a pattern of activity in the optic nerve, what does that pattern mean?}

\begin{itemize} \item Neural code is unknown \item So, what can we do without it, or to learn it? \begin{itemize} \item Attempt to learn to reconstruct signal from spike train \item Constrain the neural code by estimating the amount of information in it \end{itemize} \end{itemize} \end{itemize}


Stimulus reconstruction from spike train

Given a pair (stimulus signal, spike train signal), learn to guess stimulus signal when given spike train.


Note: doing optimal reconstruction is a different problem from predicting the system's response to stimulus

response) = \frac{P(response stimulus) P(stimulus)}{P(response)}$
stimulus)$: the distribution of noise matters

Note: doing optimal reconstruction is a different problem from predicting the system's response to stimulus cont'd


Signal estimation


Signal estimation cont'd


Signal estimation cont'd

The filter

Here's the equation that we'll use to predict the stimulus, given a spike train (it's a convolution of the spike train with some kernel K):

\begin{equation} stimulus(t) = \int K(\tau)\ spike\ train(t - \tau) d \tau \end{equation}


Signal estimation cont'd

The spike train

Model spike train as a sum of delta functions at the spikes (which occur at times $t_i$, $i = 1\ldots N$):

\begin{equation*} spike\ train(t) = \sum_{i=1}^N \delta(t - t_i) \end{equation*}


Signal estimation cont'd

The kernel

\begin{equation*} K(t) = \int \frac{dw}{2 \pi} e^{-i w t} \frac{E\left[ \tilde{s}(w) \sum_j e^{-i w t_j} \right]}{E\left[ \left

\end{equation*}
\sum_{j} e^{i w t_j} \right^2 \right]}

Information rate


Collecting data

\includegraphics[scale=.3]{twoDataSets.eps}

\emph{red = ensemble data, green = conditional data}


Discretize data

Now each recording is equivalent to a \emph{string of symbols} over some alphabet $A$

\includegraphics[scale=.2]{symbolStream.eps}


Information rate cont'd

Overview of procedure

S = s)$) from conditional data

\bigskip

Estimating entropy rates will be very tricky when we account for temporal correlations, later!!!


Information rate cont'd

Estimating an entropy rate


Information rate cont'd


Information rate cont'd

Temporal correlations

\includegraphics[scale=.3]{wordHist.eps}

\includegraphics[scale=.33]{wordLengthInfiniteData.eps}


Information rate cont'd

The danger of undersampling


Information rate cont'd

Conflicting biases!


Information rate cont'd

If we're lucky

If we're lucky, there will be a plateau where $N$ is "just right"

\includegraphics[scale=.3]{competingBias3.eps}


Information rate cont'd

"Direct method"

In the past, people have tried to approach the plateau from the left (i.e. an overestimate of entropy rate) and used their intuition to decide if they are near the plateau. Then then fit the entropy rate vs. $N$ curve with a 2nd-degree polynomial, and extrapolate to the asymptote.


Information rate cont'd

Plateau not guaranteed

But it's possible for there to be no plateau, or for it to be hard to detect.

\includegraphics[scale=.2]{noPlateau.eps}


Information rate cont'd

What to do?

More advanced methods are under development to automatically choose values of $N$, search for plateaus, etc. Some of these methods end up with no free parameters which a human must guess. However, there's always a price: these models make assumptions about the underlying process, or at the least they assume a prior probability distribution from which the underlying process was drawn.


\end{document}