Careers – Computational Neuroscientist

Computational Neuroscientist

CoMind is building non-invasive brain-computer interfaces focused on dramatically increasing our understanding of the brain and changing how we interact with computers.

We are looking for a Computational Neuroscientist with a strong background in modelling and simulating neural activity and dynamics at multiple scales - biophysics, circuits and populations/networks - and at integrating across these scales. You will use this background to create biologically-realistic simulations in response to different types of stimuli or perturbations and under different brain states or cellular/tissue/architecture parameters and assumptions. In addition, you will use analytical and quantitative skills to extract dynamics and components of activity from tasks and recordings in human subjects. The results of your work will play a key role in informing the development of neuroimaging and neuromodulation hardware, imaging software and neural decoding models. You will also have the opportunity to contribute to all these areas of research directly.

You will be working in a rapidly growing team to build technology which will change how humans interact with technology. You will be working alongside Optical Engineers, Hardware Engineers, Neuroscientists and Machine Learning Engineers to create (i) a new form of neuroimaging and neurostimulation, and (ii) a suite of BCI algorithms for interpreting recorded brain activity.

CoMind is a venture-backed start-up including some of the top investors in Silicon Valley and Europe.

Essential Functions Of The Role:​

The role will offer the candidate a unique opportunity to acquire new skills and develop their own creative solutions in an experimental environment. We are looking for creative thinkers, who work well in a team, and who welcome new challenges and solve problems with a fresh and unique perspective, drawing inspiration for solutions from wide-ranging topics. You never shy away from and enjoy solving difficult problems and helping others to solve difficult problems.
You should have a thorough understanding of and experience with creating and running simulations of neural activity at different levels of abstraction, from biophysical models to circuits and networks. Experience with well-established software packages for multi-scale neural simulations such as NEURON, BRIAN or NetPyNe (or others) is essential. 
Using a strong quantitative & analytical background to analyse human neural recordings and extract low-dimensional projections (GPFA, PCA, etc) of the activity that captures and describe neural dynamics effectively, then using induction to develop intelligible task/dynamics models from this analysis.

You should be able to work effectively and cooperatively with a diverse and multidisciplinary team of engineers and physicists. You should be able to effectively communicate your needs, concepts, and ideas to the rest of the team to enable them to transform these into reality. You should have an entrepreneurial mindset and a very strong work ethic. You are a fast learner and adaptable to different problems even if you don’t have as much experience in that area.

You Should Apply If You Have:

  • PhD in Theoretical/Computational Neuroscience or a Master’s degree in the same topic and 3+ years of relevant research or industry experience.

  • Experience creating and running simulations of neural activity and dynamics at different scales from biophysics to populations/networks.

  • Papers published in peer-reviewed journals (please attach to application) or communications in conferences where you made a key contribution to this topic of research.

  • Experience using well-established software frameworks for neural simulations (NEURON, BRIAN, NetPyne or others).

  • Good communication skills, i.e. the ability to abstract the key properties and insights from your modelling work and explain it clearly to non-neuroscientists.

  • The ability to work collaboratively: your work will feed on and feed into the research work of optical and machine learning engineers and neuroscientists.

  • Great programming skills, which you use to write and maintain intelligible code that can be understood by others and which can run fast and efficiently.

  • Experience exploring, analysing and visualising data.

  • Experience using TensorFlow or PyTorch, as well as NumPy, Pandas, Scikit-learn.

  • Creativity, fearless thinking and thriving in collaboration, brainstorming and bouncing ideas around with colleagues.

  • A highly creative first-principles thinker who thrives on hard problems and is a natural people person and leader.

Additional Preferred Skills:

  • Experience in approaches to solving inverse problems (EEG, MEG).

  • Experience in dynamical systems.

  • Experience in simulating neural recordings (LFP, extracellular, EEG) from physical and neural principles, in addition to neural processes.

  • 2+ examples of first-author peer-reviewed scientific publications in a relevant topic.

  • Experience working with large and noisy datasets.

  • Experience in procuring, analyzing, and managing complex neural activity datasets.

  • Demonstrable experience successfully implementing tools, analyses and replicating results published by others.

  • Excellent documentation and communication skills.

  • Experience working with HPCs.

  • Access to a solid network in the theoretical neuroscience community.

What We Offer:

  • Competitive salary plus stock options.

  • Pension.

  • Breakfast and snacks provided.

  • An opportunity to change how humans interact with computers and the potential to save lives.

  • The chance to help grow a company and its culture from the ground up.

  • A promising career growth opportunity. We would much rather promote people who have exceeded expectations in their current role rather than hire new people.


Don't see a role which matches your skill set? Get in touch.


The future of non-invasive neural interfaces.


To learn more about how we are leading the next cognitive and computing revolution, get in touch with us at: