The Interesting Interplay Between Biological Intelligence & Artificial Intelligence

Women Leading Visual Tech: Interview with Dr. Mackenzie Mathis

LDV Capital invests in people who are building businesses powered by visual technologies. We thrive on collaborating with deep tech teams leveraging computer vision, machine learning, and artificial intelligence to analyze visual data. We are the only venture capital firm with this thesis.

It’s been a year since we started our monthly Women Leading Visual Tech interview series to showcase the leading women whose work in visual tech is reshaping business and society!

Dr. Mackenzie Mathis, the founder of the Adaptive Motor Control Lab © Alain Herzog

Dr. Mackenzie Mathis, the founder of the Adaptive Motor Control Lab © Alain Herzog

Mackenzie Mathis is a neuroscientist, a tenure-track professor at the Swiss Federal Institute of Technology, working within the Brain Mind Institute & Center for Neuroprosthetics. The lab is hosted at the Campus Biotech in Geneva, Switzerland, where she holds the Bertarelli Foundation Chair of Integrative Neuroscience.

Dr. Mathis founded the Adaptive Motor Control Lab to investigate the neural basis of adaptive motor behaviors in mice to inform future translational research in neurological diseases. Her team’s goal is to reverse engineer the neural circuits that drive adaptive motor behavior by studying artificial and natural intelligence. The researchers are using the latest techniques in 2-photon and deep brain imaging. They also develop computer vision tools, like DeepLabCut

Before that, Mackenzie completed her doctoral studies and was a faculty member at Harvard University. Her work has been featured in Nature, Bloomberg BusinessWeek, and The Atlantic.

LDV Capital’s Abigail Hunter-Syed spoke with Dr. Mackenzie Mathis about her work to beat neurodegenerative diseases like Lou Gehrig’s, the awesome computer vision tool she developed DeepLabCut, and so much more! (Note: After five years with LDV Capital, Abby decided to leave LDV to take a corporate role with fewer responsibilities that will allow her to have more time to focus on her young kids during these crazy times.)

The following is the shortened text version of the interview.

Abby: How do you describe what you do in simple terms?

Mackenzie: Humans can do remarkable things in terms of how we express ourselves. Our speaking, walking, writing, and other activities depend on the motor system. I’m interested in understanding how the motor system across different brain areas is orchestrating adaptability and this ease with which we carry ourselves around the world. 

It sounds like a simple problem because we do it all the time with no effort but the underlying neural mechanisms are complicated. It's still a big and unanswered question. 

Dr. Mathis at her lab © Cassandra Klos

Dr. Mathis at her lab © Cassandra Klos

Abby: How did your fascination with it start?

Mackenzie: When I was an undergraduate, I was planning to go to medical school and I worked for a doctor all through college. I saw patients with Lou Gehrig's disease. In a few words: your motor neurons are dying in the periphery and you lose the ability to move and eventually the ability to breathe. I wondered how this could become a druggable thing. Is that possible? Will it require cell transplants and different types of therapies? That's where my fascination with the motor system and how the brain orchestrates this movement came from.

Abby: How did your initial research transition into computational neuroscience? When did you start using deep learning and why?

Mackenzie: Deep learning burst onto the scene around six years ago, when it became clear that this was going to be usable and have a massive impact and on all facets of society. It was about four years ago when you started seeing these incredible algorithms and what they can do. It was a natural fit. 

Deep neural networks were built with inspiration from the brain, in terms of the units or the nodes. These networks model what we think our neurons might do.

There's this interesting interplay between understanding biological intelligence on one hand and artificial intelligence on the other hand. That's when it came into our work. Now instead of doing stem cell-derived neurons, we're trying to build models of the system. We still use mice as a model system. Our lab does both wet and dry work to compare biological and machine intelligence.

DeepLabCut-mice-research

Abby: With some of the machine learning tools that you've developed, you're leveraging a lot of imaging to understand how the brain is functioning. Can you tell me about that? How can you measure neural activity to understand what behaviors it drives?

image8.gif

Mackenzie: We can have a window into the brain. With mice, we can genetically engineer them such that any time a neuron fires, which is transmitting information, it glows green (which is called calcium imaging). We can record that and then look at these populations of green-glowing neurons and try to make sense of the dynamics: how well does this firing or this glowing green correlate with the behavior? Do the same neurons fire every time I reach my arm?

We can also go back in and perturb those neural circuits. Not only we can measure them with this green fluorescent protein variant but some tools also allow us to do light-gated activation or inhibition, which is called optogenetics. We can selectively, in specific space and time windows, shut down populations of neurons or stimulate them. That allows us to causally probe how the brain relates to behavior. If I stimulate these neurons, did they cause behavior X? Or if I shut them down, can behavior X not be done? This is something that was built nearly 10 years ago by researchers at Stanford and MIT (Karl Deisseroth and Ed Boyden) that's completely transformed the field.

Abby: When you talk about stimulating neurons to drive a specific motor behavior, how do you stimulate them?

Mackenzie: Optogenetics allows us to use engineered proteins to drive neurons to fire when you shine a light on them. In recent years new variants that leverage far-red light, you can almost do this noninvasively (of course it still requires genetic engineering). But, you can put a light over the brain region and then activate it.

Abby: We consider this a visual technology because the light is used to examine the gene or the neuron that's firing. Tell me about chemogenetics?

Mackenzie: Chemogenetics is a newer tool in the box, it was invented by Bryan Roth and his colleagues. Instead of light, you use a designer receptor. You can imagine a drug that would otherwise be inert and doesn't do anything if you give it to a mouse or a human. Then you can build that receptor into subsets of cells. When you give this otherwise inert drug, you can then drive the activity of the subsets of neurons.

Chemogenetics is a powerful tool, and it’s different from optogenetics. You can study some things over longer timescales. Optogenetics is more geared towards fine time scale manipulations.

Having both of these tools, plus more in the toolbox of “hacking the mouse” is a powerful platform, because as you said, then the whole world of using visual tools – computer vision and artificial networks – to analyze this data becomes a reality too.

Abby: Mapping it back to the motor side of things, when you are doing your research, and you're looking to understand which neurons are firing associated with which motor activity is happening, can you give me an example?

Mackenzie: We are currently interested in two avenues of motor research. One is: what does a typical life of a mouse look like? What do they learn in their day-to-day life? To do this, we research natural movements. Two is: we are trying to understand how you learn skills because what we're trying to do in both cases is understand the natural, healthy state of the system so we can try to understand when it goes wrong, what these circuits functions are. 

To do this, we teach mice to play video games, which sounds maybe a little crazy but bear with me here. 

computer vision-mice-research.gif

The aim is that they get to learn a new skill, namely, they learn skilled games that they don't normally do. We give mice a little joystick, it's a tiny robot. They can reach out and move it in multiple directions. You can think of it as a simple Pac-Man game for mice. But what that allows us to do then is play games with them. You can change the rules of the game. The animal has to learn to play it in real-time, and then we can watch this unfold in the behavior space. We can film them and analyze all the movements that they make. But then at the same time, we have calcium imaging and optogenetics to perturb and measure the neural circuits.

computer vision - horses.gif

Abby: It seems like with DeepLabCut – the software that you developed to monitor some of their behavior – it wasn't enough for you to watch, observe and make notes on it, but you wanted to have some form of measurable objective movement monitoring for them.

Mackenzie: You're right. DeepLabCut is aimed to leverage computer vision to be able to track the movements of anything that you can see. If you can see it, you can train this algorithm to find it. It comes from a long history in computer vision research, which we could go on for hours about.

Do you know how motion capture is done for movies like Avatar? Humans get in front of a green screen, they have reflective markers all over them and they are easy to track in a computer vision sense. They're bright, they're salient, it could have worked... but mice don't like wearing costumes.

Abby: So you guys did try this is what you're saying?

computer vision- insects research.gif

Mackenzie: I think it went as far as painting their fingernails with glow-in-the-dark paint to track their hands. We want to know exactly what they're doing when they grab the joystick. This is a better, and non-invasive way, purely through video. We can teach a deep neural network to track any points of interest. As long as the human can label it, we train the supervised algorithm.

Abby: What was the driving factor behind making DeepLabCut open source?

Mackenzie: I'm standing on the shoulders of giants in computer vision and people in that field have done so much for human pose estimation. Our contribution in many ways was to say, "Look, we can take these incredible algorithms. We can use a trick called transfer learning, which we can talk about if you want and, and use this on the animals or any species that you want." It’s a matter of giving back to the field. It's a pay-it-forward situation if you will.

I'm a proponent of open science. I like democratizing AI and making it accessible to people.

Abby: What other scientists have you seen uniquely using DeepLabCut that you were impressed by?

Mackenzie: There are so many applications that I never could have imagined when we first released this! I find interesting the research of octopuses in the wild. There's also a little organism called the Hydra, which is transparent and can extend its arms and grow new arms and do incredible things.

We touched on the topic of surgical robots in our interview with Carol Reiley, the Mother of Robots. She previously worked for Intuitive, the market leader in the surgical robotics space.

There have been some great reports of using it in human applications too. For example, in surgical robots. This software is used to teach surgeons to be better at quantifying their movements to gait analysis in humans with Parkinson's.

When you put something out in the world that is used, your efforts aren't wasted, but also at the same time, it’s rewarding to see the incredible depth of ideas and things that people can use it in.

Abby: We have a portfolio company – CIONIC – that's using pose estimation. With smart clothing, they can delay the need for crutches in people with Parkinson's disease and others. Something that you’ve developed to look at mice is going to be used to help people be safer as they walk and delay serious harm and serious challenges associated with their disease.

Mackenzie: It's kind of full-circle and takes your breath away. I have a grandfather who passed away, but prior to this, he had a stroke. At the time there was just a hospital bed and it’s one of the saddest places, and I felt helpless with the lack of tech. Now we have so much AI and technology in our day-to-day life, but yet a hospital bed is not a smart tool. I hope this changes.

Smart clothing and smart devices are the future. It's not about replacing humans. It’s all about making the world a better place. It's about augmentation. 

Dr. Mathis is showing some mouse video games © Photo by Cassandra Klos

Dr. Mathis is showing some mouse video games © Photo by Cassandra Klos

Abby: How does your research of neurological workings of mice correlate to doing something like fighting Lou Gehrig's disease?

Mackenzie: Mice can be an incredible platform for lots of different technologies – anything from optogenetics to pose estimation. These tools have spread beyond the mouse but having the ability to use mice for this has been impactful. The other side of this is that there have been mouse models of these different diseases, especially neurodegenerative diseases, but they haven't quite panned out in the sense that many of the drugs that have worked in mice have failed in clinical trials for humans. A lot of this has to do with how we measure and think about behavior.

We care about curing diseases but also the quality of life, where historically in biomedical research, we look at the data and ask ourselves, "Did it extend the mouse's life by X number of days?". 

I hope to be one of those players in that space to reassess some of these diseases in mice. Mice and humans have a lot of differences but we have a lot of similarities. We both have a motor cortex, cerebellums, and spinal cords that drive motor control and locomotion. 

We’ve seen in the past that understanding one circuit at a higher resolution – which is much more accessible and ethical than human work – has been able to translate it into the human domain.

Abby: In five to ten years, how are we going to treat neurodegenerative disorders? Will we be able to rid the world of them?

Mackenzie: It's a tough question because there's a side of you that always wants to be the extreme optimist. I don't know if we'll be able to get rid of them, but we're already on a path to answer and look at these questions from a more holistic standpoint. 

I believe machine intelligence will help us in multiple different ways. 

In 5 to 10 years, many more of us will be wearing smart technology to monitor the state of not only our heart rate but our health and other things that come along with this. We probably all have our smartwatches on now, but you can imagine where this tool kit goes. These types of products will also push more into the medical domain. 

For neurodegeneration, we'll have much more monitoring and understanding of symptoms and more individual treatment in terms of dosage. It's not sufficient to go and see your doctor every few months and then give a five-minute survey and you have 15 minutes to try to say all the right things. As a patient, you could talk for hours about your daily life but that's not what you're asked. You end up doing tests that are rated 1 to 4, e.g. “how well am I walking today?”. It shouldn't be that simple because the brain isn't simple and our biology isn't simple. In the next 5-10 years or sooner, we’ll probably have this holistic viewpoint and deeper metrics that are meaningful. I hope diagnosis would be transformed if the disease is not eradicated.

Abby: A couple of years ago, we wrote a report on healthcare. We reviewed where visual technologies are being implemented to revolutionize the way it is administered and practiced. We realized that so much of healthcare is going to be data-driven by computer vision algorithms like yours. We think that patients are going to own the data and then you can choose to share it with the doctors as you go to them.

Mackenzie: Absolutely. Like you touched on, it also gets into questions about ownership of data and how you choose to share it, or how you are going to share terabytes of videos. 

We have to be able to distill it down to something meaningful but how to do that smartly will raise questions and specifically ethical questions in AI, which we're already seeing. We need to address them before deploying these technologies.

Abby: The other day I was listening to a podcast where Bill Gates said that researchers and technologists always have to be ardent optimists to keep trying to do things that are otherwise believed to be unachievable: like those still putting billions of dollars into malaria research where they have repeatedly hit dead ends. That might be the same case with neurodegenerative disorders. Up until now, there's been a lot of dead ends and there's been a lot of money thrown at the problem. It sounds like an AI-powered tool for you is one of the ways to open up this field and come up with great solutions.

Mackenzie: We all have our biases and I have mine, but I see the future where this will be much more integrated with patient care, diagnosis, and even drug discovery. 

It'll matter on a lot of different fronts and as you said, it might come from little tweaks in many domains that people are already working in. It's hard to imagine that one day it'll be like, “oh, computer vision solved ALS”. What we can do is to provide tools and start thinking about the problems in new ways. This has driven innovation in science and the healthcare industry.

Women Leading Visual Tech.png

Abby: There are a lot of similarities between building out your station of research and taking the first steps to get a company off the ground. What was your one big learning from getting your lab started?

Mackenzie: Running a lab is almost like starting a small business. It's a great training ground in that respect too. It’s important to surround yourself with people who inspire you, people who are better than you. Always try to hire people of diverse backgrounds and attitudes towards problems. Our team is diverse and that's worked well for us. Everyone comes to the table with their perspective and skillsets. Then you learn to speak the same language.

Abby: Is there one piece of advice that you would give to researchers who are looking to start something of their own?

Mackenzie: Be fearless and have fun. Let the data speak to you and don't be afraid of it. That's the best advice I can give to my students or anyone working in this space. 

None of us were born programmers, none of us were born understanding all of these things, it's learnable. We have the most amazing computer sitting on top of our shoulders.

Abby: Talking about computers...If they didn't exist, what would your career choice have been?

Mackenzie: I grew up in the countryside and was an avid equestrian. In a different life, I probably would have been a horse trainer, which maybe has some interplay with now that I teach mice to play video games… If computers didn't exist, I would be doing that or trying my best to try to figure out how to make a computer.