Speaker 1: Hello everyone. So my name is Sarah Biber and I am the program director at the National Alzheimer's Coordinating Center, which is directed by Bud Kukul, who is there in the audience. I'm going to be calling on some other folks in the audience as well. I also have the opportunity to work closely with Sean Mooney, who spearheads the symposium as well as many other exciting things at University of Washington. And I actually want to start off today with a quick poll. How many of you have done research on Alzheimer's disease? Okay. And how many of you knew about NAC before I got up here today? Okay. Not as bad as I thought. So really my, for the next 20 minutes, I'm going to try to get you all fired up, excited about working on Alzheimer's disease. And I'm also going to try to illustrate how NAC data and the NAC data platform can be, has been an incredibly powerful tool for AI driven discovery in the Alzheimer's space. So many, I don't need to convince many folks here that Alzheimer's is terrible. So it's currently the seventh leading cause of death in the United States alone. It is estimated that 6.5 million people in the US alone are currently suffering, living with Alzheimer's disease over the age of 65. It's estimated to be about 30 million worldwide. And the prevalence is expected to go up. So by 2050, it's expected to nearly double. This is a huge problem. This disease is an incredible burden on patients and families. It's also an economic burden. So projected costs currently are projected to be around 30 to $21 million in the US. And that is expected to go up to nearly a trillion by 2050. So huge problem, incredibly devastating disease. And Alzheimer's disease progresses along a continuum from preclinical Alzheimer's disease to mild cognitive impairment, to dementia due to, to increasing degrees of dementia from mild to moderate to severe. And I want to talk about some of the key challenges, opportunities in Alzheimer's disease research right now, and also encourage all of you to think about how AI, data science, can be an incredibly powerful tool for investigating Alzheimer's disease. So Alzheimer's disease is one of the key challenges right now is really to advance early detection. And so one of the things you should know about Alzheimer's disease, you don't already, is that biomarker changes are evident 15 to 20 years before neuronal damage and cognitive symptoms occur. So I want you all to take a look at that window there, that preclinical window. And you can see this figure goes from preclinical to MCI to dementia. And what you're seeing there with those red and green lines is the levels of amyloid and tau in the brain. And I'll remind all of you that amyloid and tau, plaques and tangles, are the hallmark of Alzheimer's disease. And these can be, levels of these are going up in the brains of these individuals prior to any neuronal damage, any cognitive symptoms. And so early detection is critical because this window, prior to disease onset, provides an important opportunity for the testing of clinical interventions. And you may be saying, well, there aren't any cures right now, right? And so early detection, though, is critical for being able to detect these biomarker changes, is critical for being able to run clinical trials earlier in the course of the disease and having the opportunity to modify the disease course before damage has occurred, for being able to develop therapeutics, prevent AD symptoms and disease progression, and to be able to identify patients for current, future disease-modifying therapies, as well as be able to counsel patients and families on what to expect and help connect them to care early on. So many of you have probably heard of Leucanabab, it's been all over the news. This is a drug that came out last year and is showing incredible promise. So this was a drug that where they did a clinical trial, they were able to detect these changes, they ran it on patients with early stage Alzheimer's disease, and were able to show that it slowed the progression of the disease in those individuals, and it specifically targeted amyloid. So one thing that's also needed is cheaper biomarkers, blood-based biomarkers for being able to detect a disease. These biomarker changes, there's opportunities also to develop digital biomarkers to be able to detect these early changes, and also to be able to test the outcomes of clinical trials. So another thing that you should know about, oh, and I should mention that AI presents a really exciting opportunity to be able to do better early detection in that window when clinical interventions can still have a big impact on the course of the disease. So another thing I want to mention about Alzheimer's disease, and a challenge is that it's really critical that we do better differential diagnosis. So Alzheimer's is an incredibly, dementia is an incredibly heterogeneous condition. Within Alzheimer's disease alone, the disease, there are many different subtypes that you can detect via MRI, via neuropathology. In addition to that, Alzheimer's disease rarely occurs alone in the brain of an individual, so it's almost always accompanied by other co-pathologies as well, so each brain has its own kind of cocktail of different dementias, and that's found anywhere from 50% to 75% of individuals, and that can include other co-pathologies such as cerebrovascular disease, Lewy bi-disease, frontal temporal dementia, Parkinson's disease, and other conditions. So it's really important to be able to do this differential diagnosis, to be able to develop better targeted therapies for specific etiologies, as well as to be able to help with prognosis and expected clinical progression, as well as for the opportunity to develop precision medicine approaches for therapeutic intervention. And so new etiologies are still being discovered. Just recently, there was a new condition described called LATE, which actually mimics Alzheimer's disease, but is caused by a dysfunction in a protein called TDP-43, and that particular type of dementia was discovered actually using NAC neuropathology data. So another thing to... And so I should mention that AI presents a really interesting opportunity, powerful tool for being able to differentiate, disentangle these different etiologies so that better therapeutics can be developed. So another thing I want to point out about Alzheimer's disease is that it's really important to be able to identify and stratify risk factors and protective mechanisms. So numerous factors contribute to and modify the onset and progression of Alzheimer's disease symptoms. Getting older is the biggest risk factor, unfortunately, which we can't really avoid, but there are also a number of other potentially modifiable things, such as diet, such as socioeconomic factors, sleep, other lifestyle components, et cetera, Mediterranean diet, you've probably heard about that. So it's really important to be able to understand the impact of these different factors, many more that are even just listed here in red, so we can understand the contribution of those modifiable factors, be able to identify and stratify risk factors and protective mechanisms to better predict individual disease prognosis, and also so that we have the potential to delay, prevent AD onset and progression. So once again, AI presents a really exciting opportunity to be able to stratify, understand the impact of these different risk factors. So can artificial intelligence advance Alzheimer's research? The answer is definitely yes. I'm not going to go into these papers in detail, but these are three recent papers that illustrate some of the areas in which AI is helping to advance Alzheimer's research. So there's been some really promising papers showing that AI can be incredibly effective for early AD detection. So in this particular paper that I show here, the algorithm that they developed performed better than many of the specialized clinicians that were doing this diagnosis early on in the disease. There's a number of many different papers showing that AI can actually be really effective for predicting AD progression. This particular paper that I show here was looking at the transition from mild cognitive impairment to Alzheimer's disease and was able to predict which patients were going to be making that transition within three-year period with high accuracy. And as I mentioned, AI can also be a really powerful tool for patient subtyping, can find patterns and classify subtypes in ways that humans cannot do as effectively. So this last paper here on the right really illustrates an example of how that's been done. So lots of people are doing this. I know they're going to look into it. So there's increasing number of papers using machine learning and deep learning to really better understand Alzheimer's disease as well as other neurodegenerative diseases. Okay, and with that, I want to give an introduction to NAC. So the National Alzheimer's Coordinators Center serves as the data collaboration and communication hub for NIA's Alzheimer's Disease Research Center program. So there are 37 Alzheimer's Disease Research Centers spread out across the country. You can see that here and kind of where they're located geographically. And these centers are all contributing data to NAC. They actually get paid to contribute that data to NAC. And we have a lot of that data. So how does this work? So these are independent ADRCs. There's one at University of Washington as well that are collecting these data, the standardized longitudinal data. These centers enroll research participants who are healthy, at risk, and with dementia symptoms. They collect the standardized longitudinal data through annual visits using tools that are co-developed and provided by NAC. And so this is an incredible longitudinal data set. We've been doing this for 23 years. And we actually have up to 18 years of data on some of these participants. So NAC collects, harmonizes, integrates, and shares this ADRC data with researchers all over the world. We've been doing this for 23 years, more than 23 years. And we have a lot of data. So we have more than 47,000 participants with data at NAC. More than 16,000 of those are active participants, as in they're still alive. And those participants are being added to every year. We have data on more than 174,000 clinical assessments. As I mentioned, we have up to 18 years of data on some participants. And we also collect neuropathology data from a large portion of these participants after they die. So up to 58%, we have more than 7,000 neuropathology data sets in our system. And people, as I mentioned, are using this data all over the world. This number is definitely an underestimate. Not everyone remembers to cite NAC data when they publish their papers, but we have more than 1,000 published studies that are using NAC data currently. So I want to talk a little bit about what's really unique about the value of NAC data. So it is one of the largest, most comprehensive, longitudinal, standardized clinical and neuropathological data sets in the world. Because it is longitudinal, it means that we're capturing this data for participants prior to and post-dementia onset. So we have a chance to capture that transition there with this data. And we also have the infrastructure in place for long-term tracking of these individuals into the future. As I mentioned, we have data on normal cognition, MCI, and dementia, as well as other pathologies. So one of the other unique things about NAC data is that it is incredibly rich. So this clinical phenotyping data that we're collecting, it's standardized, it's longitudinal. This is multi-domain neurocognitive data. We have extensive sociodemographic data, and this can be really valuable for all sorts of things, including supporting genomic analyses. As I mentioned, we also have extensive neuropathology data. This is also standardized. You can explore diverse ideologies within the NAC data set. You can just explore mixed pathologies. We also have non-standardized, kind of heterogeneous, currently MRI data and PET data that is also longitudinal available within our data set currently. So one of the nice things about NAC data is that it's integrated clinical neuropathology and MRI PET data currently. That enables multimodal analysis and better precision studies. And NAC data is set to become a lot more multimodal. So I want to tell you a little bit about that and where we're headed. So I already mentioned kind of our existing data streams, this longitudinal clinical data, neuropathology data, and this non-standardized MRI PET data, which we already have integrated. All of NAC data right now is moving into the cloud. We are building a modern and secure multimodal data integration harmonization platform. So this is all underway right now. You can go to Ben Keller's poster to learn a little bit more about it and ask him questions. He's helping to spearhead a lot of that. And in addition to that, we are coming up with new tools for how we can share that data with researchers around the world. We want to talk a little bit more about the data that we're integrating. So we are working with various partners to be able to integrate additional metadata and analysis data for standard MRI PET data. So that is an initiative called SCAN, and NAC is a critical partner in that. And we are in the process right now of collecting standardized MRI and PET data from across the ADRC program, so the 37 centers that I talked to you about. We partner closely with NICRAD, another Alzheimer's big data center, to be able to integrate biospecimen data for ADRC participants, including blood biomarker data. We already have some of that data available for participants at NICRAD, but we're going to be integrating a lot more of that data into our database. In addition to that, there's genomic and genetic data available for a lot of ADRC participants, and we're going to be working to integrate metadata and analysis data for that genetic data into the NAC database as well. So we also are going to be bringing in new data streams directly into the NAC database, so that includes digital biomarker data. We are about to launch a pilot focused on that, as well as digital neuropathology data, which will open up the door to all sorts of additional AI investigations, as well as electronic health record and CMS data. So we currently have a pilot that's actually focused on COVID-19 and Alzheimer's disease that is working to bring in electronic health record data into our database and integrate it with all of these other data streams. You can access data right now at NAC through a quick access file. I'll tell you at the end about how you can do that. But in the future, we're also going to be building real-time data search visualization access tools, building off of a tool you may have heard of already from Nick Dobbins called Leaf, and as well as another nice thing about our platform is it can provide sandboxes for collaborative data analysis, and it is ready for AI-driven discovery. So I want to talk a little bit about leveraging NAC data for AI-driven Alzheimer's discovery. So I'm just going to highlight two papers. So one of them here is, I highly recommend these, basically developed a new approach for AD diagnosis that was based on 3D deep convolutional neural networks using standardized structural MRI data. So another organization called ADNI, which has lots of imaging data, this is incredibly standardized MRI data. They're able to use that to develop an algorithm that really robustly performed early detection of Alzheimer's disease, and it actually performed better than some clinicians in that work. And this model relied on a wide range of regions that were associated with Alzheimer's disease and was validated both using standardized MRI data from ADNI as well as the heterogeneous MRI data that's available through NAC. So I think this is a really good example of how NAC data can be used, where there's actually a lot of value in this heterogeneous MRI data, which more closely mimics real-world clinical data that you might find, and that you can use these algorithms that are developed with this incredibly standardized MRI data and then test them within the NAC data set. So some key takeaways here is that this model was able to accurately differentiate between cognitively normal subjects and subjects with either MCI or mild Alzheimer's disease. It can be used to forecast progression, so it predicts which MCI subjects would be faster to progress to dementia, as well as these deep neural networks were able to learn to identify imaging biomarkers that are predictive of AD and can leverage them to achieve accurate early detection of disease. So very exciting development. Another new paper that I want to highlight is this one here, and essentially what they did was they created a deep learning framework for accurate differential diagnosis of normal cognition, MCI, Alzheimer's disease, as well as being able to differentiate non-Alzheimer's disease dementia. So NAC data was used in this case to develop this model, several models actually that were capable of classifying cognitive status. So one of the models used just MRI data, one of the models used non-imaging data such as like demographic data, medical history, functional assessments, as well as neuropsychological test results. They also developed a fusion model that used a combination of non-imaging and MRI data. So one of the nice things about this model was that it was interpretable and was validated on multiple heterogeneous independent cohorts. You can see those cohorts listed there on the top left. And it linked computational predictions with well-known anatomical and pathological markers of neurodegeneration from NAC neuropathology data. So they actually used NAC data to develop the algorithm, test the algorithm, as well as to validate the algorithm with actual neuropath data. So some key takeaways from this model is that it approached clinical standards of diagnosis. This could be a potentially valuable tool for physician-assisted diagnosis, was comparable to expert assessment by neurologists and radiologists, and actually determined dementia status from routinely collected clinical care data. So I mentioned that NAC data is about to become even more valuable for AI discovery. So we are going to be expanding our data modalities. So a couple things here is one that we're working to collect additional socio-demographics, including social determinants of health data. I mentioned that we're going to be providing integrated additional genetic and genomic data, as well as additional biospecimen biomarker data. We have both heterogeneous MRI and PET data, as well as standardized MRI and PET data, as well as digital neuropath data coming online, digital neuropaths, neuropsychological tests, as well as EHR data. So this is going to be really powerful for being able to develop new AI algorithms and tasks. And the hope here is that this will lead to better outcomes and advancements, including things like early diagnosis tools, better patient subtyping, better prediction of conversion from things like MCI to Alzheimer's disease, identify additional risk factors, as well as help folks understand how the disease is going to progress. We also are interested in running community challenges to develop these algorithms using open data science. So stay tuned for more information on that. We hope to be able to launch that within the coming year. And you can access data at NAC right now. It takes about 15 minutes to submit a data request. You can go to NACdata.org as well as learn about the additional kind of data we have available. And you'll get that data within about 48 hours, which is great. I want to acknowledge the incredible team at NAC. I'd love for folks that are in the audience right now, part of NAC, to raise their hands. I see John, I see Ben Keller, I see Bud, I see Zach, I see Brendan. So there's a number of folks here that you can speak with to learn more about NAC and the work that we're doing. And thank you all for your time. And I'm happy to take any questions.
Generate a brief summary highlighting the main points of the transcript.
GenerateGenerate a concise and relevant title for the transcript based on the main themes and content discussed.
GenerateIdentify and highlight the key words or phrases most relevant to the content of the transcript.
GenerateAnalyze the emotional tone of the transcript to determine whether the sentiment is positive, negative, or neutral.
GenerateCreate interactive quizzes based on the content of the transcript to test comprehension or engage users.
GenerateWe’re Ready to Help
Call or Book a Meeting Now