Over the last 15 years, I have met a myriad of medical school applicants: some fueled by intense ambition, some exhausted, some used to excelling, but all overwhelmed. As the founder of the medical school admissions consulting company Inspira Advantage, I am familiar with the traditional methods of handling feeling overwhelmed. However, in the last year or so, medical school applicants have begun turning to something else entirely: artificial intelligence.
At Inspira Advantage, we recently ran a survey of 145 pre-medical students across the U.S. to understand their mental health during the application process. Nearly 60% of med school applicants said they frequently feel overwhelmed by stress or anxiety while preparing their applications. Around 50% said they feel the need for therapy during the same tumultuous process. What came next, however, felt like a wake-up call.
More than half of the respondents were comfortable sharing sensitive mental-health-related information with an AI bot. Further, more than 55% said they actually believe that AI tools can improve their mental health not only in med school admission preparation, but also in the healthcare field. It's clear that these aspiring healthcare professionals are in need of help and AI is showing up for them, without the price tag or the busy schedules.
Another critical point our survey uncovered was how these aspiring doctors believed AI can indeed improve their mental health. While close to 59% believed it would be through assigning tasks to AI bots, 29% said they would do so by discussing their mental health directly with an AI bot. So, it isn’t just AI’s role as a low-level assistant that students are relying on, it’s also its use as an emotional buffer.
Now, before we crown AI as the new study buddy and mental health saviour on campus, we need to consider certain gaps.
Players in the medical school application space must rethink the support structures available to these young applicants. Some of these students are actively turning to free and omnipresent tools like AI to pry open their mental health issues, despite the presence of mental health services at colleges. According to a 2024 study, 50% of undergraduate students surveyed did not know how to get help on campuses. If universities and medical school campuses can responsibly use AI to streamline the access and affordability of human-centric mental health support services to students, it can lead to safer environments for our nation’s future doctors that are not overly or solely dependent on AI.
Furthermore, we need more awareness on the limitations of AI integrated into the curricula as well. Even the applicants themselves observed this need: Nearly two-thirds of the survey respondents admitted that future physicians must understand the limitations and risks of using AI, both pre- and post-medical school.
It is the responsibility of educational institutions, where students spend most of their time, to educate them on such perils, while simultaneously adopting AI practices to better prepare the next generation of doctors. Today’s medical students will inherit a world where patients walk in already having diagnosed their conditions, and perhaps with even a customized care plan crafted using AI. Students, hence, need to speak that language fluently, or risk being overshadowed by it. Schools, thus, must equip them with these skills right from early undergraduate years.
Once students are in medical school, institutions can introduce AI modules in pre-clinical training or elective courses in the use of AI in healthcare. Along with this must also follow courses in the ethical usage of AI, how it can endanger patient safety.
The time to fight AI is behind us. But, while accepting it as part of the medical school landscape, what we can do is inscribe policies and principles that guide students in their use of this currently open-for-all tool, both as a cognitive assistant and as an emotional anchor. AI does come with its risks, but with better literacy and ethical usage, we can ensure that we wield this tool responsibly — minimizing how it can harm us and maximizing how it can equip the next generation of doctors to care for themselves before saving others.
Arush Chandna is a member of the Dartmouth Thayer School of Engineering Class of 2016 and a co-founder of Inspira Advantage. Guest columns represent the views of their author(s), which are not necessarily those of The Dartmouth.



