Skip to Content, Navigation, or Footer.
Support independent student journalism. Support independent student journalism. Support independent student journalism.
The Dartmouth
December 9, 2025 | Latest Issue
The Dartmouth

World’s first campus wellness AI under development at Dartmouth

The program, Evergreen.AI, is expected to enter beta testing in early 2026.

10-9-25-dalilab-evergreen.png

This article is featured in the 2025 Homecoming Special Issue.

The world’s first college-specific wellness artificial intelligence, named Evergreen.AI, is being built at Dartmouth. The $5 million project will feature personalized wellness check-ins, organizational tools and offer Dartmouth-specific social advice. 

Evergreen is being promoted as a “by-students, for-students” platform. It is the product of almost two years of design and more than 100 undergraduate employees, according to Geisel School of Medicine psychiatry professor Lisa Marsch, a faculty advisor on the project.

“The goal is to give someone a resource that really understands students and knows the life of this community [and] language of Dartmouth, so that it feels relevant to student life and can help students meet their goals,” Marsch said.

A “closed beta” — during which project managers will make Evergreen available to a select group of test users — planned for early 2026 will use pre-structured dialogues rather than large language model prompting, according to project manager Jennifer Li ’27. Full large language model integration is expected by 2028. Pre-structured dialogues allow only a pre-scripted set of user prompts, rather than the dynamic prompting available to users of generative AI; the two-year lag before the integration of LLMs allows time for model development and safety testing.

Evergreen will “scale” with the data students give it, according to user experience designer Rachael Huang ’27.

“If you give it consent to your location, your health data or your  assignments, it can passively track different aspects of your life and notice if something’s not going well,” Huang explained.

While the decision to give Evergreen comprehensive data access is entirely optional, some students are skeptical of the premise overall. 

One DALI Lab member, who is not working on the project, expressed concerns over both data privacy and the untested nature of chatbots in mental health.

“There are benefits to mental health apps,” said the DALI member, who was granted anonymity in order to share their opinions candidly. “The issue with introducing AI into that is guardrails. What if [the AI] goes haywire?”

Several recent tragedies have raised concerns. In August 2025, a 16-year-old boy took his own life after receiving advice on how to tie a noose from ChatGPT-4o, according to The New York Times. In October 2024, The New York Times also reported that a 14-year-old boy took his own life after becoming emotionally attached to a chatbot on Character.AI.

AI in mental health is “something I’m opposed to now, because we don’t have answers to questions of liability,” the DALI member said. “Doctors can mess up. Doctors mess up all the time, but there’s legislation governing that. With stuff like this, we haven’t gotten there yet.”

The DALI member also brought up privacy concerns. AI companies “can access whatever data you give them that’s part of the user agreements,” including prompt data, the DALI member said. “A lot of consumers probably do not know that.”

When asked what privacy protections are in place, computer science professor Nicholas Jacobson, a technical lead for the project, said Evergreen is “not an effort to collect data” on Dartmouth students. 

“The goal is only to actually use [student data] to derive interventions that would promote their own goals and their ability to flourish on campus,” Jacobson said. “To be clear, we are not collecting and trying to use this data for other purposes.”

“The data wouldn’t be retained anywhere for any long period of time, like kept locally,” Jacobson added. “For any [personal] data, we’ll try to process that and then delete it.”

Evergreen will use a server purchased and hosted by Dartmouth for data processing, according to Jacobson. Actual data processing will be done at a physical location contracted out by the College, according to Jacobson. The AI is based on open-source models and will not send prompts out to commercial firms; the College will own the Evergreen large language model as intellectual property.

Evergreen’s interventions are trained on more than 200 dialogues rooted in behavioral science research, according to project assistant Keene Dampal ’28. 

The dialogues themselves were written by more than 80 Dartmouth students from a variety of academic and personal backgrounds, according to Dampal. 

“We’re trying to make sure that it’s not just a small portion of the Dartmouth population … to make sure that this platform actually knows what being a Dartmouth student is like,” Dampal said.

“Highly relevant training data” reduces the likelihood of model hallucinations, Jacobson wrote in an email statement to The Dartmouth.

“Hallucinations broadly happen when models lack training data in the given area and are doing latent extrapolations,” Jacobson explained. “We plan to customize this to Dartmouth to reduce the likelihood of this.”

While 210 dialogues have been completed by the time of launch, students will continue to write more while the project is live, according to Dampal. Further, the project’s $5 million in funds are less than half of the project’s $16.5 million fundraising goal.

“This larger [fundraising] target reflects a multi‑year program with extensive product and curriculum development, staffing and facilities and multiple phases of independent, rigorous evaluation to ensure the product is safe, effective and scalable [to other campuses],” Marsch wrote in an email statement to The Dartmouth.

Dampal, who is also a peer wellness coach at the Student Health Center, added that he sees Evergreen fitting into the “myriad” mental health resources on campus as a sort of “first aid.”

“Ultimately, we want students to receive the support of mental health professionals,” Dampal said. “But unfortunately, that’s not accessible for everyone. That’s where Evergreen can come in.”

Marsch, whose research as founding director of the Dartmouth Center for Technology and Behavioral Health was foundational to Evergreen, explained that the platform is not for “treatment,” but rather “for building resilience, protective factors and life skills during college life.”

“AI is not going away. Mobile devices aren’t going away,” Marsch said. “So what I’ve been excited about is how we can use the ubiquity of these platforms for good, for helping people achieve their goals in life.”


Jackson Hyde

Jackson Hyde '28 is a news reporter. He is from Los Angeles, Calif., and is majoring in Government modified with Philosophy.

Trending