When Mani Devi, an Accredited Social Well being Activist (ASHA) in rural Rajasthan, noticed the underweight toddler, she knew one thing was improper—however not how severe it could be, or what recommendation to present.
So she reached for her cellphone and opened WhatsApp: In Hindi, she typed a query to a brand new device known as ASHABot: What’s the perfect weight for a child this age?
The chatbot—skilled in Hindi, English, and a hybrid often known as Hinglish—responded inside seconds: a child that age ought to weigh round 4 to five kilograms. This one weighed much less.
The bot’s reply was clear and particular. It inspired feeding the newborn eight to 10 instances a day, and it defined the right way to counsel the mom with out inflicting alarm.
That, she mentioned, was one of many many encounters with ASHABot that modified the way in which she does her job.
The device is a part of a quiet however vital shift in public well being, one which blends cutting-edge synthetic intelligence with on-the-ground realities in a few of India’s most underserved communities.
ASHABot, launched in early 2024, is what occurs when a generative AI mannequin akin to OpenAI’s ChatGPT or GPT-4 shouldn’t be solely skilled on the broader web, however is linked to a data base containing India’s public well being manuals, immunization pointers, and household planning protocols. It takes voice notes when prompted and gives solutions that assist the ASHAs serve sufferers.
Constructed by the nonprofit Khushi Child (opens in new tab) utilizing expertise developed and open sourced by Microsoft Analysis, the bot has been remodeling how a few of the nation’s ASHA employees do their jobs. These girls are the glue between India’s rural households and the well being system, chargeable for every thing from vaccination information to childbirth counseling. However they obtain simply 23 days of primary coaching and infrequently work in settings the place medical doctors are distant, supervisors are overburdened, and even cellular sign is unreliable.
“ASHAs have all the time been on the entrance strains,” mentioned Ruchit Nagar, co-founder and CEO of Khushi Child and a Harvard-trained doctor. “However they haven’t all the time had the instruments.”
Nagar’s relationship with ASHAs goes again practically a decade. In 2015, he launched Khushi Child with the objective of digitizing well being information in underserved communities, usually designing tech programs that have been regionally grounded. The thought of ASHABot emerged in late 2023, throughout a summit with stakeholders in Rajasthan.
On the time, Khushi Child was working with Microsoft Analysis on a separate AI mission—one which used eye photographs to detect anemia. However the buzz round giant language fashions, particularly ChatGPT, was rising quick. Nagar and his collaborators started to ask whether or not this expertise may assist ASHAs, who usually lacked real-time entry to high quality, comprehensible, medically sound steerage.
“ASHAs have been already utilizing WhatsApp and YouTube. We noticed an inflection level, new digital customers prepared for one thing extra,” mentioned Nagar, now a resident on the Yale Faculty of Drugs in New Haven, Conn.
So that they started constructing.
Microsoft researcher Pragnya Ramjee joined the mission round that point, leaving a design job at a hedge fund to deal with expertise with social affect. With a background in human-centered design, she helped lead the qualitative analysis, interviewing ASHAs in Rajasthan alongside a skilled translator.
“It made an enormous distinction that the translator and I have been girls,” she mentioned. “The ASHAs felt extra snug being open with us, particularly about delicate points like contraception or gender-based violence.”

Ramjee and the staff helped fine-tune the system in collaboration with medical doctors and public well being specialists. The mannequin, based mostly on GPT-4, was skilled to be extremely correct. When it receives a query, it consults a rigorously curated database—round 40 paperwork from the Indian authorities, UNICEF, and different well being our bodies. If the bot doesn’t discover a clear reply, it doesn’t guess. As an alternative, it forwards the query to a small group of nurses, whose responses are then synthesized by the mannequin and returned to the ASHA inside hours.
The objective, Ramjee mentioned, is to make sure the bot all the time stays grounded in actuality and in the actual coaching ASHAs obtain.
Thus far, greater than 24,000 messages have been despatched by the system and 869 ASHAs have been onboarded. Some employees have used it solely a few times. Others ship as much as 20 messages in a single day. Matters vary from the anticipated—childhood immunization schedules, breastfeeding finest practices—to the sudden.
“They’re asking about contraception, about little one marriage, about what to do if there’s a battle within the household,” Ramjee mentioned. “These aren’t simply medical questions. They’re social questions.”

One girl got here to Mani Devi saying she’d missed her interval for 2 months however wasn’t pregnant. The bot supplied Devi with data that gave her the arrogance to guarantee the affected person she had nothing to fret about.
The responses are available each textual content and voice observe, the latter usually performed aloud by ASHAs for the affected person to listen to. In some instances, voice responses about long-acting contraception assist persuade hesitant girls to start remedy.
There is no such thing as a query the expertise works. However the staff is fast to emphasise that it doesn’t exchange human data. As an alternative, it amplifies it. ASHABot illustrates how LLM-powered chatbots may also help bridge the knowledge hole for folks, significantly these with restricted entry to formal coaching and expertise, mentioned Mohit Jain, principal researcher at Microsoft Analysis India.
“There may be quite a lot of debate about whether or not LLMs are a boon or a bane,” Jain mentioned. “I imagine it’s as much as us to design and deploy them responsibly, in ways in which unlock their potential for actual societal profit. ASHABot is one instance of how that’s potential.”
– Mohit Jain, Principal Researcher, Microsoft Analysis India

In fact, the chatbot isn’t excellent. Some customers nonetheless favor to name folks they know, and the massive query of scaling stays. The staff is exploring personalization choices, multimodal help like picture inputs, and parallel LLM brokers to make sure high quality assurance at scale.
Nonetheless, the imaginative and prescient is expansive. As of now, ASHABot is barely utilized in Udaipur, one of many 50 districts in Rajasthan. The long-term objective is to convey ASHABot to all a million ASHAs throughout the nation, who maintain about 800 to 900 million folks in rural India. The potential ripple impact throughout maternal well being, vaccination, and illness surveillance is immense.
Nagar, who has traveled to India twice yearly for the final 10 years to analysis the wants of ASHAs, mentioned there are nonetheless “many issues but to discover, and plenty of huge inquiries to reply.”
For ASHAs like Mani Devi, the shift is already actual. She says she feels extra knowledgeable, extra assured. She will speak about beforehand taboo topics, as a result of the bot helps her break the silence.
“Total, I may give higher data to individuals who need assistance,” she mentioned. “I can ask it something.”