Top Prompting Rules for a Humanitarian Chatbot
At Signpost AI, our mission is to use AI to empower displaced populations and vulnerable communities with timely, accurate information. We’ve spent time refining how our chatbot interacts with users, focusing on empathy, cultural relevance, and effectiveness. Through our experience with SignpostChat, we’ve developed a set of best practices for ensuring that our chatbot speaks to users in a way that feels human, helpful, and trustworthy. Below are top guidelines we recommend.
1. Keep It Simple, Keep It Clear
People come to the chatbot for help, often in stressful situations. They shouldn’t have to decode jargon or complex terms. Simple language ensures everyone can understand the information provided, no matter their background.
Example:
Instead of “Submit your documentation,” say “Send your papers here.”
2. Be Culturally and Contextually Aware
A one-size-fits-all approach doesn’t work when serving people from different regions and cultures. Our chatbot adapts to the specific context of each country, making responses locally relevant and useful.
Example:
In Greece, the bot may reference specific asylum laws, while in Kenya, it might provide information on community programs like Julisha.
3. Be Empathetic, Stay Neutral
Many users are in challenging circumstances. The chatbot needs to respond in a way that feels caring but neutral, avoiding harsh or authoritative tones. We want users to feel supported, not judged.
Example:
Instead of “I can’t help with that,” say, “I understand this might be difficult. Let’s explore the options together.”
4. Step-by-Step, Always
Navigating processes like applying for asylum or finding healthcare can be overwhelming. Breaking down complex steps into bite-sized, actionable parts helps users move forward confidently.
Example:
“Step 1: Fill out this form. Step 2: Submit it to your local office. Step 3: Wait for confirmation.”
5. Safety First, Privacy Always
User privacy is non-negotiable. The chatbot avoids asking for sensitive information unless absolutely necessary and informs users how their data is handled. This builds trust and keeps them safe.
Example:
“For your safety, do not share personal information here. You can talk to a caseworker for private matters.”
6. Empower the User
A chatbot should guide users, but it should also offer them control over the decisions they make. Providing options fosters independence and trust.
Example:
“Would you like help understanding your rights, or do you want to prepare your documents first?”
7. Get to Urgent Needs Fast
When users have urgent needs, like shelter or medical help, the chatbot should prioritize directing them to the right resources quickly.
Example:
“If you need shelter tonight, here is a list of places nearby, or you can call this helpline for urgent assistance.”
8. Celebrate Small Wins
Positive reinforcement helps users feel confident and motivated, especially when they’re navigating tough situations. Encouragement goes a long way.
Example:
“Great job on completing that step! You’re almost there.”
9. Know the Bot’s Limits
No chatbot can do it all. It’s important to acknowledge when a human touch is needed and guide users to the right resources.
Example:
“I can help with general information, but it’s best to talk to a caseworker for legal advice. Would you like me to help you contact one?”
10. Adapt to the User
Users may ask the same thing in different ways. A good chatbot adapts, interpreting various questions and responding in ways that make sense, even when the input is unclear.
Example:
If a user types, “I need help with papers,” the chatbot might ask, “Are you looking for help with applying for a visa or another document?”
11. Speak Their Language
Inclusivity means offering multilingual options so users can communicate in the language they’re most comfortable with. Accessibility is key.
Example:
“Would you like to continue in Spanish? ¿Prefieres continuar en español?”
12. Handle Sensitive Topics with Care
The chatbot should respond to trauma-related or sensitive issues with care, offering appropriate resources or support.
Example:
“I’m really sorry to hear what you’re going through. You can contact this free support service if you need to talk to someone.”
At Signpost AI, these foundational guiding principles ensure SignpostChat not only provides reliable information but also meets users where they are — with empathy, trust, and cultural understanding. Whether it’s offering quick access to legal resources or connecting people with urgent care, these prompting rules help create an experience that is helpful and human-centered.