MOOD MNKY leverages cutting-edge AI technologies to enhance product personalization, customer experience, and operational efficiency. This documentation outlines our AI technology stack, implementation details, and best practices.
MOOD MNKY uses a clear architectural separation between chatflow/chatbot workflows and server-side AI processing:Flowise (primary for chatflows/chatbots):
Dojo Blending Lab chat, custom tools, document store RAG
Hosted Flowise instance with S3/MinIO for document storage
Elements AI SDK for chat UI; flowise-sdk as app bridge
Automatic fallback to OpenAI when Flowise is unavailable
Our NLU system processes user inputs to extract intents, entities, and sentiment:
import { OpenAI } from 'langchain/llms/openai'import { PromptTemplate } from 'langchain/prompts'import { LLMChain } from 'langchain/chains'const llm = new OpenAI({ modelName: 'gpt-4o', temperature: 0.2,})const template = `Analyze the following customer message and extract:1. Primary intent (question, purchase, support, feedback)2. Product categories mentioned3. Sentiment (positive, negative, neutral)4. Key entitiesCustomer message: {userMessage}Response in JSON format:`const promptTemplate = new PromptTemplate({ template, inputVariables: ['userMessage'],})const chain = new LLMChain({ llm, prompt: promptTemplate })const result = await chain.call({ userMessage: 'I love your vanilla candles but would like something with more citrus notes.' })console.log(result.text)
Our AI creates detailed descriptions of fragrance experiences based on composition:
import { ChatOpenAI } from 'langchain/chat_models/openai'import { HumanMessage, SystemMessage } from 'langchain/schema'const chat = new ChatOpenAI({ modelName: 'gpt-4o', temperature: 0.7,})// Get fragrance composition from databaseconst fragrance = { name: 'Midnight Amber', notes: { top: ['Bergamot', 'Orange'], middle: ['Jasmine', 'Rose'], base: ['Amber', 'Vanilla', 'Sandalwood'] }, intensity: 'moderate', family: 'Oriental'}const messages = [ new SystemMessage( `You are an expert perfumer with decades of experience. Describe how a fragrance would smell and evolve over time based on its composition. Include details about the initial impression, how it develops, and the lasting impression. Use sensory, evocative language that helps the reader imagine the scent experience.` ), new HumanMessage( `Create a virtual testing experience for this fragrance: Name: ${fragrance.name} Top notes: ${fragrance.notes.top.join(', ')} Middle notes: ${fragrance.notes.middle.join(', ')} Base notes: ${fragrance.notes.base.join(', ')} Intensity: ${fragrance.intensity} Family: ${fragrance.family}` ),]const response = await chat.call(messages)console.log(response.content)
Our mood analysis system helps suggest products based on user emotional state:
import { Pipeline } from '@huggingface/inference'import { OpenAI } from 'langchain/llms/openai'// Initialize the modelsconst hfPipeline = new Pipeline(process.env.HF_API_KEY)const llm = new OpenAI({ modelName: 'gpt-4o', temperature: 0.3,})// Analyze user text for emotional indicatorsasync function analyzeMood(userText: string) { // Use Hugging Face for sentiment analysis const sentimentResult = await hfPipeline.sentiment({ model: 'distilbert-base-uncased-finetuned-sst-2-english', inputs: userText, }) // Use emotion detection model const emotionResult = await hfPipeline.classification({ model: 'SamLowe/roberta-base-go_emotions', inputs: userText, }) return { sentiment: sentimentResult, emotions: emotionResult, }}// Generate product recommendations based on moodasync function getMoodBasedRecommendations(moodAnalysis: any) { const prompt = ` Based on the following mood analysis, recommend 3 MOOD MNKY products that would help enhance or complement the user's current emotional state. Sentiment: ${moodAnalysis.sentiment.label} (${moodAnalysis.sentiment.score}) Primary emotions: ${moodAnalysis.emotions.slice(0, 3).map(e => `${e.label} (${e.score})`).join(', ')} For each product: 1. Name and category 2. Key fragrance notes 3. Why it's appropriate for this mood 4. How it might benefit the user Format as a JSON array. ` const result = await llm.call(prompt) return JSON.parse(result)}// Example usageconst userJournal = "I've been feeling overwhelmed with work lately, but today I finally completed a big project. I'm relieved but still feeling a bit anxious about the next steps."const moodAnalysis = await analyzeMood(userJournal)const recommendations = await getMoodBasedRecommendations(moodAnalysis)console.log('Mood Analysis:', moodAnalysis)console.log('Recommended Products:', recommendations)