Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
15 changes: 14 additions & 1 deletion packages/backend/src/ai/ai.constants.ts
Original file line number Diff line number Diff line change
Expand Up @@ -106,6 +106,8 @@ about the people in this conversation that would be worth remembering for future
The participant named "Moonbeam" (or "muzzle3") is the bot. You can see its messages for context (to understand
what humans were reacting to), but extract observations about the HUMANS only.

YOUR DEFAULT ANSWER IS NONE. Only extract something if you are confident it meets the criteria below.

WHAT TO EXTRACT:
- Specific statements or positions someone argued with conviction
- How someone interacts with Moonbeam and others (telling it off, asking it to settle arguments, testing it,
Expand All @@ -115,6 +117,7 @@ WHAT TO EXTRACT:

WHAT TO SKIP:
- Idle chatter, one-liners, greetings, link shares without commentary
- Someone asking a question — asking about a topic is NOT the same as caring about it
- Names of partners, kids, or family members (e.g. "his wife Katie", "her son Jake")
- Addresses, workplaces, or job titles (e.g. "works at Capital One", "lives in Cranford")
- Medical info (e.g. "diagnosed with ADD", "had hernia surgery")
Expand All @@ -127,12 +130,22 @@ HOW TO DECIDE:
Look for energy. Did someone care enough to write more than a sentence? Did they argue back and forth? Did they
directly engage with Moonbeam or another person? If the conversation is just casual banter, the answer is NONE.

A single question to Moonbeam is NOT energy. Someone asking "what happened to chuck norris" is idle curiosity, not
a memorable observation. You need to see sustained engagement — multiple messages, a debate, a strong reaction,
someone going off about something they care about.

EXAMPLES OF NONE (do not extract from conversations like these):
- Someone asks Moonbeam a factual question and gets an answer
- Someone shares a link with no commentary
- A few people exchange short one-liners or greetings
- Someone makes a single joke or observation and moves on

EXISTING MEMORIES (for context — do not duplicate these):
{existing_memories}

For each observation, classify:
- NEW: not captured in existing memories
- REINFORCE: an existing memory came up again
- REINFORCE: an existing memory came up again — only if the conversation shows genuine sustained engagement with the topic, not just a passing mention
- EVOLVE: contradicts or meaningfully updates an existing memory

Return a JSON array, or the string NONE if nothing is worth extracting. Most of the time, NONE is the right answer.
Expand Down
21 changes: 3 additions & 18 deletions packages/backend/src/ai/ai.service.ts
Original file line number Diff line number Diff line change
Expand Up @@ -25,10 +25,6 @@ import {
import { MemoryPersistenceService } from './memory/memory.persistence.service';
import { MemoryWithSlackId } from '../shared/db/models/Memory';
import { logger } from '../shared/logger/logger';
import {
ResponseOutputMessage,
ResponseOutputText,
} from 'openai/resources/responses/responses';
import { SlackService } from '../shared/services/slack/slack.service';
import { MuzzlePersistenceService } from '../muzzle/muzzle.persistence.service';
import { OpenAIService } from './openai/openai.service';
Expand Down Expand Up @@ -348,18 +344,7 @@ export class AIService {
.replace('{all_memories_grouped_by_user}', formattedMemories);

try {
const response = await this.openAiService.openai.responses.create({
model: GATE_MODEL,
input: prompt,
});

const textBlock = response.output.find(
(block): block is ResponseOutputMessage => block.type === 'message',
);
const outputText = textBlock?.content?.find(
(block): block is ResponseOutputText => block.type === 'output_text',
);
const raw = outputText?.text?.trim();
const raw = await this.openAiService.generateText(prompt, 'selection', undefined, GATE_MODEL);

if (!raw) return [];

Expand Down Expand Up @@ -467,10 +452,10 @@ export class AIService {
const extractionInput = `${conversationHistory}\n\nMoonbeam: ${moonbeamResponse}`;
const prompt = MEMORY_EXTRACTION_PROMPT.replace('{existing_memories}', existingMemoriesText);

const result = await this.openAiService.generateText(extractionInput, 'extraction', prompt);
const result = await this.openAiService.generateText(extractionInput, 'extraction', prompt, GATE_MODEL);

if (!result) {
this.aiServiceLogger.warn('Extraction returned no result from generateText');
this.aiServiceLogger.warn('Extraction returned no result');
return;
}

Expand Down
6 changes: 3 additions & 3 deletions packages/backend/src/ai/openai/openai.service.ts
Original file line number Diff line number Diff line change
Expand Up @@ -12,11 +12,11 @@ export class OpenAIService {
apiKey: process.env.OPENAI_API_KEY,
});

generateText = (text: string, userId: string, instructions?: string) => {
generateText = (text: string, userId: string, instructions?: string, model?: string) => {
return this.openai.responses
.create({
model: GPT_MODEL,
tools: [{ type: 'web_search_preview' }],
model: model || GPT_MODEL,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Instead of this just make the model have a default param. IE: model = GPT_MODEL in the function params

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Then we can always pass model here and it'll either be the default when the caller doesn't pass one in, or it'll be whatever they passed in. Slightly more clean (this is kinda becoming a nit though because we might get rid of this class entirely in the near future if @bajman EVER GETS IT TOGETHER COUGH COUGH AHEM)

...(model ? {} : { tools: [{ type: 'web_search_preview' }] }),
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This might break the tool calling.. Why did Claude do this? Nano doesn't have tool calling or something?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Right — gpt-4.1-nano doesn't support web_search_preview. It throws an error if you pass tools to it. The conditional skips tools when a custom model is passed (extraction/selection via nano), but keeps them for the default path (GPT-5.2 responses). Extraction and selection don't need web search anyway — they're analyzing conversation text that's already in the prompt.

instructions: instructions,
input: text,
user: `${userId}-DaBros2016`,
Expand Down
Loading