Sending prompts to embeded query using Langchain (NodeJs)

I am using a Pinecone index containing the data I want to query with OpenAI (which is all working), however, I would like to know if it’s possible to expand the current chain.invoke question with a full set of system and user prompts through a messages array.

This works when I pass in the client (Pinecone) and the question string:

  const index = client.Index(indexName);
  // Create query embedding
  const queryEmbedding = await new OpenAIEmbeddings().embedQuery(question);
  // Query pinecone index query and return matches
  let queryResponse = await index.query({
    topK: 100,
    vector: queryEmbedding,
    includeMetadata: true,
    includeValues: true,
  });
  if (queryResponse.matches.length > 0) {
    // Create an OpenAI instance and load the QAStuffChain
    const llm = new OpenAI({
      modelName: 'gpt-4-turbo-preview',
      temperature: 0,
      topP: 0.1,
    });

    const chain = loadQAStuffChain(llm);

    // Extract and concatenate page content from matched documents
    const concatenatedPageContent = queryResponse.matches
      .map(match => match.metadata.pageContent)
      .join(' ');
    // Execute the chain with input documents and question
    const result = await chain.invoke({
      input_documents: [new Document({ pageContent: concatenatedPageContent })],
      promptMessage: question,
    });
    // Log the answer
    console.log(`\n${result.text.trim()}`);

Leave a Comment