Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

window.ai.contextCache #159

Open
hemanth opened this issue Jul 10, 2024 · 0 comments
Open

window.ai.contextCache #159

hemanth opened this issue Jul 10, 2024 · 0 comments

Comments

@hemanth
Copy link

hemanth commented Jul 10, 2024

Use Case

The window.ai.contextCache API allows web applications to efficiently manage and reuse context information for AI models. This is particularly useful for applications that involve ongoing conversations or require maintaining state across multiple AI interactions, such as chatbots, virtual assistants, or context-aware content generation tools.

By caching context, applications can:

  1. Improve response relevance in multi-turn conversations
  2. Reduce latency by avoiding the need to resend full conversation history
  3. Optimize resource usage by managing context size

API Description

interface ContextCacheOptions {
  maxSize?: number; // Maximum number of tokens or characters to store
  ttl?: number; // Time-to-live in milliseconds
}

interface ContextEntry {
  id: string;
  content: string;
  timestamp: number;
}

interface WindowAI {
  contextCache: {
    add(id: string, content: string): Promise<void>;
    get(id: string): Promise<string | null>;
    update(id: string, content: string): Promise<void>;
    delete(id: string): Promise<void>;
    clear(): Promise<void>;
    setOptions(options: ContextCacheOptions): Promise<void>;
  };
}

interface Window {
  ai: WindowAI;
}

Methods

  • add(id: string, content: string): Adds a new context entry to the cache.
  • get(id: string): Retrieves a context entry by its ID.
  • update(id: string, content: string): Updates an existing context entry.
  • delete(id: string): Removes a context entry from the cache.
  • clear(): Removes all entries from the cache.
  • setOptions(options: ContextCacheOptions): Configures cache behavior.

Example Usage

async function manageConversationContext(conversationId, newMessage) {
  // Configure cache
  await window.ai.contextCache.setOptions({ maxSize: 1000, ttl: 3600000 });

  // Retrieve existing context
  let context = await window.ai.contextCache.get(conversationId);

  // Update context with new message
  context = (context ? context + "\n" : "") + newMessage;
  await window.ai.contextCache.update(conversationId, context);

  // Use updated context in AI interaction
  const response = await someAIFunction(context);

  return response;
}

This API provides a simple yet flexible way to manage context information for AI interactions in web applications.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant