Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Storing generated functions #16

Open
PranavOlety opened this issue Mar 7, 2024 · 8 comments
Open

Storing generated functions #16

PranavOlety opened this issue Mar 7, 2024 · 8 comments

Comments

@PranavOlety
Copy link

Based on my review of the Fructose codebase, I did not see any functionality for storing generated functions. This might go against the design philosophy of the project, but I think it would be useful to have some abstraction that allows you to store a function you have generated in a previous run to be stored. Ideally, if there are no changes to its arguments, signature, docstring, and the Fructose decorator and its arguments, then you would call the previously generated function and not have to use an API call to an LLM.

This could be made a flavor or some other argument to the Fructose decorator.

@erik-dunteman
Copy link
Contributor

Potentially a bit outside of the scope of this project, but curious what you'd have in mind.

What's the safety/isolation aspect? Running exec() on generated code without some form of sandbox would be risky.

@ErikKaum is our resident sandbox/vm guy, tagging him in for thoughts

@erik-dunteman
Copy link
Contributor

To clarify, @PranavOlety could you trace out a series of actions fructose would take? (like 1. generate function, 2. store function, 3. register function in uses, 4. execute function in some sandbox just as we do the uses functions)

@ErikKaum
Copy link
Member

ErikKaum commented Mar 7, 2024

If I understood this correct it sounds maybe more like @PranavOlety is referring to caching (correct me if I'm wrong). Let's use a concrete example.

If we have a function like this:

@ai()
def describe(animals: list[str]) -> str:
  """
  Given a list of animals, use one word that'd describe them all.
  """

and call the function once --> call goes to llm --> cache the result (in this case "pets", type str)

result = describe(["dog", "cat", "parrot", "goldfish"])
print(result) # -> "pets" type: str

if we call function with the same ["dog", "cat", "parrot", "goldfish"] we could just fetch the "pets" result from a local cache vs. calling an LLM.

Did I get this right?

@PranavOlety
Copy link
Author

@ErikKaum, yes that's exactly what I was referring to - caching locally. Yeah the functionality is exactly as you described, you would instead fetch "pets" from a local cache if nothing has changed with the function itself.

@ErikKaum
Copy link
Member

ErikKaum commented Mar 7, 2024

Cool! Yeah this is definitely on our roadmap and not at all against our design philosophy.

The thing we'll probably discuss with the team is how to present the interface. Like should it be default/not default, do you enable it per function or can you globally enable it, should it be a flavour and so on.

@PranavOlety
Copy link
Author

Awesome! Would love to help out with this feature once the interface has been set

@ErikKaum
Copy link
Member

ErikKaum commented Mar 7, 2024

Thank you very much! We'll let you know 👍

@erik-dunteman
Copy link
Contributor

I see I completely misunderstood this

100%, result caching is a feature that makes sense.
I'd tend toward putting it on by default and not exposing an API to shut it off until people ask for it

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants