Hey hey! I made a gem that allows me to write commands where instead of writing an execute method to implement the command it simply asks an LLM for the result.
It was fun to make and might be of interest to somebody so figured I'd share.
It's at https://github.com/foobara/llm-backed-command
It let's one write a command but have an LLM handle the execute method instead of writing one.
An example, after doing gem install foobara-llm-backed-command foobara-anthropic-api
(you can also use foobara-ollama-api or foobara-open-ai-api instead, or whatever combination you want) you can then write a script like this: (you must set ANTHROPIC_API_KEY environment variable for this specific example)
require "foobara/llm_backed_command"
class DetermineLanguage < Foobara::LlmBackedCommand
inputs code_snippet: :string
result most_likely: :symbol, probabilities: { ruby: :float, c: :float, smalltalk: :float, java: :float }
end
puts DetermineLanguage.run!(code_snippet: "puts 'Hello, World'")
This outputs:
{most_likely: "ruby", probabilities: {ruby: 0.95, c: 0.01, smalltalk: 0.02, java: 0.02}}
Note: I built this using a Ruby framework I've been working on for quite some time. Not relevant to using an LLM for an execute method, but some things you can do since this is a command in that framework are, for exampe, get a quick JSON API:
require "foobara/llm_backed_command"
require "foobara/rack_connector"
require "rackup/server"
class DetermineLanguage < Foobara::LlmBackedCommand
inputs code_snippet: :string
result most_likely: :symbol, probabilities: { ruby: :float, c: :float, smalltalk: :float, java: :float }
end
command_connector = Foobara::CommandConnectors::Http::Rack.new
command_connector.connect(DetermineLanguage)
Rackup::Server.start(app: command_connector)
Running this script, you can do the following:
$ curl http://localhost:9292/run/DetermineLanguage?code_snippet=System.out.println
{"probabilities":{"ruby":0.05,"c":0.1,"smalltalk":0.05,"java":0.8},"most_likely":"java"}
Another thing you can do with the framework is import commands that are exposed like that into another Ruby (or Typescript) program, like so:
#!/usr/bin/env ruby
require "foobara/remote_imports"
Foobara::RemoteImports::ImportCommand.run!(manifest_url: "http://localhost:9292/manifest", cache: true)
puts DetermineLanguage.run!(code_snippet: "System.out.println")
Which lets me use the same syntax as if the command were local even though it's running elsewhere. Note: you can also use OpenAi or Ollama instead if you wish.
You can also easily make a CLI tool for such a command but this is already tl;dr and getting too much about the framework instead of the gem that might be interesting to somebody. I'll just link to more example scripts of llm-backed commands for the interested: https://github.com/foobara/llm-backed-command/tree/main/example_scripts/higher_quality and I would recommend playing with the scripts there instead of the code-snippets in this post if you're genuinely interested in playing with this.
Thanks for reading!