Skip to content

Intro to function calling

What is function calling?

AI Models are based on data gathered, from webpages to data sets and so on, however they are based on a point in time data. They cannot, for example tell you what the weather is now in Boston, or what game is currently ranking number one in the chart.

Function calling is the ability to add an API Endpoint(s) to the request, this enables the model to have access to the correct information.

To do this the model needs to know the following:

  • The API Endpoint
  • How the Endpoint looks for requests, and for responses. It uses a jsonschema for these.

So how does this work without JellyFaaS?

First you need to ask your question, supplying the jsonschema of the API request. The model then returns the contents of what it wants you to run against the API. You then run the query, get back the results and send that back into the model attached to the previous response. The model will then reply with your answer.

These steps are outlined below, however in JellyFaaS this whole process is automated, all you need to do is supply the name of a function from our library. Our library contains all the information needed and the jsonschema are all pre-generated.

Our function also allow complex interactions with APIs that normally required a number of repeated calls to happen with a simple wrapper function. This makes it far easier for the model to understand. Many API's are very powerful, and can be complex to work with, requiring multiple calls to different endpoints to carry out a task. This currently would make it error prone and likely to fail using the current models. In most cases only a few key interactions are needed to do most of the work people require, these can be made into functions that the model can easily understand and work with, resulting in a higher chance of success.

flowchart TD
    A1(Query, with function call description)
    B[Return parameters]
    C[Manually run against API with required params]
    D[Send results back into the model]
    E1[The AI Model processes and return back the response]

    subgraph Start
        A1
    end

    subgraph End
        E1
    end

    A1 --> B
    B --> C
    C --> D
    D --> E1

What are the use cases for a SQL Databases

The primary use case is the ability to give the AI model access to real time information, in your systems (current stock count) or external information, weather in Arlington.

As it can use functions the models also has the ability to output information to slack (for example), or create ticket in jira dynamically.

Privacy and your data

We do store any of you data in our system, all data is in transit and discarded after the response is sent back to the client.

Tutorial

In this example we are going to call out to a function in JellyFaaS that return information on a computer games. This is 'mock' process and is designed to show how you can implement with a function, that will provide the context into the model.

The function we are looking at is called gameslist and can be found here

As you can see from the readme, the file takes a json body :

{
    "genre":"action-adventure",
    "rating":"M"
}

Genres are:

  • Action-Adventure
  • 3d
  • Sci-fi
  • Horror
  • Action
  • Sandbox
  • Simulation
  • RPG
  • Action RPG

This will return suggested games that fit the critiea.

Calling the Function endpoint

To call the function you will first need a token, to get a we token see here

Then you can use the following curl command:

Request asking M rated games
curl --location 'localhost:8080/query-service/v1/function' --header 'jfwt: <token>' --header 'Content-Type: application/json' --data '{
    "query": "Tell me which horror games are in stock rated M",
    "function":"gameslist"
}'
Response
{
    "answer": "We have two horror games rated M in stock: Resident Evil Village and Silent Hill 2.",
    "query": "Tell me which horror games are in stock rated M",
    "spanId": "sjIjJSdNg",
    "messages": null
}

or asking for more details:

Request asking for more information
curl --location 'localhost:8080/query-service/v1/function' --header 'jfwt: <token>' --header 'Content-Type: application/json' --data '{
    "query": "Tell me which horror games are in stock rated M and how many of each one?, Tell me about each game, and why people think of it, and why I should by it",
    "function":"gameslist"
}'
Request asking for more information
{
    "answer": "We have 1 copy of Resident Evil Village in stock. It is a survival horror game where you play as Ethan Winters, who is searching for his kidnapped daughter. The game is known for its intense atmosphere, terrifying enemies, and challenging gameplay. You should buy it if you enjoy survival horror games with a strong story and a lot of action. \n\nWe are currently out of stock of Silent Hill 2. It is a psychological horror game that follows James Sunderland as he searches for his wife in the town of Silent Hill. The game is known for its disturbing atmosphere, memorable characters, and thought-provoking story. You should buy it if you enjoy psychological horror games with a focus on atmosphere and story. \n",
    "query": "Tell me which horror games are in stock rated M and how many of each one?, Tell me about each game, and why people think of it, and why I should by it",
    "spanId": "NkOLxSdHR",
    "messages": null
}

If you forget to supply the rating, it will include this information in the prompt

{
    "answer": "I need a rating to find horror games in stock. What rating are you looking for? \n",
    "query": "Tell me which horror games are in stock",
    "spanId": "R-quJIOHg",
    "messages": null
}

Using the SDK's

Other query options

REST Options, you can optionally set these when using the SDK too (see the SDK docs):

Key Required Description
query Y The query to ask
function Y The function to use
ai_platform N Optional of Gemini (default, part of the product, or you can use your own) or OpenAI ChatGPT.
api+key N If you want to bring your own Secret key (for Gemini, or ChatGTP [required])
structured_output N Base64 encoded JSON Schema representing the requested output model