I have built many AI agents, and all frameworks felt so bloated, slow, and unpredictable. Therefore, I hacked together a minimal library that works with JSON/dict/kwargs definitions for each step, allowing you a simpler way to define reproducible agents. It supports concurrency for up to 1000 calls/min, giving you speed and predictability in your workflows.<p><i>Install</i><p><pre><code> pip install flashlearn
</code></pre>
<i>Input is a list of dictionaries</i><p>Simply take user inputs, API responses, and calculations from other tools and feed them to <i>FlashLearn</i>.<p><pre><code> user_inputs = [{"query": "When was python launched?"}]
</code></pre>
<i>Skill is just a simple dictionary</i><p>A skill is an LLM’s ability to perform a task, containing all the necessary information. You can create your own, use predefined samples, or generate them automatically from example data.<p><pre><code> ConvertToGoogleQueries = {
"skill_class": "GeneralSkill",
"system_prompt": "Exactly populate the provided function definition",
"function_definition": {
"type": "function",
"function": {
"name": "ConvertToGoogleQueries",
"description": "Convert the given question into between 1 and n google queries to answer the given question.",
"strict": True,
"parameters": {
"type": "object",
"properties": {
"google_queries": {
"type": "array",
"items": {"type": "string"}
}
},
"required": ["google_queries"],
"additionalProperties": False
}
}
}
}
</code></pre>
<i>Run in 3 lines of code</i><p>Load the skill, create tasks (a list of dictionaries), and run them in parallel. Results are easy to parse in downstream steps.<p><pre><code> skill = GeneralSkill.load_skill(ConvertToGoogleQueries)
tasks = skill.create_tasks([{"query": "User's query"}])
results = skill.run_tasks_in_parallel(tasks)
</code></pre>
<i>Get structured results</i><p>The output is a dictionary, where each key corresponds to an index in the original list. This lets you keep track of results easily.<p><pre><code> flash_results = {'0': {'google_queries': ["QUERY_1", "QUERY_2"]}}
</code></pre>
<i>Pass on to downstream tasks</i><p>Use the structured JSON output in your next steps.<p><pre><code> queries = flash_results["0"]["google_queries"]
results = SimpleGoogleSearch(GOOGLE_API_KEY, GOOGLE_CSE_ID).search(queries)
msgs = [
{"role": "system", "content": "insert links from search results in response to quote it"},
{"role": "user", "content": str(results)},
{"role": "user", "content": "When was python launched?"}
]
print(client.chat.completions.create(model=MODEL_NAME, messages=msgs).choices[0].message.content)
</code></pre>
Feel free to ask anything!