HomeSample Page

Sample Page Title


On this tutorial, we stroll by way of constructing a compact however absolutely useful Cipher-based workflow. We begin by securely capturing our Gemini API key within the Colab UI with out exposing it in code. We then implement a dynamic LLM choice perform that may robotically change between OpenAI, Gemini, or Anthropic primarily based on which API secret’s accessible. The setup part ensures Node.js and the Cipher CLI are put in, after which we programmatically generate a cipher.yml configuration to allow a reminiscence agent with long-term recall. We create helper features to run Cipher instructions straight from Python, retailer key undertaking selections as persistent reminiscences, retrieve them on demand, and eventually spin up Cipher in API mode for exterior integration. Take a look at the FULL CODES right here.

import os, getpass
os.environ["GEMINI_API_KEY"] = getpass.getpass("Enter your Gemini API key: ").strip()


import subprocess, tempfile, pathlib, textwrap, time, requests, shlex


def choose_llm():
   if os.getenv("OPENAI_API_KEY"):
       return "openai", "gpt-4o-mini", "OPENAI_API_KEY"
   if os.getenv("GEMINI_API_KEY"):
       return "gemini", "gemini-2.5-flash", "GEMINI_API_KEY"
   if os.getenv("ANTHROPIC_API_KEY"):
       return "anthropic", "claude-3-5-haiku-20241022", "ANTHROPIC_API_KEY"
   elevate RuntimeError("Set one API key earlier than working.")

We begin by securely getting into our Gemini API key utilizing getpass so it stays hidden within the Colab UI. We then outline a choose_llm() perform that checks our surroundings variables and robotically selects the suitable LLM supplier, mannequin, and key primarily based on what is obtainable. Take a look at the FULL CODES right here.

def run(cmd, examine=True, env=None):
   print("▸", cmd)
   p = subprocess.run(cmd, shell=True, textual content=True, capture_output=True, env=env)
   if p.stdout: print(p.stdout)
   if p.stderr: print(p.stderr)
   if examine and p.returncode != 0:
       elevate RuntimeError(f"Command failed: {cmd}")
   return p

We create a run() helper perform that executes shell instructions, prints each stdout and stderr for visibility, and raises an error if the command fails when examine is enabled, making our workflow execution extra clear and dependable. Take a look at the FULL CODES right here.

def ensure_node_and_cipher():
   run("sudo apt-get replace -y && sudo apt-get set up -y nodejs npm", examine=False)
   run("npm set up -g @byterover/cipher")

We outline ensure_node_and_cipher() to put in Node.js, npm, and the Cipher CLI globally, guaranteeing our surroundings has all the required dependencies earlier than working any Cipher-related instructions. Take a look at the FULL CODES right here.

def write_cipher_yml(workdir, supplier, mannequin, key_env):
   cfg = """
llm:
 supplier: {supplier}
 mannequin: {mannequin}
 apiKey: ${key_env}
systemPrompt:
 enabled: true
 content material: |
   You might be an AI programming assistant with long-term reminiscence of prior selections.
embedding:
 disabled: true
mcpServers:
 filesystem:
   kind: stdio
   command: npx
   args: ['-y','@modelcontextprotocol/server-filesystem','.']
""".format(supplier=supplier, mannequin=mannequin, key_env=key_env)


   (workdir / "memAgent").mkdir(mother and father=True, exist_ok=True)
   (workdir / "memAgent" / "cipher.yml").write_text(cfg.strip() + "n")

We implement write_cipher_yml() to generate a cipher.yml configuration file inside a memAgent folder, setting the chosen LLM supplier, mannequin, and API key, enabling a system immediate with long-term reminiscence, and registering a filesystem MCP server for file operations. Take a look at the FULL CODES right here.

def cipher_once(textual content, env=None, cwd=None):
   cmd = f'cipher {shlex.quote(textual content)}'
   p = subprocess.run(cmd, shell=True, textual content=True, capture_output=True, env=env, cwd=cwd)
   print("Cipher says:n", p.stdout or p.stderr)
   return p.stdout.strip() or p.stderr.strip()

We outline cipher_once() to run a single Cipher CLI command with the offered textual content, seize and show its output, and return the response, permitting us to work together with Cipher programmatically from Python. Take a look at the FULL CODES right here.

def start_api(env, cwd):
   proc = subprocess.Popen("cipher --mode api", shell=True, env=env, cwd=cwd,
                           stdout=subprocess.PIPE, stderr=subprocess.STDOUT, textual content=True)
   for _ in vary(30):
       attempt:
           r = requests.get("http://127.0.0.1:3000/well being", timeout=2)
           if r.okay:
               print("API /well being:", r.textual content)
               break
       besides: go
       time.sleep(1)
   return proc

We create start_api() to launch Cipher in API mode as a subprocess, then repeatedly ballot its /well being endpoint till it responds, guaranteeing the API server is prepared earlier than continuing. Take a look at the FULL CODES right here.

def predominant():
   supplier, mannequin, key_env = choose_llm()
   ensure_node_and_cipher()
   workdir = pathlib.Path(tempfile.mkdtemp(prefix="cipher_demo_"))
   write_cipher_yml(workdir, supplier, mannequin, key_env)
   env = os.environ.copy()


   cipher_once("Retailer determination: use pydantic for config validation; pytest fixtures for testing.", env, str(workdir))
   cipher_once("Bear in mind: comply with standard commits; implement black + isort in CI.", env, str(workdir))


   cipher_once("What did we standardize for config validation and Python formatting?", env, str(workdir))


   api_proc = start_api(env, str(workdir))
   time.sleep(3)
   api_proc.terminate()


if __name__ == "__main__":
   predominant()

In predominant(), we choose the LLM supplier, set up dependencies, and create a short lived working listing with a cipher.yml configuration. We then retailer key undertaking selections in Cipher’s reminiscence, question them again, and eventually begin the Cipher API server briefly earlier than shutting it down, demonstrating each CLI and API-based interactions.

In conclusion, we have now a working Cipher setting that securely manages API keys, selects the suitable LLM supplier robotically, and configures a memory-enabled agent completely by way of Python automation. Our implementation consists of determination logging, reminiscence retrieval, and a stay API endpoint, all orchestrated in a Pocket book/Colab-friendly workflow. This makes the setup reusable for different AI-assisted improvement pipelines, permitting us to retailer and question undertaking information programmatically whereas holding the setting light-weight and straightforward to redeploy.


Take a look at the FULL CODES right here. Be happy to take a look at our GitHub Web page for Tutorials, Codes and Notebooks. Additionally, be happy to comply with us on Twitter and don’t overlook to hitch our 100k+ ML SubReddit and Subscribe to our Publication.


Asif Razzaq is the CEO of Marktechpost Media Inc.. As a visionary entrepreneur and engineer, Asif is dedicated to harnessing the potential of Synthetic Intelligence for social good. His most up-to-date endeavor is the launch of an Synthetic Intelligence Media Platform, Marktechpost, which stands out for its in-depth protection of machine studying and deep studying information that’s each technically sound and simply comprehensible by a large viewers. The platform boasts of over 2 million month-to-month views, illustrating its recognition amongst audiences.

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles