Setup is done. Here's where your documents live and how to connect your recording service.
Your dashboard is running locally. Open it in your browser:
Log in with the username and password you set during the installer. If your port is different, the installer printed it — look for the line that says Dashboard : http://localhost:XXXX.
http://100.x.x.x:7777).Run this command in Terminal. It reads your config and prints every folder ID the pipeline is watching:
python3 -c "
import yaml, os
cfg = os.environ.get('COS_CONFIG_DIR', os.path.expanduser('~/cos-pipeline-config'))
ctx = yaml.safe_load(open(f'{cfg}/firm_context.yaml'))
for src in ctx.get('transcript_sources', []):
print(f\"\n{src['name']}\")
for fid in src.get('folder_ids', []):
print(f\" https://drive.google.com/drive/folders/{fid}\")
"
This prints one clickable Drive URL per folder. Open each one to confirm it exists in your Google Drive, then copy the folder ID (the long string at the end of the URL) to paste into your recording service in the next step.
./setup.sh --instance=<your-slug> --validate to diagnose.
Run this to print direct links to all three of your shared docs:
python3 -c "
import yaml, os
cfg = os.environ.get('COS_CONFIG_DIR', os.path.expanduser('~/cos-pipeline-config'))
docs = yaml.safe_load(open(f'{cfg}/config/drive-docs.yaml'))
for k, v in docs.get('docs', {}).items():
print(f\"{k}: https://docs.google.com/document/d/{v}\")
"
Paste the folder ID from Step 2 into whichever service you use. Only need to do this once.
Call Title.txtTo trigger any pipeline manually or check when it last ran, go to your dashboard admin panel: