secrets

Use this skill when a user wants to store, manage, or work with Goldsky secrets — the named credential objects used by pipeline sinks. This includes: creating a new secret from a connection string or credentials, listing or inspecting existing secrets, updating or rotating credentials after a password change, and deleting secrets that are no longer needed. Trigger for any query where the user mentions 'goldsky secret', wants to securely store database credentials for a pipeline, or is working with sink authentication for PostgreSQL, Neon, Supabase, ClickHouse, Kafka, S3, Elasticsearch, DynamoDB, SQS, OpenSearch, or webhooks.

Safety Notice

This listing is imported from skills.sh public index metadata. Review upstream SKILL.md and repository scripts before running.

Copy this and send it to your AI assistant to learn

Install skill "secrets" with this command: npx skills add goldsky-io/goldsky-agent/goldsky-io-goldsky-agent-secrets

Goldsky Secrets Management

Create and manage secrets for pipeline sink credentials.

Agent Instructions

When this skill is invoked, follow this streamlined workflow:

Step 1: Verify Login + List Existing Secrets

Run goldsky secret list to confirm authentication and show existing secrets.

If authentication fails: Invoke the auth-setup skill first.

Step 2: Determine Intent Quickly

Skip unnecessary questions. If the user's intent is clear from context, proceed directly:

  • User says "create a postgres secret" → Go straight to credential collection
  • User pastes a connection string → Parse it immediately (see Connection String Parsing)
  • User mentions a specific provider (Neon, Supabase, etc.) → Use provider-specific guidance

Only use AskUserQuestion if intent is genuinely unclear.

Step 3: Connection String Parsing (Preferred for PostgreSQL)

If user provides a connection string, parse it directly instead of asking questions.

PostgreSQL connection string format:

postgres://USER:PASSWORD@HOST:PORT/DATABASE?sslmode=require
postgresql://USER:PASSWORD@HOST/DATABASE

Parsing logic:

  1. Extract: user, password, host, port (default 5432), databaseName
  2. Construct JSON immediately
  3. Create the secret without further questions

Example - user provides:

postgresql://neondb_owner:abc123@ep-cool-name.us-east-2.aws.neon.tech/neondb?sslmode=require

Create using the connection string directly:

goldsky secret create --name SUGGESTED_NAME
# When prompted, paste the connection string:
# postgresql://neondb_owner:abc123@ep-cool-name.us-east-2.aws.neon.tech/neondb?sslmode=require

Step 4: Provider-Specific Quick Paths

Neon:

  • Connection string format: postgresql://USER:PASS@ep-XXX.REGION.aws.neon.tech/neondb
  • Default port: 5432
  • Common issue: Free tier has 512MB limit - pipelines will fail with "project size limit exceeded"

Supabase:

  • Connection string format: postgresql://postgres:PASS@db.PROJECT.supabase.co:5432/postgres
  • Use the "Connection string" from Project Settings → Database

PlanetScale (MySQL):

  • Use "protocol": "mysql" and port 3306

Step 5: Create Secret Directly

Once you have credentials (from parsing or user input), create immediately:

goldsky secret create \
  --name SECRET_NAME \
  --value '{"type":"jdbc","protocol":"postgres",...}' \
  --description "Optional description"

Naming convention: PROJECT_PROVIDER (e.g., TRADEWATCH_NEON, ANALYTICS_SUPABASE)

Step 6: Verify

Run goldsky secret list to confirm creation.


Secret JSON Schemas

JSON schema files are available in the schemas/ folder. Each file contains the full schema with examples.

Secret TypeSchema FileType FieldUse Case
PostgreSQLpostgres.jsonjdbcDatabase sink
MySQLpostgres.jsonjdbcDatabase sink (protocol: mysql)
ClickHouseclickhouse.jsonclickHouseAnalytics database
Kafkakafka.jsonkafkaEvent streaming
AWS S3s3.jsons3Object storage
ElasticSearchelasticsearch.jsonelasticSearchSearch engine
DynamoDBdynamodb.jsondynamodbNoSQL database
SQSsqs.jsonsqsMessage queue
OpenSearchopensearch.jsonopensearchSearch/analytics
Webhookwebhook.jsonhttpauthHTTP endpoints

Schema location: schemas/ (relative to this skill's directory)

Quick Reference Examples

PostgreSQL — Connection string format:

postgres://username:password@host:port/database
goldsky secret create --name MY_POSTGRES_SECRET
# The CLI will prompt for the connection string interactively

ClickHouse — Connection string format:

https://username:password@host:port/database

Kafka — JSON format:

{
  "type": "kafka",
  "bootstrapServers": "broker:9092",
  "securityProtocol": "SASL_SSL",
  "saslMechanism": "PLAIN",
  "saslJaasUsername": "user",
  "saslJaasPassword": "pass"
}

S3 — Colon-separated format:

access_key_id:secret_access_key

Or with session token: access_key_id:secret_access_key:session_token

Webhook:

Note: Turbo pipeline webhook sinks do not support Goldsky's native secrets management. Include auth headers directly in the pipeline YAML headers: field instead.

Connection String Parser

For PostgreSQL, use the helper script to parse connection strings:

./scripts/parse-connection-string.sh "postgresql://user:pass@host:5432/dbname"
# Output: JSON ready for goldsky secret create --value

Step 5: Confirm and Create

Show the user what will be created (mask password with ***) and ask for confirmation before running the command.

Step 6: Verify Success

Run goldsky secret list to confirm the secret was created.

Quick Reference

ActionCommand
Creategoldsky secret create --name NAME --value "value"
Listgoldsky secret list
Revealgoldsky secret reveal NAME
Updategoldsky secret update NAME --value "new-value"
Deletegoldsky secret delete NAME

Prerequisites

  • Goldsky CLI installed
  • Logged in (goldsky login)
  • Connection credentials for your target sink

Why Secrets Are Needed

Pipelines that write to external sinks (PostgreSQL, ClickHouse, Kafka, S3) need credentials to connect. Instead of putting credentials directly in your pipeline YAML, you store them as secrets and reference them by name.

Benefits:

  • Credentials are encrypted and stored securely
  • Pipeline configs can be shared without exposing secrets
  • Credentials can be rotated without modifying pipelines

Command Reference

CommandPurposeKey Flags
goldsky secret createCreate a new secret--name, --value, --description
goldsky secret listList all secrets
goldsky secret reveal <name>Show secret value
goldsky secret update <name>Update secret value--value, --description
goldsky secret delete <name>Delete a secret-f (force, skip confirmation)

Common Patterns

PostgreSQL Secret

goldsky secret create --name PROD_POSTGRES
# When prompted, provide the connection string:
# postgres://admin:secret@db.example.com:5432/mydb

Pipeline usage:

sinks:
  output:
    type: postgres
    from: my_source
    schema: public
    table: transfers
    secret_name: PROD_POSTGRES

ClickHouse Secret

goldsky secret create --name CLICKHOUSE_ANALYTICS
# When prompted, provide the connection string:
# https://default:secret@abc123.clickhouse.cloud:8443/analytics

Pipeline usage:

sinks:
  output:
    type: clickhouse
    from: my_source
    table: events
    secret_name: CLICKHOUSE_ANALYTICS
    primary_key: id

Rotating Credentials

Update an existing secret without changing pipeline configs:

goldsky secret update MY_POSTGRES_SECRET --value 'postgres://admin:NEW_PASSWORD@db.example.com:5432/mydb'

Active pipelines will pick up the new credentials on their next connection.

Deleting Unused Secrets

# With confirmation prompt
goldsky secret delete OLD_SECRET

# Skip confirmation (for scripts)
goldsky secret delete OLD_SECRET -f

Warning: Deleting a secret that's in use will cause pipeline failures.

Secret Naming Conventions

Use descriptive, uppercase names with underscores:

GoodBad
PROD_POSTGRES_MAINsecret1
STAGING_CLICKHOUSEmy-secret
KAFKA_PROD_CLUSTERpostgres

Include environment and purpose in the name for clarity.

Troubleshooting

Error: Secret not found

Error: Secret 'MY_SECRET' not found

Cause: The secret name doesn't exist or is misspelled.
Fix: Run goldsky secret list to see available secrets and check the exact name.

Error: Secret already exists

Error: Secret 'MY_SECRET' already exists

Cause: Attempting to create a secret with a name that's already in use.
Fix: Use goldsky secret update MY_SECRET --value "new-value" to update, or choose a different name.

Error: Invalid secret value format

Error: Invalid JSON in secret value

Cause: JSON syntax error in the secret value.
Fix: Validate your JSON before creating the secret:

# Test JSON validity
echo '{"url":"...","user":"..."}' | jq .

Pipeline fails with "connection refused"

Cause: The credentials in the secret are incorrect or the database is unreachable.
Fix:

  1. Verify credentials work outside Goldsky: psql "postgresql://..."
  2. Check the secret value: goldsky secret reveal MY_SECRET
  3. Ensure the database allows connections from Goldsky's IP ranges

Pipeline fails with "authentication failed"

Cause: Username or password in the secret is incorrect. Fix: Update the secret with correct credentials:

goldsky secret update MY_SECRET --value 'postgres://correct:credentials@host:5432/db'

Secret value contains special characters

Cause: JSON strings with special characters need proper escaping. Fix: Use proper JSON escaping for special characters in password fields:

  • Backslash: use \\
  • Double quote: use \"
  • Newline: use \n

With the structured JSON format, most special characters in passwords work without URL encoding since the password is a separate field.

Related

  • /turbo-builder — Build and deploy pipelines that use these secrets
  • /auth-setup — Invoke this if user is not logged in
  • /turbo-pipelines — Pipeline YAML configuration reference

Source Transparency

This detail page is rendered from real SKILL.md content. Trust labels are metadata-based hints, not a safety guarantee.

Related Skills

Related by shared tags or category signals.

Automation

turbo-pipelines

No summary provided by upstream source.

Repository SourceNeeds Review
Automation

turbo-builder

No summary provided by upstream source.

Repository SourceNeeds Review
Automation

turbo-lifecycle

No summary provided by upstream source.

Repository SourceNeeds Review
Automation

turbo-architecture

No summary provided by upstream source.

Repository SourceNeeds Review