Files
gh-byunk-claude-code-toolki…/skills/erd-skill/cli.md
2025-11-29 18:02:57 +08:00

3.5 KiB

DBML CLI Usage Guide

Convert DBML files to SQL and vice versa.

Prerequisites

Assumes cli is already installed globally. If not installed: npm install -g @dbml/cli

Convert a DBML file to SQL

$ dbml2sql schema.dbml

CREATE TABLE "staff" (
  "id" INT PRIMARY KEY,
  "name" VARCHAR,
  "age" INT,
  "email" VARCHAR
);
...

By default it will generate to "PostgreSQL". To specify which database to generate to:

$ dbml2sql schema.dbml --mysql

CREATE TABLE `staff` (
  `id` INT PRIMARY KEY,
  `name` VARCHAR(255),
  `age` INT,
  `email` VARCHAR(255)
);
...

To output to a file you may use --out-file or -o:

$ dbml2sql schema.dbml -o schema.sql
  ✔ Generated SQL dump file (PostgreSQL): schema.sql

Syntax Manual

$ dbml2sql <path-to-dbml-file>
           [--mysql|--postgres|--mssql|--oracle]
           [-o|--out-file <output-filepath>]

Convert a SQL file to DBML

To convert SQL to DBML file:

$ sql2dbml dump.sql --postgres

Table staff {
  id int [pk]
  name varchar
  age int
  email varchar
}
...

Output to a file:

$ sql2dbml --mysql dump.sql -o mydatabase.dbml
  ✔ Generated DBML file from SQL file (MySQL): mydatabase.dbml

Syntax Manual

$ sql2dbml <path-to-sql-file>
           [--mysql|--postgres|--mssql|--postgres-legacy|--mysql-legacy|--mssql-legacy|--snowflake]
           [-o|--out-file <output-filepath>]

Note: The --postgres-legacy, --mysql-legacy and mssql-legacy options import PostgreSQL/MySQL/MSSQL to dbml using the old parsers. It's quicker but less accurate.

Generate DBML directly from a database

$ db2dbml postgres 'postgresql://dbml_user:dbml_pass@localhost:5432/dbname?schemas=public'

Table "staff" {
  "id" int4 [pk, not null]
  "name" varchar
  "age" int4
  "email" varchar
}
...

Output to a file:

$ db2dbml postgres 'postgresql://dbml_user:dbml_pass@localhost:5432/dbname?schemas=public' -o schema.dbml
  ✔ Generated DBML file from database's connection: schema.dbml

Syntax Manual

$ db2dbml postgres|mysql|mssql|snowflake|bigquery
          <connection-string>
          [-o|--out-file <output-filepath>]

Connection string examples:

  • postgres: 'postgresql://user:password@localhost:5432/dbname?schemas=schema1,schema2,schema3'
  • mysql: 'mysql://user:password@localhost:3306/dbname'
  • mssql: 'Server=localhost,1433;Database=master;User Id=sa;Password=your_password;Encrypt=true;TrustServerCertificate=true;Schemas=schema1,schema2,schema3;'
  • snowflake: 'SERVER=<account_identifier>.<region>;UID=<your_username>;PWD=<your_password>;DATABASE=<your_database>;WAREHOUSE=<your_warehouse>;ROLE=<your_role>;SCHEMAS=schema1,schema2,schema3;'
  • bigquery: /path_to_json_credential.json

For BigQuery, the credential file supports flexible authentication:

1. Application Default Credentials (ADC):

  • Empty file: {} - uses environment authentication
  • Override specific fields: {"project_id": "my-project", "datasets": [...]}

For more information about ADC, see How Application Default Credentials works

2. Explicit Service Account (bypasses ADC):

{
  "project_id": "your-project-id",
  "client_email": "your-client-email",
  "private_key": "your-private-key",
  "datasets": ["dataset_1", "dataset_2", ...]
}

:::note

  • Both client_email and private_key must be provided together.
  • If datasets is not specified or is empty, all accessible datasets will be fetched.