Back to Documentation

Schema Upload

Upload your database schema as CSV or DDL when you cannot provide live database access. Squish analyzes the uploaded schema to discover relationships using naming conventions and column analysis.

When to Use Schema Upload

Compliance restrictions

Security policy prohibits external tools from connecting to production databases.

Air-gapped environments

Database is on a private network with no external access.

Evaluation without credentials

Want to test Squish before sharing database credentials with the team.

CI/CD integration

Export schema from your migration tool and upload as part of your pipeline.

CSV Format

Required Columns

ColumnTypeDescription
table_namestringName of the database table
column_namestringName of the column
data_typestringSQL data type (e.g. integer, varchar, timestamp)

Optional Columns

ColumnDescription
is_nullableWhether the column allows NULL values (YES/NO)
column_defaultDefault value expression
character_maximum_lengthMax length for string columns
table_name,column_name,data_type,is_nullable
users,id,integer,NO
users,email,varchar,NO
users,name,varchar,YES
users,created_at,timestamp,NO
orders,id,integer,NO
orders,user_id,integer,NO
orders,total,decimal,NO
orders,status,varchar,NO
orders,created_at,timestamp,NO
products,id,integer,NO
products,name,varchar,NO
products,price,decimal,NO
order_items,id,integer,NO
order_items,order_id,integer,NO
order_items,product_id,integer,NO
order_items,quantity,integer,NO

DDL Format

Upload PostgreSQL-compatible CREATE TABLE statements. Squish parses table names, column names, data types, and foreign key constraints from the DDL.

CREATE TABLE users (
  id SERIAL PRIMARY KEY,
  email VARCHAR(255) NOT NULL UNIQUE,
  name VARCHAR(255),
  created_at TIMESTAMP NOT NULL DEFAULT NOW()
);

CREATE TABLE orders (
  id SERIAL PRIMARY KEY,
  user_id INTEGER NOT NULL REFERENCES users(id),
  total DECIMAL(10,2) NOT NULL,
  status VARCHAR(50) NOT NULL DEFAULT 'pending',
  created_at TIMESTAMP NOT NULL DEFAULT NOW()
);

CREATE TABLE products (
  id SERIAL PRIMARY KEY,
  name VARCHAR(255) NOT NULL,
  price DECIMAL(10,2) NOT NULL
);

CREATE TABLE order_items (
  id SERIAL PRIMARY KEY,
  order_id INTEGER NOT NULL REFERENCES orders(id),
  product_id INTEGER NOT NULL REFERENCES products(id),
  quantity INTEGER NOT NULL DEFAULT 1
);

DDL upload extracts explicit REFERENCES constraints as foreign key relationships in addition to running naming convention analysis. This gives you higher-confidence results than CSV upload alone.

Upload Flow

1

Navigate to Upload

In the Squish app, go to Dashboard and click "Upload Schema" or navigate directly to the upload page.

2

Select your file

Drag and drop your CSV or DDL file, or click to browse. Files up to 10MB are supported.

3

Review parsed schema

Squish shows you the tables and columns it parsed from your file. Verify the count matches your expectations.

4

Run analysis

Click "Analyze" to run relationship discovery on the uploaded schema. Results appear in the same relationships table and ERD view as live discovery.

Limitations

No statistical analysis

Without live database access, Squish cannot run COUNT or COUNT(DISTINCT) queries. Cardinality-based scoring is not available.

Convention and column name methods only

Discovery relies on naming conventions (e.g. user_id -> users.id) and column name matching. Foreign key constraints are extracted from DDL but not from CSV.

No query history analysis

JOIN pattern analysis from pg_stat_statements or Snowflake QUERY_HISTORY requires a live connection.

Point-in-time snapshot

Uploaded schemas are static. To pick up schema changes, upload a new file.

Ready to upload your schema?

Open Schema Upload