Import Methods
File Upload
CSV, Excel, JSON files up to 100MB
API Integration
Real-time data import via REST API
Database Connection
PostgreSQL, MySQL, SQL Server, Oracle, MongoDB
Third-Party
Google Sheets, Airtable, webhooks
File Upload
Navigate to your dataset → Import Data → File Upload.Supported Formats
CSV: Best for structured data. UTF-8 encoding recommended, 100MB limit. Excel: .xlsx/.xls files, multiple sheets supported, formulas converted to values. JSON: Array of objects or line-delimited JSON, automatic schema detection.Upload Process
- Select File: Browse or drag-and-drop your file
- Configure: System detects headers, delimiters, and encoding automatically
- Map Columns: Match file columns to dataset columns, create new columns if needed
- Handle Issues: System flags missing headers, type mismatches, duplicates, invalid dates
- Choose Import Mode:
- Append: Add to existing data
- Replace: Replace all data
- Update: Update existing records by key column
- Execute Import: Review summary and start processing
Best Practices
File Preparation: Use clean headers, consistent formatting, UTF-8 encoding, remove empty rows. Large Files: Split files over 100MB, remove unnecessary columns, test with sample first.API Integration
Use the REST API for real-time or automated imports.Setup
- Get API Key: Workspace Settings → API Keys → Create with
datasets:writepermission - Use Endpoint:
POST /api/v1/workspaces/{workspace_id}/datasets/{dataset_id}/records - Authentication: Bearer token in Authorization header
Example
validate_schema, skip_duplicates, update_existing in options.