It is a script to help an import to DynamoDB from a JSON file. it automatically splits a data to chunks. Can be used with local DynamoDB, localstack or AWS cloudshell.
npm i -g aws-import
Run in CloudShell or locally
aws-import file=sample.json
Use to split a source json to chunks
aws-import file=sample.json split
- show short help brief
- source json file
- optional
if json file is unique in the directory, it can be omitted.
- AWS region
- optional
default: eu-west-1
endpoint=http://localhost:4566
- DynamoDB endpoint
- optional
default for local: http://localhost:4566
default for aws cloudshell: https://dynamodb.<region>.amazonaws.com
- split a source json to chunks
- optional
- number of rows in a chunk (by default 25)
default: false
when split is specified, it will split a source json to chunks only
Files will be numbered with -001, -002, -003, etc. pattern
You can specify a table name to import data to.
Specifiing a table name will override the table name in the json file.
- optional
- default: dynamo
- dynamo: keep dynamoDB format
- clean: remove dynamoDB format, clean, raw JSON
If you want to export a DynamoDB table to a JSON file, you can use the following command:
aws dynamodb scan --table-name TABLE_NAME > export.json