Posting JSON to DynamoDB through the AWS CLI can fail due to Unicode errors, so it may be worth importing your data manually through Python.
Fortunately this is relatively simple – you need to do this first:
pip install boto3
Then you can run this code to loop over all files in a directory and upload them:
from __future__ import print_function import boto3 import json import decimal import os dynamodb = boto3.resource('dynamodb', region_name='us-east-1') table = dynamodb.Table('talks') rootdir = "d:\\projects\\image-annotation\\data\\talks\\json\\1" for subdir, dirs, files in os.walk(rootdir): for file in files: selected = subdir + "\\" + file try: with open(selected, encoding="UTF-8") as json_file: row = json.load(json_file, parse_float = decimal.Decimal) row['id_i'] = str(row['id_i']) #print(row) table.put_item( Item=row ) except: print("error: " + selected)
Note here that I’ve chosen “id_i” as my primary key and set it up as a string in AWS, which is why it needs to be monkey-patched here.