My table is around 220mb with 250k records within it. I'm trying to pull all of this data into python. I realize this needs to be a chunked batch process and looped through, but I'm not sure how I can set the batches to start where the previous left off.
Is there some way to filter my scan? From what I read that filtering occurs after loading and the loading stops at 1mb so I wouldn't actually be able to scan in new objects.
Any assistance would be appreciated.
import boto3dynamodb = boto3.resource('dynamodb', aws_session_token = aws_session_token, aws_access_key_id = aws_access_key_id, aws_secret_access_key = aws_secret_access_key, region_name = region )table = dynamodb.Table('widgetsTableName')data = table.scan()