A Python application to connect to Arlo cameras and create timelapses of construction sites or other scenes using the Arlo API, AWS Lambda and s3 for image storage.
- Features
- Development
- Project Structure
- Setup
- Quick Code Update & Redeployment Guide
- Downloading and Managing Images
- Creating Timelapses
- Acknowledgements & Inspiration
- License
- Using pyArlo
- Connect to Arlo cameras using credentials with MFA support
- Docker-based deployment for reliable Lambda compatibility
- Capture snapshots from cameras
- Schedule automatic snapshots via AWS Lambda and EventBridge
- Store images in S3 for easy timelapse creation
This project was developed using Cursor IDE's agent mode with Claude 3.7 Sonnet.
For all API interactions with the Arlo cameras, we reference the PyArlo-Docs.mdc file, which contains documentation of the PyArlo library functions and their usage. This ensures consistent and correct usage of the Arlo API across the codebase.
This project contains two main Python files:
-
lambda_function.py: The main function deployed to AWS Lambda that:
- Connects to Arlo using PyArlo with automated email-based 2FA (via IMAP)
- Captures a snapshot from your configured camera
- Uploads the snapshot to an S3 bucket with a timestamp
- Designed to run on a schedule via EventBridge
-
test_connect_arlo.py: A simple test script for local development that:
- Tests basic connectivity to your Arlo account
- Uses manual console-based 2FA (prompts you to enter the code)
- Verifies the camera can be found and accessed
- Checks basic camera information (battery, signal strength)
- Requests a test snapshot to confirm camera operation
- Python 3.6+
- Arlo account credentials
- Gmail account (for automated 2FA)
- AWS account (for Lambda and S3 setup)
-
Clone this repository:
git clone https://github.com/niklasingvar/arlo-timelapse.git cd arlo-timelapse -
Create a virtual environment:
python -m venv venv -
Activate the virtual environment:
source venv/bin/activate -
Install dependencies:
pip install -r requirements.txt -
Create a
.envfile with your Arlo credentials:USER_NAME=your_arlo_email PASSWORD=your_arlo_password CAMERA_NAME=your_camera_name # S3 only required for lambda_function.py and not test_connect_arlo.py) S3_BUCKET_NAME=my-arlo-timelapse-images # For automated 2FA via Gmail IMAP (required for lambda_function.py) TFA_USERNAME=your_gmail_address TFA_PASSWORD=your_gmail_app_passwordNote: The TFA credentials are only required for the full Lambda function test with automatic 2FA. The simple connectivity test will prompt you for the code manually.
This project provides two ways to test your setup locally:
-
Test Arlo connectivity with manual 2FA verification:
make test-arloThis runs the
test_connect_arlo.pyscript which will prompt you to enter a 2FA code from your email or phone. Use this first to verify your Arlo credentials work correctly. -
Test the full Lambda function locally (with automatic Gmail IMAP 2FA):
make runThis runs the
lambda_function.pyscript which uses automated IMAP-based 2FA and attempts to save a snapshot. This requires your TFA_USERNAME and TFA_PASSWORD to be properly configured in your .env file.
Since Lambda functions run without user interaction, you need an automated way to handle Arlo's 2FA requirements. The PyArlo library supports automatic 2FA code retrieval via IMAP:
-
IMAP in Gmail:
- Gmail now has IMAP permanently enabled by default (as of January 2025)
- No configuration needed for IMAP access
-
Create an App Password (if your Gmail uses 2FA):
- Go to your Google Account
- Go to "Security" > "2-Step Verification"
- Scroll down and click on "App passwords"
- Select "Mail" as the app and "Other" as the device (name it "ArloLambdaTimeLapse")
- Click "Generate"
- Copy the 16-character password that appears (you'll use this in Lambda)
-
Update Your Lambda Function Code:
- Modify the
lambda_function.pyto use IMAP for 2FA:
# In lambda_function.py arlo = PyArlo( username=username, password=password, tfa_type="email", tfa_source="imap", # Use IMAP to fetch codes from email tfa_host="imap.gmail.com", tfa_username=os.getenv("TFA_USERNAME"), # Gmail username tfa_password=os.getenv("TFA_PASSWORD"), # Gmail app password )
- Modify the
-
Add Environment Variables to Lambda:
- Go to AWS Lambda Console > Your Function > Configuration > Environment Variables
- Add these new variables:
TFA_USERNAME: Your Gmail address (e.g., yourname@gmail.com)TFA_PASSWORD: Your Gmail app password (the 16-character code)
-
Update Your .env File (for local testing):
USER_NAME=your_arlo_email PASSWORD=your_arlo_password CAMERA_NAME=your_camera_name S3_BUCKET_NAME=my-arlo-timelapse-images TFA_USERNAME=your_gmail_address TFA_PASSWORD=your_gmail_app_password -
Update Your Lambda Deployment:
- Redeploy your Lambda function using the build_and_deploy.sh script
To automate snapshot collection for timelapses, you can deploy this script to AWS Lambda and schedule it to run 6 times per day.
First, create an IAM user to manage your AWS resources from your local machine:
- Go to AWS IAM Console
- Click "Users" then "Create user"
- Set a name (e.g., "arlo-timelapse-admin")
- Enable "Provide user access to the AWS Management Console" (optional)
- Under "Set permissions", attach the following policies:
- AmazonS3FullAccess
- AWSLambda_FullAccess
- CloudWatchLogsFullAccess
- Complete the user creation
- On the success page, save the Access key ID and Secret access key (or download the CSV)
- IMPORTANT: This is the only time AWS will show you the secret key
- Download the CSV file or copy both keys immediately
- Store these securely as they provide access to your AWS account
If you already have an IAM user and need to create new access keys:
- Go to AWS IAM Console
- Click on "Users" and select your user
- Go to "Security credentials" tab
- Under "Access keys", click "Create access key"
- Choose "Command Line Interface (CLI)" and follow the prompts
Configure AWS CLI with your credentials:
aws configureEnter your Access Key ID, Secret Access Key, preferred region, and output format.
Output Format Options:
json(default) - Output in JSON formattext- Output in plain texttable- Output in ASCII table formatyaml- Output in YAML format
For most users, the default json format is recommended.
When prompted for a region, choose one closest to your physical location for best performance:
- North America:
us-east-1(Virginia),us-east-2(Ohio),us-west-1(California),us-west-2(Oregon) - Europe:
eu-west-1(Ireland),eu-central-1(Frankfurt),eu-west-2(London) - Asia Pacific:
ap-southeast-1(Singapore),ap-northeast-1(Tokyo),ap-southeast-2(Sydney)
Important: All your AWS resources (Lambda, S3, etc.) should be in the same region to avoid cross-region data transfer costs and latency.
Create an S3 bucket to store your snapshots:
aws s3 mb s3://my-arlo-timelapse-imagesOr through the AWS Console:
- Go to AWS S3 Console
- Click "Create bucket"
- Name your bucket (e.g.,
my-arlo-timelapse-images) - Choose your region and configure as needed
- Click "Create bucket"
You can prepare your Lambda deployment package in two ways:
This approach ensures binary compatibility with the Lambda environment, especially for libraries with compiled components.
-
Create a Dockerfile:
FROM public.ecr.aws/lambda/python:3.12 # Copy requirements file COPY requirements.txt . # Install dependencies with specific platform tag to ensure compatibility RUN python -m pip install --platform=manylinux2014_x86_64 --target="${LAMBDA_TASK_ROOT}" --implementation=cp --only-binary=:all: --upgrade -r requirements.txt # Copy function code COPY lambda_function.py ${LAMBDA_TASK_ROOT} # Set the CMD to your handler CMD [ "lambda_function.lambda_handler" ] -
Create a deployment script (build_and_deploy.sh):
#!/bin/bash set -e echo "Building Docker image..." docker build -t arlo-lambda . echo "Creating container to extract files..." docker create --name arlo-temp arlo-lambda:latest mkdir -p docker-lambda echo "Copying Lambda package files from container..." docker cp arlo-temp:/var/task/. docker-lambda/ docker rm arlo-temp echo "Creating deployment package..." cd docker-lambda zip -r ../lambda_deployment_docker.zip . cd .. echo "Updating Lambda function..." aws lambda update-function-code \ --function-name arlo-snapshot-function \ --zip-file fileb://lambda_deployment_docker.zip echo "Deployment completed"
-
Make the script executable and run it:
chmod +x build_and_deploy.sh ./build_and_deploy.sh
This Docker-based approach ensures that native libraries and dependencies are compiled correctly for the Lambda environment, avoiding compatibility issues with libraries like cryptography.
Note: This is different from the IAM user "arlo-timelapse-admin" created earlier. The user is for your CLI access, while this role is what the Lambda service itself will use to access S3.
Create a role that gives Lambda permission to access S3:
- Go to IAM Roles
- Click "Create role"
- Select "AWS service" as the trusted entity and "Lambda" as the service
- Attach these policies:
- AWSLambdaBasicExecutionRole (for CloudWatch Logs)
- AmazonS3FullAccess (or create a custom policy for just your bucket)
- Name the role (e.g., "arlo-lambda-s3-role") and create it
Before creating the Lambda function, you need your AWS account ID:
- Go to the AWS Management Console
- Click on your username in the top-right corner
- Your 12-digit account ID is shown in the dropdown menu
- Alternatively, run this command:
aws sts get-caller-identity
Create the Lambda function using AWS CLI:
NOTE: Replace YOUR_ACCOUNT_ID with your 12-digit AWS account ID and YOUR_REGION with your preferred region.
aws lambda create-function \
--function-name arlo-snapshot-function \
--runtime python3.9 \
--handler lambda_function.lambda_handler \
--role arn:aws:iam::YOUR_ACCOUNT_ID:role/arlo-lambda-s3-role \
--zip-file fileb://lambda_deployment.zip \
--timeout 30 \
--region YOUR_REGION \
--environment "Variables={USER_NAME=your_arlo_email,PASSWORD=your_arlo_password,CAMERA_NAME=your_camera_name,S3_BUCKET_NAME=my-arlo-timelapse-images}"Or through the AWS Console:
- Go to AWS Lambda Console
- Click "Create function"
- Select "Author from scratch"
- Name your function (e.g.,
arlo-snapshot-function) - Select Python runtime (3.9 or newer)
- Under "Permissions," select "Use an existing role" and choose the role you created
- Click "Create function"
- On the function page, upload your ZIP file (
lambda_deployment.zip)
In the Lambda function configuration:
- Go to "Configuration" > "Environment variables"
- Add the following variables:
USER_NAME: Your Arlo account emailPASSWORD: Your Arlo account passwordCAMERA_NAME: Name of your Arlo cameraS3_BUCKET_NAME: The name of your S3 bucketTFA_USERNAME: Your Gmail address used for receiving 2FA codesTFA_PASSWORD: Your Gmail app password (16-character code)
Note: For the TFA variables, refer to the "Two-Factor Authentication (2FA)" section below for detailed setup instructions.
-
Go to the EventBridge Console
-
Select "EventBridge Schedule" (not EventBridge Rule)
-
Click "Create schedule"
-
Configure the schedule:
- Name:
arlo-snapshot-schedule - For the cron expression, enter the following in the separate fields:
- Minutes:
0 - Hours:
8,10,12,14,16,18 - Day of month:
* - Month:
* - Day of the week:
? - Year:
*(This runs at 8am, 10am, 12pm, 2pm, 4pm, and 6pm)
- Minutes:
- Select your time zone from the dropdown
- Name:
-
Under Target details:
- Select "AWS Lambda function" as the target
- Select your "arlo-snapshot-function"
- Leave the default version/alias settings
- For execution role:
- Select "Create new role for this schedule"
- AWS will automatically create a role with the necessary permissions
- This role will have a trust relationship allowing EventBridge Scheduler to assume it
- It will also have permissions to invoke your Lambda function
Note: If you choose to use an existing role, it must have a trust relationship with EventBridge Scheduler (
scheduler.amazonaws.com) and permissions to invoke your Lambda function. -
Review and click "Create schedule"
This will create a schedule that automatically triggers your Lambda function at the specified times.
If you prefer to create a custom role instead of using the automatically generated one:
- Go to the IAM Console
- Select "Roles" → "Create role"
- For trusted entity, select "AWS service" and choose "EventBridge Scheduler"
- Attach the following policies:
AWSLambdaRole(or a custom policy that allowslambda:InvokeFunctionon your specific function)
- Name the role (e.g.,
EventBridge-Scheduler-Arlo-Role) and create it - Use this role when setting up your schedule
- Create a test event in the Lambda console
- Click "Test" to verify your function works
- Check your S3 bucket for the uploaded image
Check CloudWatch logs to see if your function is working:
aws logs filter-log-events --log-group-name /aws/lambda/arlo-snapshot-functionOr visit the CloudWatch Logs console and select the log group for your function.
If you're returning to the project and need to update the Lambda function:
-
✅ Activate your virtual environment:
source venv/bin/activate # On macOS/Linux -
✅ Make your code changes to
lambda_function.py -
✅ Test locally:
make run -
✅ Rebuild and deploy using Docker:
./build_and_deploy.sh -
✅ Test the function in AWS Lambda console:
- Go to AWS Lambda Console > Functions > arlo-snapshot-function
- Click the "Test" tab at the top
- Create a new test event if needed (empty JSON
{}is fine) - Click "Test" button and wait for execution to complete
- Check the execution results for success (status code 200)
-
✅ Verify in CloudWatch logs:
- Go to CloudWatch Console > Log groups > /aws/lambda/arlo-snapshot-function
- Open the most recent log stream
- Look for success messages like "Successfully uploaded snapshot_*.jpg to S3"
- Check for any errors if the function failed
List all images in your bucket:
aws s3 ls s3://my-arlo-timelapse-images/Download all images to your local machine:
aws s3 sync s3://my-arlo-timelapse-images/ ./timelapse-images/Once you have collected images in your S3 bucket, you can create a timelapse:
First, install FFmpeg:
# macOS
brew install ffmpeg
# Ubuntu/Debian
sudo apt-get install ffmpeg
Then create the timelapse:
cd timelapse-images
ffmpeg -framerate 10 -pattern_type glob -i '*.jpg' -c:v libx264 -pix_fmt yuv420p timelapse.mp4
This project was inspired by arlo-timelapse-lambda by Notalifeform, which provides a similar approach to creating timelapses with Arlo cameras. While no code was used from that project, it provided valuable insights into working with Arlo cameras and AWS Lambda integration.
Special thanks to the developers of the PyArlo library, which makes interfacing with Arlo cameras possible. The PyArlo-Docs.mdc file has been an invaluable reference for understanding and implementing the Arlo API functionality in this project.