When I saw Enphase allows API connection to the data generated by the solar panels, I had to do something with it. This guide will describe how I setup and collect data to display on the Solar Data page.
Because I have a limit on Enphase’s API calls, I collect the data from my solar panels through the API two times a day via a scheduled Lambda function which stores it in a DynamoDB database. Then the Web page gathers and displays the data from DynamoDB through an API Gateway connected to a Lambda function. The API Gateway is called from AngularJS to display the charts.
Tools used in this project
- AWS - Lambda, DynamoDB, CloudWatch, API Gateway, IAM
- Enphase Developer API
- Python 2.7
- AWS CLI, AWS SDK for Python & Boto3
- AngularJS and Angular-Chart
- Code from GitHub: https://github.com/smyleeface/smylee_solar_data
Setup DynamoDB Schema with Python and Boto3
- Determined and scripted the table creation and schema for the DynamoDB. (This could’ve been done using a local DynamoDB, which I’ve done before, but I was too lazy to setup.)
Getting the Solar Panel Data From Enphase
- Create an account on http://developer.enphase.com.
- Choose a plan (Watt is the free use with some catches)
- Create a new application.
- Open the new application and copy the copy API Key for later use.
- Copy the Authorization URL into your browser and allow access.
- After allowing access, copy user id key it displays on the screen for later use.
- If you have an enlighten profile, you will see that application in the list of authorized apps.
- Get the data from the Enphase API using the code on GitHub: https://github.com/smyleeface/smylee_solar_data/blob/master/dynamo_create_and_put.py. To the code, add the API key and user id copied earlier in the URL. You will also need the system ID which is found in your enlighten profile, in the Enphase App, or on the Enphase device box.
- Store data into DynamoDB
Add a Lambda Function to Store Data
- Once the data was being stored correctly, copied the python script to the Lambda function.
- I created a new IAM role that allows connection to DynamoDB. The default lambda_dynamo role that AWS will allow CRUD of items in a DynamoDB table, but the script requires describing and creating table also. Modify the lambda_dynamo role to include the DynamoDB policy of “CreateTable” and “DescribeTable”.
- Once that is saved, add an Event to do a cron run at 11:55AM & 11:55PM PT daily.
Getting the Solar Panel Data from DynamoDB
- Using the code on GitHub: https://github.com/smyleeface/smylee_solar_data/blob/master/dynamo_get_data.py to pull all the data, daily, monthly and current. In order to test in terminal, you will need to have the AWS CLI configured with and access key and secret id of a user who has access to run queries against DynamoDB.
Add a Lambda Function to Get Data
- Once the data was being pulled correctly, copied the python script to the Lambda function.
- Test the Lambda function. The different payloads are commented in the dynamodb_data.py file, just change the date.
Connect API Gateway to Lambda
- Add a new GET Resource to an exiting or new API Gateway. Choose the Lambda function that gets the data from DynamoDB.
- Under “method request”, add two querystrings: filter & getType
Under “integration request” > body mapping templates, choose “When there are no templates defined” and add template code for application/json.
Test the API Gateway. The different payloads are commented in the dynamodb_data.py file, just change the date.
Deploy API. Get the URL for use in the later in AngularJS.
Call API Gateway with AngularJS
- Downloaded AngularJS, Angular-Chart, and Chart.js. Use the URL from the API Gateway to get the data from DynamoDB to the Angular-Charts. You can find the code on GitHub: https://github.com/smyleeface/smylee_solar_data , solar.html & solar_angular.js
- After some tinkering, finally got the charts to display. More bars will be added on the fly as more data is added to the database. See the updated solar data page.
- Originally was going to write a shell script to create the DynamoDB, but python is so much easier to write. Going down that path, I did find and used this really cool AWS CLI auto complete tool called aws-shell: https://github.com/awslabs/aws-shell.
- You cannot filter (e.g. begins_with) a DynamoDB HASH, only eq.
- Timezone! Everything on AWS uses UTC so you must specify a timezone.
- Event schedules in CloudWatch has a specific way of writing cron.
AWS Costs for creating this project
- AWS provides a lot of services for free for the first 12 months of sign up and beyond. More information can be found on AWS pricing pages.
- Cost should be less than $1.00 to create and use this project.
- Lambda pricing cost - Doesn’t start to charge until you’ve made 1 million requests and gone over 400,000 GB-Seconds (each billed separately.)
- API Gateway pricing cost - Different for some regions, but all regions are billed per million calls received, plus the cost of data transfer out, in gigabytes. Plus you only pay for what you use.
- CloudWatch Event Rule pricing cost - $1.00 per million custom events generated
- DynamoDB pricing - DynamoDB is estimating my database at $0.59 / month and 25 units of read capacity for a month per the free tier. Pricing on their page states, write throughput: $0.0065 per hour for every 10 units of write capacity (in this project 1 unit of Write Capacity 2x a day); read throughput: $0.0065 per hour for every 50 units of read capacity (3 units of read capacity per Web page view.)