I’ve decided to finally go back and revamp my original hugo setup which can be found in the Hugo Site Auto Generated by Lambda post. The issue with that previous setup was if I had multiple posts (markdown files), the S3 event would trigger the lambda function as many times as the number of files uploaded, when I only needed it to run once after all were uploaded. My new plan was to publish the markdowns to github and have a lambda function trigger to pull the files and put into S3.
- Description When I saw Enphase allows API connection to the data generated by the solar panels, I had to do something with it. This guide will describe how I setup and collect data to display on the Solar Data page. Because I have a limit on Enphase’s API calls, I collect the data from my solar panels through the API two times a day via a scheduled Lambda function which stores it in a DynamoDB database.
- Description When a custom Slack command is triggered, the corresponding command will change the state of a GitHub pull request. A magic night project provided by AWS User Group Hosted by MindTouch. Tools used in this project Slack: smylee.Slack.com GitHub: GitHub.com/smyleeface/shiny-palm-tree Amazon API Gateway, Lambda, IAM > Roles, Encryption Keys AWS CLI Prerequisites GitHub Pull Request Alert on Slack (Magic Night Project - Part 1) The AWS CLI installed with a user and valid API key with an authorized user to run the kms command before continuing.
- Description When a GitHub pull request is made, it will trigger an alert on Slack. A magic night project provided by AWS User Group Hosted by MindTouch. Tools used in this project Slack: smylee.slack.com GitHub: github.com/smyleeface/shiny-palm-tree Amazon API Gateway & Lambda Slack Setup Setup a Slack team if you don’t have one to use. Create a new channel (or skip this step to use an existing channel that the messages will appear.
- So now that my site is using Hugo for static HTML file generation, S3 to store the files, lambda to trigger the Hugo update, and CloudFront to serve images, the next step is to upload files to S3 directly from the browser. Thanks to my buddy Robert, he got me started and helped me get the API Gateway setup. HTML page with input type of file with evaporate. Include the aws_key, bucket: and aws_url.