created Apr 23, 2019
Back in 2016, I began using AWS services Route 53, CloudFront, EC2, and S3.
This time, I only want to use S3 for a test website that uses a custom domain name. I may want to use HTTPS by using CloudFront.
Kinglet S3 bucket that does not use a domain name can be accessed at these two URLs.
http://scaup.org/notessat-dec-2-2017.text - markdown file that gets rendered into html on the client side by chrome and firefox markdown viewing browser extensions. I prefer to use .md extensions, but apparently, S3 is not configured to send the browser text/plain for .md files. I had to add that setting to the Nginx config on my Digital Ocean Droplet.
The setup might seem "easy" to me, but to non-tech people, no way. I think that non-tech people would try and then give up part way through and stick with social media.
I set up a GitHub repo for the test scaup.org website, which is a bad domain name, since I created a CMS app called Scaup a few years ago. I'm not using scaup.org to demo my Scaup CMS. We have viewed a lot of Lesser Scaup ducks this spring along the lakeshore.
I can create Markdown or HTML posts on my local machine, using a text editor or a more sophisticated editor. Then I can backup the content by committing it to the GitHub repo. I don't remember if the content can be copied from GitHub to S3. I have been uploading the content to S3 via the AWS S3 website. I could also use the S3 command line utility on my local machine. But that's a lot of command line work, and most non-tech people won't do that.
Non-tech people my fire up a nice editor, create the content, save it to a local folder. The Dropbox app installed on the local machine would copy the file to the user's Dropbox account. That's the backup process. Then the user could use the AWS S3 web interface to upload the file to the user's bucket that the domain name points to.
I think that it's important to have a copy on the local machine and then a copy on a service, such as GitHub or Dropbox, and then of course a copy on the S3 bucket that's used for the website. That's three locations. Maybe the local machine and the S3 bucket are enough.
- local editor to create or update and then save the post to the local machine.
- git commit
- s3 copy
- local editor saves file into folder that installed Dropbox app accesses and automatically copies file to Dropbox account
- AWS S3 web interface to upload file to bucket
Of course, if using GitHub somewhere in the pipeline, one could host the website at GitHub, using GitHub Pages, instead of hosting the site at S3. The author could create and update files at GitHub through GitHub's basic web browser editing interface. No need to store files locally nor anywhere else. Well, it might still be nice to store a copy elsewhere. Storing the files in only one location is risky.
With AWS console, I can modify the properties or meta info for individual S3 files, which permits me to set the content type to text/plain for files with .md extensions.
The process of using a local machine, Dropbox, and S3 is too messy when using IA Writer on a Chromebook. IA Writer creates and updates files at Dropbox, but I still need to manually download the file from Dropbox to the local Chromeboook Downloads folder, and then upload that file to the S3 bucket through the AWS web interface.
On my old Linux computer, I wanted to install the local Dropbox app, which would automatically store files from the local machine to the Dropbox server. I would then need only to upload the file manually to S3 through the AWS web interface.
But my Linux computer runs Python 2.7.5, and I need Python 3.7 to use the Dropbox app. I need to upgrade the Ubuntu on my computer in order to install Python 3.7, I think.
It ain't easy to use old or limited tech to manage a personal website without using a CMS. On my old Linux computer, however, I could use an old version of the Chrome web browser to post to Facebook if I had a Facebook account. Or on the same computer, I can use an updated version of Firefox to post to Facebook if I desired.
Facebook offers a limited version of its service at https://mbasic.facebook.com, which would work with old web browsers, running on old computers.
I'm struggling to determine a way to create web pages and manage a simple static website without using a CMS, and I want the files to exist in three places: local computer, Dropbox or GitHub, and S3.
I could host the website at GitHub and use S3 as the backup storage, instead of Dropbox.
I could host the website at S3 and use Dropbox as the backup storage, and leave GitHub out of the mix.
I could host the website at GitHub and backup files at Dropbox and ignore S3.
The easiest for me is to use GitHub and S3, and I would use both from the command line.
Last week, I created a command-line utility in Lua that creates HTML pages from the Markdown files. The utility uses simple template files that I created, such as header and footer. I would like to create this in Golang, since it would be easier to create an executable. The utility only needs to read the Markdown file and the templates and create the HTML by relying on the Markdown module for the programming language.
- edit file with Vim or some other text editor and type in Markdown and maybe HTML too when needed.
- use my utility to convert the Markdown into an HTML page.
- use git to commit the file to my GitHub repo.
- use the s3cmd command to copy the file to the S3 bucket.
A hook may exist that automatically fires when a file is committed to the GitHub repo. This hook would copy the file to the S3 bucket, saving me a command line function.
Connecting GitHub to S3 for automatic deployment seems a bit complicated. It might be easier to run git and s3cmd separately.
It's far easier to use a social media silo or a CMS-hosted solution, such as Blogger or WordPress without domain name mapping.
If I used only Dropbox and GitHub, I would still need something else to buy and mange the domain name purchase. At least with using AWS S3 to host the website or to backup the files, I can also use AWS Route53 to buy and manage the domain name.
I have experience with using AWS, S3, and GitHub. I have hardly any experience with using Dropbox. And right now, I can use my old Linux computer and my Chromebook with S3 and GitHub.
I can do this from the command line, but the goal would be to manage a static website by using a local text editor app and a web browser and never use the command line.
Baby steps. I need to figure out the simplest way to do this from the command line first. The hard part via the editor and the web browser only is how does the Markdown file get converted to HTML?
I think that using GitHub Pages to host the website is the best option. I could create and update Jekyll Markdown pages by using GitHub's simple web editing interface, or I could connect the nicer web editing experience hosted at Prose.io to my GitHub account.
Either way, the creates and edits are done through the web browser. GitHub Pages/Jekyll converts the Markdown files to HTML pages.
Then I could use AWS to buy and mange the domain name via Route53, and I could use AWS S3 to host the files as a backup.
I would use the web browser to do the following:
- create and edit GitHub Pages files
- save the Markdown pages to the local machine
- upload the Markdown files to the S3 bucket
I could use any computer. No need for a local install of git, Dropbox, and s3cmd.
I could still use a local editor if I desired and then copy and paste the markup into GitHub's or Prose.io's web editor.
GitHub Pages uses Jekyll, which is obviously a static site generator. Instead running a command locally to create HTML pages and then uploading or committing the HTML pages to some cloud server, I can create and update markup files on GitHub's servers directly through the web browser.
http://www.perchwire.com was a mirror of http://babyutoledo.com. I created babyutoledo.com in 2015, using my Grebe CMS app. I still manage the website, but I have not posted updates to perchwire.com in a couple years.
I created perchwire.com to test using GitHub Pages, no local git install, and Prose.io.
Here is the GitHub Pages repo:
One issue with using GitHub Pages is enabling TLS/SSL for the website. I think that CloudFlare can be used, but that is yet another service. It's not simple. It's not.
Here's the raw version of the .md file that I used for saving to my local machine.
https://s3.amazonaws.com/kinglet/2016-09-30-baby-university-budget.md - during the upload process, i set the permission for the file to be viewed publicly by all. after uploaded, i clicked "properties" for the file, and i modified the "Content-Type" for the file, changing it to text/plain.