Google Cloud Storage for static website

Started by proofread, Oct 16, 2022, 08:17 AM

Previous topic - Next topic

proofreadTopic starter

Consider the task of web hosting a static site, for example, a Leaflet map with pre-calculated data on it or a static 3D model. For these purposes, you can use static file hosting on Google Cloud Storage. In addition, this method makes it very easy to restrict access to web site in the Google Cloud web interface by specifying the emails of users who are allowed access.
Due to Google CDN and file caching, you don't have to worry about handling a heavy load, and adding or deleting files is available using the gsutil console utility and in the Google Cloud web interface.
There is also no need to deal with obtaining and updating SSL certificates and creating domain names for them. In just a few console commands, a scalable and easily supported solution with data storage in the Google cloud and a guarantee of data protection is obtained.



Creating a site with limited access

Below are the necessary commands to create and configure web access for the bucketname bucket:

# create new bucket
$ gsutil mb gs://bucketname
# allow fine-granted access control for the bucket
$ gsutil uniformbucketlevelaccess set off gs://bucketname
# prepare OAuth wrapper
$ cat redirect.html
<html>
  <head>
    <meta http-equiv="Refresh" content="0; url=https://storage.cloud.google.com/bucketname/index.html">
  </head>
  Redirecting...
</html>
# copy the OAuth wrapper to the bucket
$ gsutil -m cp redirect.html gs://bucketname/
# make the OAuth wrapper publicly available
$ gsutil iam ch allUsers:legacyObjectReader gs://bucketname/redirect.html
# setup web site entry point for the bucket
$ gsutil web set -m redirect.html gs://bucketname/
# copy your data to the bucket
$ gsutil -m cp -r index.html gs://bucketname/
# prepare CORS configuration
$ $ cat cors.json
[
    {
      "origin": ["*"],
      "responseHeader": ["Content-Type"],
      "method": ["GET"],
      "maxAgeSeconds": 3600
    }
]
# set the CORS configuration for web site
$ gsutil cors set cors.json gs://bucketname


Attention: the method specified in the Google documentation does not work if you try to specify access rights to the file redirect.html
An error occurs in the Googl Cloud web interface for the allUsers user with ObjectViewer rights. Similarly, it is not possible to set allUsers:ObjectViewer rights via gsutil. The secret is that you need to use the old version of the API and set the rights of allUsers:legacyObjectReader

Now the link https://storage.googleapis.com/bucketname/redirect.html when it is opened in the browser, it redirects the user to the login page of Google accounts and after successful authorization returns to the page of the created static site index.html Links to script and style files must be specified absolute in the form https://storage.cloud .google.com/bucketname/filename or https://www.googleapis.com/download/storage/v1/b/bucketname/o/filename?alt=media .
In the latter case, it is necessary to encode a slash sign for subdirectories, as an example https://www.googleapis.com/download/storage/v1/b/bucketname/o/dirname%2Ffilename?alt=media

Site access control

Above, we created a site with restricted access. You can add new users both in the Google Cloud web interface and using the console command:

gsutil acl ch -u USER_EMAIL:objectViewer gs://BUCKET_NAME

To turn our web resource into a public one, it will be enough to allow all users access to all files:

gsutil acl ch -u allUsers:legacyObjectReader gs://BUCKET_NAME

After executing this command, any user will be able to access web site by following the link https://storage.googleapis.com/bucketname/index.html . If you want to shorten this link to https://storage.googleapis.com/bucketname / you can use the command

# setup site entry point for the bucket
$ gsutil web set -m index.html gs://bucketname/

You can also create a public site right away, without configuring CORS and OAuth.

Custom Domain

It is possible to use custom domains, for this in the DNS settings you need to specify c.storage.googleapis.com as a CNAME for bucketname on a custom domain. If this is not done in Google DNS, then when opening such an address, browsers will rightly swear at web site's non-corresponding SSL certificate.

Conclusion

Fortunately, in today's world of unpredictable loads, huge files and SSL certificates, there are still simple ways of static web  hosting.
There are other ways, including GitHub web Pages, which allow you to create a site based on the GitHub repository, see as an example Geological models in virtual/augmented reality (VR/AR) Amazon S3 can also be turned into a static site (I used to post similar projects there, but stopped using them because of their intentionally painful inconvenient work with regions).
  •  

jckdear

Cloud storage implies that both architecture and hardware are provided as a service that you pay for.
There are several options for implementing interaction with the cloud, for example, an integrated approach or VSAAS (Video Surveillance As A Service), i.e. in addition to the costs of video cameras, you do not bear the costs of video processing, analytics, and storage, but buy this complex from a conditional provider. Or a backup of the video archive from the local system to the cloud.
The advantages are that you have access to your system around the clock from anywhere in the world. The most obvious consumer of cloud systems is, perhaps, urban infrastructure, when video cameras are installed on a huge area and, nevertheless, are controlled from one place and are available for viewing online.
  •  

MOHAN

Before we start on the step-by-step guide to configuring Google Cloud, we must first make some preparations as Google Cloud requires more credentials than similar services. Ensure that you already have a working Google Cloud account (a payment option must have been added), and a domain that you have confirmed ownership of in the Search Console. A custom domain is necessary to take advantage of website functionality in Google Cloud storage (that is, working index.html redirects that redirect browsers from URLs to the specific directories).

When you have met all of these requirements, we may then move on to creating and configuring your bucket to work with Publii. Follow these steps to set-up Google Cloud:

Start off by opening your Google Cloud console at https://console.cloud.google.com/.
Make sure that your website project is selected in the drop-down list in the toolbar at the top of the screen, or create a new one if necessary.
Select Project

Now click on the hamburger menu on the left of the toolbar, and select the Storage option.
Google Cloud Storage

The storage screen should default to the Browser tab; on this page, click on the Enable billing button; this will allow your account to be charged for any usage over the complimentary amounts.
Enable billing

With billing enabled the Browser tab will change to display some new options; click on the Create bucket button to get started.
Creating Google Cloud backet

You'll now need to enter some details for your bucket. For the bucket name, use your previously-verified domain name. For the Default storage class option, choose either Regional or Multi-regional depending on whether you expect your visitors to be from across the world or just one particular region. Finally, click the Create button to finish creating the bucket.
Storage class multi-regional

With the bucket created we now need to create a JSON key file that Publii can use to access the bucket in order to upload or update your website files. In the Google Cloud console, click on the hamburger menu and and choose IAM & Admin → Service accounts option.
IAM and Admin

A Create service account form will appear; in the Service account name option enter a name of your choice (it will not affect functionality). Set the Role option to Storage Admin and check the Furnish a new private key checkbox, then click the Create button.
Service account management

Creating service account

The JSON file containing the key will be downloaded; make sure to keep it somewhere safe because it will not be possible to download it again later.
Service account and key created

To enable communication between Publii and Google Cloud we need to enable the Cloud Storage JSON API service; start by clicking on the hamburger menu and then on the API Manager option.
API Manager

In the API Manager screen, click on the Enable API link near the top of the page; you'll be taken to a list of APIs that you can enable, plus a search bar for ones not already visible.
Enable API

To find the API we need, Cloud Storage JSON API, click on the More link at the bottom of the list of Google Cloud APIs. Once you find it, click on it.
Google Storage JSON API

The settings screen for the API will load, but we don't need to change anything; just click the Enable button to the right of the name at the top of the page; with this done, configuration is complete.
Enable Google Cloud storage JSON API

Google Cloud has now been setup, so we just need to open Publii and configure the server settings to make sure it is able to connect and upload files to the bucket.
  •  

arthyk

Such cloud storage is certainly interesting for employees of various enterprises and / or just users who need to quickly get high-speed access to various types of databases / files. I especially think this platform is interesting in terms of getting access to various audio and video content in real time. People may only be interested in maintaining the declared access speeds in various geographic regions. Probably, the owner of such a 'hosting' should still test the service in target locations in advance. 8)
  •  

sigma-sem

Unlike AWS CloudFront, Cloud CDN is just a cache and requires a premium Google network layer. Since this is really something you can enable to improve performance, it requires a load balancer to connect to serve the content and provide most of the configuration.
This means that you will always pay at least $15per month for a load balancer, plus all applicable data transfer fees. This makes it impossible to create a "free S3 website" on GCP, although they have Firebase Hosting who can do the same job.
  •