I tried Serverless framework with Python+AWS Lambda.
Prerequirement Node.js and npm:
$ node -v v10.19.0 $ npm -v 7.5.2 Install I created my serverless account with Google SSO:
sudo npm install -g serverless Hello world Create a project:
$ serverless Serverless: No project detected. Do you want to create a new one? Yes Serverless: What do you want to make? AWS Python Serverless: What do you want to call this project?
Motivation If you use multi AWS accounts in your work environment, I hightly recommend to configure your “aws-cli profile” so that you can easily change your aws-cli account.
The only thing you need to do when you want to change your aws-cli environment is the option --profile. Very simple.
How to configure https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-profiles.html
Check under ~/.aws/. You can configure multi profile.
Pre-requirement I installed Docker because I wanted to use Docker executor for better isolation of environments.
Install GitLab Runner on the server Follow the official manual.
https://docs.gitlab.com/runner/install/linux-repository.html
On Ubuntu From repo (recommended) curl -L https://packages.gitlab.com/install/repositories/runner/gitlab-runner/script.deb.sh | sudo bash export GITLAB_RUNNER_DISABLE_SKEL=true; sudo -E apt install gitlab-runner Check.
$ sudo gitlab-runner status Runtime platform arch=amd64 os=linux pid=3163 revision=2ebc4dc4 version=13.9.0 gitlab-runner: Service is running! $ sudo gitlab-runner list Runtime platform arch=amd64 os=linux pid=3172 revision=2ebc4dc4 version=13.
Reference I found the good slide which contains good figures to understand AWS network.
https://de.slideshare.net/AmazonWebServicesLATAM/aws-vpc-fundamentals-webinar
Physical location From the slide 10/58.
Region consists of multi AZs (Availability Zone) An AZ consists of AZ-a, AZ-b, AZ-c, etc. AZx (x=a,b,c) consists of data centers The latency within the Region is ~2ms.
Logical network VPC: A private network like 172.31.0.0/16 across AZs (AZa, AZb, AZc). VPC subnet: Each AZx is assiend a subnet, like 172.
How to think Linux volume system Here is the coprehensive image from Wikipedia.
From Wikipedia There are several physical volumes (PVs) in a server. Linux group the physical volumes in a volume group (a VG). Each of PVs contains physical partitions (PPs). We can aggregate the PVs in a logical volume (a LV). On a LV, we can decide a file system (FS). From Linux side, we can mount a Linux directory to the FS.
First thing you should decide We should decided “User pool” or “Identity pool”.
Here is the official blog post about the differences.
https://aws.amazon.com/de/premiumsupport/knowledge-center/cognito-user-pools-identity-pools/
In a nut shell, User pools are for authentication (identify verification), and Identity pools are for authorization (access control).
I’ll try an User pool.
Future scope: integrate with AppSync.
Tutorial 1. Create an User pool I followed the link. Very easy.
https://docs.aws.amazon.com/cognito/latest/developerguide/tutorial-create-user-pool.html
Choose Manage User Pools. Manage User Pools.
Tutorial The URL I followed.
https://developer.mozilla.org/en-US/docs/WebAssembly/Rust_to_wasm
Pre-requirement You should install npm beforehand. To compile wasm-pack, apt install -y build-essential and install gcc. In case of Ubuntu, apt install -y libssl-dev pkg-config. Download wasm-pack To build the package, we need an additional tool, wasm-pack. This helps compile the code to WebAssembly, as well as produce the right packaging for npm.
cargo install wasm-pack Write codes cargo new --lib hello-wasm cd hello-wasm src/lib.
Getting started Set up Go to AppSync Page. Create API. Getting Started. Customize your API or import from Amazon DynamoDB -> Create with wizard Click Start Create model Model Name: Atlex00Model Configure model fields: Name: uid, Type: ID, Required. Name: first_name: Type: String Name: last_name: Type: String Name: gender: Type: Int Name: age: Type: Int Name: email: Type: Email Configure model table (optional) Table Name: Atlex00ModelTable Primary Key: uid, Sort key: first_name Create resources API configuration API name: atlex00AppSync Your API is almost ready… Updating the schema.
Original source I followed the linke below.
https://hatemtayeb2.medium.com/hello-graphql-a-practical-guide-a2f7f9f70ab4
Install library pip -U install graphene Use it scheme.py
import graphene import json class Query(graphene.ObjectType): hello = graphene.String() def resolve_hello(self, info): return "world" schema = graphene.Schema(query=Query) result = schema.execute( ''' { hello } ''' ) data = dict(result.data.items()) print(json.dumps(data,indent=2)) Try the code.
$ python schema.py { "hello": "world" } GraphQL concepts Every GraphQL implementation needs a Query class that contains some fields, every field (not always) must have a resolver function that return data related to the field.