Skip to content

Commit 615366c

Browse files
authored
Merge pull request #43 from RisingStack/master
feat: replace api key with personal access token
2 parents d9413be + 1ca39ef commit 615366c

File tree

7 files changed

+114
-20
lines changed

7 files changed

+114
-20
lines changed

Dockerfile

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,6 @@ COPY . .
77

88
RUN npm i --production
99

10-
EXPOSE 3001
10+
EXPOSE 8080
1111

1212
CMD ["node", "."]

README.md

Lines changed: 78 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -35,23 +35,91 @@ $ npm start
3535

3636
## For some basic source code explanation see the [wiki page](https://github.com/nditech/CyberSim-Backend/wiki)
3737

38-
## AWS environment:
38+
# CyberSim-Backend Deployment Guide
3939

40-
### The AWS environment supports both continuous integration and continuous deployment. The environment is made of the following components:
40+
- The CyberSim Game comprises two distinct applications: a Node.js-based backend API and a React-based frontend UI. This guide specifically covers the deployment process for the Node.js-based backend API. For instructions on deploying the frontend application, please refer to the [CyberSim-UI README](https://github.com/nditech/CyberSim-UI#readme).
4141

42-
- **Github repository (nditech/CyberSim-Backend)**: Each change on the local repositories are pushed to a new branch in the Github remote. Once these changes are reviewed, they are merged into the `master` branch.
42+
1. [Set up the Elastic Beanstalk environment](#elastic-beanstalk-eg-ndicybersimgame-env)
43+
2. [Set up the CodePipeline](#codepipeline-eg-ndicybersim-backend)
4344

44-
- **CodePipeline (ndi-cybersim-backend-staging, ndi-cybersim-backend-prod)**: A CodePipeline project is created for both staging and production environments. For the staging environment a webhook is registered on Github, so each change on the `master` branch will trigger and automatic build on AWS. For the production environment no webhook is registered so changes on the `master` branch will require a manual release in CodePipeline.
45+
## Environment Component Naming Convention
4546

46-
- **CodeBuild (ndi-cybersim-backend)**: A single CodeBuild project is created to build and test both staging and production changes. After a change is triggered on CodePiepeline (manually or automatically), the source code is transferred to CodeBuild and the steps defined in the `buildspec.yml` file are executed. These steps include the installation of node, npm packages and Postgres on a virtual machine and the execution of predefined database related tests. Once these tests are run, CodePiepeline begins the deployment.
47+
The environment component name follows this format: `<ACCOUNT_ALIAS>@<COMPONENT_NAME>`.
4748

48-
- **Elastic Beanstalk (NdiCybersimBackend-staging, NdiCybersimBackend-prod)**: In Elastic Beanstalk (EB) a different environment is created for both staging and production. In both environments, the node application is running inside a docker container so an existing Dockerfile inside the source code is necessary for EB. The docker image is created using the Dockerfile in the root. Once the deployment is compleated, the API will be live. For EB the following environment variables must be set (These variables can be set seberatly for each EB environment under the Configuration/Software Tab of the AWS Console.):
49+
## GitHub Repository (nditech/CyberSim-UI)
4950

50-
- **PORT**: The PORT must match the port exposed in the docker container which currently is **3001**.
51-
- **NODE_ENV**: Must be either `production`, `development` or `test`. If the given value is `test` the server will reset the Postgres Database on each restart.
52-
- **DB_URL**: The connection string for the Postgres Database which is the following: postgres://<USERNAME>:<PASSWORD>@<HOST>:<PORT>/<DB_NAME>
51+
All local repository changes are pushed to new branches in the GitHub remote repository. These changes are reviewed and merged into the 'master' branch.
5352

54-
- **RDS (aa1p6s0h1so0mf9)**: A Postgres Database is created to store the data for both staging and production environments The default name of the RDS database is **ebdb**. This DB is only available for the EC2 instance running inside the EB environment. In order to connect to the DB from a different client, port forwarding must be configured. In order to create an ssh tunnel and forward the traffic from the DB to locahost run to following commands:
53+
## CodePipeline (e.g., ndi@Cybersim-backend)
54+
55+
A separate CodePipeline project is created for each production environment. The pipeline consists of 2 stages:
56+
57+
### SOURCE
58+
59+
1. Set the _Source provider_ to "GitHub (Version 2)" and connect to the following repository: [nditech/CyberSim-Backend](https://github.com/nditech/CyberSim-Backend). Branch name should be `master`.
60+
2. Keep the _Start the pipeline on source code change_ option checked to trigger automatic builds on AWS whenever changes are pushed to the `master` branch. (Troubleshooting: Manually trigger a release in CodePipeline)
61+
3. Leave the _Output artifact format_ on the default "CodePipeline default".
62+
63+
### BUILD
64+
65+
Skip the build stage. It's not needed.
66+
67+
### DEPLOY
68+
69+
1. Set the _Deploy provider_ to "AWS Elastic Beanstalk".
70+
2. [Select the Cybersim application and the environment you've just created.](#elastic-beanstalk-eg-ndicybersimgame-env)
71+
72+
## Elastic Beanstalk (e.g., ndi@Cybersimgame-env)
73+
74+
In Elastic Beanstalk (EB) a different environment is created for each game instance. In each environment, the node application is running inside a docker container so an existing Dockerfile inside the source code is necessary for EB. The docker image is created using the Dockerfile in the root. Once the deployment is compleated, the API will be live.
75+
76+
To create a new Elastic Beanstalk environment follow these steps:
77+
78+
1. For _Environment tier_ select "Web server environment".
79+
2. _Environment name_ and _domain_ are arbitrary. This domain is required in the [frontend setup](https://github.com/nditech/CyberSim-UI#readme).
80+
3. For _Platform_ select a managed Docker platform. Recommended: "Docker running on 64bit Amazon Linux 2@3.6.1"
81+
4. You might want to deploy an existing version right away, you can do it under the _Application code_ setting.
82+
5. Just leave the _Preset_ on "Single instance".
83+
6. For _Service role_ settings:
84+
85+
- Existing service role --> "aws-elasticbeanstalk-service-role"
86+
- Add your SSL key for easier access to the EC2 instance. You can create a new "EC2 keypair" under the "EC2 service console / Network & security / Key pairs".
87+
- EC2 instance profile --> "aws-elasticbeanstalk-ec2-role"
88+
89+
7. VPC is not required.
90+
8. Set the _Public IP adress_ to "Activated"
91+
9. For _Database_ select "Enabled" and follow [these steps](#rds-eg-ndiaa1jiteus75zy8h)
92+
10. For the _Instances_ and _Capacity_ settings leave everything on default.
93+
11. Set the _Monitoring_ to "Basic".
94+
12. _Managed updates_ to "false".
95+
13. For _Rolling updates and deployments_ everything on default.
96+
14. Under the _Platform software_ settings you must set these 3 environment variables. You can modify these variables later at any time under the Configuration/Software Tab of the AWS Console:
97+
98+
- **PORT**: The PORT must match the port exposed in the docker container which currently is **8080**.
99+
- **NODE_ENV**: Must be either `production`, `development` or `test`. If the given value is `test` the server will reset the Postgres Database on each restart.
100+
- **DB_URL**: The connection string for the Postgres Database which is the following: postgres://<USERNAME>:<PASSWORD>@<HOST>:<PORT>/<DB_NAME>. You can find the <HOST>:<PORT> information about the DB under the "Connectivity & security / Endpoint (/Port)" tab of the databases RDS page.
101+
102+
15. Optional environment variables:
103+
104+
- **MIGRATION_PASSWORD**: Required for creating a migration on the frontends "Migration" page. Defaults to "secret".
105+
- **LOG_LEVEL**: Only for the `test` NODE_ENV. Defaults to "error".
106+
107+
## RDS (e.g., ndi@aa1jiteus75zy8h)
108+
109+
A Postgres Database is created to store the data for each environment. When creating a DB for an Elastic Beanstalk environment, by default the database created will get a name like this `awseb-e-<Environment ID>-...`. The default name of the RDS database is **ebdb**.
110+
111+
Database settings:
112+
113+
1. _Engine_ - "postgres"
114+
2. _Version_ - "15.3"
115+
3. _Instance class_ - "db.t3.micro"
116+
4. _Storage_ - "20GB"
117+
5. _Username_ - arbitrary, required in the "DB_URL" environment variable
118+
6. _Password_ - arbitrary, required in the "DB_URL" environment variable
119+
7. _Availability_ - "Low"
120+
8. _Deletion policy_ - "Snapshot"
121+
122+
This DB is only available for the EC2 instance running inside the EB environment. In order to connect to the DB from a different client, port forwarding must be configured. In order to create an ssh tunnel and forward the traffic from the DB to locahost run to following commands:
55123

56124
```
57125
# Create the tunnel using ssh. Please note that you need the private key for this command.

index.js

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ const checkEnviroment = async () => {
2626

2727
checkEnviroment()
2828
.then(() => {
29-
const port = process.env.PORT || 3001;
29+
const port = process.env.PORT || 8080;
3030
const http = createServer(app);
3131
createSocket(http);
3232
const server = http.listen(port, () => {

knexfile.js

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,5 +2,8 @@ require('dotenv').config();
22

33
module.exports = {
44
client: 'pg',
5-
connection: process.env.DB_URL,
5+
connection: {
6+
connectionString: process.env.DB_URL,
7+
ssl: { rejectUnauthorized: false },
8+
},
69
};

src/app.js

Lines changed: 13 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -71,15 +71,15 @@ app.get('/curveballs', async (req, res) => {
7171
});
7272

7373
app.post('/migrate', async (req, res) => {
74-
const { password, apiKey, tableId } = req.body;
74+
const { password, accessToken, tableId } = req.body;
7575
if (password === config.migrationPassword) {
7676
try {
77-
await migrate(apiKey, tableId);
77+
await migrate(accessToken, tableId);
7878
res.send();
7979
} catch (err) {
8080
if (err.error === 'AUTHENTICATION_REQUIRED') {
8181
res.status(400).send({
82-
apiKey: 'Invalid airtable api key',
82+
accessToken: 'Invalid airtable access token',
8383
});
8484
} else if (err.error === 'NOT_FOUND') {
8585
res.status(400).send({
@@ -92,6 +92,16 @@ app.post('/migrate', async (req, res) => {
9292
message: err.message,
9393
errors,
9494
});
95+
} else if (err.error === 'NOT_AUTHORIZED') {
96+
res.status(400).send({
97+
validation: true, // Set this to "true" to show it in an alert box on the frontend.
98+
message: err.message,
99+
errors: [
100+
{
101+
message: `An authorization error has occurred! Please make sure that the base ID and the required personal access token scope settings are correct!`,
102+
},
103+
],
104+
});
95105
} else {
96106
console.log(500);
97107
res.status(500).send({

src/util/errors.js

Lines changed: 13 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,9 @@
1+
function transformSchemaError(schemaError) {
2+
return {
3+
message: `A schema error occurred when querying the ${schemaError.tableName} table. Please check if the table is set correctly!`,
4+
};
5+
}
6+
17
function transformValidationError(validationError) {
28
return validationError.errors.map((error) => {
39
const [id, message] = error.split('.');
@@ -11,7 +17,13 @@ function transformValidationError(validationError) {
1117

1218
function transformValidationErrors(validationErrors) {
1319
if (Array.isArray(validationErrors)) {
14-
return validationErrors.map(transformValidationError).flat();
20+
return validationErrors
21+
.map((validationError) =>
22+
validationError.validation
23+
? transformValidationError(validationError)
24+
: transformSchemaError(validationError),
25+
)
26+
.flat();
1527
}
1628
return transformValidationError(validationErrors);
1729
}

src/util/migrate.js

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -48,6 +48,7 @@ function fetchTable(base, tableName) {
4848
},
4949
function done(err) {
5050
if (err) {
51+
err.tableName = tableName;
5152
reject(err);
5253
} else {
5354
validate(airtableSchemas[tableName], allFields, tableName)
@@ -73,16 +74,16 @@ function addPartyLocation(locations) {
7374
: locations?.[0];
7475
}
7576

76-
async function migrate(apiKey, baseId) {
77+
async function migrate(accessToken, baseId) {
7778
// connect to the airtable instance
7879
Airtable.configure({
7980
endpointUrl: 'https://api.airtable.com',
80-
apiKey,
81+
apiKey: accessToken,
8182
});
8283

8384
const base = Airtable.base(baseId);
8485

85-
// do a starting "fake" fetch to check if the API key and table id are correct
86+
// do a starting "fake" fetch to check if the personal access token and table id are correct
8687
await fetchTable(base, 'handbook_categories');
8788

8889
// define arrays for junctions tables that must be added at the end of the migration

0 commit comments

Comments
 (0)