Skip to content

merge from Feature/integration test fix #13

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 7 commits into from
Nov 20, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -68,6 +68,8 @@ workflows:
branches:
only:
- dev
- feature/es-segregation
- feature/integration-test-fix

# Production builds are exectuted only on tagged commits to the
# master branch.
Expand Down
5 changes: 4 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -113,4 +113,7 @@ dist
.yarn/unplugged
.yarn/build-state.yml
.yarn/install-state.gz
.pnp.*
.pnp.*

# api.env
api.env
66 changes: 56 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,27 +19,49 @@ The following parameters can be set in config files or in env variables:
- `BASE_PATH`: the server api base path
- `AUTH_SECRET`: The authorization secret used during token verification.
- `VALID_ISSUERS`: The valid issuer of tokens, a json array contains valid issuer.

- `AUTH0_URL`: Auth0 URL, used to get TC M2M token
- `AUTH0_AUDIENCE`: Auth0 audience, used to get TC M2M token
- `AUTH0_AUDIENCE_FOR_BUS_API`: Auth0 audience, used to get TC M2M token to be used in bus api client
- `TOKEN_CACHE_TIME`: Auth0 token cache time, used to get TC M2M token
- `AUTH0_CLIENT_ID`: Auth0 client id, used to get TC M2M token
- `AUTH0_CLIENT_SECRET`: Auth0 client secret, used to get TC M2M token
- `AUTH0_PROXY_SERVER_URL`: Proxy Auth0 URL, used to get TC M2M token

- `DATABASE_URL`: PostgreSQL database url.
- `DB_SCHEMA_NAME`: string - PostgreSQL database target schema
- `PROJECT_API_URL`: the project service url
- `TC_API`: the Topcoder v5 url
- `ORG_ID`: the organization id
- `HOST`: the elasticsearch host
- `ES_INDEX_JOB`: the job index
- `ES_INDEX_JOB_CANDIDATE`: the job candidate index
- `ES_INDEX_RESOURCE_BOOKING`: the resource booking index

- `esConfig.HOST`: the elasticsearch host
- `esConfig.ES_INDEX_JOB`: the job index
- `esConfig.ES_INDEX_JOB_CANDIDATE`: the job candidate index
- `esConfig.ES_INDEX_RESOURCE_BOOKING`: the resource booking index
- `esConfig.AWS_REGION`: The Amazon region to use when using AWS Elasticsearch service
- `esConfig.ELASTICCLOUD.id`: The elastic cloud id, if your elasticsearch instance is hosted on elastic cloud. DO NOT provide a value for ES_HOST if you are using this
- `esConfig.ELASTICCLOUD.username`: The elastic cloud username for basic authentication. Provide this only if your elasticsearch instance is hosted on elastic cloud
- `esConfig.ELASTICCLOUD.password`: The elastic cloud password for basic authentication. Provide this only if your elasticsearch instance is hosted on elastic cloud

- `BUSAPI_URL`: Topcoder Bus API URL
- `KAFKA_ERROR_TOPIC`: The error topic at which bus api will publish any errors
- `KAFKA_MESSAGE_ORIGINATOR`: The originator value for the kafka messages

- `TAAS_JOB_CREATE_TOPIC`: the create job entity Kafka message topic
- `TAAS_JOB_UPDATE_TOPIC`: the update job entity Kafka message topic
- `TAAS_JOB_DELETE_TOPIC`: the delete job entity Kafka message topic
- `TAAS_JOB_CANDIDATE_CREATE_TOPIC`: the create job candidate entity Kafka message topic
- `TAAS_JOB_CANDIDATE_UPDATE_TOPIC`: the update job candidate entity Kafka message topic
- `TAAS_JOB_CANDIDATE_DELETE_TOPIC`: the delete job candidate entity Kafka message topic
- `TAAS_RESOURCE_BOOKING_CREATE_TOPIC`: the create resource booking entity Kafka message topic
- `TAAS_RESOURCE_BOOKING_UPDATE_TOPIC`: the update resource booking entity Kafka message topic
- `TAAS_RESOURCE_BOOKING_DELETE_TOPIC`: the delete resource booking entity Kafka message topic


## PostgreSQL Database Setup
- Go to https://www.postgresql.org/ download and install the PostgreSQL.
- Modify `DATABASE_URL` under `config/default.js` to meet your environment.
- Run `npm run init-db` to create table
- Run `npm run init-db` to create table(run `npm run init-db force` to force creating table)

## ElasticSearch Setup
- Go to https://www.elastic.co/downloads/ download and install the elasticsearch.
Expand All @@ -52,17 +74,41 @@ The following parameters can be set in config files or in env variables:
- Install dependencies `npm install`
- Run lint `npm run lint`
- Run lint fix `npm run lint:fix`
- Clear and init db `npm run init-db`
- Clear and create es index `npm run delete-index && npm run create-index`
- Clear and init db `npm run init-db force`
- Clear and create es index

``` bash
npm run delete-index # run this if you already created index
npm run create-index
```

- Start app `npm start`
- App is running at `http://localhost:3000`

## Docker Deployment
- Run `docker-compose up`
## Local Deployment with Docker

Make sure all config values are right, and you can run on local successful, then run below commands

1. Navigate to the directory `docker`

2. Rename the file `sample.api.env` to `api.env`

3. Set the required AUTH0 configurations, PostgreSQL Database url and ElasticSearch host in the file `api.env`

Note that you can also add other variables to `api.env`, with `<key>=<value>` format per line.
If using AWS ES you should add `AWS_ACCESS_KEY_ID` and `AWS_SECRET_ACCESS_KEY` variables as well.

4. Once that is done, run the following command

```bash
docker-compose up
```

5. When you are running the application for the first time, It will take some time initially to download the image and install the dependencies

## Testing
- Run `npm run test` to execute unit tests
- Run `npm run cov` to execute unit tests and generate coverage report.

## Verification
Refer to the verification document [Verification.md](Verification.md)
Refer to the verification document [Verification.md](Verification.md)
55 changes: 30 additions & 25 deletions Verification.md
Original file line number Diff line number Diff line change
@@ -1,35 +1,40 @@
# Topcoder Bookings API

## Postman test
- Refer `ReadMe.md` to start the app and postgreSQL database
- Run `npm run init-db` to init db before testing.
- Run `npm run create-index` to create es index before testing
- start PostgreSQL and ElasticSearch
- Refer `README.md#Local Deployment` to start the app
- Import Postman collection and environment file in the `docs` folder to Postman and execute the scripts to validate the app from top to bottom.

## Note About Testing `/taas-teams` Endpoints
Before you run tests against the `taas-teams` endpoints, you should insert the dedicated test data by running `npm run test-data`.

## Unit test Coverage


63 passing (43s)
``` bash
96 passing (170ms)


File | % Stmts | % Branch | % Funcs | % Lines | Uncovered Line #s
----------------------------|---------|----------|---------|---------|-------------------
All files | 99.49 | 97.62 | 100 | 99.74 |
config | 100 | 100 | 100 | 100 |
default.js | 100 | 100 | 100 | 100 |
test.js | 100 | 100 | 100 | 100 |
src | 90.48 | 50 | 100 | 94.12 |
bootstrap.js | 90.48 | 50 | 100 | 94.12 | 18
src/common | 100 | 100 | 100 | 100 |
errors.js | 100 | 100 | 100 | 100 |
helper.js | 100 | 100 | 100 | 100 |
src/models | 100 | 92.86 | 100 | 100 |
Job.js | 100 | 100 | 100 | 100 |
JobCandidate.js | 100 | 100 | 100 | 100 |
ResourceBooking.js | 100 | 100 | 100 | 100 |
index.js | 100 | 80 | 100 | 100 | 29
src/services | 100 | 100 | 100 | 100 |
JobCandidateService.js | 100 | 100 | 100 | 100 |
JobService.js | 100 | 100 | 100 | 100 |
ResourceBookingService.js | 100 | 100 | 100 | 100 |
----------------------------|---------|----------|---------|---------|----------------------------
File | % Stmts | % Branch | % Funcs | % Lines | Uncovered Line #s
----------------------------|---------|----------|---------|---------|----------------------------
All files | 98.43 | 91.03 | 100 | 98.56 |
config | 100 | 100 | 100 | 100 |
default.js | 100 | 100 | 100 | 100 |
test.js | 100 | 100 | 100 | 100 |
src | 90.91 | 50 | 100 | 94.44 |
bootstrap.js | 90.91 | 50 | 100 | 94.44 | 18
src/common | 97.69 | 90.91 | 100 | 97.66 |
errors.js | 100 | 50 | 100 | 100 | 23
helper.js | 97.5 | 92.86 | 100 | 97.46 | 94,176,284
src/models | 100 | 92.86 | 100 | 100 |
Job.js | 100 | 100 | 100 | 100 |
JobCandidate.js | 100 | 100 | 100 | 100 |
ResourceBooking.js | 100 | 100 | 100 | 100 |
index.js | 100 | 80 | 100 | 100 | 29
src/services | 98.81 | 89.5 | 100 | 98.8 |
JobCandidateService.js | 98.77 | 88 | 100 | 98.77 | 37
JobService.js | 97.37 | 85.37 | 100 | 97.32 | 73,285,326
ResourceBookingService.js | 98.86 | 93.1 | 100 | 98.86 | 54
TeamService.js | 100 | 90.7 | 100 | 100 | 19,135-138,188-202,251,267
----------------------------|---------|----------|---------|---------|----------------------------
```
6 changes: 3 additions & 3 deletions app.js
Original file line number Diff line number Diff line change
Expand Up @@ -8,8 +8,8 @@ const config = require('config')
const express = require('express')
const cors = require('cors')
const HttpStatus = require('http-status-codes')
const logger = require('./src/common/logger')
const interceptor = require('express-interceptor')
const logger = require('./src/common/logger')

// setup express app
const app = express()
Expand Down Expand Up @@ -52,7 +52,7 @@ require('./app-routes')(app)
// The error handler
// eslint-disable-next-line no-unused-vars
app.use((err, req, res, next) => {
logger.logFullError(err, req.signature || `${req.method} ${req.url}`)
logger.logFullError(err, { component: 'app', signature: req.signature || `${req.method}_${req.url}` })
const errorResponse = {}
const status = err.isJoi ? HttpStatus.BAD_REQUEST : (err.status || err.httpStatus || HttpStatus.INTERNAL_SERVER_ERROR)

Expand Down Expand Up @@ -87,7 +87,7 @@ app.use((err, req, res, next) => {
})

const server = app.listen(app.get('port'), () => {
logger.info(`Express server listening on port ${app.get('port')}`)
logger.info({ component: 'app', message: `Express server listening on port ${app.get('port')}` })
})

if (process.env.NODE_ENV === 'test') {
Expand Down
28 changes: 27 additions & 1 deletion config/default.js
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@ module.exports = {
VALID_ISSUERS: process.env.VALID_ISSUERS || '["https://api.topcoder-dev.com", "https://api.topcoder.com", "https://topcoder-dev.auth0.com/", "https://auth.topcoder-dev.com/"]',
AUTH0_URL: process.env.AUTH0_URL,
AUTH0_AUDIENCE: process.env.AUTH0_AUDIENCE,
AUTH0_AUDIENCE_FOR_BUS_API: process.env.AUTH0_AUDIENCE_FOR_BUS_API,
TOKEN_CACHE_TIME: process.env.TOKEN_CACHE_TIME,
AUTH0_CLIENT_ID: process.env.AUTH0_CLIENT_ID,
AUTH0_CLIENT_SECRET: process.env.AUTH0_CLIENT_SECRET,
Expand All @@ -22,8 +23,33 @@ module.exports = {

esConfig: {
HOST: process.env.ES_HOST || 'http://localhost:9200',

ELASTICCLOUD: {
id: process.env.ELASTICCLOUD_ID,
username: process.env.ELASTICCLOUD_USERNAME,
password: process.env.ELASTICCLOUD_PASSWORD
},

AWS_REGION: process.env.AWS_REGION || 'us-east-1', // AWS Region to be used if we use AWS ES

ES_INDEX_JOB: process.env.ES_INDEX_JOB || 'job',
ES_INDEX_JOB_CANDIDATE: process.env.ES_INDEX_JOB_CANDIDATE || 'job_candidate',
ES_INDEX_RESOURCE_BOOKING: process.env.ES_INDEX_RESOURCE_BOOKING || 'resource_booking'
}
},

BUSAPI_URL: process.env.BUSAPI_URL || 'https://api.topcoder-dev.com/v5',
KAFKA_ERROR_TOPIC: process.env.KAFKA_ERROR_TOPIC || 'common.error.reporting',
KAFKA_MESSAGE_ORIGINATOR: process.env.KAFKA_MESSAGE_ORIGINATOR || 'taas-api',
// topics for job service
TAAS_JOB_CREATE_TOPIC: process.env.TAAS_JOB_CREATE_TOPIC || 'taas.job.create',
TAAS_JOB_UPDATE_TOPIC: process.env.TAAS_JOB_UPDATE_TOPIC || 'taas.job.update',
TAAS_JOB_DELETE_TOPIC: process.env.TAAS_JOB_DELETE_TOPIC || 'taas.job.delete',
// topics for jobcandidate service
TAAS_JOB_CANDIDATE_CREATE_TOPIC: process.env.TAAS_JOB_CANDIDATE_CREATE_TOPIC || 'taas.jobcandidate.create',
TAAS_JOB_CANDIDATE_UPDATE_TOPIC: process.env.TAAS_JOB_CANDIDATE_UPDATE_TOPIC || 'taas.jobcandidate.update',
TAAS_JOB_CANDIDATE_DELETE_TOPIC: process.env.TAAS_JOB_CANDIDATE_DELETE_TOPIC || 'taas.jobcandidate.delete',
// topics for job service
TAAS_RESOURCE_BOOKING_CREATE_TOPIC: process.env.TAAS_RESOURCE_BOOKING_CREATE_TOPIC || 'taas.resourcebooking.create',
TAAS_RESOURCE_BOOKING_UPDATE_TOPIC: process.env.TAAS_RESOURCE_BOOKING_UPDATE_TOPIC || 'taas.resourcebooking.update',
TAAS_RESOURCE_BOOKING_DELETE_TOPIC: process.env.TAAS_RESOURCE_BOOKING_DELETE_TOPIC || 'taas.resourcebooking.delete'
}
7 changes: 6 additions & 1 deletion config/test.js
Original file line number Diff line number Diff line change
@@ -1,3 +1,8 @@
module.exports = {
LOG_LEVEL: process.env.LOG_LEVEL || 'info'
LOG_LEVEL: process.env.LOG_LEVEL || 'info',
AUTH0_URL: 'http://example.com',
AUTH0_AUDIENCE: 'http://example.com',
AUTH0_AUDIENCE_FOR_BUS_API: 'http://example.com',
AUTH0_CLIENT_ID: 'fake_id',
AUTH0_CLIENT_SECRET: 'fake_secret'
}
2 changes: 1 addition & 1 deletion docker/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -14,4 +14,4 @@ RUN npm install
COPY . .

# Run the app
CMD [ "node", "app.js" ]
CMD [ "npm", "start" ]
11 changes: 11 additions & 0 deletions docker/docker-compose.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
version: '3'
services:
taas_api:
image: taas_api:latest
build:
context: ../
dockerfile: docker/Dockerfile
env_file:
- api.env
ports:
- "3000:3000"
9 changes: 9 additions & 0 deletions docker/sample.api.env
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
DATABASE_URL=<Database Url>
ES_HOST=<ES Host Endpoint>

AUTH0_URL=<AUTH0 URL>
AUTH0_AUDIENCE=<AUTH0 Audience>
AUTH0_AUDIENCE_FOR_BUS_API=<AUTH0 Audience For Bus Api>
TOKEN_CACHE_TIME=500000
AUTH0_CLIENT_ID=<AUTH0 Client ID>
AUTH0_CLIENT_SECRET=<AUTH0 Client Secret>
Loading