Skip to content

Develop #24

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 7 commits into from
May 24, 2019
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .eslintignore
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
coverage
14 changes: 14 additions & 0 deletions .eslintrc
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
{
"parserOptions": {
"ecmaVersion": 2018
},
"env": {
"mocha": true,
"node": true,
"es6": true
},
"extends": "eslint:recommended",
"rules": {
"no-trailing-spaces": "error"
}
}
81 changes: 7 additions & 74 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Topcoder - Submission Legacy Processor Application
# Topcoder - Legacy Submission Processor Application
----------------------

## Requirements
@@ -29,81 +29,14 @@ You can update the configuration file or set values to the corresponding environ
- `AUTH0_URL` auth0 url
- `AUTH0_AUDIENCE` auth0 audience
- `TOKEN_CACHE_TIME` auth0 token cache time
- `AUTH0_PROXY_SERVER_URL` auth0 proxy server url
- `AUTH0_CLIENT_ID` auth0 client id
- `AUTH0_CLIENT_SECRET` auth0 client secret
- `CHALLENGE_INFO_API` The challenge info api template with {cid} gets replaced with challenge id
- `CHALLENGE_SUBTRACK` The sub track of marathon match challenge
- `MM_CHALLENGE_SUBTRACK` The sub track of marathon match challenge

`./config/production.js`, `./config/staging.js`, `./config/test.js` will use same configuration variables as `./config/default.js` except `./config/test.js` will have new configurations for test only.
- `MOCK_SUBMISSION_API_PORT` The mock submission api port
- `MOCK_SERVER_PORT` The mock server port for challenge api

`./config/mock.js` will use same configuration variables as `./config/default.js` except
- `MOCK_SERVER_PORT` The mock server port for challenge api

`./test/test_files/sqlParams.json` will load necessary sql params used in test, it will only work if you run `./test/sql/test.sql`.

> NOTE: ALL COMMANDS BELOW EXECUTE UNDER ```<legacy-sub-procecssor>``` directory
`./config/production.js`, `./config/staging.js`, `./config/test.js` will use same configuration variables as `./config/default.js` except `./config/test.js` will have new configurations for test only:
- `MOCK_API_PORT` The mock server port for submission && challenge api

To build the application you must set the `DB_SERVER_NAME` environment variable. This variable holds the database hostname.

## Build Application Docker Image
We only need to do this once
```bash
export DB_SERVER_NAME=informix
docker-compose build lsp-app
```

## Run Kafka and Create Topic (if running kafka locally)

Build Kafka image:
```bash
docker-compose build kafka
```

Run Kafka server:
```bash
docker-compose up -d kafka
docker exec -ti kafka bash -c "kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic submission.notification.create"
docker exec -ti kafka bash -c "kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic submission.notification.update"
```

## Install App Requirements
```bash
export DB_SERVER_NAME=informix
docker-compose up lsp-app-install
```

## Deployment
```bash
export DB_SERVER_NAME=informix
docker-compose up lsp-app
```

## Running Test
- Make sure you're running a clean database (you can take down tc-informix container and then up it again)
- Stop `legacy-sub-processor` application if it was running
- Install test data and you may start service `tc-informix` with command `docker-compose up tc-informix` if not started
```bash
docker cp test/sql/test.sql iif_innovator_c:/
docker exec -ti iif_innovator_c bash -c "source /home/informix/ifx_informixoltp_tcp.env && dbaccess - /test.sql"
```
- Run kafka container (and create topic if you haven't do this before)
- Run (you can use `docker-compose run lsp-app-test` too and it is more suitable for test command)
```bash
docker-compose up lsp-app-test
```

## Docker Build

```bash
heroku login
heroku create
heroku container:push web --arg servername=<DATABASE_SERVER>
heroku container:release web
```

## Standard Code Style

- Check code style `npm run lint`
- Check code style with option to fix the errors `npm run lint:fix`
## Validation
Follow the steps in [Validation.md](Validation.md)
130 changes: 71 additions & 59 deletions Validation.md
Original file line number Diff line number Diff line change
@@ -1,91 +1,103 @@
# Topcoder - Submission Legacy Processor Application - Verification
# Topcoder - Legacy Submission Processor Application - Verification
------------
> NOTE: ALL COMMANDS BELOW EXECUTE UNDER ```<legacy-sub-procecssor>``` directory
Please check [docs/Validation.md](/docs/Validation.md) and [docs/Verification_with_DB.md](/docs/Verification_with_DB.md).

Please note currently you have to verify application with database or please check [docs/Verification_with_DB.md](/docs/Verification_with_DB.md) only.

I would recommend you to verify and test with docker otherwise you need to check related Dockerfile to have better understanding about how to setup environment properly(not recommend).
## Run Kafka and Create Topic

Build Kafka image:
```bash
docker-compose build kafka
```

## Topcoder - Marathon Match - Legacy Processor
Please verify under linux or osx and I test under Ubuntu 18.04 and OSX 12 and windows may have issues to verify with docker.
Run Kafka server:
```bash
docker-compose up -d kafka
docker exec -ti kafka bash -c "kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic submission.notification.create"
docker exec -ti kafka bash -c "kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic submission.notification.update"
```

Please check README.md and ensure you could run tests in docker environment successfully and you can check coverage folder to ensure
`src/services/MarathonSubmissionService.js`,`src/services/NonMarathonSubmissionService.js`,`src/services/LegacySubmissionIdService.js` are fully tested.
## Run Informix and Insert Test Data
Make sure you're running a clean database (you can take down and remove iif_innovator_c container and then up it again)
```bash
export DB_SERVER_NAME=informix

## Setup data in direct
Only necessary MM challenge related test data updated in `./test/sql/test.sql` and you can also setup complete test data using direct application.
docker kill iif_innovator_c
docker rm iif_innovator_c

You can follow [docs/Verification_with_DB.md](/docs/Verification_with_DB.md) to setup MM challenge in direct app, choose Marathon Match under Data menu during creating challenge and create new project with billing account if error to save mm challenge, in last step you have to create as draft challenge,add user as Submitter and get match submission phase id.
docker-compose up -d tc-informix

Currently tc-direct have [issue to save Match Round ID](https://github.com/appirio-tech/direct-app/issues/341) or you may see such logs from command `docker-compose logs tc-direct` if you want to save in page and refresh page.
```bash
| 07:48:37,097 ERROR [ExceptionMappingInterceptor] Invalid action class configuration that references an unknown class named [saveDraftContestAction]
tc-direct_1 | java.lang.RuntimeException: Invalid action class configuration that references an unknown class named [saveDraftContestAction]
docker logs iif_innovator_c
# When you see log like following then informix is started:
# starting rest listener on port 27018
# starting mongo listener on port 27017
```

So you have to run such sql and replace <mm challenge id> with your new created mm challenge id.
**Then insert test data (which will be used by Unit Tests step and Verification step)**:
```bash
database tcs_catalog;
INSERT INTO informix.project_info
(project_id, project_info_type_id, value, create_user, create_date, modify_user, modify_date)
VALUES(<mm challenge id>, 56, 2001, '132456', current, '132456', current);
docker cp test/sql/test.sql iif_innovator_c:/
docker exec -ti iif_innovator_c bash -c "source /home/informix/ifx_informixoltp_tcp.env && dbaccess - /test.sql"
```

Even you run sql direct app will still fail to show details page with such error
```bash
java.lang.NullPointerException
tc-direct_1 | at com.topcoder.direct.services.view.action.analytics.longcontest.MarathonMatchHelper.getMarathonMatchDetails(MarathonMatchHelper.java:115)
tc-direct_1 | at com.topcoder.direct.services.view.action.contest.launch.GetContestAction.executeAction(GetContestAction.java:476)
tc-direct_1 | at com.topcoder.direct.services.view.action.BaseDirectStrutsAction.execute(BaseDirectStrutsAction.java:305)
tc-direct_1 | at sun.reflect.GeneratedMethodAccessor553.invoke(Unknown Source)

## Build Application Docker Image
We only need to do this once
```bash
export DB_SERVER_NAME=informix
docker-compose build lsp-app
```

Please check https://github.com/appirio-tech/direct-app/blob/dev/src/java/main/com/topcoder/direct/services/view/action/contest/launch/GetContestAction.java#L502

Even latest direct codes will comment out related codes to solve this issue so just leave direct page there.

## Run Legacy Submission Proc. app

Make sure related services started and test data prepared and start app with `NODE_ENV=mock` to mock challenge api otherwise new created mm challenge will still consider as non mm challenge
## Install App dependencies
```bash
export NODE_ENV=mock
export DB_SERVER_NAME=informix
docker-compose up lsp-app
rm -rf node_modules && docker-compose run lsp-app-install
```

## Send Test data
From previous data setup I got:
- challengeId = 40005570
- memberId = 132458 (user)
- submissionPhaseId = 100024

Let's send mm submission with example = 0 event to kafka:
**Note**, if the ***legacy-processor-module*** is changed locally (e.g. during local dev and not pushed to git yet), then you need to delete it from *node_modules* and copy the local changed one to *node_modules*:

```bash
docker exec -ti lsp-app bash -c "npm run produce-test-event mm 40005570 132458 100024 0"
rm -rf ./node_modules/legacy-processor-module
cp -rf <path/to/legacy-processor-module> ./node_modules
# e.g cp -rf ../legacy-processor-module ./node_modules
```

Let's send mm submission with example = 1 event to kafka:
## Standard Code Style

- Check code style `npm run lint`
- Check code style with option to fix the errors `npm run lint:fix`

## Run Unit Tests
- Stop `legacy-sub-processor` application if it was running: `docker stop lsp-app`
- Make sure kafka container running with topic created and informix container running with test data inserted
- Run unit tests:
```bash
docker exec -ti lsp-app bash -c "npm run produce-test-event mm 40005570 132458 100024 1"
docker-compose run lsp-app-test
```

or you can run sample mm submission message directly(valid if run `test/sql/test.sql`)
## Verify with Test Data
Deploy first:
```bash
docker exec -ti lsp-app bash -c "npm run produce-test-event 9"
export DB_SERVER_NAME=informix
docker-compose up lsp-app
```


Please note currently processor will call challenge api to check whether challenge is MM challenge and default server api.topcoder-dev.com may not exist mm challenge created in local direct application.
So when we start app with NODE_ENV=mock it will use mock challenge api configurations and start mock challenge api server.
- Run `docker exec -ti lsp-app bash -c "npm run produce-test-event different-topic"` and verify that the app doesn't consume this message (no log)
- Run `docker exec -ti lsp-app bash -c "npm run produce-test-event null-message"` and verify that the app skips this message (log: `Skipped null or empty event`)
- Run `docker exec -ti lsp-app bash -c "npm run produce-test-event empty-message"` and verify that the app skips this message (log: `Skipped null or empty event`)
- Run `docker exec -ti lsp-app bash -c "npm run produce-test-event invalid-json"` and verify that the app skips this message (log: `Skipped Invalid message JSON`)
- Run `docker exec -ti lsp-app bash -c "npm run produce-test-event empty-json"` and verify that the app skips this message (log: `Skipped the message topic "undefined" doesn't match the Kafka topic submission.notification.create`)
- Run `docker exec -ti lsp-app bash -c "npm run produce-test-event invalid-payload"` and verify that the app skips this message (log: `Skipped invalid event, reasons: "timestamp" must be...`)
- Run `docker exec -ti lsp-app bash -c "npm run produce-test-event wrong-topic"` and verify that the app skips this message (log: `Skipped the message topic "wrong-topic" doesn't match the Kafka topic submission.notification.create`)
- Run `docker exec -ti lsp-app bash -c "npm run produce-test-event wrong-originator"` and verify that the app skips this message (log: `Skipped event from topic wrong-originator`)

- Run `docker exec -ti lsp-app bash -c "npm run produce-test-event submission"` and verify that the app makes call to the Submission API successfully (log: `Successfully processed non MM message - Patched to the Submission API: id 111, patch: {"legacySubmissionId":60000}`) and the Mock API log (`docker logs mock-api`) like `Patch /submissions/111 with {"legacySubmissionId":60000}`.
- Run `docker exec -ti lsp-app bash -c "npm run produce-test-event final-fix"` and verify that the app has log like `final fix upload, only insert upload`, and it should only insert into `upload` table, but not `submission`/`resource_submission` table.
- Run `docker exec -ti lsp-app bash -c "npm run produce-test-event not-allow-multiple"` and verify that the app has log like `delete previous submission for challengeId...`.
- Run `docker exec -ti lsp-app bash -c "npm run produce-test-event update-url"` and verify that the app has log like `Successfully processed non MM message - Submission url updated...`.
- Run `docker exec -ti lsp-app bash -c "npm run produce-test-event no-challenge-props"` and verify that the app has error log like `Error: null or empty result get challenge properties...`.
- Run `docker exec -ti lsp-app bash -c "npm run produce-test-event mm-submission"` and verify that the app skips this message (log: `Skipped event for subTrack: DEVELOP_MARATHON_MATCH`).

## Verify Database
Open your database explorer (**DBeaver** application, for instance). Connect to database informixoltp
Check table: `long_component_state`, `long_submission` or run below sql
```bash
select lcs.status_id,lcs.points, lcs.example_submission_number,lcs.submission_number,ls.* from informixoltp:long_submission ls, informixoltp:long_component_state lcs where ls.long_component_state_id=lcs.long_component_state_id
```
Open your database explorer (**DBeaver** application, for instance). Connect to database tcs_catalog
Check table: `upload`, `submission` and `resource_submission`

## Cleanup
After verification, run `docker-compose down` to take down and remove containers.
18 changes: 9 additions & 9 deletions config/default.js
Original file line number Diff line number Diff line change
@@ -12,10 +12,10 @@ module.exports = {
KAFKA_URL: process.env.KAFKA_URL || 'ssl://kafka-host:9093',

// The client cert, can be (1) the path to the cert file, or (2) the cert content
KAFKA_CLIENT_CERT: process.env.KAFKA_CLIENT_CERT || './test/kafka-ssl/client.crt',
KAFKA_CLIENT_CERT: process.env.KAFKA_CLIENT_CERT || './docker/kafka/kafka-ssl/client.crt',

// The client cert key, can be (1) the path to the cert key file, or (2) the cert key content
KAFKA_CLIENT_CERT_KEY: process.env.KAFKA_CLIENT_CERT_KEY || './test/kafka-ssl/client.key',
KAFKA_CLIENT_CERT_KEY: process.env.KAFKA_CLIENT_CERT_KEY || './docker/kafka/kafka-ssl/client.key',

// The topic from which the app consumes events
KAFKA_NEW_SUBMISSION_TOPIC: process.env.KAFKA_NEW_SUBMISSION_TOPIC || 'submission.notification.create',
@@ -27,7 +27,7 @@ module.exports = {
KAFKA_NEW_SUBMISSION_ORIGINATOR: process.env.KAFKA_NEW_SUBMISSION_ORIGINATOR || 'submission-api',

// The Submission API URL
SUBMISSION_API_URL: process.env.SUBMISSION_API_URL || 'http://submission-api-host:3000',
SUBMISSION_API_URL: process.env.SUBMISSION_API_URL || 'http://mock-api-host:3000',

// The Submission API timeout
SUBMISSION_TIMEOUT: process.env.SUBMISSION_TIMEOUT || '10000',
@@ -50,19 +50,19 @@ module.exports = {
// The Informix Submission Table Sequence Name
ID_SEQ_SUBMISSION: process.env.ID_SEQ_SUBMISSION || 'submission_id_seq',

AUTH0_URL: process.env.AUTH0_URL, // Auth0 credentials for Submission Service
AUTH0_URL: process.env.AUTH0_URL || 'https://topcoder-dev.auth0.com/oauth/token', // Auth0 credentials for Submission Service

AUTH0_AUDIENCE: process.env.AUTH0_AUDIENCE,
AUTH0_AUDIENCE: process.env.AUTH0_AUDIENCE || 'https://m2m.topcoder-dev.com/',

TOKEN_CACHE_TIME: process.env.TOKEN_CACHE_TIME || '86400000',

AUTH0_CLIENT_ID: process.env.AUTH0_CLIENT_ID,

AUTH0_CLIENT_SECRET: process.env.AUTH0_CLIENT_SECRET,

CHALLENGE_INFO_API: process.env.CHALLENGE_INFO_API || 'https://api.topcoder-dev.com/v4/challenges?filter=id={cid}', // {cid} gets replaced with challenge id
AUTH0_PROXY_SERVER_URL: process.env.AUTH0_PROXY_SERVER_URL,

CHALLENGE_SUBTRACK: process.env.CHALLENGE_SUBTRACK || 'MARATHON_MATCH, DEVELOP_MARATHON_MATCH',
AUTH0_PROXY_SERVER_URL: process.env.AUTH0_PROXY_SERVER_URL
CHALLENGE_INFO_API: process.env.CHALLENGE_INFO_API || 'http://mock-api-host:3000/challenges?filter=id={cid}', // {cid} gets replaced with challenge id

MM_CHALLENGE_SUBTRACK: process.env.MM_CHALLENGE_SUBTRACK || 'MARATHON_MATCH, DEVELOP_MARATHON_MATCH'
}
9 changes: 0 additions & 9 deletions config/mock.js

This file was deleted.

6 changes: 5 additions & 1 deletion config/production.js
Original file line number Diff line number Diff line change
@@ -60,5 +60,9 @@ module.exports = {

AUTH0_CLIENT_SECRET: process.env.AUTH0_CLIENT_SECRET,

CHALLENGE_INFO_API: process.env.CHALLENGE_INFO_API
AUTH0_PROXY_SERVER_URL: process.env.AUTH0_PROXY_SERVER_URL,

CHALLENGE_INFO_API: process.env.CHALLENGE_INFO_API,

MM_CHALLENGE_SUBTRACK: process.env.MM_CHALLENGE_SUBTRACK
}
16 changes: 15 additions & 1 deletion config/staging.js
Original file line number Diff line number Diff line change
@@ -50,5 +50,19 @@ module.exports = {
// The Informix Submission Table Sequence Name
ID_SEQ_SUBMISSION: process.env.ID_SEQ_SUBMISSION || 'submission_id_seq',

CHALLENGE_INFO_API: process.env.CHALLENGE_INFO_API
AUTH0_URL: process.env.AUTH0_URL, // Auth0 credentials for Submission Service

AUTH0_AUDIENCE: process.env.AUTH0_AUDIENCE,

TOKEN_CACHE_TIME: process.env.TOKEN_CACHE_TIME || '86400000',

AUTH0_CLIENT_ID: process.env.AUTH0_CLIENT_ID,

AUTH0_CLIENT_SECRET: process.env.AUTH0_CLIENT_SECRET,

AUTH0_PROXY_SERVER_URL: process.env.AUTH0_PROXY_SERVER_URL,

CHALLENGE_INFO_API: process.env.CHALLENGE_INFO_API,

MM_CHALLENGE_SUBTRACK: process.env.MM_CHALLENGE_SUBTRACK
}
16 changes: 7 additions & 9 deletions config/test.js
Original file line number Diff line number Diff line change
@@ -1,11 +1,9 @@
/**
* The test configuration.
*/
const MOCK_SUBMISSION_API_PORT = 3000
const MOCK_SERVER_PORT = 3001
const MOCK_API_PORT = 3001
module.exports = {
MOCK_SUBMISSION_API_PORT,
MOCK_SERVER_PORT,
MOCK_API_PORT,
LOG_LEVEL: 'debug',

// The client group ID for committing and fetching offsets.
@@ -16,10 +14,10 @@ module.exports = {
KAFKA_URL: 'ssl://kafka-host:9093',

// The client cert, can be (1) the path to the cert file, or (2) the cert content
KAFKA_CLIENT_CERT: './test/kafka-ssl/client.crt',
KAFKA_CLIENT_CERT: './docker/kafka/kafka-ssl/client.crt',

// The client cert key, can be (1) the path to the cert key file, or (2) the cert key content
KAFKA_CLIENT_CERT_KEY: './test/kafka-ssl/client.key',
KAFKA_CLIENT_CERT_KEY: './docker/kafka/kafka-ssl/client.key',

// The topic from which the app consumes events
KAFKA_NEW_SUBMISSION_TOPIC: 'submission.notification.create',
@@ -31,7 +29,7 @@ module.exports = {
KAFKA_NEW_SUBMISSION_ORIGINATOR: 'new-submission-originator',

// The Submission API URL
SUBMISSION_API_URL: `http://localhost:${MOCK_SUBMISSION_API_PORT}`,
SUBMISSION_API_URL: `http://localhost:${MOCK_API_PORT}`,

// The Submission API timeout
SUBMISSION_TIMEOUT: 2000,
@@ -54,7 +52,7 @@ module.exports = {
// The Informix Submission Table Sequence Name
ID_SEQ_SUBMISSION: 'submission_id_seq',

CHALLENGE_INFO_API: `http://localhost:${MOCK_SERVER_PORT}/challenges?filter=id={cid}`, // {cid} gets replaced with challenge id
CHALLENGE_INFO_API: `http://localhost:${MOCK_API_PORT}/challenges?filter=id={cid}`, // {cid} gets replaced with challenge id

CHALLENGE_SUBTRACK: 'MARATHON_MATCH, DEVELOP_MARATHON_MATCH'
MM_CHALLENGE_SUBTRACK: 'MARATHON_MATCH, DEVELOP_MARATHON_MATCH'
}
54 changes: 5 additions & 49 deletions docker-compose.yml
Original file line number Diff line number Diff line change
@@ -1,10 +1,5 @@
version: '3.2'
services:
tc-cache:
image: "redis:3.2.5"
ports:
- "6379:6379"

tc-informix:
image: "appiriodevops/tc-database-scripts:latest"
hostname: ${DB_SERVER_NAME}
@@ -19,42 +14,6 @@ services:
- "27883:27883"
tty: true

tc-direct:
image: "appiriodevops/direct-app:latest"
links:
- "tc-informix"
- "tc-cache"
- "mock-services:tc-api.cloud.topcoder.com"
hostname: cockpit.cloud.topcoder.com
ports:
- "443:443"
- "8180:8180"
- "1198:1198"
- "1199:1199"
- "3973:3973"
- "5446:5446"

run-online-review:
image: "appiriodevops/online-review:run"
environment:
- JAVA_OPTS=-Xms256m -Xmx512m
- DISABLE_ONLINE_REVIEW=0
- DISABLE_AUTO_PILOT=0
- DISABLE_LATE_DELIVERABLES_TRACKER=1
- DISABLE_REVIEW_ASSIGNMENT=1
ports:
- "80:8080"
links:
- tc-informix:db
- "mock-services:tc-api.cloud.topcoder.com"
entrypoint: /root/scripts/run.sh

mock-services:
image: "appiriodevops/mock-services:latest"
ports:
- "8080:8080"
- "8443:8443"

kafka:
build:
context: ./docker/kafka
@@ -63,21 +22,18 @@ services:
ports:
- "9093:9093"
- "9092:9092"
# to override ssl related file
# volumes:
# - "./test/kafka-ssl:/kafka-ssl"
environment:
- "ENABLE_SSL=true"
- "TRUSTSTORE_PASSWORD=test1234"
- "KEYSTORE_PASSWORD=test1234"
- "KEY_PASSWORD=test1234"

mock-submission-api:
mock-api:
image: lsp-app:latest
container_name: mock-submission-api
container_name: mock-api
volumes:
- ".:/app"
command: run mock-submission-api
command: run mock-api

lsp-app-install:
build:
@@ -106,12 +62,12 @@ services:
- "3000:3300"
links:
- "kafka:kafka-host"
- "mock-submission-api:submission-api-host"
- "mock-api:mock-api-host"
- "tc-informix:informix"
depends_on:
- "kafka"
- "tc-informix"
- "mock-submission-api"
- "mock-api"

lsp-app-test:
image: lsp-app:latest
1 change: 1 addition & 0 deletions docker/legacy-submission-processor/Dockerfile
Original file line number Diff line number Diff line change
@@ -6,6 +6,7 @@ USER root
RUN mkdir /app
WORKDIR /home/informix

RUN sed -i '/jessie-updates/d' /etc/apt/sources.list
RUN apt-get -qq update && apt-get -qq install -y \
wget gcc g++ make xz-utils python2.7 git

109 changes: 0 additions & 109 deletions docs/Validation.md

This file was deleted.

138 changes: 0 additions & 138 deletions docs/Verification_with_DB.md

This file was deleted.

108 changes: 4 additions & 104 deletions index.js
Original file line number Diff line number Diff line change
@@ -1,115 +1,15 @@
/**
* The main entry of the application.
*/
const logger = require('./src/common/logger')
require('./src/bootstrap')
require('legacy-processor-module/bootstrap')

const _ = require('lodash')
const Kafka = require('no-kafka')
const config = require('config')
const util = require('util')
const healthcheck = require('topcoder-healthcheck-dropin')
const m2mAuth = require('tc-core-library-js').auth.m2m
const NewSubmissionService = require('./src/services/SubmissionService')
const IDGenerator = require('legacy-processor-module/IdGenerator')
const Informix = require('legacy-processor-module/Informix')
const KafkaConsumer = require('legacy-processor-module/KafkaConsumer')

logger.info(`KAFKA URL - ${config.KAFKA_URL}`)
const SubmissionService = require('./src/services/SubmissionService')

// db informix option
const dbOpts = {
database: config.DB_NAME,
username: config.DB_USERNAME,
password: config.DB_PASSWORD,
pool: {
min: 0,
max: 10
}
}

const db = new Informix(dbOpts)
const m2m = ((config.AUTH0_CLIENT_ID && config.AUTH0_CLIENT_SECRET) ? m2mAuth(_.pick(config, ['AUTH0_URL',
'AUTH0_AUDIENCE', 'TOKEN_CACHE_TIME', 'AUTH0_PROXY_SERVER_URL'
])) : null)

/**
* Handle the messages from Kafka.
* @param {Array<Object>} messages the messages
* @param {String} topic the topic
* @param {Number} partition the partition
* @private
*/
function handleMessages (messages, topic, partition) {
return Promise.each(messages, (m) => {
const messageValue = m.message.value ? m.message.value.toString('utf8') : null
const messageInfo = `message from topic ${topic}, partition ${partition}, offset ${m.offset}: ${messageValue}`

logger.debug(`Received ${messageInfo}`)

// Handle the event
return NewSubmissionService.handle(messageValue, db, m2m, idUploadGen, idSubmissionGen)
.then(() => {
logger.debug(`Completed handling ${messageInfo}`)

// Commit offset
return consumer.commitOffset({
topic, partition, offset: m.offset
})
.catch(err => {
logger.error(`Failed to commit offset for ${messageInfo}: ${err.message}`)
logger.error(util.inspect(err))
})
})
.catch(err => {
// Catch all errors thrown by the handler
logger.error(`Failed to handle ${messageInfo}: ${err.message}`)
logger.error(util.inspect(err))
})
})
}

const options = { connectionString: config.KAFKA_URL }
if (config.KAFKA_CLIENT_CERT && config.KAFKA_CLIENT_CERT_KEY) {
options.ssl = { cert: config.KAFKA_CLIENT_CERT, key: config.KAFKA_CLIENT_CERT_KEY }
}
const consumer = new Kafka.SimpleConsumer(options)

const idUploadGen = new IDGenerator(db, config.ID_SEQ_UPLOAD)
const idSubmissionGen = new IDGenerator(db, config.ID_SEQ_SUBMISSION)

// check if there is kafka connection alive
function check () {
if (!consumer.client.initialBrokers && !consumer.client.initialBrokers.length) {
return false
}
let connected = true
consumer.client.initialBrokers.forEach(conn => {
logger.debug(`url ${conn.server()} - connected=${conn.connected}`)
connected = conn.connected & connected
})
return connected
}

consumer
.init()
// consume configured topics
.then(() => {
healthcheck.init([check])
const topics = [config.KAFKA_NEW_SUBMISSION_TOPIC, config.KAFKA_UPDATE_SUBMISSION_TOPIC]
_.each(topics, (tp) => {
consumer.subscribe(tp, { time: Kafka.LATEST_OFFSET }, handleMessages)
})
})
.catch((err) => logger.error(err))
const consumer = KafkaConsumer.startConsumer(SubmissionService, [config.KAFKA_NEW_SUBMISSION_TOPIC, config.KAFKA_UPDATE_SUBMISSION_TOPIC])

if (process.env.NODE_ENV === 'test') {
module.exports = consumer
}
if (process.env.NODE_ENV === 'mock') {
// start mock server if NODE_ENV = mock
const {
mockServer
} = require('./test/mock-api')
mockServer.listen(config.MOCK_SERVER_PORT)
console.log(`mock api is listen port ${config.MOCK_SERVER_PORT}`)
}
1,447 changes: 444 additions & 1,003 deletions package-lock.json

Large diffs are not rendered by default.

18 changes: 8 additions & 10 deletions package.json
Original file line number Diff line number Diff line change
@@ -4,14 +4,14 @@
"description": "Topcoder - Legacy Submission Processor Application",
"main": "index.js",
"scripts": {
"start:docker": "npm --unsafe-perm install && node -r dotenv/config index",
"test:docker": "npm --unsafe-perm install && npm run test",
"start:docker": "node -r dotenv/config index",
"test:docker": "npm run test",
"clean": "rm -rf node_modules && rm -rf coverage && rm -rf .nyc_output",
"start": "node index",
"lint": "standard",
"lint:fix": "standard --fix",
"mock-submission-api": "node test/mock-submission-api",
"produce-test-event": "node test/produce-test-event",
"lint": "eslint .",
"lint:fix": "eslint --fix .",
"mock-api": "node node_modules/legacy-processor-module/mock/mock-api",
"produce-test-event": "node node_modules/legacy-processor-module/test/produce-test-event",
"test": "npm run lint && nyc --reporter=html --reporter=text mocha test/tests.js --timeout 20000 --exit"
},
"dependencies": {
@@ -30,13 +30,12 @@
"no-kafka": "^3.2.10",
"tc-core-library-js": "appirio-tech/tc-core-library-js.git#v2.6",
"topcoder-healthcheck-dropin": "^1.0.2",
"winston": "^2.4.2",
"flatted": "^2.0.0"
"winston": "^2.4.2"
},
"devDependencies": {
"eslint": "^5.16.0",
"mocha": "^5.2.0",
"nyc": "^12.0.2",
"standard": "^11.0.1",
"should": "^13.2.3"
},
"standard": {
@@ -51,7 +50,6 @@
},
"nyc": {
"exclude": [
"src/common/logger.js",
"test/*.js"
]
}
5 changes: 0 additions & 5 deletions src/bootstrap.js

This file was deleted.

49 changes: 0 additions & 49 deletions src/common/constant.js

This file was deleted.

15 changes: 0 additions & 15 deletions src/common/logger.js

This file was deleted.

216 changes: 108 additions & 108 deletions src/services/SubmissionService.js
Original file line number Diff line number Diff line change
@@ -1,138 +1,138 @@
/**
* The service to handle new submission events.
* The service to handle new submission events for non-MM challenge.
*/
const _ = require('lodash')
const Axios = require('axios')
const config = require('config')
const Flatted = require('flatted')
const Joi = require('joi')
const logger = require('../common/logger')
const { handleSubmission } = require('legacy-processor-module/AllSubmissionService')
const config = require("config");
const Joi = require("joi");

// Custom Joi type
Joi.id = () => Joi.number().integer().positive().required()
const logger = require("legacy-processor-module/common/logger");
const Schema = require("legacy-processor-module/Schema");
const LegacySubmissionIdService = require("legacy-processor-module/LegacySubmissionIdService");

// The event schema to validate events from Kafka
const eventSchema = Joi.object().keys({
topic: Joi.string().required(),
originator: Joi.string().required(),
timestamp: Joi.date().required(),
'mime-type': Joi.string().required(),
payload: Joi.object().keys({
id: Joi.alternatives().try(Joi.id(), Joi.string().uuid()).required(),
resource: Joi.alternatives().try(Joi.string().valid('submission'), Joi.string().valid('review')),
challengeId: Joi.id().optional(),
memberId: Joi.id().optional(),
submissionPhaseId: Joi.id().optional(),
url: Joi.string().uri().optional(),
type: Joi.string().optional(),
legacySubmissionId: Joi.number().integer().positive().optional(),
isExample: Joi.number().integer().valid(0, 1).optional(),
typeId: Joi.string().optional(),
score: Joi.number().min(0).max(100).optional(),
metadata: Joi.object().keys({
testType: Joi.string().required()
}).optional()
}).required().unknown(true)
})
const eventSchema = Schema.createEventSchema({
id: Joi.sid().required(),
resource: Joi.resource(),
challengeId: Joi.id().required(),
memberId: Joi.id().required(),
submissionPhaseId: Joi.id().required(),
type: Joi.string().required(),
url: Joi.string()
.uri()
.optional(),
legacySubmissionId: Joi.id().optional()
});

/**
* Get the subtrack for a challenge.
* @param {string} challengeId - The id of the challenge.
* @returns {string} The subtrack type of the challenge.
* Handle new submission and update submission event.
* @param {Object} event the event object
*/
async function getSubTrack (challengeId) {
try {
// attempt to fetch the subtrack
const result = await Axios.get(config.CHALLENGE_INFO_API.replace('{cid}', challengeId))
// use _.get to avoid access with undefined object
return _.get(result.data, 'result.content[0].subTrack')
} catch (err) {
if (err.response) { // non-2xx response received
logger.error(`Challenge Details API Error: ${Flatted.stringify({
data: err.response.data,
status: err.response.status,
headers: err.response.headers
}, null, 2)}`)
} else if (err.request) { // request sent, no response received
// may throw such error Converting circular structure to JSON if use native JSON.stringify
// https://github.com/axios/axios/issues/836
logger.error(`Challenge Details API Error (request sent, no response): ${Flatted.stringify(err.request, null, 2)}`)
} else {
logger.error(util.inspect(err))
}
async function handle(event) {
if (!event) {
logger.debug("Skipped null or empty event");
return;
}
}

/**
* Handle new submission message.
* @param {String} value the message value (JSON string)
* @param {Object} db the informix database
* @param {Object} m2m the m2m auth
* @param {IDGenerator} idUploadGen IDGenerator instance of upload
* @param {IDGenerator} idSubmissionGen IDGenerator instance of submission
*/
async function handle (value, db, m2m, idUploadGen, idSubmissionGen) {
if (!value) {
logger.debug('Skipped null or empty event')
return
// Check topic and originator
if (
event.topic !== config.KAFKA_NEW_SUBMISSION_TOPIC &&
event.topic !== config.KAFKA_UPDATE_SUBMISSION_TOPIC
) {
logger.debug(`Skipped event from topic ${event.topic}`);
return;
}

// Parse JSON string to get the event
let event
try {
event = JSON.parse(value)
} catch (err) {
logger.debug(`Skipped non well-formed JSON message: ${err.message}`)
return
if (event.originator !== config.KAFKA_NEW_SUBMISSION_ORIGINATOR) {
logger.debug(`Skipped event from originator ${event.originator}`);
return;
}

if (!event) {
logger.debug('Skipped null or empty event')
return
if (event.payload.resource !== "submission") {
logger.debug(`Skipped event from resource ${event.payload.resource}`);
return;
}

// Validate event
const validationResult = Joi.validate(event, eventSchema, { abortEarly: false, stripUnknown: true })
if (validationResult.error) {
const validationErrorMessage = _.map(validationResult.error.details, 'message').join(', ')
logger.debug(`Skipped invalid event, reasons: ${validationErrorMessage}`)
return
if (!Schema.validateEvent(event, eventSchema)) {
return;
}

// Check topic and originator
if (event.topic !== config.KAFKA_NEW_SUBMISSION_TOPIC && event.topic !== config.KAFKA_UPDATE_SUBMISSION_TOPIC) {
logger.debug(`Skipped event from topic ${event.topic}`)
return
}
// Attempt to retrieve the subTrack of the challenge
const subTrack = await LegacySubmissionIdService.getSubTrack(
event.payload.challengeId
);
logger.debug(
`Challenge ${event.payload.challengeId} get subTrack ${subTrack}`
);

if (event.originator !== config.KAFKA_NEW_SUBMISSION_ORIGINATOR) {
logger.debug(`Skipped event from originator ${event.originator}`)
return
}
const mmChallangeSubtracks = config.MM_CHALLENGE_SUBTRACK.split(",").map(x =>
x.trim()
);

if (event.payload.resource !== 'submission') {
logger.debug(`Skipped event from resource ${event.payload.resource}`)
return
// Skip MM challenge submissions
if (!subTrack || mmChallangeSubtracks.includes(subTrack)) {
logger.debug(`Skipped event for subTrack: ${subTrack}`);
return;
}

// will convert to Date object by Joi and assume UTC timezone by default
const timestamp = validationResult.value.timestamp.getTime()
if (event.topic === config.KAFKA_NEW_SUBMISSION_TOPIC) {
// Handle new submission
logger.debug(`Started adding submission for ${event.payload.id}`);
try {
const patchObject = await LegacySubmissionIdService.addSubmission(
event.payload.id,
event.payload.challengeId,
event.payload.memberId,
event.payload.submissionPhaseId,
event.payload.url,
event.payload.type
);

logger.debug(
`Successfully processed non MM message - Patched to the Submission API: id ${
event.payload.id
}, patch: ${JSON.stringify(patchObject)}`
);
} catch (error) {
logger.error(`Failed to handle ${JSON.stringify(event)}: ${error.message}`)
logger.error(error);
}
} else if (event.payload.url) {
// We only concerns updating url,
// while the update event may not be caused by url update

// attempt to retrieve the subTrack of the challenge
const subTrack = await getSubTrack(event.payload.challengeId)
logger.debug(`Challenge ${event.payload.challengeId} get subTrack ${subTrack}`)
const challangeSubtracks = config.CHALLENGE_SUBTRACK.split(',').map(x => x.trim())
let legacySubmissionId = event.payload.legacySubmissionId;
if (!legacySubmissionId) {
// In case legacySubmissionId not present, try to get it from submission API
const submission = await LegacySubmissionIdService.getSubmission(
event.payload.id
);
legacySubmissionId = submission.legacySubmissionId || 0;
}

// process all challenge submissions
if (subTrack && challangeSubtracks.includes(subTrack)) {
await handleSubmission(Axios, event, db, m2m, idUploadGen, idSubmissionGen, timestamp, true)
} else {
await handleSubmission(Axios, event, db, m2m, idUploadGen, idSubmissionGen, timestamp, false)
logger.debug(`Successful Processing of non MM challenge submission message: ${JSON.stringify(event, null, 2)}`)
logger.debug(
`Started updating URL for submission for ${legacySubmissionId}`
);
try {
await LegacySubmissionIdService.updateUpload(
event.payload.challengeId,
event.payload.memberId,
event.payload.submissionPhaseId,
event.payload.url,
event.payload.type,
legacySubmissionId
);
logger.debug(
`Successfully processed non MM message - Submission url updated, legacy submission id : ${legacySubmissionId} with url ${
event.payload.url
}`
);
} catch (error) {
logger.error(`Failed to handle ${JSON.stringify(event)}: ${error.message}`)
logger.error(error);
}
}
}

module.exports = {
handle
}
};
16 changes: 0 additions & 16 deletions test/kafka-ssl/client.crt

This file was deleted.

27 changes: 0 additions & 27 deletions test/kafka-ssl/client.key

This file was deleted.

Binary file removed test/kafka-ssl/server.keystore.jks
Binary file not shown.
Binary file removed test/kafka-ssl/server.truststore.jks
Binary file not shown.
39 changes: 0 additions & 39 deletions test/mock-api.js

This file was deleted.

168 changes: 0 additions & 168 deletions test/mock-submission-api.js

This file was deleted.

155 changes: 0 additions & 155 deletions test/produce-test-event.js

This file was deleted.

3 changes: 0 additions & 3 deletions test/sql/prepare.sql

This file was deleted.

1,621 changes: 132 additions & 1,489 deletions test/sql/test.sql

Large diffs are not rendered by default.

5 changes: 0 additions & 5 deletions test/test_files/Test.java

This file was deleted.

15 changes: 0 additions & 15 deletions test/test_files/sqlParams.json

This file was deleted.

618 changes: 292 additions & 326 deletions test/tests.js

Large diffs are not rendered by default.