Skip to content

Commit 00adcf0

Browse files
committed
Initial commits
0 parents  commit 00adcf0

23 files changed

+2829
-0
lines changed

.dockerignore

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
node_modules/
2+
.env
3+
coverage/
4+
.nyc_output/

Procfile

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
worker: npm start

README.md

Lines changed: 227 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,227 @@
1+
# Topcoder - Member Processor
2+
3+
## Dependencies
4+
5+
- nodejs https://nodejs.org/en/ (v8+)
6+
- Kafka
7+
- ElasticSearch v6
8+
- Docker, Docker Compose
9+
10+
## Configuration
11+
12+
Configuration for the notification server is at `config/default.js`.
13+
The following parameters can be set in config files or in env variables:
14+
- DISABLE_LOGGING: whether to disable loggin
15+
- LOG_LEVEL: the log level; default value: 'debug'
16+
- KAFKA_URL: comma separated Kafka hosts; default value: 'localhost:9092'
17+
- KAFKA_CLIENT_CERT: Kafka connection certificate, optional; default value is undefined;
18+
if not provided, then SSL connection is not used, direct insecure connection is used;
19+
if provided, it can be either path to certificate file or certificate content
20+
- KAFKA_CLIENT_CERT_KEY: Kafka connection private key, optional; default value is undefined;
21+
if not provided, then SSL connection is not used, direct insecure connection is used;
22+
if provided, it can be either path to private key file or private key content
23+
- CREATE_PROFILE_TOPIC: create profile Kafka topic, default value is 'member.action.profile.create'
24+
- UPDATE_PROFILE_TOPIC: update profile Kafka topic, default value is 'member.action.profile.update'
25+
- DELETE_PROFILE_TOPIC: delete profile Kafka topic, default value is 'member.action.profile.delete'
26+
- CREATE_TRAIT_TOPIC: create trait Kafka topic, default value is 'member.action.profile.trait.create'
27+
- UPDATE_TRAIT_TOPIC: update trait Kafka topic, default value is 'member.action.profile.trait.update'
28+
- DELETE_TRAIT_TOPIC: delete trait Kafka topic, default value is 'member.action.profile.trait.delete'
29+
- CREATE_PHOTO_TOPIC: create photo Kafka topic, default value is 'member.action.profile.photo.create'
30+
- UPDATE_PHOTO_TOPIC: update photo Kafka topic, default value is 'member.action.profile.photo.update'
31+
- esConfig: ElasticSearch config
32+
33+
Refer to `esConfig` variable in `config/default.js` for ES related configuration.
34+
Usually, you need to configure the ES_HOST environment variable according to setup ES, e.g.
35+
export ES_HOST=localhost:9200
36+
37+
38+
Also note that there is a `/health` endpoint that checks for the health of the app. This sets up an expressjs server and listens on the environment variable `PORT`. It's not part of the configuration file and needs to be passed as an environment variable
39+
40+
## Local Kafka setup
41+
42+
- `http://kafka.apache.org/quickstart` contains details to setup and manage Kafka server,
43+
below provides details to setup Kafka server in Mac, Windows will use bat commands in bin/windows instead
44+
- download kafka at `https://www.apache.org/dyn/closer.cgi?path=/kafka/1.1.0/kafka_2.11-1.1.0.tgz`
45+
- extract out the doanlowded tgz file
46+
- go to extracted directory kafka_2.11-0.11.0.1
47+
- start ZooKeeper server:
48+
`bin/zookeeper-server-start.sh config/zookeeper.properties`
49+
- use another terminal, go to same directory, start the Kafka server:
50+
`bin/kafka-server-start.sh config/server.properties`
51+
- note that the zookeeper server is at localhost:2181, and Kafka server is at localhost:9092
52+
- use another terminal, go to same directory, create some topics:
53+
`bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic member.action.profile.create`
54+
`bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic member.action.profile.update`
55+
`bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic member.action.profile.delete`
56+
`bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic member.action.profile.trait.create`
57+
`bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic member.action.profile.trait.update`
58+
`bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic member.action.profile.trait.delete`
59+
`bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic member.action.profile.photo.create`
60+
`bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic member.action.profile.photo.update`
61+
- verify that the topics are created:
62+
`bin/kafka-topics.sh --list --zookeeper localhost:2181`,
63+
it should list out the created topics
64+
- run the producer and then write some message into the console to send to the `member.action.profile.create` topic:
65+
`bin/kafka-console-producer.sh --broker-list localhost:9092 --topic member.action.profile.create`
66+
in the console, write message, one message per line:
67+
`{ "topic": "member.action.profile.create", "originator": "member-api", "timestamp": "2018-02-16T00:00:00", "mime-type": "application/json", "payload": { "userId": 1111, "userHandle": "handle", "email": "[email protected]", "sex": "male", "created": "2018-01-02T00:00:00", "createdBy": "admin" } }`
68+
- optionally, use another terminal, go to same directory, start a consumer to view the messages:
69+
`bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic member.action.profile.create --from-beginning`
70+
- writing/reading messages to/from other topics are similar
71+
72+
73+
## ElasticSearch setup
74+
75+
You may download ElasticSearch v6, install and run it locally.
76+
Or to setup ES service using AWS.
77+
Another simple way is to use docker compose:
78+
go to docker-es folder, run `docker-compose up`
79+
80+
81+
## Local deployment
82+
83+
- install dependencies `npm i`
84+
- run code lint check `npm run lint`, running `npm run lint-fix` can fix some lint errors if any
85+
- initialize Elasticsearch, create configured Elasticsearch index if not present: `npm run init-es`
86+
- or to re-create the index: `npm run init-es force`
87+
- run tests `npm run test`
88+
- start processor app `npm start`
89+
90+
## Local Deployment with Docker
91+
92+
To run the Member ES Processor using docker, follow the below steps
93+
94+
1. Navigate to the directory `docker`
95+
96+
2. Rename the file `sample.api.env` to `api.env`
97+
98+
3. Set the required AWS credentials in the file `api.env`
99+
100+
4. Once that is done, run the following command
101+
102+
```
103+
docker-compose up
104+
```
105+
106+
5. When you are running the application for the first time, It will take some time initially to download the image and install the dependencies
107+
108+
## Unit tests and Integration tests
109+
110+
Integration tests may use different index `member-test` which may not same as the usual index.
111+
112+
Please ensure to create the index `member-test` or the index specified in the environment variable `ES_INDEX_TEST` before running the Integration tests. You could re-use the existing scripts to create index but you would need to set the below environment variable
113+
114+
```
115+
export ES_INDEX=member-test
116+
```
117+
118+
#### Running unit tests and coverage
119+
120+
To run unit tests alone
121+
122+
```
123+
npm run test
124+
```
125+
126+
To run unit tests with coverage report
127+
128+
```
129+
npm run cov
130+
```
131+
132+
#### Running integration tests and coverage
133+
134+
To run integration tests alone
135+
136+
```
137+
npm run e2e
138+
```
139+
140+
To run integration tests with coverage report
141+
142+
```
143+
npm run cov-e2e
144+
```
145+
146+
147+
## Verification
148+
149+
- start kafka server, start elasticsearch, initialize Elasticsearch, start processor app
150+
- start kafka-console-producer to write messages to `member.action.profile.create` topic:
151+
`bin/kafka-console-producer.sh --broker-list localhost:9092 --topic member.action.profile.create`
152+
- write message:
153+
`{ "topic": "member.action.profile.create", "originator": "member-api", "timestamp": "2018-02-16T00:00:00", "mime-type": "application/json", "payload": { "userId": 1111, "userHandle": "handle", "email": "[email protected]", "sex": "male", "created": "2018-02-16T00:00:00", "createdBy": "admin" } }`
154+
- run command `npm run view-data profile1111` to view the created data, you will see the data are properly created:
155+
156+
```bash
157+
info: Elasticsearch data:
158+
info: {
159+
"userId": 1111,
160+
"userHandle": "handle",
161+
"email": "[email protected]",
162+
"sex": "male",
163+
"created": "2018-02-16T00:00:00",
164+
"createdBy": "admin",
165+
"resource": "profile"
166+
}
167+
```
168+
169+
- you may write invalid message like:
170+
`{ "topic": "member.action.profile.create", "originator": "member-api", "timestamp": "2018-02-16T00:00:00", "mime-type": "application/json", "payload": { "user-id": "1111", "userHandle": "handle", "sex": "male", "created": "2018-01-02T00:00:00", "createdBy": "admin" } }`
171+
- then in the app console, you will see error message
172+
173+
- start kafka-console-producer to write messages to `member.action.profile.update` topic:
174+
`bin/kafka-console-producer.sh --broker-list localhost:9092 --topic member.action.profile.update`
175+
- write message:
176+
`{ "topic": "member.action.profile.update", "originator": "member-api", "timestamp": "2018-03-02T00:00:00", "mime-type": "application/json", "payload": { "userId": 1111, "userHandle": "handle", "email": "[email protected]", "sex": "male", "created": "2018-01-02T00:00:00", "createdBy": "admin", "updated": "2018-03-02T00:00:00", "updatedBy": "admin" } }`
177+
- run command `npm run view-data profile1111` to view the updated data, you will see the data are properly updated:
178+
179+
```bash
180+
info: Elasticsearch data:
181+
info: {
182+
"userId": 1111,
183+
"userHandle": "handle",
184+
"email": "[email protected]",
185+
"sex": "male",
186+
"created": "2018-01-02T00:00:00",
187+
"createdBy": "admin",
188+
"resource": "profile",
189+
"updatedBy": "admin",
190+
"updated": "2018-03-02T00:00:00"
191+
}
192+
```
193+
194+
- start kafka-console-producer to write messages to `member.action.profile.delete` topic:
195+
`bin/kafka-console-producer.sh --broker-list localhost:9092 --topic member.action.profile.delete`
196+
- write message:
197+
`{ "topic": "member.action.profile.delete", "originator": "member-api", "timestamp": "2018-04-16T00:00:00", "mime-type": "application/json", "payload": { "userId": 1111, "userHandle": "handle" } }`
198+
- run command `npm run view-data profile1111` to view the deleted data, you will see the data are properly deleted:
199+
200+
```bash
201+
info: The data is not found.
202+
```
203+
204+
- management of other data are similar, below gives valid Kafka messages for other resource types, so that you may test them easily
205+
- create trait:
206+
`{ "topic": "member.action.profile.trait.create", "originator": "member-api", "timestamp": "2018-02-16T00:00:00", "mime-type": "application/json", "payload": { "userId": 1111, "userHandle": "handle", "traitId": 123, "created": "2018-02-16T00:00:00", "createdBy": "admin" } }`
207+
`{ "topic": "member.action.profile.trait.create", "originator": "member-api", "timestamp": "2018-02-16T00:00:00", "mime-type": "application/json", "payload": { "userId": 1111, "userHandle": "handle", "traitId": 456, "created": "2018-02-16T00:00:00", "createdBy": "admin" } }`
208+
- update trait:
209+
`{ "topic": "member.action.profile.trait.update", "originator": "member-api", "timestamp": "2018-02-17T00:00:00", "mime-type": "application/json", "payload": { "userId": 1111, "userHandle": "handle", "traitId": 123, "created": "2018-02-16T00:00:00", "createdBy": "admin", "updated": "2018-02-17T00:00:00", "updatedBy": "admin" } }`
210+
- delete trait:
211+
`{ "topic": "member.action.profile.trait.delete", "originator": "member-api", "timestamp": "2018-02-18T00:00:00", "mime-type": "application/json", "payload": { "userId": 1111, "userHandle": "handle", "memberProfileTraitIds": [123, 456] } }`
212+
213+
- create photo:
214+
`{ "topic": "member.action.profile.photo.create", "originator": "member-api", "timestamp": "2018-02-16T00:00:00", "mime-type": "application/json", "payload": { "userId": 1111, "userHandle": "handle", "photoURL": "http://test.com/123.png", "created": "2018-02-16T00:00:00", "createdBy": "admin" } }`
215+
- update photo:
216+
`{ "topic": "member.action.profile.photo.update", "originator": "member-api", "timestamp": "2018-02-17T00:00:00", "mime-type": "application/json", "payload": { "userId": 1111, "userHandle": "handle", "photoURL": "http://test.com/456.png", "created": "2018-02-16T00:00:00", "createdBy": "admin", "updated": "2018-02-16T00:00:00", "updatedBy": "admin" } }`
217+
218+
- to view photo data, run command `npm run view-data profile<userId>photo`, e.g. `npm run view-data profile1111photo`
219+
- to view trait data, run command `npm run view-data profile<userId>trait<traitId>`, e.g. `npm run view-data profile1111trait123`
220+
221+
222+
## Notes
223+
- the processor will add resource field (profile/photo/trait) to the message payload to be indexed in ElasticSearch,
224+
('profile' + userId) is used to identify profile,
225+
('profile' + userId + 'photo') is used to identify photo,
226+
('profile' + userId + 'trait' + traitId) is used to identify trait
227+

config/default.js

Lines changed: 31 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,31 @@
1+
/**
2+
* The default configuration file.
3+
*/
4+
5+
module.exports = {
6+
DISABLE_LOGGING: process.env.DISABLE_LOGGING || false, // If true, logging will be disabled
7+
LOG_LEVEL: process.env.LOG_LEVEL || 'debug',
8+
9+
KAFKA_URL: process.env.KAFKA_URL || 'localhost:9092',
10+
// below are used for secure Kafka connection, they are optional
11+
// for the local Kafka, they are not needed
12+
KAFKA_CLIENT_CERT: process.env.KAFKA_CLIENT_CERT,
13+
KAFKA_CLIENT_CERT_KEY: process.env.KAFKA_CLIENT_CERT_KEY,
14+
15+
CREATE_PROFILE_TOPIC: process.env.CREATE_PROFILE_TOPIC || 'member.action.profile.create',
16+
UPDATE_PROFILE_TOPIC: process.env.UPDATE_PROFILE_TOPIC || 'member.action.profile.update',
17+
DELETE_PROFILE_TOPIC: process.env.DELETE_PROFILE_TOPIC || 'member.action.profile.delete',
18+
CREATE_TRAIT_TOPIC: process.env.CREATE_TRAIT_TOPIC || 'member.action.profile.trait.create',
19+
UPDATE_TRAIT_TOPIC: process.env.UPDATE_TRAIT_TOPIC || 'member.action.profile.trait.update',
20+
DELETE_TRAIT_TOPIC: process.env.DELETE_TRAIT_TOPIC || 'member.action.profile.trait.delete',
21+
CREATE_PHOTO_TOPIC: process.env.CREATE_PHOTO_TOPIC || 'member.action.profile.photo.create',
22+
UPDATE_PHOTO_TOPIC: process.env.UPDATE_PHOTO_TOPIC || 'member.action.profile.photo.update',
23+
24+
esConfig: {
25+
HOST: process.env.ES_HOST,
26+
AWS_REGION: process.env.AWS_REGION || 'us-east-1', // AWS Region to be used if we use AWS ES
27+
API_VERSION: process.env.ES_API_VERSION || '6.3',
28+
ES_INDEX: process.env.ES_INDEX || 'member-test',
29+
ES_TYPE: process.env.ES_TYPE || '_doc' // ES 6.x accepts only 1 Type per index and it's mandatory to define it
30+
}
31+
}

config/production.js

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
/**
2+
* Production configuration file
3+
*/
4+
5+
module.exports = {
6+
LOG_LEVEL: process.env.LOG_LEVEL || 'info'
7+
}

config/test.js

Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,13 @@
1+
/**
2+
* Configuration file to be used while running tests
3+
*/
4+
5+
module.exports = {
6+
DISABLE_LOGGING: false, // If true, logging will be disabled
7+
LOG_LEVEL: 'debug',
8+
esConfig: {
9+
ES_HOST: process.env.ES_HOST || 'https://test.es.com',
10+
ES_INDEX: process.env.ES_INDEX_TEST || 'member-test',
11+
ES_TYPE: process.env.ES_TYPE_TEST || '_doc' // ES 6.x accepts only 1 Type per index and it's mandatory to define it
12+
}
13+
}

docker-es/docker-compose.yml

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
version: "2"
2+
services:
3+
esearch:
4+
image: "docker.elastic.co/elasticsearch/elasticsearch:6.3.1"
5+
ports:
6+
- "9200:9200"

docker/Dockerfile

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
# Use the base image with Node.js 8.11.3
2+
FROM node:8.11.3
3+
4+
# Copy the current directory into the Docker image
5+
COPY . /member-processor-es
6+
7+
# Set working directory for future use
8+
WORKDIR /member-processor-es
9+
10+
# Install the dependencies from package.json
11+
RUN npm install

docker/docker-compose.yml

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
version: '3'
2+
services:
3+
member-processor-es:
4+
build:
5+
context: ../
6+
dockerfile: docker/Dockerfile
7+
env_file:
8+
- api.env
9+
command: npm start
10+
network_mode: "host"

docker/sample.api.env

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
AWS_ACCESS_KEY_ID=<AWS Access Key ID>
2+
AWS_SECRET_ACCESS_KEY=<AWS Secret Access Key>
3+
ES_HOST=<ES Host Endpoint>

package.json

Lines changed: 49 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,49 @@
1+
{
2+
"name": "tc-submission-processor",
3+
"version": "1.0.0",
4+
"description": "Topcoder - Submission Processor",
5+
"main": "src/app.js",
6+
"scripts": {
7+
"start": "node src/app.js",
8+
"lint": "standard",
9+
"lint:fix": "standard --fix",
10+
"init-es": "node src/init-es.js",
11+
"view-data": "node test/common/view-data.js",
12+
"test": "mocha test/unit/*.test.js --require test/unit/prepare.js --exit",
13+
"e2e": "mocha test/e2e/*.test.js --exit",
14+
"cov": "nyc --reporter=html --reporter=text mocha test/unit/*.test.js --require test/unit/prepare.js --exit",
15+
"cov-e2e": "nyc --reporter=html --reporter=text mocha test/e2e/*.test.js --exit"
16+
},
17+
"author": "TCSCODER",
18+
"license": "none",
19+
"devDependencies": {
20+
"chai": "^4.1.1",
21+
"mocha": "^3.5.0",
22+
"mocha-prepare": "^0.1.0",
23+
"nock": "^9.4.4",
24+
"nyc": "^12.0.2",
25+
"standard": "^11.0.1"
26+
},
27+
"dependencies": {
28+
"aws-sdk": "^2.286.2",
29+
"bluebird": "^3.5.1",
30+
"co": "^4.6.0",
31+
"config": "^1.21.0",
32+
"elasticsearch": "^15.1.1",
33+
"get-parameter-names": "^0.3.0",
34+
"http-aws-es": "^6.0.0",
35+
"joi": "^9.0.4",
36+
"lodash": "^4.17.10",
37+
"no-kafka": "^3.2.4",
38+
"topcoder-healthcheck-dropin": "^1.0.2",
39+
"winston": "^2.2.0"
40+
},
41+
"engines": {
42+
"node": "8.x"
43+
},
44+
"standard": {
45+
"env": [
46+
"mocha"
47+
]
48+
}
49+
}

0 commit comments

Comments
 (0)