Skip to content
This repository was archived by the owner on Mar 12, 2025. It is now read-only.

Commit 76d79f8

Browse files
committed
enrich disintegration
1 parent 6c079bb commit 76d79f8

26 files changed

+1654
-294
lines changed

.gitignore

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
.env

README.md

Lines changed: 47 additions & 29 deletions
Original file line numberDiff line numberDiff line change
@@ -21,32 +21,49 @@
2121

2222
Configuration for the application is at `config/default.js` and `config/production.js`. The following parameters can be set in config files or in env variables:
2323

24-
- LOG_LEVEL: the log level
25-
- PORT: the server port
26-
- AUTH_SECRET: TC Authentication secret
27-
- VALID_ISSUERS: valid issuers for TC authentication
28-
- PAGE_SIZE: the default pagination limit
29-
- MAX_PAGE_SIZE: the maximum pagination size
30-
- API_VERSION: the API version
31-
- DB_NAME: the database name
32-
- DB_USERNAME: the database username
33-
- DB_PASSWORD: the database password
34-
- DB_HOST: the database host
35-
- DB_PORT: the database port
36-
- ES_HOST: Elasticsearch host
37-
- ES_REFRESH: Should elastic search refresh. Default is 'true'. Values can be 'true', 'wait_for', 'false'
38-
- ELASTICCLOUD_ID: The elastic cloud id, if your elasticsearch instance is hosted on elastic cloud. DO NOT provide a value for ES_HOST if you are using this
39-
- ELASTICCLOUD_USERNAME: The elastic cloud username for basic authentication. Provide this only if your elasticsearch instance is hosted on elastic cloud
40-
- ELASTICCLOUD_PASSWORD: The elastic cloud password for basic authentication. Provide this only if your elasticsearch instance is hosted on elastic cloud
41-
- ES.DOCUMENTS: Elasticsearch index, type and id mapping for resources.
42-
- SKILL_INDEX: The Elastic search index for skill. Default is `skill`
43-
- SKILL_ENRICH_POLICYNAME: The enrich policy for skill. Default is `skill-policy`
44-
- TAXONOMY_INDEX: The Elastic search index for taxonomy. Default is `taxonomy`
45-
- TAXONOMY_PIPELINE_ID: The pipeline id for enrichment with taxonomy. Default is `taxonomy-pipeline`
46-
- TAXONOMY_ENRICH_POLICYNAME: The enrich policy for taxonomy. Default is `taxonomy-policy`
47-
- MAX_BATCH_SIZE: Restrict number of records in memory during bulk insert (Used by the db to es migration script)
48-
- MAX_BULK_SIZE: The Bulk Indexing Maximum Limits. Default is `100` (Used by the db to es migration script)
24+
- `LOG_LEVEL`: the log level
25+
- `PORT`: the server port
26+
- `AUTH_SECRET`: TC Authentication secret
27+
- `VALID_ISSUERS`: valid issuers for TC authentication
28+
- `PAGE_SIZE`: the default pagination limit
29+
- `MAX_PAGE_SIZE`: the maximum pagination size
30+
- `API_VERSION`: the API version
31+
- `DB_NAME`: the database name
32+
- `DB_USERNAME`: the database username
33+
- `DB_PASSWORD`: the database password
34+
- `DB_HOST`: the database host
35+
- `DB_PORT`: the database port
36+
- `ES_HOST`: Elasticsearch host
37+
- `ES_REFRESH`: Should elastic search refresh. Default is 'true'. Values can be 'true', 'wait_for', 'false'
38+
- `ELASTICCLOUD_ID`: The elastic cloud id, if your elasticsearch instance is hosted on elastic cloud. DO NOT provide a value for ES_HOST if you are using this
39+
- `ELASTICCLOUD_USERNAME`: The elastic cloud username for basic authentication. Provide this only if your elasticsearch instance is hosted on elastic cloud
40+
- `ELASTICCLOUD_PASSWORD`: The elastic cloud password for basic authentication. Provide this only if your elasticsearch instance is hosted on elastic cloud
41+
- `ES`.DOCUMENTS: Elasticsearch index, type and id mapping for resources.
42+
- `SKILL_INDEX`: The Elastic search index for skill. Default is `skill`
43+
- `TAXONOMY_INDEX`: The Elastic search index for taxonomy. Default is `taxonomy`
44+
- `MAX_BATCH_SIZE`: Restrict number of records in memory during bulk insert (Used by the db to es migration script)
45+
- `MAX_BULK_SIZE`: The Bulk Indexing Maximum Limits. Default is `100` (Used by the db to es migration script)
46+
47+
- `AUTH0_URL`: Auth0 URL, used to get TC M2M token
48+
- `AUTH0_AUDIENCE`: Auth0 audience, used to get TC M2M token
49+
- `TOKEN_CACHE_TIME`: Auth0 token cache time, used to get TC M2M token
50+
- `AUTH0_CLIENT_ID`: Auth0 client id, used to get TC M2M token
51+
- `AUTH0_CLIENT_SECRET`: Auth0 client secret, used to get TC M2M token
52+
- `AUTH0_PROXY_SERVER_URL`: Proxy Auth0 URL, used to get TC M2M token
53+
54+
- `BUSAPI_URL`: Topcoder Bus API URL
55+
- `KAFKA_ERROR_TOPIC`: The error topic at which bus api will publish any errors
56+
- `KAFKA_MESSAGE_ORIGINATOR`: The originator value for the kafka messages
57+
- `SKILLS_ERROR_TOPIC`: Kafka topic for report operation error
58+
59+
**NOTE** AUTH0 related configuration normally is shared on challenge forum.
60+
61+
## DB and Elasticsearch In Docker
62+
- Navigate to the directory `docker-pgsql-es` folder. Rename `sample.env` to `.env` and change any values if required.
63+
- Run `docker-compose up -d` to have docker instances of pgsql and elasticsearch to use with the api
4964

65+
**NOTE** To completely restart the services, run `docker-compose down --volumes` and then `docker-compose up`.
66+
Notice the `--volumes` argument is passed to the `docker-compose down` command to remove the volume that stores DB data. Without the `--volumes` argument the DB data will be persistent after the services are put down.
5067

5168
## Local deployment
5269

@@ -58,17 +75,16 @@ Setup your Postgresql DB and Elasticsearch instance and ensure that they are up
5875
- Run the migrations - `npm run migrations up`. This will create the tables.
5976
- Then run `npm run insert-data` and insert mock data into the database.
6077
- Run `npm run migrate-db-to-es` to sync data with ES.
61-
- Startup server `npm run start`
78+
- Startup server `npm run start:dev`
6279

6380
## Migrations
6481

6582
Migrations are located under the `./scripts/db/` folder. Run `npm run migrations up` and `npm run migrations down` to execute the migrations or remove the earlier ones
6683

6784
## Local Deployment with Docker
85+
Setup your Postgresql DB and Elasticsearch instance and ensure that they are up and running.
6886

69-
- Navigate to the directory `docker-pgsql-es` folder. Rename `sample.env` to `.env` and change any values if required.
70-
- Run `docker-compose up -d` to have docker instances of pgsql and elasticsearch to use with the api
71-
87+
- Configure AUTH0 related parameters via ENV variables. Note that normally you don't need to change other configuration.
7288
- Create database using `npm run create-db`.
7389
- Run the migrations - `npm run migrations up`. This will create the tables.
7490
- Then run `npm run insert-data` and insert mock data into the database.
@@ -102,6 +118,8 @@ Migrations are located under the `./scripts/db/` folder. Run `npm run migrations
102118
| `npm run delete-data` | Delete the data from the database |
103119
| `npm run migrations up` | Run up migration |
104120
| `npm run migrations down` | Run down migration |
121+
| `npm run create-index` | Create Elasticsearch indexes. Use `-- --force` flag to skip confirmation |
122+
| `npm run delete-index` | Delete Elasticsearch indexes. Use `-- --force` flag to skip confirmation |
105123
| `npm run generate:doc:permissions` | Generate [permissions.html](docs/permissions.html) |
106124
| `npm run generate:doc:permissions:dev` | Generate [permissions.html](docs/permissions.html) on any changes (useful during development). |
107125

config/default.js

Lines changed: 16 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -20,6 +20,20 @@ module.exports = {
2020
DB_HOST: process.env.DB_HOST || 'localhost',
2121
DB_PORT: process.env.DB_PORT || 5432,
2222

23+
AUTH0_URL: process.env.AUTH0_URL,
24+
AUTH0_AUDIENCE: process.env.AUTH0_AUDIENCE,
25+
TOKEN_CACHE_TIME: process.env.TOKEN_CACHE_TIME,
26+
AUTH0_CLIENT_ID: process.env.AUTH0_CLIENT_ID,
27+
AUTH0_CLIENT_SECRET: process.env.AUTH0_CLIENT_SECRET,
28+
AUTH0_PROXY_SERVER_URL: process.env.AUTH0_PROXY_SERVER_URL,
29+
30+
BUSAPI_URL: process.env.BUSAPI_URL || 'https://api.topcoder-dev.com/v5',
31+
32+
KAFKA_ERROR_TOPIC: process.env.KAFKA_ERROR_TOPIC || 'common.error.reporting',
33+
KAFKA_MESSAGE_ORIGINATOR: process.env.KAFKA_MESSAGE_ORIGINATOR || 'skills-api',
34+
35+
SKILLS_ERROR_TOPIC: process.env.SKILLS_ERROR_TOPIC || 'skills.action.error',
36+
2337
// ElasticSearch
2438
ES: {
2539
HOST: process.env.ES_HOST || 'http://localhost:9200',
@@ -35,14 +49,11 @@ module.exports = {
3549
DOCUMENTS: {
3650
skill: {
3751
index: process.env.SKILL_INDEX || 'skill',
38-
type: '_doc',
39-
enrichPolicyName: process.env.SKILL_ENRICH_POLICYNAME || 'skill-policy'
52+
type: '_doc'
4053
},
4154
taxonomy: {
4255
index: process.env.TAXONOMY_INDEX || 'taxonomy',
43-
type: '_doc',
44-
pipelineId: process.env.TAXONOMY_PIPELINE_ID || 'taxonomy-pipeline',
45-
enrichPolicyName: process.env.TAXONOMY_ENRICH_POLICYNAME || 'taxonomy-policy'
56+
type: '_doc'
4657
}
4758
},
4859
MAX_BATCH_SIZE: parseInt(process.env.MAX_BATCH_SIZE, 10) || 10000,

docker/sample.env

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6,3 +6,8 @@ DB_PORT=5432
66

77
ES_HOST=http://host.docker.internal:9200
88
PORT=3001
9+
10+
AUTH0_CLIENT_ID=<auth0 client id>
11+
AUTH0_CLIENT_SECRET=<auth0 client secret>
12+
AUTH0_URL=<auth0 url>
13+
AUTH0_AUDIENCE=<auth0 audience>

docs/permissions.html

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -263,7 +263,7 @@ <h2 class="anchor-container">
263263
<div class="row">
264264
<div class="col pt-5 pb-2">
265265
<h2 class="anchor-container">
266-
<a href="#section-taxonomy-metadata" name="section-taxonomy-metadata" class="anchor"></a>Taxonomy Metadata
266+
<a href="#section-taxonomy" name="section-taxonomy" class="anchor"></a>Taxonomy
267267
</h2>
268268
</div>
269269
</div>
@@ -360,7 +360,7 @@ <h2 class="anchor-container">
360360
<div class="row">
361361
<div class="col pt-5 pb-2">
362362
<h2 class="anchor-container">
363-
<a href="#section-taxonomy" name="section-taxonomy" class="anchor"></a>Taxonomy
363+
<a href="#section-taxonomy-metadata" name="section-taxonomy-metadata" class="anchor"></a>Taxonomy Metadata
364364
</h2>
365365
</div>
366366
</div>

0 commit comments

Comments
 (0)