Skip to content

Add execution_id to the labels of a manual stackdriver log entry #560

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
rhodgkins opened this issue Sep 20, 2019 · 15 comments
Closed

Add execution_id to the labels of a manual stackdriver log entry #560

rhodgkins opened this issue Sep 20, 2019 · 15 comments

Comments

@rhodgkins
Copy link
Contributor

Related issues

This is similar but not the same I don't think #554

[REQUIRED] Version info

node: v8.16
firebase-functions: 3.2.0
firebase-tools: 7.3.2
firebase-admin: 8.3.0
@google-cloud/logging: 5.3.1
@google-cloud/error-reporting: 1.1.1

[REQUIRED] Test case

const { Logging } = require('@google-cloud/logging')
const stackdriverLogger = new Logging()

const log = stackdriverLogger.log('errors', { removeCircular: true })

console.log('console.log: Hello')
log.write(log.entry({
            resource: {
               type: 'cloud_function',
               labels: resourceLabels
            },
            severity: 'INFO'
         }, '@google-cloud/logging: Hello'))

[REQUIRED] Steps to reproduce

I follow the setup info from here: https://firebase.google.com/docs/functions/reporting-errors#sending_to_stackdriver

Put the above code in a function (returning log.write(...)).

[REQUIRED] Expected behavior

In the Functions Logs page I'd expect to be able to have @google-cloud/logging: Hello showing under the same execution as console.log: Hello but it does not. Also there is no "..." menu to the far right of that log entry.

[REQUIRED] Actual behavior

It seems that the @google-cloud/logging is missing labels.execution_id, which a console.log automatically has.

Is there anyway to get this? I couldn't find it in an environmental variable, and it isn't the functions event context eventId either.

console.log: Hello from Stackdriver logging:

{
 insertId: "000001-95cfaf51-ffab-474e-8b73-bc1809bd7b8c"  
 labels: {
  execution_id: "753669694570642"   
 }
 logName: "projects/bookcreator-dev/logs/cloudfunctions.googleapis.com%2Fcloud-functions"  
 receiveTimestamp: "2019-09-20T11:33:16.298554054Z"  
 resource: {
  labels: {
   function_name: "users-intercom-userData"    
   project_id: "bookcreator-dev"    
   region: "us-central1"    
  }
  type: "cloud_function"   
 }
 severity: "INFO"  
 textPayload: "console.log: Hello"  
 timestamp: "2019-09-20T11:33:11.812Z"  
 trace: "projects/bookcreator-dev/traces/1ce745b577485e3bd81a77a88810badd"  
}

@google-cloud/logging: Hello from Stackdriver logging:

{
 insertId: ".........0kfYGDy8d8E4O5gD7igjF1n"  
 logName: "projects/bookcreator-dev/logs/errors"  
 receiveTimestamp: "2019-09-20T11:33:11.871903937Z"  
 resource: {
  labels: {
   function_name: "users-intercom-userData"    
   project_id: "bookcreator-dev"    
   region: "us-central1"    
  }
  type: "cloud_function"   
 }
 severity: "INFO"  
 textPayload: "@google-cloud/logging: Hello"  
 timestamp: "2019-09-20T11:33:11.812000036Z"  
}

Were you able to successfully deploy your functions?

Yes

@google-oss-bot
Copy link
Collaborator

I couldn't figure out how to label this issue, so I've labeled it for a human to triage. Hang tight.

@Elyx0
Copy link

Elyx0 commented Nov 8, 2019

Looks pretty similar to my case, let's hope it gets triaged fast

@cmditch
Copy link

cmditch commented Jan 12, 2020

Also seeing this.

@scottypickering
Copy link

I'm also having trouble with this issue.

In the meantime, I found a few relevant Stack Overflow threads that contain decent workarounds.

  • For HTTP functions, there is Function-Execution-Id header in each request. I suspect this applies to callable functions as well. Read more here.
  • For Pub/Sub functions, there is an EventID field in the context metadata that appears to correspond to the execution_id. Read more here.

By logging the execution ID this way you can create a filter to see all the logs of the same function on the Stackdriver Logging console. Unfortunately, I don't believe it would work natively on the Firebase Functions console.

@rhodgkins
Copy link
Contributor Author

Is it possible to get an update / workaround for this? It's almost been open for 6 months now without any triaging from the Firebase team...

@Elyx0
Copy link

Elyx0 commented Mar 18, 2020

Hey guys can you look at this from home? 🤓

@andieromero
Copy link

We apologize for the delay - I know the CF team is aware of the issue, but don't have plans to fix this in the near future. Do you mind submitting a customer support issue to Cloud Functions directly to get help with a workaround? https://cloud.google.com/support/docs/issue-trackers#feature_requests thanks!

@andieromero andieromero added the Needs: Author Feedback Issues awaiting author feedback label Mar 20, 2020
@rhodgkins
Copy link
Contributor Author

Done https://issuetracker.google.com/issues/152153718

@Elyx0
Copy link

Elyx0 commented Apr 16, 2020

@rhodgkins Did you find a solution?
@scottypickering You're wrong about the link you sent, as Josh says in the thread:

execution_id should change on every call, but your identifier will only change when the function environment is restarted

It's way harder to have a non-global execution-id that you have to pass around. Since the behavior can be:

For the same container being reused:

Execution1 runs and logs: grab execution-id it and set the global
Execution1 hits an await.
Execution2 runs and logs: grab execution-id and set the global
Execution1 resumes but our custom grabbed execution id is now set to the Execution2 id
Execution1 logs wrongly to Execution2.

@scottypickering
Copy link

@Elyx0 Interesting, I've always thought that function executions run in sequence on a given container, regardless of if they are "awaiting" or not. In your scenario, I would expect Execution2 to cause a new instance to spin up, rather than running on the same container as Execution1. Only when Execution1 completely finishes would I expect an additional execution to run on that instance. Could you point me towards a resource that explains otherwise?

For what it's worth, @CodingDoug says here that, for a single instance, events are processed in serial. I'm admittedly not sure that means what I think it means though.

@rhodgkins
Copy link
Contributor Author

rhodgkins commented Apr 16, 2020

@Elyx0 I'm currently using the eventId from a PubSub message as that corresponds to the execution_id. Can’t set the correct label in the logs though so I have a filter that looks for both. Neither of these are ideal solutions at all.

I’d expect this to work out the box but it seems development on Firebase Functions has stalled somewhat (if you also look at the fact that Node 10 support is still in beta despite LTSing 6 months ago. Also it seems a lot of @Google-Cloud NPM packages are now dropping Node 8 support...)!

@Elyx0
Copy link

Elyx0 commented Apr 17, 2020

@scottypickering You were right: https://cloud.google.com/functions/docs/concepts/exec#auto-scaling_and_concurrency so I guess it's safe to alter from a global for now

@Elyx0
Copy link

Elyx0 commented Apr 17, 2020

@rhodgkins Did you ever hit Exception from a finished function: Error: Could not load the default credentials. Browse to https://cloud.google.com/docs/authentication/getting-started for more information. when trying to log from @Google-Cloud logging? I'm having it intermittently

@rhodgkins
Copy link
Contributor Author

rhodgkins commented Jun 8, 2020

Is there any update on this now functions are on Node 10 (and the structure logging API has been added #665)?

This is much better (I'd go as far as to say fixed from at least my use of it!) from 3.7.xusing Node 10 as the structured logging means that the using @google-cloud/logging is no longer needed - can just use require('firebase-functions').logger instead. It supports most of the fields (except labels for the moment) and if used with @google-cloud/error-reporting the stack traces can be reported.

@taeold
Copy link
Contributor

taeold commented Jun 21, 2021

I separately confirmed that execution_id is now being correctly piped through to Cloud Logging! Thanks everyone for your patience.

@taeold taeold closed this as completed Jun 21, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

7 participants