Miłosz Orzeł

.net, js/ts, html/css, arduino, java... no rants or clickbaits.

The Daily Grind (Quick Tips on Miscellaneous Issues, Ep. 2)

Intro

Here's a second round of random issues and quick tips for working around them. Part one is here.

The issues:

 

Firebase app GitHub Actions deploy failing (IAM_PERMISSION_DENIED)

I've added a feature that used Firebase Cloud Storage bucket. It all worked well on emulator but when I wanted to deploy it to test environment through GitHub Actions, the job failed with such error:

Error: Request to https://firebasestorage.googleapis.com/v1alpha/projects/example-project/defaultBucket had HTTP Error: 403, Permission 'firebasestorage.defaultBucket.get' denied on resource '//firebasestorage.googleapis.com/projects/example-project/defaultBucket' (or it may not exist).

"details": [
  ***
    "@type": "type.googleapis.com/google.rpc.ErrorInfo",
    "reason": "IAM_PERMISSION_DENIED",
    "domain": "firebasestorage.googleapis.com",
    "metadata": ***
      "resource": "projects/example-project/defaultBucket",
      "permission": "firebasestorage.defaultBucket.get"
    ***
  ***
]

It happened because GitHub Action was using a service account which lacked firebasestorage.defaultBucket.get permission. The permission is listed under roles/firebasestorage.viewer, so such role should be added to the service account. 

You can see your project's service accounts in Project settings / Service account page in Firebase console and manage it on GCP IAM & Admin / IAM page (IAM stands for Identity and Access Management). My project uses Tarraform and Terragrunt so after adding the role through GCP console (for a quick test) I had to add it to .tf file and run terragrunt apply to modify all the environments...

 

SignatureDoesNotMatch error while fetching file from a bucket

I was fetching a file with unusual file extension in React app (in-house DSL that programs clinical trial questionnaires). To work correctly it needed to be recognized as text/plain MIME type instead of the default application/octet-stream. When the file was loaded from Firebase hosting, I could set 'Content-Type': 'text/plain' header while fetching. When I've tired to do the same after switching to Firestore bucket, an 403 error with SignatureDoesNotMatch in response stated to appear.

The signature feature didn't like the fetch header setting, but it was possible to go without it by setting the Content-Type property in the uploaded file metadata. To edit the metadata you can go to GCP storage browser and use Storage bucket details -> Edit metadata menu.

BTW: You might face another issue while working with cloud storage: your newly uploaded file will be reachable when opening file link in browser or by doing a GET in curl but it might fail with fetch due to lack of CORS settings. If you get this issue look here or search how to use gsutil cors set cors.json gs://your-bucket-name command.

 

No level config per handler in @datadog/browser-logs

Datadog has a pretty neat feature that allows sending logs from browser to Datadog servers (including automatic reporting of unhandled errors and rejected promises). You might want to use the http handler that sends data over network (debounced) and the console handler that forwards entries to browser's console. You can set the log level (e.g. info vs debug), but the issue is that you cannot configure different level for each handler. That's a bummer, it would be cool to use info level for logs sent over network but still see the debug logs in console...

Fortunately there is a simple workaround. The datadogLogs has beforeSend callback and you can use it to filter out debug entries:

datadogLogs.init({
  // More init props here
  beforeSend: logsEvent => {
    return logsEvent.status !== 'debug';
  },
});

const logger = datadogLogs.createLogger('example-logger', {
  level: 'debug',
  handler: ['http', 'console'],
});

 

Grep skipping hidden files (fzf-lua+ripgrep in Neovim)

I like ripgrep and fzf a lot! These are great not only as standalone command line tools, but can also be combined to provide awesome search experience in Vim/Neovim. For Vim there's a fzf.vim plugin and for Neovim I can recommend se fzf-lua.

I often use life_grep command for quick regexp search of the codebase (which can later be fine tuned by fuzzy search of files that contain a match). I have this mapping:

vim.keymap.set('n', '<leader>sg', ':FzfLua live_grep<CR>', { desc = '[s]earch live [g]rep' })

I used to have such config for the fzf-lua plugin:

require('fzf-lua').setup { 'max-perf' }

but there was an issue: grepping would skip hidden files and directories (.env files, .github folder etc.). I think it's much safer and useful to check the hidden files too, so now I'm using such config: 

require('fzf-lua').setup {
  'max-perf',
  grep = {
    rg_opts = "--hidden --glob '!.git/' --color=never --no-heading --column -n --smart-case",
  },
}

With the above, live_grep still avoids scanning the .git folder, and node_modules are still skipped due to ripgrep's sensible defaults but I won't miss occurrences in hidden files.

There's a simpler option, if you don't care about .git being included in search too:

require('fzf-lua').setup {
  'max-perf',
  grep = {
    hidden = true,
  },
}

 

The Daily Grind (Quick Tips on Miscellaneous Issues, Ep. 1)

Intro

This post will be different from other texts I've put on this blog so far. Instead of digging deep into a singe topic, I'll mention some issues that stumped me recently and suggest solutions. These issues might be quite niche, but I hope it will help at least one person get unstuck. Well, it might be me in the future. I'm 40 now - it's about time I started writing things down ;)

The issues:

 

Logout not working in React + Capacitor app (Auth Connect)

I was working on a PoC app with React and Capacitor for Web and Android platforms, based on Auth Connect tutorial. The app was initially using Auth0Provider to connect to Ionic's test client. It worked fine. Unfortunately as soon as I switched to OIDC auth used at my company, the logout action stopped working (while login still worked)! Logout was not invoking correct URL, it looked like the end_session_endpoint property from .well-known discovery was ignored.

Switching from Auth0Provider to OktaProvider solved the issue. If more problems appear, maybe it will be necessary to do a custom provider...

Login action stuck in React + Capacitor app (Auth Connect)

While working again on the same PoC app mentioned above, I've faced an issue with login on Android (the Web version was all good). First part of the login process worked fine: user was redirected to auth page, got a chance to enter email and input one-time password. But, after clicking the sign in button nothing happened (user should be redirected back to the Android app). Keycloak logs showed successful LOGIN event... Weird... I've checked out older version (based on Auth Connect tutorial) which worked ok and compared the logs in Android Studio Logcat: the broken version lacked App restarted, Auth Code, Login Token Provider and Fetch Token URL entries that follow successful Login...

It turned out that I didn't have proper AUTH_URL_SCHEME setting in android/variables.gradle file. If your Android URLs start with com.example.myapp:// then you should add AUTH_URL_SCHEME = 'com.example.myapp' variable (BTW: for production apps look into Deep Links). After fixing the setting, the login flow started to work correctly. Now Keyclock events log shows not only LOGIN entry but also CODE_TO_TOKEN following it.

No details in Capacitor/Console logs

It's easy to get used to how good modern console logging is. If you write console.log('Some message', {someObject}) then Chrome will display the logged message and show expandable object details (which you can copy to clipboard or save as variable). Node will show the details too (formatted and colored)... Watch out though, because if you are working on a Native target in Capacitor, the same console.log call will end up as useless 'Some message', [object Object] line that you can see in Android Studio Logcat.

You can make the log more useful by creating a small utility, that prepares parameters in such way that when the app runs on Native platform, the details object is serialized (you don't want to do it for Web unless you are ready to lose special object handling and see just a string instead):

import { Capacitor } from '@capacitor/core';

export const getLogArgs = (message: string, details: unknown) =>
  Capacitor.isNativePlatform()
    ? [message, JSON.stringify(details, null, 2)]
    : [message, details];

// Usage:
console.log(...getLogArgs('Some message', { someObject }));

It's tempting to create a simple function that wraps a call to console.log but that has a serious drawback: your log entries will be marked as coming from file and line that contains the wrapper function instead of a place that added the message.

Unable to start application on Android Virtual Device (activity class does not exist)

I occasionally get "Activity class {com.example.something/com.example.something.MainActivity} does not exist" error while trying to start application in Android emulator from Android Studio. Restarting virtual device, using "Invalidate Caches" or "Reload All from Disk" options in Android Studio doesn't help. Rebuilding the project, doing npx cap sync etc. also doesn't solve the issue. The error persists...

One thing that finally solves it is wiping out emulated device data. Losing the data is a bit annoying (for example I will need to onboard a fingerprint again), but it's much better than not being able to test the app at all :) There's probably a better way, but I don't do that much on Android... You can find "Wipe Data" option in virtual devices manager:

Wipe Data

 

Problems with accessing internal NPM dependency

I work on projects that mix dependencies on publicly accessible NPM packages with internal packages from GitHub Packages registry. Recently I couldn't get a project running after moving to new laptop, Yarn was not able to install an internal package and it did not provide useful info on why the authorization failed...

Your projects might be setup in a different way, but for me the issue was that my GITHUB_TOKEN was refreshed and it lost SSO (single sign-on) access to my company organization in GitHub. The issue was easy to spot after running this command:

curl -H "Authorization: Bearer ${GITHUB_TOKEN}" https://npm.pkg.github.com/your-org/your-package

because it gave such helpful information:

{"error":"Permission permission_denied: Resource protected by organization SAML enforcement. You must grant your Personal Access token access to this organization."}

Some other things worth checking if you use GITHUB_TOKEN and internal NPM registry is to verify if your shell gives you proper (current) value of token from environment variable by running env | grep GITHUB_TOKEN command. Check also if you have good config in .npmrc and run npm login at least once (in my case the expected password was the token, not my normal login password!)...