Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

6.3.10 timeout issues #1361

Open
philipwindsora55 opened this issue Aug 6, 2024 · 42 comments
Open

6.3.10 timeout issues #1361

philipwindsora55 opened this issue Aug 6, 2024 · 42 comments

Comments

@philipwindsora55
Copy link

Which version of Duende IdentityServer are you using?
6.3.10

Which version of .NET are you using?
6.0

Describe the bug
We patched IdentityServer as per the CVE. We've gone from 6.3.5 to 6.3.10.

I'm not saying for sure that this patch has started causing issues but they only started happening since I applied the patch so I'm reaching out to see if anyone else has experienced issues.

Our UI is working fine, we can login and edit details however, the following endpoints are timing out:

/.well-known/openid-configuration
/.well-known/openid-configuration/jwks

The timeouts have occurred across our dev and test instances the past few days. I've just rebooted out test instance and .well-known/openid-configuration is loading again.

There are no logs showing in Application Insights, I'm also in the process of raising a ticket with azure support.

@philipwindsora55
Copy link
Author

I've turned on trace logging for our test instance and will post the logs once we have them. This might give more insight into why these endpoints are timing out

@sanket-mistry-jm
Copy link

sanket-mistry-jm commented Aug 12, 2024

We are running to same situation on all of our environments. We had this issue when we used pre-release version as well as public released version. Rebooting seems to resolve the issue, but issue seems to come back after few days and seems consistent.

What is worth noting is that we rollbacked version to 6.3.8 and issue was gone.

What I have noticed is that issue starts occurring mostly during UTC mid-night on lower environment as well as production as far as I can tell. I hope this might help investigate.

@josephdecock
Copy link
Member

Thanks for this report. We're investigating. My initial suspicion is that this is related to 6.3.9's update of our dependencies on ASP.NET framework packages. In that version, we updated framework packages from version 6.0.0 to version 6.0.26. This updates our transitive dependency on the System.IdentityModel.Tokens.Jwt and Microsoft.IdentityModel.JsonWebTokens packages past versions that have a known Denial of Service vulnerability.

@sanket-mistry-jm it would be extremely helpful if you could try to reproduce the issue in your environment with the 6.3.9 build of IdentityServer.

In the meantime, since this is preventing application of the hotfix, my recommendation is to apply the workarounds that we describe in the advisory - basically your UI code should not rely on the interaction service to determine if urls are local. You should use the IsLocalUrl api from ASP.NET instead.

@philipwindsora55
Copy link
Author

Hi @josephdecock, thanks for looking into this. Is it likely that the same issue effects 7.X.X versions? We are planning to upgrade to dotnet8 and v7 in the very near future.

@sanket-mistry-jm
Copy link

@josephdecock, I will see what I can do and report back here.

@sanket-mistry-jm
Copy link

FYI, we ran into the same issue again on Production. Again, issue started at almost UTC 12:00 today.

@philipwindsora55
Copy link
Author

Our test instance started timing out again today, here are the logs....

Logs from a request that times out

13/08/2024, 09:39:50.064 - Request starting HTTP/1.1 GET https://[redacted]/.well-known/openid-configuration - -
13/08/2024, 09:39:50.065 - All hosts are allowed.
13/08/2024, 09:39:50.065 - Adding HSTS header to response.
13/08/2024, 09:39:50.065 - The request path /.well-known/openid-configuration does not match a supported file type
13/08/2024, 09:39:50.065 - No candidates found for the request path '/.well-known/openid-configuration'
13/08/2024, 09:39:50.065 - Request did not match any endpoints
13/08/2024, 09:39:50.065 - AuthenticationScheme: idsrv was not authenticated.
13/08/2024, 09:39:50.066 - AuthenticationScheme: idsrv was not authenticated.
13/08/2024, 09:39:50.066 - Request path /.well-known/openid-configuration matched to endpoint type Discovery
13/08/2024, 09:39:50.066 - Endpoint enabled: Discovery, successfully created handler: Duende.IdentityServer.Endpoints.DiscoveryEndpoint
13/08/2024, 09:39:50.066 - Invoking IdentityServer endpoint: Duende.IdentityServer.Endpoints.DiscoveryEndpoint for /.well-known/openid-configuration
13/08/2024, 09:39:50.066 - Processing discovery request.
13/08/2024, 09:39:50.066 - Start discovery request
13/08/2024, 09:39:50.066 - Calling into discovery response generator: Duende.IdentityServer.ResponseHandling.DiscoveryResponseGenerator
13/08/2024, 09:39:50.066 - Getting all the keys.
13/08/2024, 09:39:50.066 - Cache miss when loading all keys.
13/08/2024, 09:39:50.066 - Loading keys from store.
13/08/2024, 09:39:50.067 - Entity Framework Core 6.0.5 initialized 'PersistedGrantDbContext' using provider 'Microsoft.EntityFrameworkCore.SqlServer:6.0.5' with options: None
13/08/2024, 09:39:50.067 - Creating DbCommand for 'ExecuteReader'.
13/08/2024, 09:39:50.067 - Created DbCommand for 'ExecuteReader' (0ms).
13/08/2024, 09:39:50.067 - Opening connection to database '[redacted]' on server '[redacted]'.
13/08/2024, 09:39:50.067 - Opened connection to database '[redacted]' on server '[redacted]'.
"13/08/2024, 09:39:50.067 - Executing DbCommand [Parameters=[], CommandType='""Text""', CommandTimeout='30']
SELECT [k].[Id], [k].[Algorithm], [k].[Created], [k].[Data], [k].[DataProtected], [k].[IsX509Certificate], [k].[Use], [k].[Version]
FROM [Keys] AS [k]
WHERE [k].[Use] = N'signing'"
"13/08/2024, 09:39:50.069 - Executed DbCommand (1ms) [Parameters=[], CommandType='""Text""', CommandTimeout='30']
SELECT [k].[Id], [k].[Algorithm], [k].[Created], [k].[Data], [k].[DataProtected], [k].[IsX509Certificate], [k].[Use], [k].[Version]
FROM [Keys] AS [k]
WHERE [k].[Use] = N'signing'"
13/08/2024, 09:39:50.069 - A data reader was disposed.
13/08/2024, 09:39:50.069 - Closing connection to database '[redacted]' on server '[redacted]'.
13/08/2024, 09:39:50.069 - Closed connection to database '[redacted]' on server '[redacted]'.
13/08/2024, 09:39:50.069 - Performing unprotect operation to key {528a74a2-4935-449f-bf9b-99c525077820} with purposes ('C:\home\site\wwwroot', 'DataProtectionKeyProtector').
13/08/2024, 09:39:50.069 - Performing unprotect operation to key {9ff190bf-9bb3-4ec1-9314-95bb4692d19c} with purposes ('C:\home\site\wwwroot', 'DataProtectionKeyProtector').

Logs from a successful request for comparison.

12/08/2024, 09:40:26.426 -  Request starting HTTP/1.1 GET https://[Redacted]/.well-known/openid-configuration - -
12/08/2024, 09:40:26.426 -  All hosts are allowed.
12/08/2024, 09:40:26.426 -  Adding HSTS header to response.
12/08/2024, 09:40:26.426 -  The request path /.well-known/openid-configuration does not match a supported file type
12/08/2024, 09:40:26.426 -  No candidates found for the request path '/.well-known/openid-configuration'
12/08/2024, 09:40:26.426 -  Request did not match any endpoints
12/08/2024, 09:40:26.426 -  CORS request made for path: /.well-known/openid-configuration from origin: https://[Redacted]
12/08/2024, 09:40:26.426 -  Cache hit for https://[Redacted]
12/08/2024, 09:40:26.426 -  CorsPolicyService allowed origin: https://[Redacted]
12/08/2024, 09:40:26.427 -  The request has an origin header: 'https://[Redacted]'.
12/08/2024, 09:40:26.427 -  CORS policy execution successful.
12/08/2024, 09:40:26.427 -  AuthenticationScheme: idsrv was not authenticated.
12/08/2024, 09:40:26.427 -  AuthenticationScheme: idsrv was not authenticated.
12/08/2024, 09:40:26.427 -  Request path /.well-known/openid-configuration matched to endpoint type Discovery
12/08/2024, 09:40:26.428 -  Endpoint enabled: Discovery, successfully created handler: Duende.IdentityServer.Endpoints.DiscoveryEndpoint
12/08/2024, 09:40:26.428 -  Invoking IdentityServer endpoint: Duende.IdentityServer.Endpoints.DiscoveryEndpoint for /.well-known/openid-configuration
12/08/2024, 09:40:26.428 -  Processing discovery request.
12/08/2024, 09:40:26.428 -  Start discovery request
12/08/2024, 09:40:26.428 -  Calling into discovery response generator: Duende.IdentityServer.ResponseHandling.DiscoveryResponseGenerator
12/08/2024, 09:40:26.428 -  Getting all the keys.
12/08/2024, 09:40:26.428 -  Cache hit when loading all keys.
12/08/2024, 09:40:26.428 -  Looking for active signing keys.
12/08/2024, 09:40:26.428 -  Looking for an active signing key for alg RS256.
12/08/2024, 09:40:26.428 -  Checking if key with kid 6A8BD84B554DEA5B5A9FAF797D1D1736 is active (respecting activation delay).
12/08/2024, 09:40:26.428 -  Key with kid 6A8BD84B554DEA5B5A9FAF797D1D1736 is inactive: the current time is prior to its activation delay.
12/08/2024, 09:40:26.428 -  Checking if key with kid DDA1ACB7B81465052B036C1C2DC59864 is active (respecting activation delay).
12/08/2024, 09:40:26.428 -  Key with kid DDA1ACB7B81465052B036C1C2DC59864 is active.
12/08/2024, 09:40:26.428 -  Active signing key found (respecting the activation delay) with kid: DDA1ACB7B81465052B036C1C2DC59864.
12/08/2024, 09:40:26.428 -  Found active signing key for alg RS256 with kid DDA1ACB7B81465052B036C1C2DC59864.
12/08/2024, 09:40:26.428 -  Checking if key with kid 6A8BD84B554DEA5B5A9FAF797D1D1736 is active (respecting activation delay).
12/08/2024, 09:40:26.428 -  Key with kid 6A8BD84B554DEA5B5A9FAF797D1D1736 is inactive: the current time is prior to its activation delay.
12/08/2024, 09:40:26.428 -  Checking if key with kid DDA1ACB7B81465052B036C1C2DC59864 is active (respecting activation delay).
12/08/2024, 09:40:26.428 -  Key with kid DDA1ACB7B81465052B036C1C2DC59864 is active.
12/08/2024, 09:40:26.428 -  Active signing key found (respecting the activation delay) with kid: DDA1ACB7B81465052B036C1C2DC59864.
12/08/2024, 09:40:26.429 -  Key rotation not required for alg RS256; New key expected to be created in "73.22:23:02"
12/08/2024, 09:40:26.429 -  Cache hit for __all__
12/08/2024, 09:40:26.429 -  Getting the current key.
12/08/2024, 09:40:26.429 -  Cache hit when loading all keys.
12/08/2024, 09:40:26.429 -  Looking for active signing keys.
12/08/2024, 09:40:26.429 -  Looking for an active signing key for alg RS256.
12/08/2024, 09:40:26.429 -  Checking if key with kid 6A8BD84B554DEA5B5A9FAF797D1D1736 is active (respecting activation delay).
12/08/2024, 09:40:26.429 -  Key with kid 6A8BD84B554DEA5B5A9FAF797D1D1736 is inactive: the current time is prior to its activation delay.
12/08/2024, 09:40:26.429 -  Checking if key with kid DDA1ACB7B81465052B036C1C2DC59864 is active (respecting activation delay).
12/08/2024, 09:40:26.429 -  Key with kid DDA1ACB7B81465052B036C1C2DC59864 is active.
12/08/2024, 09:40:26.429 -  Active signing key found (respecting the activation delay) with kid: DDA1ACB7B81465052B036C1C2DC59864.
12/08/2024, 09:40:26.429 -  Found active signing key for alg RS256 with kid DDA1ACB7B81465052B036C1C2DC59864.
12/08/2024, 09:40:26.429 -  Checking if key with kid 6A8BD84B554DEA5B5A9FAF797D1D1736 is active (respecting activation delay).
12/08/2024, 09:40:26.429 -  Key with kid 6A8BD84B554DEA5B5A9FAF797D1D1736 is inactive: the current time is prior to its activation delay.
12/08/2024, 09:40:26.429 -  Checking if key with kid DDA1ACB7B81465052B036C1C2DC59864 is active (respecting activation delay).
12/08/2024, 09:40:26.431 -  Key with kid DDA1ACB7B81465052B036C1C2DC59864 is active.
12/08/2024, 09:40:26.431 -  Active signing key found (respecting the activation delay) with kid: DDA1ACB7B81465052B036C1C2DC59864.
12/08/2024, 09:40:26.431 -  Key rotation not required for alg RS256; New key expected to be created in "73.22:23:02"
12/08/2024, 09:40:26.431 -  Active signing key found with kid DDA1ACB7B81465052B036C1C2DC59864 for alg RS256. Expires in "11.22:15:08". Retires in "25.22:15:08"
12/08/2024, 09:40:26.431 -  Invoking result: Duende.IdentityServer.Endpoints.Results.DiscoveryDocumentResult
12/08/2024, 09:40:26.431 -  Request finished HTTP/1.1 GET https://[Redacted]/.well-known/openid-configuration - - - 200 - application/json;+charset=UTF-8 5.6007ms

@josephdecock
Copy link
Member

Just wanted to update everyone that we are continuing our investigation. We don't have a fix yet unfortunately, but we'll keep you all updated as we go.

@josephdecock
Copy link
Member

The picture that I'm getting is that

  1. there is an intermittent time-out on the discovery endpoint
  2. the timeout goes away after a restart
  3. the timeouts always occur around UTC 12:00

Has anyone in this thread seen it happen at other times of day?

@sanket-mistry-jm
Copy link

sanket-mistry-jm commented Aug 15, 2024

The picture that I'm getting is that

  1. there is an intermittent time-out on the discovery endpoint
  2. the timeout goes away after a restart
  3. the timeouts always occur around UTC 12:00

Has anyone in this thread seen it happen at other times of day?

@josephdecock , for us prettymuch all endpoints start delaying by x seconds eventually going to x minutes. in our case, /connect/authorize works fine and show login UI but post login action it times-out when the issue occurs. I am thinking may be something related to how tokens are signed, or keys are generated/regenerated?

We were able to capture .NET Profiler Trace, and we see this....
image
image

@Stumm304
Copy link

Stumm304 commented Aug 19, 2024

Hey @josephdecock,

we're also experiencing this issue since around two weeks now.

But we are using different versions as reported.
I also cannot confirm, that it only happens around UTC 12:00. We also had it around 5pm UTC.
In our dev environment, it sometimes happens twice a day and sometimes a couple of days nothing.
In Prod it luckily just happened once, yet.

  • DEV Environment

    • Duende 7.0.6, .NET 8
  • PROD Environment

    • Duende 7.0.5, .NET 8

Besides different versions and times, the issue is exactly the same as reported. Only a restart helps.

2024-08-19_17h24_50

@AndersAbel
Copy link
Member

If anyone would be able to capture Otel Traces of the timeout that could help us understand the root cause.

https://docs.duendesoftware.com/identityserver/v7/diagnostics/otel/traces/

@Stumm304
Copy link

Stumm304 commented Aug 20, 2024

We're currently using the Elastic APM agent to collect Traces. Unfortunately, it seems that we dont get any logs when the issue comes up.

@AndersAbel
Copy link
Member

@Stumm304 Thank you for getting back with that information. We have similar reports from other customers. May I ask what log levels you have enabled?

@Stumm304
Copy link

It happened in our dev environment today at around 11:45 UTC+2.
In our dev environment we are already on Log level Debug, but still no logs :-/

@techyian
Copy link

Hi, we're running Duende 7.0.4 and .NET 8 in our Prod environment and we believe this is happening for us also. @AndersAbel could you please advise on which Otel traces you're interested in capturing, is it all those referenced in https://docs.duendesoftware.com/identityserver/v7/diagnostics/otel/traces/ ? Do we have an update on this issue from your investigations?

@josephdecock
Copy link
Member

please advise on which Otel traces you're interested in capturing, is it all those referenced in https://docs.duendesoftware.com/identityserver/v7/diagnostics/otel/traces/ ?

Yes, that's what we're looking for.

Do we have an update on this issue from your investigations?

Unfortunately, we don't yet have a fix. We're continuing to work on the problem though.

@techyian can you give us more history please? Was a previous deployment working?

@philipwindsora55
Copy link
Author

Hi @josephdecock, thanks for looking into this. Is it likely that the same issue effects 7.X.X versions? We are planning to upgrade to dotnet8 and v7 in the very near future.

Can anyone advise on this please? with dotnet6 approaching EOL, we will need to upgrade to dotnet 8.

Another thing of note is, most versions have now been marked as deprecated on nuget because of the vulnerability. We treat warnings as errors in our build pipeline so would need to disable this to be able to use a previously stable version.

Thanks for this report. We're investigating. My initial suspicion is that this is related to 6.3.9's update of our dependencies on ASP.NET framework packages. In that version, we updated framework packages from version 6.0.0 to version 6.0.26. This updates our transitive dependency on the System.IdentityModel.Tokens.Jwt and Microsoft.IdentityModel.JsonWebTokens packages past versions that have a known Denial of Service vulnerability.

@sanket-mistry-jm it would be extremely helpful if you could try to reproduce the issue in your environment with the 6.3.9 build of IdentityServer.

In the meantime, since this is preventing application of the hotfix, my recommendation is to apply the workarounds that we describe in the advisory - basically your UI code should not rely on the interaction service to determine if urls are local. You should use the IsLocalUrl api from ASP.NET instead.

@josephdecock you mentioned an update of dependencies as a possible cause, is there anyway to confirm this?

@keithlfs
Copy link

We are seeing this issue on 6.3.10 using .NET 7 hosted in Azure, calls to /.well-known/openid-configuration just time out, with fairly catastrophic consequences.

@josephdecock
Copy link
Member

josephdecock commented Aug 29, 2024

Hi everyone, I wanted to give you all an update to let you know that we're working hard on this issue, and it is in fact priority 1 for us now. If anyone has an environment where the issue is occurring that they are willing to show to the Duende engineering team, I would be very interested in a troubleshooting call with you. Please email me ([email protected]) if you're able to do so.

Other things that would be helpful include:

  • Trace level logs from an environment where this occurs.
  • OTel Traces from an environment where this occurs.
  • Information about usage of Automatic signing keys, including
    • Are you using automatic signing keys?
    • How many are in the store?
    • Which store implementation are you using (Duende.IdentityServer.EntityFramework or something custom)?
    • Which database are you using? (Provider, Version, Hosting info, etc)
  • Information about versions of the Microsoft.IdentityModel dependencies that you have (the output of dotnet list package --include-transitive | sls "Microsoft.IdentityModel|System.IdentityModel")

@josephdecock
Copy link
Member

@philipwindsora55 We do have reports of this issue affecting both the 6.3.x and 7.0.x release branches. We're still investigating a number of possibilities of what the cause of the issue might be, but one of the few things that did change recently in both of those release branches is that we updated our dependency on the Microsoft.IdentityModel libraries. Are you able to show a before and after of those dependencies?

So, first, on the commit that you deployed to your dev and test environements that caused the issue, run
dotnet list package --include-transitive | sls "Microsoft.IdentityModel|System.IdentityModel". Then, on a known good commit, run the same.

@AndersAbel
Copy link
Member

@keithlfs Do you use Linux or Windows as the hosting OS in Azure?

We have reports of this happening on Linux hosts, is there anyone affected that hosts on Windows?

@philipwindsora55
Copy link
Author

@josephdecock I'm happy to setup a troubleshooting call with you, i'll send you an email.

Another point to note is, we have up-time monitors in place which keeps the app alive 24/7.

As for configuration....

we have deployment slots so need to store keys in a location both instances can access:

services.AddDataProtection()
    .PersistKeysToAzureBlobStorage(new Uri(dataProtectionBlobUri), new DefaultAzureCredential())
    .ProtectKeysWithAzureKeyVault(new Uri(dataProtectionKeyUri), new DefaultAzureCredential());

Our Identity configuration is:

var builder = services.AddIdentityServer(options =>
{
    options.Events.RaiseSuccessEvents = true;
    options.Events.RaiseFailureEvents = true;
    options.Events.RaiseErrorEvents = true;
    options.UserInteraction.ErrorUrl = "/error/idsvr";
    options.Authentication.CookieSlidingExpiration = true;
    options.Authentication.CookieLifetime = TimeSpan.FromDays(7);
    options.ServerSideSessions.UserDisplayNameClaimType = "email";
    options.LicenseKey = duendeLicenseKey;
})
.AddConfigurationStore(options =>
{
    options.ConfigureDbContext = b => b.UseSqlServer(connectionString);
})
.AddOperationalStore(options =>
{
    options.ConfigureDbContext = b => b.UseSqlServer(connectionString);
})
.AddInMemoryCaching()
.AddConfigurationStoreCache()
.AddServerSideSessions();

@AndersAbel We are hosting on Windows/IIS in Azure

@sanket-mistry-jm
Copy link

@philipwindsora55 We do have reports of this issue affecting both the 6.3.x and 7.0.x release branches. We're still investigating a number of possibilities of what the cause of the issue might be, but one of the few things that did change recently in both of those release branches is that we updated our dependency on the Microsoft.IdentityModel libraries. Are you able to show a before and after of those dependencies?

So, first, on the commit that you deployed to your dev and test environements that caused the issue, run dotnet list package --include-transitive | sls "Microsoft.IdentityModel|System.IdentityModel". Then, on a known good commit, run the same.

@josephdecock , Here is the output requested.

` v6.3.10 - Where issue occurs

Microsoft.IdentityModel.Abstractions 6.36.0
Microsoft.IdentityModel.JsonWebTokens 6.36.0
Microsoft.IdentityModel.Logging 6.36.0
Microsoft.IdentityModel.Protocols 6.36.0
Microsoft.IdentityModel.Protocols.OpenIdConnect 6.36.0
Microsoft.IdentityModel.Tokens 6.36.0
System.IdentityModel.Tokens.Jwt 6.36.0
Microsoft.IdentityModel.Abstractions 6.36.0
Microsoft.IdentityModel.JsonWebTokens 6.36.0
Microsoft.IdentityModel.Logging 6.36.0
Microsoft.IdentityModel.Protocols 6.36.0
Microsoft.IdentityModel.Protocols.OpenIdConnect 6.36.0
Microsoft.IdentityModel.Tokens 6.36.0
System.IdentityModel.Tokens.Jwt 6.36.0
Microsoft.IdentityModel.Abstractions 6.36.0
Microsoft.IdentityModel.JsonWebTokens 6.36.0
Microsoft.IdentityModel.Logging 6.36.0
Microsoft.IdentityModel.Protocols 6.36.0
Microsoft.IdentityModel.Protocols.OpenIdConnect 6.36.0
Microsoft.IdentityModel.Tokens 6.36.0
System.IdentityModel.Tokens.Jwt 6.36.0
Microsoft.IdentityModel.Abstractions 6.36.0
Microsoft.IdentityModel.JsonWebTokens 6.36.0
Microsoft.IdentityModel.Logging 6.36.0
Microsoft.IdentityModel.Protocols 6.36.0
Microsoft.IdentityModel.Protocols.OpenIdConnect 6.36.0
Microsoft.IdentityModel.Tokens 6.36.0
System.IdentityModel.Tokens.Jwt 6.36.0
Microsoft.IdentityModel.Protocols.OpenIdConnect 6.36.0 6.36.0
Microsoft.IdentityModel.Abstractions 6.36.0
Microsoft.IdentityModel.JsonWebTokens 6.36.0
Microsoft.IdentityModel.Logging 6.36.0
Microsoft.IdentityModel.Protocols 6.36.0
Microsoft.IdentityModel.Tokens 6.36.0
System.IdentityModel.Tokens.Jwt 6.36.0
Microsoft.IdentityModel.Abstractions 6.36.0
Microsoft.IdentityModel.JsonWebTokens 6.36.0
Microsoft.IdentityModel.Logging 6.36.0
Microsoft.IdentityModel.Protocols 6.36.0
Microsoft.IdentityModel.Protocols.OpenIdConnect 6.36.0
Microsoft.IdentityModel.Tokens 6.36.0
System.IdentityModel.Tokens.Jwt 6.36.0
Microsoft.IdentityModel.Abstractions 6.36.0
Microsoft.IdentityModel.JsonWebTokens 6.36.0
Microsoft.IdentityModel.Logging 6.36.0
Microsoft.IdentityModel.Protocols 6.36.0
Microsoft.IdentityModel.Protocols.OpenIdConnect 6.36.0
Microsoft.IdentityModel.Tokens 6.36.0
System.IdentityModel.Tokens.Jwt 6.36.0
Microsoft.IdentityModel.Abstractions 6.36.0
Microsoft.IdentityModel.JsonWebTokens 6.36.0
Microsoft.IdentityModel.Logging 6.36.0
Microsoft.IdentityModel.Protocols 6.36.0
Microsoft.IdentityModel.Protocols.OpenIdConnect 6.36.0
Microsoft.IdentityModel.Tokens 6.36.0
System.IdentityModel.Tokens.Jwt 6.36.0
Microsoft.IdentityModel.Abstractions 6.36.0
Microsoft.IdentityModel.JsonWebTokens 6.36.0
Microsoft.IdentityModel.Logging 6.36.0
Microsoft.IdentityModel.Protocols 6.36.0
Microsoft.IdentityModel.Protocols.OpenIdConnect 6.36.0
Microsoft.IdentityModel.Tokens 6.36.0
System.IdentityModel.Tokens.Jwt 6.36.0
Microsoft.IdentityModel.Abstractions 6.36.0
Microsoft.IdentityModel.JsonWebTokens 6.36.0
Microsoft.IdentityModel.Logging 6.36.0
Microsoft.IdentityModel.Protocols 6.36.0
Microsoft.IdentityModel.Protocols.OpenIdConnect 6.36.0
Microsoft.IdentityModel.Tokens 6.36.0
System.IdentityModel.Tokens.Jwt 6.36.0
Microsoft.IdentityModel.Abstractions 6.36.0
Microsoft.IdentityModel.JsonWebTokens 6.36.0
Microsoft.IdentityModel.Logging 6.36.0
Microsoft.IdentityModel.Protocols 6.36.0
Microsoft.IdentityModel.Protocols.OpenIdConnect 6.36.0
Microsoft.IdentityModel.Tokens 6.36.0
System.IdentityModel.Tokens.Jwt 6.36.0
Microsoft.IdentityModel.Abstractions 6.36.0
Microsoft.IdentityModel.JsonWebTokens 6.36.0
Microsoft.IdentityModel.Logging 6.36.0
Microsoft.IdentityModel.Protocols 6.36.0
Microsoft.IdentityModel.Protocols.OpenIdConnect 6.36.0
Microsoft.IdentityModel.Tokens 6.36.0
System.IdentityModel.Tokens.Jwt 6.36.0`

`v6.2.3 - Where issue did not occur.

Microsoft.IdentityModel.Abstractions 6.25.1
Microsoft.IdentityModel.JsonWebTokens 6.25.1
Microsoft.IdentityModel.Logging 6.25.1
Microsoft.IdentityModel.Protocols 6.25.1
Microsoft.IdentityModel.Protocols.OpenIdConnect 6.25.1
Microsoft.IdentityModel.Tokens 6.25.1
System.IdentityModel.Tokens.Jwt 6.25.1
Microsoft.IdentityModel.Abstractions 6.25.1
Microsoft.IdentityModel.JsonWebTokens 6.25.1
Microsoft.IdentityModel.Logging 6.25.1
Microsoft.IdentityModel.Protocols 6.25.1
Microsoft.IdentityModel.Protocols.OpenIdConnect 6.25.1
Microsoft.IdentityModel.Tokens 6.25.1
System.IdentityModel.Tokens.Jwt 6.25.1
Microsoft.IdentityModel.Abstractions 6.25.1
Microsoft.IdentityModel.JsonWebTokens 6.25.1
Microsoft.IdentityModel.Logging 6.25.1
Microsoft.IdentityModel.Protocols 6.25.1
Microsoft.IdentityModel.Protocols.OpenIdConnect 6.25.1
Microsoft.IdentityModel.Tokens 6.25.1
System.IdentityModel.Tokens.Jwt 6.25.1
Microsoft.IdentityModel.Abstractions 6.25.1
Microsoft.IdentityModel.JsonWebTokens 6.25.1
Microsoft.IdentityModel.Logging 6.25.1
Microsoft.IdentityModel.Protocols 6.25.1
Microsoft.IdentityModel.Protocols.OpenIdConnect 6.25.1
Microsoft.IdentityModel.Tokens 6.25.1
System.IdentityModel.Tokens.Jwt 6.25.1
Microsoft.IdentityModel.Protocols.OpenIdConnect 6.25.1 6.25.1
Microsoft.IdentityModel.Abstractions 6.25.1
Microsoft.IdentityModel.JsonWebTokens 6.25.1
Microsoft.IdentityModel.Logging 6.25.1
Microsoft.IdentityModel.Protocols 6.25.1
Microsoft.IdentityModel.Tokens 6.25.1
System.IdentityModel.Tokens.Jwt 6.25.1
Microsoft.IdentityModel.Abstractions 6.25.1
Microsoft.IdentityModel.JsonWebTokens 6.25.1
Microsoft.IdentityModel.Logging 6.25.1
Microsoft.IdentityModel.Protocols 6.25.1
Microsoft.IdentityModel.Protocols.OpenIdConnect 6.25.1
Microsoft.IdentityModel.Tokens 6.25.1
System.IdentityModel.Tokens.Jwt 6.25.1
Microsoft.IdentityModel.Abstractions 6.25.1
Microsoft.IdentityModel.JsonWebTokens 6.25.1
Microsoft.IdentityModel.Logging 6.25.1
Microsoft.IdentityModel.Protocols 6.25.1
Microsoft.IdentityModel.Protocols.OpenIdConnect 6.25.1
Microsoft.IdentityModel.Tokens 6.25.1
System.IdentityModel.Tokens.Jwt 6.25.1
Microsoft.IdentityModel.Abstractions 6.25.1
Microsoft.IdentityModel.JsonWebTokens 6.25.1
Microsoft.IdentityModel.Logging 6.25.1
Microsoft.IdentityModel.Protocols 6.25.1
Microsoft.IdentityModel.Protocols.OpenIdConnect 6.25.1
Microsoft.IdentityModel.Tokens 6.25.1
System.IdentityModel.Tokens.Jwt 6.25.1
Microsoft.IdentityModel.Abstractions 6.25.1
Microsoft.IdentityModel.JsonWebTokens 6.25.1
Microsoft.IdentityModel.Logging 6.25.1
Microsoft.IdentityModel.Protocols 6.25.1
Microsoft.IdentityModel.Protocols.OpenIdConnect 6.25.1
Microsoft.IdentityModel.Tokens 6.25.1
System.IdentityModel.Tokens.Jwt 6.25.1
Microsoft.IdentityModel.Abstractions 6.25.1
Microsoft.IdentityModel.JsonWebTokens 6.25.1
Microsoft.IdentityModel.Logging 6.25.1
Microsoft.IdentityModel.Protocols 6.25.1
Microsoft.IdentityModel.Protocols.OpenIdConnect 6.25.1
Microsoft.IdentityModel.Tokens 6.25.1
System.IdentityModel.Tokens.Jwt 6.25.1
Microsoft.IdentityModel.Abstractions 6.25.1
Microsoft.IdentityModel.JsonWebTokens 6.25.1
Microsoft.IdentityModel.Logging 6.25.1
Microsoft.IdentityModel.Protocols 6.25.1
Microsoft.IdentityModel.Protocols.OpenIdConnect 6.25.1
Microsoft.IdentityModel.Tokens 6.25.1
System.IdentityModel.Tokens.Jwt 6.25.1
Microsoft.IdentityModel.Abstractions 6.25.1
Microsoft.IdentityModel.JsonWebTokens 6.25.1
Microsoft.IdentityModel.Logging 6.25.1
Microsoft.IdentityModel.Protocols 6.25.1
Microsoft.IdentityModel.Protocols.OpenIdConnect 6.25.1
Microsoft.IdentityModel.Tokens 6.25.1
System.IdentityModel.Tokens.Jwt 6.25.1`

Information about usage of Automatic signing keys, including
Q : Are you using automatic signing keys?
A : Yes, we are using automatic signing keys + certificate-based validation key. Certificate is loaded from Azure Key Vault on Startup.

Q: How many are in the store?
A: At this moment, we are not storing the key into database but using the default option framework provides (Which I believe is File system store). We plan to move to database with v7 upgrade.

Q: Which store implementation are you using (Duende.IdentityServer.EntityFramework or something custom)?
A: We are using custom implementation which uses Azure CosmosDB (MongoDb API).
Q: Which database are you using? (Provider, Version, Hosting info, etc)
A: See above.

Let me know if this helps.

@keithlfs
Copy link

keithlfs commented Aug 29, 2024 via email

@keithlfs
Copy link

keithlfs commented Sep 6, 2024

@josephdecock please can we have an update on this issue.

@sanket-mistry-jm
Copy link

I wanted to check if anyone who has reported was able to bypass it or are we still seeing the same issue? For us, it is still consistent.

@keithlfs
Copy link

@AndersAbel @josephdecock please may we have an update? This is a critical issue and we're stuck between reverting to a known insecure version or having arbitrary application crashes in production. It would be great to have an understanding of what is being done by Duende to investigate and rectify this issue.

@leastprivilege
Copy link
Member

Hi Keith,

I am really sorry for the radio silence. The engineering team is working on it, but it is very hard to consistently reproduce.

Are you saying that by reverting back to an earlier version (which exact version) the problem goes away?

thanks!

@keithlfs
Copy link

keithlfs commented Sep 11, 2024 via email

@DanBlumenfeld
Copy link

In case it helps others: I was seeing the issue every 2-3 days, but it has not recurred in the past two weeks.

My environment: IdentityServer 7.06, Azure SQL Server, ASP.NET Core 8 hosted in Azure container app.

I made the following three changes more-or-less simultaneously, and haven't seen a failure since:

  1. Set logging of Duende.IdentityServer.Services.KeyManagement to Trace level
  2. Reduced KeyCacheDuration (IdentityServerOptions.KeyManagement.KeyCacheDuration) to 2 hours from the default 24
  3. Added an external uptime monitor (UptimeRobot) which calls the discovery endpoint every 60 seconds

@techyian
Copy link

Echoing @DanBlumenfeld, we haven't seen a timeout in a few weeks which is strange considering how frequently it was occurring. Could this be a hosting provider issue, are we all hosted in Azure in this thread?

@sanket-mistry-jm
Copy link

sanket-mistry-jm commented Sep 11, 2024

For us it is a different story....We rollbacked to v6.3.8 and we are seeing the same issue on that version as well. So We are unsure what is happening.

@AndersAbel
Copy link
Member

We are investigating if the hang is related to data protection and specifically the Azure data protection services.

For those affected, do you use Azure.Extensions.AspNetCore.DataProtection.Blobs? In that case, what version?

@DanBlumenfeld
Copy link

DanBlumenfeld commented Sep 12, 2024

We are investigating if the hang is related to data protection and specifically the Azure data protection services.

For those affected, do you use Azure.Extensions.AspNetCore.DataProtection.Blobs? In that case, what version?

Azure.Extensions.AspNetCore.DataProtection.Blobs, v1.3.3
Azure.Extensions.AspNetCore.DataProtection.Keys, v1.2.3

@mcolebiltd
Copy link

We are investigating if the hang is related to data protection and specifically the Azure data protection services.

For those affected, do you use Azure.Extensions.AspNetCore.DataProtection.Blobs? In that case, what version?

Azure.Extensions.AspNetCore.DataProtection.Keys: 1.2.3
Azure.Extensions.AspNetCore.DataProtection.Blobs: 1.3.4

@keithlfs
Copy link

keithlfs commented Sep 12, 2024

We are investigating if the hang is related to data protection and specifically the Azure data protection services.

For those affected, do you use Azure.Extensions.AspNetCore.DataProtection.Blobs? In that case, what version?

<PackageReference Include="Azure.Extensions.AspNetCore.DataProtection.Blobs" Version="1.3.2" />
<PackageReference Include="Azure.Extensions.AspNetCore.DataProtection.Keys" Version="1.2.2" />

@OmidID
Copy link

OmidID commented Sep 16, 2024

Hello Everyone.

We found the issue is related to specific version of Azure.Core package that make this problem on our product.

Do not forget, you may have another package that has dependency to this one (Our case)

Azure/azure-sdk-for-net#44882

Force update Azure.Core to at least version.

@AndersAbel
Copy link
Member

Thank you @OmidID for sharing that information. The information you linked to is consistent with the version numbers reported above. Azure.Extensions.AspNetCore.DataProtection.Keys 1.2.3 references Azure.Core 1.37.0.

@keithlfs @mcolebiltd @DanBlumenfeld Could you please try updating Azure.Extensions.AspNetCore.DataProtection.Keys to version 1.2.4 (which brings in Azure.Core >= 1.42.0.

@DanBlumenfeld
Copy link

Thank you @OmidID for sharing that information. The information you linked to is consistent with the version numbers reported above. Azure.Extensions.AspNetCore.DataProtection.Keys 1.2.3 references Azure.Core 1.37.0.

@keithlfs @mcolebiltd @DanBlumenfeld Could you please try updating Azure.Extensions.AspNetCore.DataProtection.Keys to version 1.2.4 (which brings in Azure.Core >= 1.42.0.

Updated and am deploying now.

In my case, I've not seen the issue recur since the changes I mentioned above, so hopefully this makes no difference :-)

@AndersAbel AndersAbel self-assigned this Sep 17, 2024
@philipwindsora55
Copy link
Author

We have not experienced a timeout issue for a good few weeks now, maybe it was an Azure infrastructure issue?

I've updated the references to our Azure packages too as suggested above and again, no issues so far. I'll continue to monitor for another few weeks and update.

@AndersAbel
Copy link
Member

@philipwindsora55 Thanks for reporting back. So far everyone that has reported this issues have been running on Azure, or at least using Azure services. At this point we strongly suspect that this is/was an Azure issue - either with the Azure services or the Azure SDK (most likely the Azure.Core library).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests