Skip to content

Migrate manual files to logging abstraction#742

Draft
mihaimitrea-db wants to merge 2 commits intomainfrom
mihaimitrea-db/stack/logging-migration
Draft

Migrate manual files to logging abstraction#742
mihaimitrea-db wants to merge 2 commits intomainfrom
mihaimitrea-db/stack/logging-migration

Conversation

@mihaimitrea-db
Copy link
Copy Markdown
Contributor

@mihaimitrea-db mihaimitrea-db commented Mar 26, 2026

🥞 Stacked PR

Use this link to review incremental changes.


Summary

Migrates all manually-maintained source files from direct org.slf4j imports to the SDK's logging abstraction (com.databricks.sdk.core.logging). After this PR, no hand-written file in the SDK references SLF4J directly — all logging goes through the abstraction introduced in PR #740.

Why

PRs #740 and #741 introduced the logging abstraction and the JUL backend, but every existing call site still imported org.slf4j.Logger and org.slf4j.LoggerFactory directly. Until those imports are rewritten, users cannot actually swap the logging backend — the abstraction would be dead code.

This PR completes the migration for all manually-maintained files. Auto-generated files are left unchanged and will be addressed separately via codegen updates.

What changed

Interface changes

None.

Behavioral changes

None. All logging calls pass through the abstraction layer, which defaults to SLF4J. Existing users see no difference.

Internal changes

  • 25 files updated: each file's org.slf4j.Logger / org.slf4j.LoggerFactory imports are replaced with com.databricks.sdk.core.logging.Logger / com.databricks.sdk.core.logging.LoggerFactory. No other code changes — the Logger API is identical.
  • Files span core, core.oauth, core.retry, core.utils, core.error, core.commons, and mixin packages.
  • The ApiClient request/response log now uses LOG.debug(() -> makeLogRecord(in, resp)) (the Supplier<String> overload) instead of an explicit isDebugEnabled() guard, since the abstraction handles the guard internally.

How is this tested?

  • This is a purely mechanical import swap; the Logger abstraction exposes the same debug/info/warn/error methods as org.slf4j.Logger.
  • Full test suite passes.

@mihaimitrea-db mihaimitrea-db self-assigned this Mar 26, 2026
@mihaimitrea-db mihaimitrea-db force-pushed the mihaimitrea-db/stack/logging-migration branch 2 times, most recently from e07eca3 to b52d9a0 Compare March 30, 2026 08:16
@mihaimitrea-db mihaimitrea-db force-pushed the mihaimitrea-db/stack/logging-migration branch from b52d9a0 to 429b118 Compare March 30, 2026 08:33
@mihaimitrea-db
Copy link
Copy Markdown
Contributor Author

Range-diff: stack/logging-jul (b52d9a0 -> 429b118)
NEXT_CHANGELOG.md
@@ -0,0 +1,10 @@
+diff --git a/NEXT_CHANGELOG.md b/NEXT_CHANGELOG.md
+--- a/NEXT_CHANGELOG.md
++++ b/NEXT_CHANGELOG.md
+ ### Internal Changes
+ * Introduced a logging abstraction (`com.databricks.sdk.core.logging`) that decouples the SDK from SLF4J. Users can now provide their own logging backend by extending `LoggerFactory` and calling `LoggerFactory.setDefault()` before creating any SDK client. SLF4J remains the default.
+ * Added `java.util.logging` as a supported alternative logging backend. Activate it with `LoggerFactory.setDefault(JulLoggerFactory.INSTANCE)`.
++* Migrated internal SDK classes from direct `org.slf4j` imports to the new logging abstraction.
+ 
+ ### API Changes
+ * Add `createCatalog()`, `createSyncedTable()`, `deleteCatalog()`, `deleteSyncedTable()`, `getCatalog()` and `getSyncedTable()` methods for `workspaceClient.postgres()` service.
\ No newline at end of file
databricks-sdk-java/src/main/java/com/databricks/sdk/core/ApiClient.java
@@ -1,13 +1,19 @@
 diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/ApiClient.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/ApiClient.java
 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/ApiClient.java
 +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/ApiClient.java
+ import com.databricks.sdk.core.http.Request;
+ import com.databricks.sdk.core.http.RequestOptions;
+ import com.databricks.sdk.core.http.Response;
++import com.databricks.sdk.core.logging.Logger;
++import com.databricks.sdk.core.logging.LoggerFactory;
+ import com.databricks.sdk.core.retry.NoRetryStrategyPicker;
+ import com.databricks.sdk.core.retry.RequestBasedRetryStrategyPicker;
+ import com.databricks.sdk.core.retry.RetryStrategy;
  import java.time.format.DateTimeFormatter;
  import java.util.*;
  import java.util.function.Function;
 -import org.slf4j.Logger;
 -import org.slf4j.LoggerFactory;
-+import com.databricks.sdk.core.logging.Logger;
-+import com.databricks.sdk.core.logging.LoggerFactory;
  
  /**
   * Simplified REST API client with retries, JSON POJO SerDe through Jackson and exception POJO
\ No newline at end of file
databricks-sdk-java/src/main/java/com/databricks/sdk/core/AzureCliCredentialsProvider.java
@@ -1,13 +1,18 @@
 diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/AzureCliCredentialsProvider.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/AzureCliCredentialsProvider.java
 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/AzureCliCredentialsProvider.java
 +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/AzureCliCredentialsProvider.java
+ package com.databricks.sdk.core;
+ 
++import com.databricks.sdk.core.logging.Logger;
++import com.databricks.sdk.core.logging.LoggerFactory;
+ import com.databricks.sdk.core.oauth.CachedTokenSource;
+ import com.databricks.sdk.core.oauth.OAuthHeaderFactory;
+ import com.databricks.sdk.core.oauth.Token;
  import com.databricks.sdk.support.InternalApi;
  import com.fasterxml.jackson.databind.ObjectMapper;
  import java.util.*;
 -import org.slf4j.Logger;
 -import org.slf4j.LoggerFactory;
-+import com.databricks.sdk.core.logging.Logger;
-+import com.databricks.sdk.core.logging.LoggerFactory;
  
  @InternalApi
  public class AzureCliCredentialsProvider implements CredentialsProvider {
\ No newline at end of file
databricks-sdk-java/src/main/java/com/databricks/sdk/core/CliTokenSource.java
@@ -1,13 +1,18 @@
 diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/CliTokenSource.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/CliTokenSource.java
 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/CliTokenSource.java
 +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/CliTokenSource.java
+ package com.databricks.sdk.core;
+ 
++import com.databricks.sdk.core.logging.Logger;
++import com.databricks.sdk.core.logging.LoggerFactory;
+ import com.databricks.sdk.core.oauth.Token;
+ import com.databricks.sdk.core.oauth.TokenSource;
+ import com.databricks.sdk.core.utils.Environment;
  import java.util.Arrays;
  import java.util.List;
  import org.apache.commons.io.IOUtils;
 -import org.slf4j.Logger;
 -import org.slf4j.LoggerFactory;
-+import com.databricks.sdk.core.logging.Logger;
-+import com.databricks.sdk.core.logging.LoggerFactory;
  
  @InternalApi
  public class CliTokenSource implements TokenSource {
\ No newline at end of file
databricks-sdk-java/src/main/java/com/databricks/sdk/core/ConfigLoader.java
@@ -1,13 +1,18 @@
 diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/ConfigLoader.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/ConfigLoader.java
 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/ConfigLoader.java
 +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/ConfigLoader.java
+ package com.databricks.sdk.core;
+ 
++import com.databricks.sdk.core.logging.Logger;
++import com.databricks.sdk.core.logging.LoggerFactory;
+ import com.databricks.sdk.core.utils.Environment;
+ import com.databricks.sdk.support.InternalApi;
+ import java.io.FileNotFoundException;
  import org.apache.commons.configuration2.INIConfiguration;
  import org.apache.commons.configuration2.SubnodeConfiguration;
  import org.apache.commons.configuration2.ex.ConfigurationException;
 -import org.slf4j.Logger;
 -import org.slf4j.LoggerFactory;
-+import com.databricks.sdk.core.logging.Logger;
-+import com.databricks.sdk.core.logging.LoggerFactory;
  
  @InternalApi
  public class ConfigLoader {
\ No newline at end of file
databricks-sdk-java/src/main/java/com/databricks/sdk/core/DatabricksCliCredentialsProvider.java
@@ -1,13 +1,18 @@
 diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/DatabricksCliCredentialsProvider.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/DatabricksCliCredentialsProvider.java
 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/DatabricksCliCredentialsProvider.java
 +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/DatabricksCliCredentialsProvider.java
+ package com.databricks.sdk.core;
+ 
++import com.databricks.sdk.core.logging.Logger;
++import com.databricks.sdk.core.logging.LoggerFactory;
+ import com.databricks.sdk.core.oauth.CachedTokenSource;
+ import com.databricks.sdk.core.oauth.OAuthHeaderFactory;
+ import com.databricks.sdk.core.oauth.Token;
  import com.fasterxml.jackson.databind.ObjectMapper;
  import java.nio.charset.StandardCharsets;
  import java.util.*;
 -import org.slf4j.Logger;
 -import org.slf4j.LoggerFactory;
-+import com.databricks.sdk.core.logging.Logger;
-+import com.databricks.sdk.core.logging.LoggerFactory;
  
  @InternalApi
  public class DatabricksCliCredentialsProvider implements CredentialsProvider {
\ No newline at end of file
databricks-sdk-java/src/main/java/com/databricks/sdk/core/DatabricksConfig.java
@@ -1,13 +1,19 @@
 diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/DatabricksConfig.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/DatabricksConfig.java
 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/DatabricksConfig.java
 +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/DatabricksConfig.java
+ import com.databricks.sdk.core.http.HttpClient;
+ import com.databricks.sdk.core.http.Request;
+ import com.databricks.sdk.core.http.Response;
++import com.databricks.sdk.core.logging.Logger;
++import com.databricks.sdk.core.logging.LoggerFactory;
+ import com.databricks.sdk.core.oauth.ErrorTokenSource;
+ import com.databricks.sdk.core.oauth.HostMetadata;
+ import com.databricks.sdk.core.oauth.OAuthHeaderFactory;
  import java.time.Duration;
  import java.util.*;
  import org.apache.http.HttpMessage;
 -import org.slf4j.Logger;
 -import org.slf4j.LoggerFactory;
-+import com.databricks.sdk.core.logging.Logger;
-+import com.databricks.sdk.core.logging.LoggerFactory;
  
  public class DatabricksConfig {
    private static final Logger LOG = LoggerFactory.getLogger(DatabricksConfig.class);
\ No newline at end of file
databricks-sdk-java/src/main/java/com/databricks/sdk/core/DefaultCredentialsProvider.java
@@ -1,13 +1,17 @@
 diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/DefaultCredentialsProvider.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/DefaultCredentialsProvider.java
 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/DefaultCredentialsProvider.java
 +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/DefaultCredentialsProvider.java
+ package com.databricks.sdk.core;
+ 
++import com.databricks.sdk.core.logging.Logger;
++import com.databricks.sdk.core.logging.LoggerFactory;
+ import com.databricks.sdk.core.oauth.*;
+ import com.databricks.sdk.support.InternalApi;
  import com.google.common.base.Strings;
  import java.util.ArrayList;
  import java.util.List;
 -import org.slf4j.Logger;
 -import org.slf4j.LoggerFactory;
-+import com.databricks.sdk.core.logging.Logger;
-+import com.databricks.sdk.core.logging.LoggerFactory;
  
  /**
   * The DefaultCredentialsProvider is the primary authentication handler for the Databricks SDK. It
\ No newline at end of file
databricks-sdk-java/src/main/java/com/databricks/sdk/core/GoogleCredentialsCredentialsProvider.java
@@ -1,13 +1,19 @@
 diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/GoogleCredentialsCredentialsProvider.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/GoogleCredentialsCredentialsProvider.java
 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/GoogleCredentialsCredentialsProvider.java
 +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/GoogleCredentialsCredentialsProvider.java
+ import static com.databricks.sdk.core.utils.GoogleUtils.GCP_SCOPES;
+ import static com.databricks.sdk.core.utils.GoogleUtils.SA_ACCESS_TOKEN_HEADER;
+ 
++import com.databricks.sdk.core.logging.Logger;
++import com.databricks.sdk.core.logging.LoggerFactory;
+ import com.databricks.sdk.support.InternalApi;
+ import com.google.auth.oauth2.*;
+ import com.google.auth.oauth2.IdTokenProvider.Option;
  import java.nio.file.Files;
  import java.nio.file.Paths;
  import java.util.*;
 -import org.slf4j.Logger;
 -import org.slf4j.LoggerFactory;
-+import com.databricks.sdk.core.logging.Logger;
-+import com.databricks.sdk.core.logging.LoggerFactory;
  
  @InternalApi
  public class GoogleCredentialsCredentialsProvider implements CredentialsProvider {
\ No newline at end of file
databricks-sdk-java/src/main/java/com/databricks/sdk/core/GoogleIdCredentialsProvider.java
@@ -1,13 +1,19 @@
 diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/GoogleIdCredentialsProvider.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/GoogleIdCredentialsProvider.java
 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/GoogleIdCredentialsProvider.java
 +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/GoogleIdCredentialsProvider.java
+ import static com.databricks.sdk.core.utils.GoogleUtils.GCP_SCOPES;
+ import static com.databricks.sdk.core.utils.GoogleUtils.SA_ACCESS_TOKEN_HEADER;
+ 
++import com.databricks.sdk.core.logging.Logger;
++import com.databricks.sdk.core.logging.LoggerFactory;
+ import com.databricks.sdk.support.InternalApi;
+ import com.google.auth.oauth2.GoogleCredentials;
+ import com.google.auth.oauth2.IdTokenCredentials;
  import com.google.auth.oauth2.ImpersonatedCredentials;
  import java.io.IOException;
  import java.util.*;
 -import org.slf4j.Logger;
 -import org.slf4j.LoggerFactory;
-+import com.databricks.sdk.core.logging.Logger;
-+import com.databricks.sdk.core.logging.LoggerFactory;
  
  @InternalApi
  public class GoogleIdCredentialsProvider implements CredentialsProvider {
\ No newline at end of file
databricks-sdk-java/src/main/java/com/databricks/sdk/core/NotebookNativeCredentialsProvider.java
@@ -1,13 +1,16 @@
 diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/NotebookNativeCredentialsProvider.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/NotebookNativeCredentialsProvider.java
 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/NotebookNativeCredentialsProvider.java
 +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/NotebookNativeCredentialsProvider.java
+ package com.databricks.sdk.core;
+ 
++import com.databricks.sdk.core.logging.Logger;
++import com.databricks.sdk.core.logging.LoggerFactory;
+ import com.databricks.sdk.support.InternalApi;
  import java.lang.reflect.Field;
  import java.lang.reflect.InvocationTargetException;
  import java.util.*;
 -import org.slf4j.Logger;
 -import org.slf4j.LoggerFactory;
-+import com.databricks.sdk.core.logging.Logger;
-+import com.databricks.sdk.core.logging.LoggerFactory;
  
  /**
   * A CredentialsProvider that uses the API token from the command context to authenticate.
\ No newline at end of file
databricks-sdk-java/src/main/java/com/databricks/sdk/core/commons/CommonsHttpClient.java
@@ -1,13 +1,19 @@
 diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/commons/CommonsHttpClient.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/commons/CommonsHttpClient.java
 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/commons/CommonsHttpClient.java
 +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/commons/CommonsHttpClient.java
+ import com.databricks.sdk.core.http.HttpClient;
+ import com.databricks.sdk.core.http.Request;
+ import com.databricks.sdk.core.http.Response;
++import com.databricks.sdk.core.logging.Logger;
++import com.databricks.sdk.core.logging.LoggerFactory;
+ import com.databricks.sdk.core.utils.CustomCloseInputStream;
+ import com.databricks.sdk.core.utils.ProxyUtils;
+ import com.databricks.sdk.support.InternalApi;
  import org.apache.http.impl.conn.PoolingHttpClientConnectionManager;
  import org.apache.http.protocol.BasicHttpContext;
  import org.apache.http.protocol.HttpContext;
 -import org.slf4j.Logger;
 -import org.slf4j.LoggerFactory;
-+import com.databricks.sdk.core.logging.Logger;
-+import com.databricks.sdk.core.logging.LoggerFactory;
  
  @InternalApi
  public class CommonsHttpClient implements HttpClient {
\ No newline at end of file
databricks-sdk-java/src/main/java/com/databricks/sdk/core/error/AbstractErrorMapper.java
@@ -1,13 +1,16 @@
 diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/error/AbstractErrorMapper.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/error/AbstractErrorMapper.java
 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/error/AbstractErrorMapper.java
 +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/error/AbstractErrorMapper.java
+ import com.databricks.sdk.core.DatabricksError;
+ import com.databricks.sdk.core.error.details.ErrorDetails;
+ import com.databricks.sdk.core.http.Response;
++import com.databricks.sdk.core.logging.Logger;
++import com.databricks.sdk.core.logging.LoggerFactory;
  import com.databricks.sdk.support.InternalApi;
  import java.util.HashMap;
  import java.util.Map;
 -import org.slf4j.Logger;
 -import org.slf4j.LoggerFactory;
-+import com.databricks.sdk.core.logging.Logger;
-+import com.databricks.sdk.core.logging.LoggerFactory;
  
  @InternalApi
  abstract class AbstractErrorMapper {
\ No newline at end of file
databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/AzureServicePrincipalCredentialsProvider.java
@@ -1,13 +1,18 @@
 diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/AzureServicePrincipalCredentialsProvider.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/AzureServicePrincipalCredentialsProvider.java
 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/AzureServicePrincipalCredentialsProvider.java
 +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/AzureServicePrincipalCredentialsProvider.java
+ package com.databricks.sdk.core.oauth;
+ 
+ import com.databricks.sdk.core.*;
++import com.databricks.sdk.core.logging.Logger;
++import com.databricks.sdk.core.logging.LoggerFactory;
+ import com.databricks.sdk.core.utils.AzureUtils;
+ import com.databricks.sdk.support.InternalApi;
  import com.fasterxml.jackson.databind.ObjectMapper;
  import java.util.HashMap;
  import java.util.Map;
 -import org.slf4j.Logger;
 -import org.slf4j.LoggerFactory;
-+import com.databricks.sdk.core.logging.Logger;
-+import com.databricks.sdk.core.logging.LoggerFactory;
  
  /**
   * Adds refreshed Azure Active Directory (AAD) Service Principal OAuth tokens to every request,
\ No newline at end of file
databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/CachedTokenSource.java
@@ -1,13 +1,18 @@
 diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/CachedTokenSource.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/CachedTokenSource.java
 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/CachedTokenSource.java
 +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/CachedTokenSource.java
+ package com.databricks.sdk.core.oauth;
+ 
++import com.databricks.sdk.core.logging.Logger;
++import com.databricks.sdk.core.logging.LoggerFactory;
+ import com.databricks.sdk.core.utils.ClockSupplier;
+ import com.databricks.sdk.core.utils.UtcClockSupplier;
+ import java.time.Duration;
  import java.time.Instant;
  import java.util.Objects;
  import java.util.concurrent.CompletableFuture;
 -import org.slf4j.Logger;
 -import org.slf4j.LoggerFactory;
-+import com.databricks.sdk.core.logging.Logger;
-+import com.databricks.sdk.core.logging.LoggerFactory;
  
  /**
   * An OAuth TokenSource which can be refreshed.
\ No newline at end of file
databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/Consent.java
@@ -1,13 +1,19 @@
 diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/Consent.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/Consent.java
 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/Consent.java
 +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/Consent.java
+ import com.databricks.sdk.core.DatabricksException;
+ import com.databricks.sdk.core.commons.CommonsHttpClient;
+ import com.databricks.sdk.core.http.HttpClient;
++import com.databricks.sdk.core.logging.Logger;
++import com.databricks.sdk.core.logging.LoggerFactory;
+ import com.sun.net.httpserver.Headers;
+ import com.sun.net.httpserver.HttpExchange;
+ import com.sun.net.httpserver.HttpHandler;
  import java.util.Objects;
  import java.util.Optional;
  import org.apache.commons.io.IOUtils;
 -import org.slf4j.Logger;
 -import org.slf4j.LoggerFactory;
-+import com.databricks.sdk.core.logging.Logger;
-+import com.databricks.sdk.core.logging.LoggerFactory;
  
  /**
   * Consent provides a mechanism to retrieve an authorization code and exchange it for an OAuth token
\ No newline at end of file
databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/DatabricksOAuthTokenSource.java
@@ -1,13 +1,19 @@
 diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/DatabricksOAuthTokenSource.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/DatabricksOAuthTokenSource.java
 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/DatabricksOAuthTokenSource.java
 +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/DatabricksOAuthTokenSource.java
+ 
+ import com.databricks.sdk.core.DatabricksException;
+ import com.databricks.sdk.core.http.HttpClient;
++import com.databricks.sdk.core.logging.Logger;
++import com.databricks.sdk.core.logging.LoggerFactory;
+ import com.google.common.base.Strings;
+ import java.time.Instant;
+ import java.util.Arrays;
  import java.util.List;
  import java.util.Map;
  import java.util.Objects;
 -import org.slf4j.Logger;
 -import org.slf4j.LoggerFactory;
-+import com.databricks.sdk.core.logging.Logger;
-+import com.databricks.sdk.core.logging.LoggerFactory;
  
  /**
   * Implementation of TokenSource that handles OAuth token exchange for Databricks authentication.
\ No newline at end of file
databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/EndpointTokenSource.java
@@ -1,13 +1,18 @@
 diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/EndpointTokenSource.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/EndpointTokenSource.java
 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/EndpointTokenSource.java
 +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/EndpointTokenSource.java
+ 
+ import com.databricks.sdk.core.DatabricksException;
+ import com.databricks.sdk.core.http.HttpClient;
++import com.databricks.sdk.core.logging.Logger;
++import com.databricks.sdk.core.logging.LoggerFactory;
+ import com.databricks.sdk.support.InternalApi;
+ import java.time.Instant;
  import java.util.HashMap;
  import java.util.Map;
  import java.util.Objects;
 -import org.slf4j.Logger;
 -import org.slf4j.LoggerFactory;
-+import com.databricks.sdk.core.logging.Logger;
-+import com.databricks.sdk.core.logging.LoggerFactory;
  
  /**
   * Represents a token source that exchanges a control plane token for an endpoint-specific dataplane
\ No newline at end of file
databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/ExternalBrowserCredentialsProvider.java
@@ -1,13 +1,19 @@
 diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/ExternalBrowserCredentialsProvider.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/ExternalBrowserCredentialsProvider.java
 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/ExternalBrowserCredentialsProvider.java
 +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/ExternalBrowserCredentialsProvider.java
+ import com.databricks.sdk.core.CredentialsProvider;
+ import com.databricks.sdk.core.DatabricksConfig;
+ import com.databricks.sdk.core.DatabricksException;
++import com.databricks.sdk.core.logging.Logger;
++import com.databricks.sdk.core.logging.LoggerFactory;
+ import com.databricks.sdk.support.InternalApi;
+ import java.io.IOException;
+ import java.nio.file.Path;
  import java.util.Objects;
  import java.util.Optional;
  import java.util.Set;
 -import org.slf4j.Logger;
 -import org.slf4j.LoggerFactory;
-+import com.databricks.sdk.core.logging.Logger;
-+import com.databricks.sdk.core.logging.LoggerFactory;
  
  /**
   * A {@code CredentialsProvider} which implements the Authorization Code + PKCE flow by opening a
\ No newline at end of file
databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/FileTokenCache.java
@@ -1,13 +1,18 @@
 diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/FileTokenCache.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/FileTokenCache.java
 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/FileTokenCache.java
 +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/FileTokenCache.java
+ package com.databricks.sdk.core.oauth;
+ 
++import com.databricks.sdk.core.logging.Logger;
++import com.databricks.sdk.core.logging.LoggerFactory;
+ import com.databricks.sdk.core.utils.SerDeUtils;
+ import com.databricks.sdk.support.InternalApi;
+ import com.fasterxml.jackson.databind.ObjectMapper;
  import java.nio.file.Files;
  import java.nio.file.Path;
  import java.util.Objects;
 -import org.slf4j.Logger;
 -import org.slf4j.LoggerFactory;
-+import com.databricks.sdk.core.logging.Logger;
-+import com.databricks.sdk.core.logging.LoggerFactory;
  
  /** A TokenCache implementation that stores tokens as plain files. */
  @InternalApi
\ No newline at end of file
databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/SessionCredentials.java
@@ -1,13 +1,16 @@
 diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/SessionCredentials.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/SessionCredentials.java
 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/SessionCredentials.java
 +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/SessionCredentials.java
+ import com.databricks.sdk.core.CredentialsProvider;
+ import com.databricks.sdk.core.DatabricksConfig;
+ import com.databricks.sdk.core.http.HttpClient;
++import com.databricks.sdk.core.logging.Logger;
++import com.databricks.sdk.core.logging.LoggerFactory;
  import com.databricks.sdk.support.InternalApi;
  import java.io.Serializable;
  import java.util.Optional;
 -import org.slf4j.Logger;
 -import org.slf4j.LoggerFactory;
-+import com.databricks.sdk.core.logging.Logger;
-+import com.databricks.sdk.core.logging.LoggerFactory;
  
  /**
   * An implementation of RefreshableTokenSource implementing the refresh_token OAuth grant type.
\ No newline at end of file
databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/SessionCredentialsTokenSource.java
@@ -1,13 +1,17 @@
 diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/SessionCredentialsTokenSource.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/SessionCredentialsTokenSource.java
 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/SessionCredentialsTokenSource.java
 +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/SessionCredentialsTokenSource.java
+ 
+ import com.databricks.sdk.core.DatabricksException;
+ import com.databricks.sdk.core.http.HttpClient;
++import com.databricks.sdk.core.logging.Logger;
++import com.databricks.sdk.core.logging.LoggerFactory;
+ import com.databricks.sdk.support.InternalApi;
  import java.util.HashMap;
  import java.util.Map;
  import java.util.Optional;
 -import org.slf4j.Logger;
 -import org.slf4j.LoggerFactory;
-+import com.databricks.sdk.core.logging.Logger;
-+import com.databricks.sdk.core.logging.LoggerFactory;
  
  /**
   * TokenSource that handles OAuth token refresh for SessionCredentials.
\ No newline at end of file
databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/TokenEndpointClient.java
@@ -1,13 +1,19 @@
 diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/TokenEndpointClient.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/TokenEndpointClient.java
 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/TokenEndpointClient.java
 +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/oauth/TokenEndpointClient.java
+ import com.databricks.sdk.core.http.HttpClient;
+ import com.databricks.sdk.core.http.Request;
+ import com.databricks.sdk.core.http.Response;
++import com.databricks.sdk.core.logging.Logger;
++import com.databricks.sdk.core.logging.LoggerFactory;
+ import com.databricks.sdk.support.InternalApi;
+ import com.fasterxml.jackson.databind.ObjectMapper;
+ import java.io.IOException;
  import java.util.Map;
  import java.util.Objects;
  import org.apache.http.HttpHeaders;
 -import org.slf4j.Logger;
 -import org.slf4j.LoggerFactory;
-+import com.databricks.sdk.core.logging.Logger;
-+import com.databricks.sdk.core.logging.LoggerFactory;
  
  /**
   * Client for interacting with an OAuth token endpoint.
\ No newline at end of file
databricks-sdk-java/src/main/java/com/databricks/sdk/core/retry/NonIdempotentRequestRetryStrategy.java
@@ -1,13 +1,18 @@
 diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/retry/NonIdempotentRequestRetryStrategy.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/retry/NonIdempotentRequestRetryStrategy.java
 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/retry/NonIdempotentRequestRetryStrategy.java
 +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/retry/NonIdempotentRequestRetryStrategy.java
+ package com.databricks.sdk.core.retry;
+ 
+ import com.databricks.sdk.core.DatabricksError;
++import com.databricks.sdk.core.logging.Logger;
++import com.databricks.sdk.core.logging.LoggerFactory;
+ import java.net.*;
+ import java.util.Arrays;
  import java.util.HashSet;
  import java.util.List;
  import java.util.Set;
 -import org.slf4j.Logger;
 -import org.slf4j.LoggerFactory;
-+import com.databricks.sdk.core.logging.Logger;
-+import com.databricks.sdk.core.logging.LoggerFactory;
  
  /**
   * This class is used to determine if a non-idempotent request should be retried. We essentially
\ No newline at end of file
databricks-sdk-java/src/main/java/com/databricks/sdk/core/utils/OSUtils.java
@@ -1,13 +1,19 @@
 diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/utils/OSUtils.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/utils/OSUtils.java
 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/utils/OSUtils.java
 +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/utils/OSUtils.java
+ package com.databricks.sdk.core.utils;
+ 
+ import com.databricks.sdk.core.DatabricksException;
++import com.databricks.sdk.core.logging.Logger;
++import com.databricks.sdk.core.logging.LoggerFactory;
+ import com.databricks.sdk.support.InternalApi;
+ import java.io.IOException;
+ import java.nio.file.Files;
  import java.nio.file.Path;
  import java.nio.file.Paths;
  import java.util.List;
 -import org.slf4j.Logger;
 -import org.slf4j.LoggerFactory;
-+import com.databricks.sdk.core.logging.Logger;
-+import com.databricks.sdk.core.logging.LoggerFactory;
  
  /**
   * OSUtils is an interface that provides utility methods for determining the current operating
\ No newline at end of file
databricks-sdk-java/src/main/java/com/databricks/sdk/mixin/ClustersExt.java
@@ -1,13 +1,19 @@
 diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/mixin/ClustersExt.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/mixin/ClustersExt.java
 --- a/databricks-sdk-java/src/main/java/com/databricks/sdk/mixin/ClustersExt.java
 +++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/mixin/ClustersExt.java
+ 
+ import com.databricks.sdk.core.ApiClient;
+ import com.databricks.sdk.core.DatabricksError;
++import com.databricks.sdk.core.logging.Logger;
++import com.databricks.sdk.core.logging.LoggerFactory;
+ import com.databricks.sdk.service.compute.*;
+ import java.time.Duration;
+ import java.util.ArrayList;
  import java.util.concurrent.TimeoutException;
  import java.util.function.Function;
  import java.util.stream.Collectors;
 -import org.slf4j.Logger;
 -import org.slf4j.LoggerFactory;
-+import com.databricks.sdk.core.logging.Logger;
-+import com.databricks.sdk.core.logging.LoggerFactory;
  
  public class ClustersExt extends ClustersAPI {
    private static final Logger LOG = LoggerFactory.getLogger(ClustersExt.class);
\ No newline at end of file

Reproduce locally: git range-diff 80ecee4..b52d9a0 02b6cb5..429b118 | Disable: git config gitstack.push-range-diff false

@mihaimitrea-db mihaimitrea-db force-pushed the mihaimitrea-db/stack/logging-migration branch from 429b118 to 6ea8c92 Compare March 30, 2026 15:05
@mihaimitrea-db
Copy link
Copy Markdown
Contributor Author

Range-diff: stack/logging-jul (429b118 -> 6ea8c92)
NEXT_CHANGELOG.md
@@ -1,11 +1,9 @@
 diff --git a/NEXT_CHANGELOG.md b/NEXT_CHANGELOG.md
 --- a/NEXT_CHANGELOG.md
 +++ b/NEXT_CHANGELOG.md
- ### Documentation
- 
  ### Internal Changes
-+* Introduced a logging abstraction (`com.databricks.sdk.core.logging`) that decouples the SDK from SLF4J. Users can now provide their own logging backend by extending `LoggerFactory` and calling `LoggerFactory.setDefault()` before creating any SDK client. SLF4J remains the default.
-+* Added `java.util.logging` as a supported alternative logging backend. Activate it with `LoggerFactory.setDefault(JulLoggerFactory.INSTANCE)`.
+ * Introduced a logging abstraction (`com.databricks.sdk.core.logging`) that decouples the SDK from SLF4J. Users can now provide their own logging backend by extending `LoggerFactory` and calling `LoggerFactory.setDefault()` before creating any SDK client. SLF4J remains the default.
+ * Added `java.util.logging` as a supported alternative logging backend. Activate it with `LoggerFactory.setDefault(JulLoggerFactory.INSTANCE)`.
 +* Migrated internal SDK classes from direct `org.slf4j` imports to the new logging abstraction.
  
  ### API Changes
databricks-sdk-java/src/main/java/com/databricks/sdk/core/logging/LoggerFactory.java
@@ -1,23 +0,0 @@
-diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/logging/LoggerFactory.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/logging/LoggerFactory.java
---- a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/logging/LoggerFactory.java
-+++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/logging/LoggerFactory.java
- /**
-  * Creates and configures {@link Logger} instances for the SDK.
-  *
-- * <p>By default, logging goes through SLF4J. Users can override the backend programmatically
-- * before creating any SDK client:
-+ * <p>By default, logging goes through SLF4J. Users can override the backend programmatically before
-+ * creating any SDK client:
-  *
-  * <pre>{@code
-  * LoggerFactory.setDefault(JulLoggerFactory.INSTANCE);
-   /**
-    * Overrides the logging backend used by the SDK.
-    *
--   * <p>Must be called before creating any SDK client or calling {@link #getLogger}. Loggers
--   * already obtained will not be affected by subsequent calls.
-+   * <p>Must be called before creating any SDK client or calling {@link #getLogger}. Loggers already
-+   * obtained will not be affected by subsequent calls.
-    */
-   public static void setDefault(LoggerFactory factory) {
-     if (factory == null) {
\ No newline at end of file
databricks-sdk-java/src/test/java/com/databricks/sdk/core/logging/JulLoggerTest.java
@@ -1,21 +0,0 @@
-diff --git a/databricks-sdk-java/src/test/java/com/databricks/sdk/core/logging/JulLoggerTest.java b/databricks-sdk-java/src/test/java/com/databricks/sdk/core/logging/JulLoggerTest.java
---- a/databricks-sdk-java/src/test/java/com/databricks/sdk/core/logging/JulLoggerTest.java
-+++ b/databricks-sdk-java/src/test/java/com/databricks/sdk/core/logging/JulLoggerTest.java
-         Arguments.of("info", "hello", null, "hello", null),
-         Arguments.of("warn", "hello", null, "hello", null),
-         Arguments.of("error", "hello", null, "hello", null),
--        Arguments.of("info", "user {} logged in", new Object[] {"alice"}, "user alice logged in", null),
-+        Arguments.of(
-+            "info", "user {} logged in", new Object[] {"alice"}, "user alice logged in", null),
-         Arguments.of("info", "a={}, b={}", new Object[] {1, 2}, "a=1, b=2", null),
-         Arguments.of("error", "failed: {}", new Object[] {"op", ex}, "failed: op", ex),
-         Arguments.of("error", "Error: {}", new Object[] {ex}, "Error: {}", ex),
- 
-   @Test
-   void isDebugEnabledReflectsJulLevel() {
--    java.util.logging.Logger julLogger =
--        java.util.logging.Logger.getLogger("isDebugTest");
-+    java.util.logging.Logger julLogger = java.util.logging.Logger.getLogger("isDebugTest");
-     Logger logger = JulLogger.create("isDebugTest");
- 
-     julLogger.setLevel(Level.FINE);
\ No newline at end of file
databricks-sdk-java/src/test/java/com/databricks/sdk/core/logging/LoggingParityTest.java
@@ -1,12 +0,0 @@
-diff --git a/databricks-sdk-java/src/test/java/com/databricks/sdk/core/logging/LoggingParityTest.java b/databricks-sdk-java/src/test/java/com/databricks/sdk/core/logging/LoggingParityTest.java
---- a/databricks-sdk-java/src/test/java/com/databricks/sdk/core/logging/LoggingParityTest.java
-+++ b/databricks-sdk-java/src/test/java/com/databricks/sdk/core/logging/LoggingParityTest.java
- import static org.junit.jupiter.api.Assertions.*;
- 
- import org.junit.jupiter.api.Test;
--import org.slf4j.helpers.MessageFormatter;
- import org.slf4j.helpers.FormattingTuple;
-+import org.slf4j.helpers.MessageFormatter;
- 
- /**
-  * Verifies that JulLogger's placeholder formatting and Throwable extraction produce the same
\ No newline at end of file
databricks-sdk-java/src/test/java/com/databricks/sdk/core/logging/Slf4jLoggerTest.java
@@ -1,21 +0,0 @@
-diff --git a/databricks-sdk-java/src/test/java/com/databricks/sdk/core/logging/Slf4jLoggerTest.java b/databricks-sdk-java/src/test/java/com/databricks/sdk/core/logging/Slf4jLoggerTest.java
---- a/databricks-sdk-java/src/test/java/com/databricks/sdk/core/logging/Slf4jLoggerTest.java
-+++ b/databricks-sdk-java/src/test/java/com/databricks/sdk/core/logging/Slf4jLoggerTest.java
-         Arguments.of("info", "hello", null, "hello", null),
-         Arguments.of("warn", "hello", null, "hello", null),
-         Arguments.of("error", "hello", null, "hello", null),
--        Arguments.of("info", "user {} logged in", new Object[] {"alice"}, "user alice logged in", null),
-+        Arguments.of(
-+            "info", "user {} logged in", new Object[] {"alice"}, "user alice logged in", null),
-         Arguments.of("info", "a={}, b={}", new Object[] {1, 2}, "a=1, b=2", null),
-         Arguments.of("error", "failed: {}", new Object[] {"op", ex}, "failed: op", ex),
-         Arguments.of("error", "Error: {}", new Object[] {ex}, "Error: {}", ex),
-   void deliversCorrectOutput(
-       String level, String format, Object[] args, String expectedMsg, Throwable expectedThrown) {
-     CapturingAppender appender = new CapturingAppender();
--    org.apache.log4j.Logger log4jLogger =
--        org.apache.log4j.Logger.getLogger(Slf4jLoggerTest.class);
-+    org.apache.log4j.Logger log4jLogger = org.apache.log4j.Logger.getLogger(Slf4jLoggerTest.class);
-     log4jLogger.addAppender(appender);
-     try {
-       Logger logger = Slf4jLogger.create(Slf4jLoggerTest.class);
\ No newline at end of file

Reproduce locally: git range-diff 80ecee4..429b118 63c2968..6ea8c92 | Disable: git config gitstack.push-range-diff false

@mihaimitrea-db mihaimitrea-db force-pushed the mihaimitrea-db/stack/logging-migration branch from 6ea8c92 to 9538f13 Compare March 30, 2026 15:11
@mihaimitrea-db
Copy link
Copy Markdown
Contributor Author

Range-diff: stack/logging-jul (6ea8c92 -> 9538f13)
NEXT_CHANGELOG.md
@@ -1,12 +1,10 @@
 diff --git a/NEXT_CHANGELOG.md b/NEXT_CHANGELOG.md
 --- a/NEXT_CHANGELOG.md
 +++ b/NEXT_CHANGELOG.md
- ### Documentation
- 
  ### Internal Changes
-+* Introduced a logging abstraction (`com.databricks.sdk.core.logging`) that decouples the SDK from SLF4J. Users can now provide their own logging backend by extending `LoggerFactory` and calling `LoggerFactory.setDefault()` before creating any SDK client. SLF4J remains the default.
-+* Added `java.util.logging` as a supported alternative logging backend. Activate it with `LoggerFactory.setDefault(JulLoggerFactory.INSTANCE)`.
-+* Migrated internal SDK classes from direct `org.slf4j` imports to the new logging abstraction.
+ * Introduced a logging abstraction (`com.databricks.sdk.core.logging`) to decouple the SDK from a specific logging backend.
+ * Added `java.util.logging` as a supported alternative logging backend. Activate it with `LoggerFactory.setDefault(JulLoggerFactory.INSTANCE)`.
++* Migrated internal SDK classes to the logging abstraction. The SDK now supports SLF4J, `java.util.logging`, or a custom backend via `LoggerFactory.setDefault()`.
  
  ### API Changes
  * Add `createCatalog()`, `createSyncedTable()`, `deleteCatalog()`, `deleteSyncedTable()`, `getCatalog()` and `getSyncedTable()` methods for `workspaceClient.postgres()` service.
\ No newline at end of file
databricks-sdk-java/src/main/java/com/databricks/sdk/core/logging/JulLogger.java
@@ -1,23 +0,0 @@
-diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/logging/JulLogger.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/logging/JulLogger.java
---- a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/logging/JulLogger.java
-+++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/logging/JulLogger.java
- 
-   private final java.util.logging.Logger delegate;
- 
--  private JulLogger(java.util.logging.Logger delegate) {
-+  JulLogger(java.util.logging.Logger delegate) {
-     this.delegate = delegate;
-   }
- 
--  static Logger create(Class<?> type) {
--    return create(type.getName());
--  }
--
--  static Logger create(String name) {
--    java.util.logging.Logger julLogger = java.util.logging.Logger.getLogger(name);
--    return new JulLogger(julLogger);
--  }
--
-   @Override
-   public boolean isDebugEnabled() {
-     return delegate.isLoggable(Level.FINE);
\ No newline at end of file
databricks-sdk-java/src/main/java/com/databricks/sdk/core/logging/JulLoggerFactory.java
@@ -1,19 +0,0 @@
-diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/logging/JulLoggerFactory.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/logging/JulLoggerFactory.java
---- a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/logging/JulLoggerFactory.java
-+++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/logging/JulLoggerFactory.java
-   public static final JulLoggerFactory INSTANCE = new JulLoggerFactory();
- 
-   @Override
--  protected Logger newInstance(Class<?> type) {
--    return JulLogger.create(type);
-+  protected Logger createLogger(Class<?> type) {
-+    return new JulLogger(java.util.logging.Logger.getLogger(type.getName()));
-   }
- 
-   @Override
--  protected Logger newInstance(String name) {
--    return JulLogger.create(name);
-+  protected Logger createLogger(String name) {
-+    return new JulLogger(java.util.logging.Logger.getLogger(name));
-   }
- }
\ No newline at end of file
databricks-sdk-java/src/main/java/com/databricks/sdk/core/logging/LoggerFactory.java
@@ -1,55 +0,0 @@
-diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/logging/LoggerFactory.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/logging/LoggerFactory.java
---- a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/logging/LoggerFactory.java
-+++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/logging/LoggerFactory.java
- /**
-  * Creates and configures {@link Logger} instances for the SDK.
-  *
-- * <p>By default, logging goes through SLF4J. Users can override the backend programmatically
-- * before creating any SDK client:
-+ * <p>By default, logging goes through SLF4J. Users can override the backend programmatically before
-+ * creating any SDK client:
-  *
-  * <pre>{@code
-  * LoggerFactory.setDefault(JulLoggerFactory.INSTANCE);
- 
-   /** Returns a logger for the given class, using the current default factory. */
-   public static Logger getLogger(Class<?> type) {
--    return getDefault().newInstance(type);
-+    return getDefault().createLogger(type);
-   }
- 
-   /** Returns a logger with the given name, using the current default factory. */
-   public static Logger getLogger(String name) {
--    return getDefault().newInstance(name);
-+    return getDefault().createLogger(name);
-   }
- 
-   /**
-    * Overrides the logging backend used by the SDK.
-    *
--   * <p>Must be called before creating any SDK client or calling {@link #getLogger}. Loggers
--   * already obtained will not be affected by subsequent calls.
-+   * <p>Must be called before creating any SDK client or calling {@link #getLogger}. Loggers already
-+   * obtained will not be affected by subsequent calls.
-    */
-   public static void setDefault(LoggerFactory factory) {
-     if (factory == null) {
-     return defaultFactory.get();
-   }
- 
--  /** Creates a new logger for the given class. */
--  protected abstract Logger newInstance(Class<?> type);
-+  /**
-+   * Creates a {@link Logger} for the given class. Subclasses obtain the backend logger (e.g. SLF4J)
-+   * and return an adapter.
-+   */
-+  protected abstract Logger createLogger(Class<?> type);
- 
--  /** Creates a new logger with the given name. */
--  protected abstract Logger newInstance(String name);
-+  /**
-+   * Creates a {@link Logger} for the given name. Subclasses obtain the backend logger and return an
-+   * adapter.
-+   */
-+  protected abstract Logger createLogger(String name);
- }
\ No newline at end of file
databricks-sdk-java/src/main/java/com/databricks/sdk/core/logging/Slf4jLogger.java
@@ -1,22 +0,0 @@
-diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/logging/Slf4jLogger.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/logging/Slf4jLogger.java
---- a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/logging/Slf4jLogger.java
-+++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/logging/Slf4jLogger.java
- 
-   private final org.slf4j.Logger delegate;
- 
--  private Slf4jLogger(org.slf4j.Logger delegate) {
-+  Slf4jLogger(org.slf4j.Logger delegate) {
-     this.delegate = delegate;
-   }
- 
--  static Logger create(Class<?> type) {
--    return new Slf4jLogger(org.slf4j.LoggerFactory.getLogger(type));
--  }
--
--  static Logger create(String name) {
--    return new Slf4jLogger(org.slf4j.LoggerFactory.getLogger(name));
--  }
--
-   @Override
-   public boolean isDebugEnabled() {
-     return delegate.isDebugEnabled();
\ No newline at end of file
databricks-sdk-java/src/main/java/com/databricks/sdk/core/logging/Slf4jLoggerFactory.java
@@ -1,19 +0,0 @@
-diff --git a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/logging/Slf4jLoggerFactory.java b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/logging/Slf4jLoggerFactory.java
---- a/databricks-sdk-java/src/main/java/com/databricks/sdk/core/logging/Slf4jLoggerFactory.java
-+++ b/databricks-sdk-java/src/main/java/com/databricks/sdk/core/logging/Slf4jLoggerFactory.java
-   public static final Slf4jLoggerFactory INSTANCE = new Slf4jLoggerFactory();
- 
-   @Override
--  protected Logger newInstance(Class<?> type) {
--    return Slf4jLogger.create(type);
-+  protected Logger createLogger(Class<?> type) {
-+    return new Slf4jLogger(org.slf4j.LoggerFactory.getLogger(type));
-   }
- 
-   @Override
--  protected Logger newInstance(String name) {
--    return Slf4jLogger.create(name);
-+  protected Logger createLogger(String name) {
-+    return new Slf4jLogger(org.slf4j.LoggerFactory.getLogger(name));
-   }
- }
\ No newline at end of file
databricks-sdk-java/src/test/java/com/databricks/sdk/core/logging/JulLoggerTest.java
@@ -1,41 +0,0 @@
-diff --git a/databricks-sdk-java/src/test/java/com/databricks/sdk/core/logging/JulLoggerTest.java b/databricks-sdk-java/src/test/java/com/databricks/sdk/core/logging/JulLoggerTest.java
---- a/databricks-sdk-java/src/test/java/com/databricks/sdk/core/logging/JulLoggerTest.java
-+++ b/databricks-sdk-java/src/test/java/com/databricks/sdk/core/logging/JulLoggerTest.java
-         Arguments.of("info", "hello", null, "hello", null),
-         Arguments.of("warn", "hello", null, "hello", null),
-         Arguments.of("error", "hello", null, "hello", null),
--        Arguments.of("info", "user {} logged in", new Object[] {"alice"}, "user alice logged in", null),
-+        Arguments.of(
-+            "info", "user {} logged in", new Object[] {"alice"}, "user alice logged in", null),
-         Arguments.of("info", "a={}, b={}", new Object[] {1, 2}, "a=1, b=2", null),
-         Arguments.of("error", "failed: {}", new Object[] {"op", ex}, "failed: op", ex),
-         Arguments.of("error", "Error: {}", new Object[] {ex}, "Error: {}", ex),
-     CapturingHandler handler = new CapturingHandler();
-     julLogger.addHandler(handler);
-     try {
--      Logger logger = JulLogger.create(JulLoggerTest.class);
-+      Logger logger =
-+          new JulLogger(java.util.logging.Logger.getLogger(JulLoggerTest.class.getName()));
-       dispatch(logger, level, format, args);
- 
-       assertEquals(1, handler.records.size(), "Expected exactly one log record");
- 
-   @Test
-   void isDebugEnabledReflectsJulLevel() {
--    java.util.logging.Logger julLogger =
--        java.util.logging.Logger.getLogger("isDebugTest");
--    Logger logger = JulLogger.create("isDebugTest");
-+    java.util.logging.Logger julLogger = java.util.logging.Logger.getLogger("isDebugTest");
-+    Logger logger = new JulLogger(julLogger);
- 
-     julLogger.setLevel(Level.FINE);
-     assertTrue(logger.isDebugEnabled());
-     CapturingHandler handler = new CapturingHandler();
-     julLogger.addHandler(handler);
-     try {
--      Logger logger = JulLogger.create(JulLoggerTest.class);
-+      Logger logger =
-+          new JulLogger(java.util.logging.Logger.getLogger(JulLoggerTest.class.getName()));
-       logger.info("test");
- 
-       assertEquals(1, handler.records.size());
\ No newline at end of file
databricks-sdk-java/src/test/java/com/databricks/sdk/core/logging/LoggingParityTest.java
@@ -1,12 +0,0 @@
-diff --git a/databricks-sdk-java/src/test/java/com/databricks/sdk/core/logging/LoggingParityTest.java b/databricks-sdk-java/src/test/java/com/databricks/sdk/core/logging/LoggingParityTest.java
---- a/databricks-sdk-java/src/test/java/com/databricks/sdk/core/logging/LoggingParityTest.java
-+++ b/databricks-sdk-java/src/test/java/com/databricks/sdk/core/logging/LoggingParityTest.java
- import static org.junit.jupiter.api.Assertions.*;
- 
- import org.junit.jupiter.api.Test;
--import org.slf4j.helpers.MessageFormatter;
- import org.slf4j.helpers.FormattingTuple;
-+import org.slf4j.helpers.MessageFormatter;
- 
- /**
-  * Verifies that JulLogger's placeholder formatting and Throwable extraction produce the same
\ No newline at end of file
databricks-sdk-java/src/test/java/com/databricks/sdk/core/logging/Slf4jLoggerTest.java
@@ -1,43 +0,0 @@
-diff --git a/databricks-sdk-java/src/test/java/com/databricks/sdk/core/logging/Slf4jLoggerTest.java b/databricks-sdk-java/src/test/java/com/databricks/sdk/core/logging/Slf4jLoggerTest.java
---- a/databricks-sdk-java/src/test/java/com/databricks/sdk/core/logging/Slf4jLoggerTest.java
-+++ b/databricks-sdk-java/src/test/java/com/databricks/sdk/core/logging/Slf4jLoggerTest.java
- public class Slf4jLoggerTest {
- 
-   @Test
--  void createReturnsSlf4jLogger() {
--    Logger logger = Slf4jLogger.create(Slf4jLoggerTest.class);
-+  void getLoggerReturnsSlf4jLogger() {
-+    Logger logger = LoggerFactory.getLogger(Slf4jLoggerTest.class);
-     assertNotNull(logger);
-     assertTrue(logger instanceof Slf4jLogger);
-   }
-   @Test
-   void isDebugEnabledReflectsBackend() {
-     // com.databricks is set to TRACE in log4j.properties
--    Logger logger = Slf4jLogger.create(Slf4jLoggerTest.class);
-+    Logger logger = LoggerFactory.getLogger(Slf4jLoggerTest.class);
-     assertTrue(logger.isDebugEnabled());
-   }
- 
-         Arguments.of("info", "hello", null, "hello", null),
-         Arguments.of("warn", "hello", null, "hello", null),
-         Arguments.of("error", "hello", null, "hello", null),
--        Arguments.of("info", "user {} logged in", new Object[] {"alice"}, "user alice logged in", null),
-+        Arguments.of(
-+            "info", "user {} logged in", new Object[] {"alice"}, "user alice logged in", null),
-         Arguments.of("info", "a={}, b={}", new Object[] {1, 2}, "a=1, b=2", null),
-         Arguments.of("error", "failed: {}", new Object[] {"op", ex}, "failed: op", ex),
-         Arguments.of("error", "Error: {}", new Object[] {ex}, "Error: {}", ex),
-   void deliversCorrectOutput(
-       String level, String format, Object[] args, String expectedMsg, Throwable expectedThrown) {
-     CapturingAppender appender = new CapturingAppender();
--    org.apache.log4j.Logger log4jLogger =
--        org.apache.log4j.Logger.getLogger(Slf4jLoggerTest.class);
-+    org.apache.log4j.Logger log4jLogger = org.apache.log4j.Logger.getLogger(Slf4jLoggerTest.class);
-     log4jLogger.addAppender(appender);
-     try {
--      Logger logger = Slf4jLogger.create(Slf4jLoggerTest.class);
-+      Logger logger = new Slf4jLogger(org.slf4j.LoggerFactory.getLogger(Slf4jLoggerTest.class));
-       dispatch(logger, level, format, args);
- 
-       assertEquals(1, appender.events.size(), "Expected exactly one log event");
\ No newline at end of file

Reproduce locally: git range-diff 80ecee4..6ea8c92 43c742c..9538f13 | Disable: git config gitstack.push-range-diff false

@mihaimitrea-db mihaimitrea-db force-pushed the mihaimitrea-db/stack/logging-migration branch from 9538f13 to b9a1c3e Compare March 30, 2026 16:24
@mihaimitrea-db
Copy link
Copy Markdown
Contributor Author

Range-diff: stack/logging-jul (9538f13 -> b9a1c3e)
databricks-sdk-java/src/main/java/com/databricks/sdk/core/ApiClient.java
@@ -16,4 +16,15 @@
 -import org.slf4j.LoggerFactory;
  
  /**
-  * Simplified REST API client with retries, JSON POJO SerDe through Jackson and exception POJO
\ No newline at end of file
+  * Simplified REST API client with retries, JSON POJO SerDe through Jackson and exception POJO
+ 
+       try {
+         response = httpClient.execute(in);
+-        if (LOG.isDebugEnabled()) {
+-          LOG.debug(makeLogRecord(in, response));
+-        }
++        Response resp = response;
++        LOG.debug(() -> makeLogRecord(in, resp));
+ 
+         if (isResponseSuccessful(response)) {
+           return response; // stop here if the request succeeded
\ No newline at end of file

Reproduce locally: git range-diff 43c742c..9538f13 a8043ce..b9a1c3e | Disable: git config gitstack.push-range-diff false

@mihaimitrea-db mihaimitrea-db force-pushed the mihaimitrea-db/stack/logging-migration branch from b9a1c3e to ef19539 Compare April 9, 2026 10:41
@mihaimitrea-db mihaimitrea-db force-pushed the mihaimitrea-db/stack/logging-migration branch from ef19539 to d5c03d4 Compare April 13, 2026 16:33
@mihaimitrea-db mihaimitrea-db force-pushed the mihaimitrea-db/stack/logging-migration branch from d5c03d4 to b8920e0 Compare April 13, 2026 16:41
github-merge-queue bot pushed a commit that referenced this pull request Apr 14, 2026
## 🥞 Stacked PR
Use this
[link](https://github.com/databricks/databricks-sdk-java/pull/740/files)
to review incremental changes.
-
[**stack/logging-abstraction**](#740)
[[Files
changed](https://github.com/databricks/databricks-sdk-java/pull/740/files)]
-
[stack/logging-jul](#741)
[[Files
changed](https://github.com/databricks/databricks-sdk-java/pull/741/files/5156924f..c04a3808)]
-
[stack/logging-migration](#742)
[[Files
changed](https://github.com/databricks/databricks-sdk-java/pull/742/files/c04a3808..d5c03d4f)]

---------
## Summary

Introduces a logging abstraction layer
(`com.databricks.sdk.core.logging`) that decouples the SDK's internal
logging from any specific backend. SLF4J remains the default. This PR
contains only the abstraction and the SLF4J backend; the JUL backend and
the call-site migration follow in stacked PRs.

## Why

The SDK currently imports `org.slf4j.Logger` and
`org.slf4j.LoggerFactory` directly in every class that logs. This hard
coupling means users who embed the SDK in environments where SLF4J is
impractical (e.g. BI tools with constrained classpaths) have no way to
switch to an alternative logging backend like `java.util.logging`.

We need an indirection layer that lets users swap the logging backend
programmatically while keeping SLF4J as the zero-configuration default
so that existing users don't have to change anything.

The design follows the pattern established by SLF4J itself and [Netty's
`InternalLoggerFactory`](https://netty.io/4.1/api/io/netty/util/internal/logging/InternalLoggerFactory.html):
an `ILoggerFactory` interface that backends implement, a `LoggerFactory`
utility class with static `getLogger` / `setDefault` methods, and a
separate `Logger` abstract class that serves as a clean extension point
for custom implementations.

## What changed

### Interface changes

- **`Logger`** — new abstract class in
`com.databricks.sdk.core.logging`. Defines the logging contract:
`debug`, `info`, `warn`, `error` (each with plain-string, varargs, and
`Supplier<String>` overloads). Users extend this to build custom
loggers.
- **`ILoggerFactory`** — new interface. Backends implement
`createLogger(Class<?>)` and `createLogger(String)` to produce `Logger`
instances. Users implement this to provide a fully custom logging
backend.
- **`LoggerFactory`** — new `final` utility class. Static methods
`getLogger(Class<?>)` and `getLogger(String)` return loggers from the
current default factory. `setDefault(ILoggerFactory)` overrides the
backend — must be called before creating any SDK client.
- **`Slf4jLoggerFactory`** — public concrete `ILoggerFactory`
implementation with a singleton `INSTANCE`. This is the default.

### Behavioral changes

None. SLF4J is the default backend and all logging calls pass through to
`org.slf4j.Logger` exactly as before. Existing users see no difference.

### Internal changes

- **`Slf4jLogger`** — package-private class that delegates all calls to
an `org.slf4j.Logger`. Fully qualifies `org.slf4j.LoggerFactory`
references to avoid collision with the SDK's `LoggerFactory`.
- All new classes live in `com.databricks.sdk.core.logging`.

## How is this tested?

- `LoggerFactoryTest` — verifies the default factory is SLF4J, that
`setDefault(null)` is rejected, and that `getLogger(String)` works.
- `Slf4jLoggerTest` — verifies `LoggerFactory.getLogger` returns the
correct type, exercises all logging methods including varargs and
trailing Throwable via a capturing Log4j appender that asserts on
message content, level, and attached throwable.
- Full test suite passes.
@mihaimitrea-db mihaimitrea-db force-pushed the mihaimitrea-db/stack/logging-migration branch from b8920e0 to 2370a96 Compare April 14, 2026 08:19
@mihaimitrea-db
Copy link
Copy Markdown
Contributor Author

Range-diff: stack/logging-jul (b8920e0 -> 2370a96)
NEXT_CHANGELOG.md
@@ -1,13 +1,6 @@
 diff --git a/NEXT_CHANGELOG.md b/NEXT_CHANGELOG.md
 --- a/NEXT_CHANGELOG.md
 +++ b/NEXT_CHANGELOG.md
- ## Release v0.104.0
- 
- ### New Features and Improvements
-+* Added automatic detection of AI coding agents (Antigravity, Claude Code, Cline, Codex, Copilot CLI, Cursor, Gemini CLI, OpenCode) in the user-agent string. The SDK now appends `agent/<name>` to HTTP request headers when running inside a known AI agent environment.
- 
- ### Bug Fixes
- * Fixed Databricks CLI authentication to detect when the cached token's scopes don't match the SDK's configured scopes. Previously, a scope mismatch was silently ignored, causing requests to use wrong permissions. The SDK now raises an error with instructions to re-authenticate.
  ### Internal Changes
  * Introduced a logging abstraction (`com.databricks.sdk.core.logging`) to decouple the SDK from a specific logging backend.
  * Added `java.util.logging` as a supported alternative logging backend. Activate it with `LoggerFactory.setDefault(JulLoggerFactory.INSTANCE)`.
databricks-sdk-java/src/main/java/com/databricks/sdk/core/DatabricksCliCredentialsProvider.java
@@ -15,23 +15,4 @@
 -import org.slf4j.LoggerFactory;
  
  @InternalApi
- public class DatabricksCliCredentialsProvider implements CredentialsProvider {
-   List<String> buildHostArgs(String cliPath, DatabricksConfig config) {
-     List<String> cmd =
-         new ArrayList<>(Arrays.asList(cliPath, "auth", "token", "--host", config.getHost()));
--    if (config.getExperimentalIsUnifiedHost() != null && config.getExperimentalIsUnifiedHost()) {
--      // For unified hosts, pass account_id, workspace_id, and experimental flag
--      cmd.add("--experimental-is-unified-host");
--      if (config.getAccountId() != null) {
--        cmd.add("--account-id");
--        cmd.add(config.getAccountId());
--      }
--      if (config.getWorkspaceId() != null) {
--        cmd.add("--workspace-id");
--        cmd.add(config.getWorkspaceId());
--      }
--    } else if (config.getClientType() == ClientType.ACCOUNT) {
-+    if (config.getClientType() == ClientType.ACCOUNT) {
-       cmd.add("--account-id");
-       cmd.add(config.getAccountId());
-     }
\ No newline at end of file
+ public class DatabricksCliCredentialsProvider implements CredentialsProvider {
\ No newline at end of file
databricks-sdk-java/src/main/java/com/databricks/sdk/core/DatabricksConfig.java
@@ -16,108 +16,4 @@
 -import org.slf4j.LoggerFactory;
  
  public class DatabricksConfig {
-   private static final Logger LOG = LoggerFactory.getLogger(DatabricksConfig.class);
-   }
- 
-   /**
--   * Attempts to resolve host metadata from the well-known endpoint. Only called for unified hosts.
--   * Logs a warning and continues if metadata resolution fails, since not all hosts support the
--   * discovery endpoint.
-+   * Attempts to resolve host metadata from the well-known endpoint. Logs a warning and continues if
-+   * metadata resolution fails, since not all hosts support the discovery endpoint.
-    */
-   private void tryResolveHostMetadata() {
-     if (host == null) {
-       return;
-     }
--    if (experimentalIsUnifiedHost == null || !experimentalIsUnifiedHost) {
--      return;
--    }
-     try {
-       resolveHostMetadata();
-     } catch (Exception e) {
-       }
-       Map<String, String> headers = new HashMap<>(headerFactory.headers());
- 
--      // For unified hosts with workspace operations, add the X-Databricks-Org-Id header
--      if (getHostType() == HostType.UNIFIED && workspaceId != null && !workspaceId.isEmpty()) {
--        headers.put("X-Databricks-Org-Id", workspaceId);
--      }
--
-       return headers;
-     } catch (DatabricksException e) {
-       String msg = String.format("%s auth: %s", credentialsProvider.authType(), e.getMessage());
-   }
- 
-   public boolean isAccountClient() {
--    if (getHostType() == HostType.UNIFIED) {
--      throw new DatabricksException(
--          "Cannot determine account client status for unified hosts. "
--              + "Use getHostType() or getClientType() instead. "
--              + "For unified hosts, client type depends on whether workspaceId is set.");
--    }
-     if (host == null) {
-       return false;
-     }
-     return host.startsWith("https://accounts.") || host.startsWith("https://accounts-dod.");
-   }
- 
--  /** Returns the host type based on configuration settings and host URL. */
-+  /** Returns the host type based on the host URL pattern. */
-   public HostType getHostType() {
--    if (experimentalIsUnifiedHost != null && experimentalIsUnifiedHost) {
--      return HostType.UNIFIED;
--    }
-     if (host == null) {
-       return HostType.WORKSPACE;
-     }
-     return HostType.WORKSPACE;
-   }
- 
--  /** Returns the client type based on host type and workspace ID configuration. */
-+  /** Returns the client type based on host type. */
-   public ClientType getClientType() {
-     HostType hostType = getHostType();
-     switch (hostType) {
--      case UNIFIED:
--        // For unified hosts, client type depends on whether workspaceId is set
--        return (workspaceId != null && !workspaceId.isEmpty())
--            ? ClientType.WORKSPACE
--            : ClientType.ACCOUNT;
-       case ACCOUNTS:
-         return ClientType.ACCOUNT;
-       case WORKSPACE:
-       discoveryUrl = oidcUri.resolve(".well-known/oauth-authorization-server").toString();
-       LOG.debug("Resolved discovery_url from host metadata: \"{}\"", discoveryUrl);
-     }
-+    // For account hosts, use the accountId as the token audience if not already set.
-+    if (tokenAudience == null && getClientType() == ClientType.ACCOUNT && accountId != null) {
-+      tokenAudience = accountId;
-+    }
-   }
- 
-   private OpenIDConnectEndpoints fetchOidcEndpointsFromDiscovery() {
-     return null;
-   }
- 
--  private OpenIDConnectEndpoints getUnifiedOidcEndpoints(String accountId) throws IOException {
--    if (accountId == null || accountId.isEmpty()) {
--      throw new DatabricksException(
--          "account_id is required for unified host OIDC endpoint discovery");
--    }
--    String prefix = getHost() + "/oidc/accounts/" + accountId;
--    return new OpenIDConnectEndpoints(prefix + "/v1/token", prefix + "/v1/authorize");
--  }
--
-   private OpenIDConnectEndpoints fetchDefaultOidcEndpoints() throws IOException {
-     if (getHost() == null) {
-       return null;
-     }
- 
--    // For unified hosts, use account-based OIDC endpoints
--    if (getHostType() == HostType.UNIFIED) {
--      return getUnifiedOidcEndpoints(getAccountId());
--    }
-     if (isAccountClient() && getAccountId() != null) {
-       String prefix = getHost() + "/oidc/accounts/" + getAccountId();
-       return new OpenIDConnectEndpoints(prefix + "/v1/token", prefix + "/v1/authorize");
\ No newline at end of file
+   private static final Logger LOG = LoggerFactory.getLogger(DatabricksConfig.class);
\ No newline at end of file
databricks-sdk-java/src/main/java/com/databricks/sdk/core/GoogleCredentialsCredentialsProvider.java
@@ -16,26 +16,4 @@
 -import org.slf4j.LoggerFactory;
  
  @InternalApi
- public class GoogleCredentialsCredentialsProvider implements CredentialsProvider {
-       Map<String, String> headers = new HashMap<>();
-       headers.put("Authorization", String.format("Bearer %s", idToken.getTokenValue()));
- 
--      if (config.getClientType() == ClientType.ACCOUNT) {
--        AccessToken token;
--        try {
--          token = finalServiceAccountCredentials.createScoped(GCP_SCOPES).refreshAccessToken();
--        } catch (IOException e) {
--          String message =
--              "Failed to refresh access token from Google service account credentials.";
--          LOG.error(message + e);
--          throw new DatabricksException(message, e);
--        }
-+      try {
-+        AccessToken token =
-+            finalServiceAccountCredentials.createScoped(GCP_SCOPES).refreshAccessToken();
-         headers.put(SA_ACCESS_TOKEN_HEADER, token.getTokenValue());
-+      } catch (IOException e) {
-+        LOG.warn("Failed to refresh GCP SA access token, skipping header: {}", e.getMessage());
-       }
- 
-       return headers;
\ No newline at end of file
+ public class GoogleCredentialsCredentialsProvider implements CredentialsProvider {
\ No newline at end of file
databricks-sdk-java/src/main/java/com/databricks/sdk/core/GoogleIdCredentialsProvider.java
@@ -16,24 +16,4 @@
 -import org.slf4j.LoggerFactory;
  
  @InternalApi
- public class GoogleIdCredentialsProvider implements CredentialsProvider {
-         throw new DatabricksException(message, e);
-       }
- 
--      if (config.getClientType() == ClientType.ACCOUNT) {
--        try {
--          headers.put(
--              SA_ACCESS_TOKEN_HEADER, gcpScopedCredentials.refreshAccessToken().getTokenValue());
--        } catch (IOException e) {
--          String message = "Failed to refresh access token from scoped id token credentials.";
--          LOG.error(message + e);
--          throw new DatabricksException(message, e);
--        }
-+      try {
-+        headers.put(
-+            SA_ACCESS_TOKEN_HEADER, gcpScopedCredentials.refreshAccessToken().getTokenValue());
-+      } catch (IOException e) {
-+        LOG.warn("Failed to refresh GCP SA access token, skipping header: {}", e.getMessage());
-       }
- 
-       return headers;
\ No newline at end of file
+ public class GoogleIdCredentialsProvider implements CredentialsProvider {
\ No newline at end of file
.github/actions/setup-build-environment/action.yml
@@ -1,54 +0,0 @@
-diff --git a/.github/actions/setup-build-environment/action.yml b/.github/actions/setup-build-environment/action.yml
-new file mode 100644
---- /dev/null
-+++ b/.github/actions/setup-build-environment/action.yml
-+name: Setup build environment
-+description: Set up JDK with JFrog Artifactory as Maven mirror for hardened runners
-+
-+inputs:
-+  java-version:
-+    description: "Java version to install"
-+    required: true
-+
-+runs:
-+  using: composite
-+  steps:
-+    - name: Setup JFrog CLI with OIDC
-+      if: runner.os != 'macOS'
-+      id: jfrog
-+      uses: jfrog/setup-jfrog-cli@279b1f629f43dd5bc658d8361ac4802a7ef8d2d5 # v4.9.1
-+      env:
-+        JF_URL: https://databricks.jfrog.io
-+      with:
-+        oidc-provider-name: github-actions
-+
-+    - name: Set up JDK
-+      uses: actions/setup-java@b6e674f4b717d7b0ae3baee0fbe79f498905dfde # v1.4.4
-+      with:
-+        java-version: ${{ inputs.java-version }}
-+
-+    - name: Configure Maven for JFrog
-+      if: runner.os != 'macOS'
-+      shell: bash
-+      run: |
-+        mkdir -p ~/.m2
-+        cat > ~/.m2/settings.xml << EOF
-+        <settings xmlns="http://maven.apache.org/SETTINGS/1.0.0"
-+                  xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
-+                  xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0 https://maven.apache.org/xsd/settings-1.0.0.xsd">
-+          <mirrors>
-+            <mirror>
-+              <id>jfrog-maven</id>
-+              <url>https://databricks.jfrog.io/artifactory/db-maven/</url>
-+              <mirrorOf>*</mirrorOf>
-+            </mirror>
-+          </mirrors>
-+          <servers>
-+            <server>
-+              <id>jfrog-maven</id>
-+              <username>${{ steps.jfrog.outputs.oidc-user }}</username>
-+              <password><![CDATA[${{ steps.jfrog.outputs.oidc-token }}]]></password>
-+            </server>
-+          </servers>
-+        </settings>
-+        EOF
\ No newline at end of file
.github/conftest/README.md
@@ -1,47 +0,0 @@
-diff --git a/.github/conftest/README.md b/.github/conftest/README.md
-new file mode 100644
---- /dev/null
-+++ b/.github/conftest/README.md
-+# Conftest policies for GitHub Actions
-+
-+This directory contains [Conftest](https://www.conftest.dev/) policies that
-+validate GitHub Actions [workflows] and [composite actions]. They are evaluated
-+by the [conftest workflow](../workflows/conftest.yml) on every push and pull
-+request that touches `.github/`.
-+
-+## Adding a new rule
-+
-+1. Create a new `.rego` file under `rules/`.
-+2. Use `package main` and add violations to `deny`.
-+3. Include a comment block at the top of the file explaining the rule and how
-+   to fix violations.
-+4. Push — the conftest workflow picks up new rules automatically.
-+
-+Note that workflows and composite actions have different YAML schemas.
-+Workflows define jobs under `jobs.<name>.steps`, while composite actions define
-+steps under `runs.steps`. Rules that inspect steps must handle both.
-+
-+## Running locally
-+
-+```bash
-+# Install conftest (macOS)
-+brew install conftest
-+
-+# Run all policies against workflows and composite actions
-+conftest test \
-+  .github/workflows/*.yml \
-+  .github/actions/*/action.yml \
-+  --policy .github/conftest/rules
-+```
-+
-+## References
-+
-+- [Conftest](https://www.conftest.dev/) — policy testing tool for configuration files
-+- [Rego](https://www.openpolicyagent.org/docs/latest/policy-language/) — the policy language used by Conftest and OPA
-+- [Workflow syntax](https://docs.github.com/en/actions/writing-workflows/workflow-syntax-for-github-actions) — YAML schema for `.github/workflows/*.yml`
-+- [Composite actions](https://docs.github.com/en/actions/sharing-automations/creating-actions/creating-a-composite-action) — YAML schema for `action.yml` in composite actions
-+- [Security hardening](https://docs.github.com/en/actions/security-for-github-actions/security-guides/security-hardening-for-github-actions) — GitHub's guide to securing workflows
-+- [Using third-party actions](https://docs.github.com/en/actions/security-for-github-actions/security-guides/security-hardening-for-github-actions#using-third-party-actions) — why pinning to commit SHAs matters
-+
-+[workflows]: https://docs.github.com/en/actions/writing-workflows/workflow-syntax-for-github-actions
-+[composite actions]: https://docs.github.com/en/actions/sharing-automations/creating-actions/creating-a-composite-action
\ No newline at end of file
.github/conftest/rules/pinned_actions.rego
@@ -1,55 +0,0 @@
-diff --git a/.github/conftest/rules/pinned_actions.rego b/.github/conftest/rules/pinned_actions.rego
-new file mode 100644
---- /dev/null
-+++ b/.github/conftest/rules/pinned_actions.rego
-+# Action pinning — supply-chain protection
-+#
-+# External actions must be pinned to a full 40-character commit SHA.
-+# Mutable tags like @v1 can be reassigned to point at malicious commits.
-+# Local composite actions (./...) are exempt.
-+#
-+# Good:  actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683
-+# Bad:   actions/checkout@v4
-+# Bad:   actions/checkout@main
-+#
-+# How to fix:
-+#   1. Find the tag you want to pin (e.g. v4.3.1).
-+#   2. Look up the commit SHA:
-+#        git ls-remote --tags https://github.com/<owner>/<action>.git '<tag>^{}' '<tag>'
-+#   3. Replace the tag with the SHA and add a comment with the tag name:
-+#        uses: actions/checkout@<sha> # v4.3.1
-+#
-+# Always include the "# <tag>" suffix comment so humans can tell which
-+# version is pinned. This cannot be enforced by conftest (YAML strips
-+# comments during parsing), so it is a convention to follow manually.
-+
-+package main
-+
-+import rego.v1
-+
-+_is_pinned(ref) if {
-+	regex.match(`^[^@]+@[0-9a-f]{40}$`, ref)
-+}
-+
-+_is_local(ref) if {
-+	startswith(ref, "./")
-+}
-+
-+# Workflow files: jobs.<name>.steps[].uses
-+deny contains msg if {
-+	some job_name, job in input.jobs
-+	some i, step in job.steps
-+	step.uses
-+	not _is_local(step.uses)
-+	not _is_pinned(step.uses)
-+	msg := sprintf("%s: step %d: action '%s' must be pinned to a full commit SHA", [job_name, i, step.uses])
-+}
-+
-+# Composite actions: runs.steps[].uses
-+deny contains msg if {
-+	some i, step in input.runs.steps
-+	step.uses
-+	not _is_local(step.uses)
-+	not _is_pinned(step.uses)
-+	msg := sprintf("step %d: action '%s' must be pinned to a full commit SHA", [i, step.uses])
-+}
\ No newline at end of file
.github/workflows/conftest.yml
@@ -1,45 +0,0 @@
-diff --git a/.github/workflows/conftest.yml b/.github/workflows/conftest.yml
-new file mode 100644
---- /dev/null
-+++ b/.github/workflows/conftest.yml
-+name: conftest
-+
-+on:
-+  push:
-+    branches: [main]
-+    paths:
-+      - '.github/**'
-+  pull_request:
-+    paths:
-+      - '.github/**'
-+
-+env:
-+  CONFTEST_VERSION: "0.67.0"
-+  CONFTEST_SHA256: "a98cfd236f3cadee16d860fbe31cc6edcb0da3efc317f661c7ccd097164b1fdf"
-+
-+jobs:
-+  conftest:
-+    runs-on: ubuntu-latest
-+
-+    steps:
-+      - name: Checkout
-+        uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
-+
-+      - name: Install conftest
-+        run: |-
-+          curl -fsSL "https://github.com/open-policy-agent/conftest/releases/download/v${CONFTEST_VERSION}/conftest_${CONFTEST_VERSION}_Linux_x86_64.tar.gz" \
-+            -o conftest.tar.gz
-+          echo "${CONFTEST_SHA256}  conftest.tar.gz" | sha256sum --check --strict
-+          tar xz -f conftest.tar.gz conftest
-+          sudo mv conftest /usr/local/bin/
-+
-+      - name: Run conftest
-+        shell: bash
-+        run: |-
-+          shopt -s nullglob
-+          conftest test \
-+            .github/workflows/*.yml \
-+            .github/workflows/*.yaml \
-+            .github/actions/*/action.yml \
-+            .github/actions/*/action.yaml \
-+            --policy .github/conftest/rules
\ No newline at end of file
.github/workflows/push.yml
@@ -1,111 +0,0 @@
-diff --git a/.github/workflows/push.yml b/.github/workflows/push.yml
---- a/.github/workflows/push.yml
-+++ b/.github/workflows/push.yml
-   merge_group:
-     types: [checks_requested]
- 
-+permissions:
-+  id-token: write
-+  contents: read
-+
- jobs:
-   fmt:
--    runs-on: ubuntu-latest
--    steps:
--      - name: Set up JDK 11
--        uses: actions/setup-java@b6e674f4b717d7b0ae3baee0fbe79f498905dfde # v1.4.4
--        with:
--          java-version: 11
-+    runs-on:
-+      group: databricks-protected-runner-group
-+      labels: linux-ubuntu-latest
- 
-+    steps:
-       - name: Checkout
-         uses: actions/checkout@ee0669bd1cc54295c223e0bb666b733df41de1c5 # v2.7.0
- 
-       - name: Cache Maven packages
-         uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
-         with:
--          path: ~/.m2
-+          path: ~/.m2/repository
-           key: ${{ runner.os }}-m2-${{ hashFiles('**/pom.xml') }}
-           restore-keys: ${{ runner.os }}-m2
- 
-+      - name: Setup build environment
-+        uses: ./.github/actions/setup-build-environment
-+        with:
-+          java-version: 11
-+
-       - name: Check formatting
-         run: mvn --errors spotless:check
- 
-+      - name: Check for JFrog proxy URLs in lockfiles
-+        run: |
-+          make fix-lockfile
-+          git diff --exit-code -- '**/lockfile.json'
-+
-   unit-tests:
-     strategy:
-       fail-fast: false
-       matrix:
-         os: [macos-latest, ubuntu-latest]
-         java-version: [8, 11, 17, 20] # 20 is the latest version as of 2023 and 17 is the latest LTS
-+        include:
-+          - os: ubuntu-latest
-+            runner:
-+              group: databricks-protected-runner-group
-+              labels: linux-ubuntu-latest
-+          - os: macos-latest
-+            runner: macos-latest
- 
--    runs-on: ${{ matrix.os }}
-+    runs-on: ${{ matrix.runner }}
- 
-     steps:
--      - name: Set up JDK
--        uses: actions/setup-java@b6e674f4b717d7b0ae3baee0fbe79f498905dfde # v1.4.4
-+      - name: Checkout
-+        uses: actions/checkout@ee0669bd1cc54295c223e0bb666b733df41de1c5 # v2.7.0
-+
-+      - name: Cache Maven packages
-+        uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
-+        with:
-+          path: ~/.m2/repository
-+          key: ${{ runner.os }}-m2-${{ hashFiles('**/pom.xml') }}
-+          restore-keys: ${{ runner.os }}-m2
-+
-+      - name: Setup build environment
-+        uses: ./.github/actions/setup-build-environment
-         with:
-           java-version: ${{ matrix.java-version }}
- 
-+      - name: Check Unit Tests
-+        run: mvn --errors test
-+
-+  check-lock:
-+    runs-on:
-+      group: databricks-protected-runner-group
-+      labels: linux-ubuntu-latest
-+
-+    steps:
-       - name: Checkout
-         uses: actions/checkout@ee0669bd1cc54295c223e0bb666b733df41de1c5 # v2.7.0
- 
-       - name: Cache Maven packages
-         uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
-         with:
--          path: ~/.m2
-+          path: ~/.m2/repository
-           key: ${{ runner.os }}-m2-${{ hashFiles('**/pom.xml') }}
-           restore-keys: ${{ runner.os }}-m2
- 
--      - name: Check Unit Tests
--        run: mvn --errors test
-+      - name: Setup build environment
-+        uses: ./.github/actions/setup-build-environment
-+        with:
-+          java-version: 11
-+
-+      - name: Validate lockfile
-+        run: make check-lock
\ No newline at end of file
.github/workflows/release.yml
@@ -1,94 +0,0 @@
-diff --git a/.github/workflows/release.yml b/.github/workflows/release.yml
---- a/.github/workflows/release.yml
-+++ b/.github/workflows/release.yml
-     branches:
-       - "**"
- 
-+permissions:
-+  id-token: write
-+  contents: read
-+
- jobs:
-   publish:
-     # Dynamically set the job name based on the trigger
-     name: ${{ startsWith(github.ref, 'refs/tags/') && 'Publish Release' || 'Run Release Dry-Run' }}
- 
-     runs-on:
--      group: databricks-deco-testing-runner-group
--      labels: ubuntu-latest-deco
-+      group: databricks-protected-runner-group
-+      labels: linux-ubuntu-latest
- 
-     steps:
-       - name: Checkout
-         uses: actions/checkout@ee0669bd1cc54295c223e0bb666b733df41de1c5 # v2.7.0
- 
-+      - name: Cache Maven packages
-+        uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
-+        with:
-+          path: ~/.m2/repository
-+          key: ${{ runner.os }}-m2-${{ hashFiles('**/pom.xml') }}
-+          restore-keys: ${{ runner.os }}-m2
-+
-+      - name: Setup JFrog CLI with OIDC
-+        id: jfrog
-+        uses: jfrog/setup-jfrog-cli@279b1f629f43dd5bc658d8361ac4802a7ef8d2d5 # v4.9.1
-+        env:
-+          JF_URL: https://databricks.jfrog.io
-+        with:
-+          oidc-provider-name: github-actions
-+
-       - name: Set up Java for publishing to Maven Central Repository
-         uses: actions/setup-java@17f84c3641ba7b8f6deff6309fc4c864478f5d62 # v3.14.1
-         with:
-           java-version: 8
--          server-id: central
-           distribution: "adopt"
--          server-username: MAVEN_CENTRAL_USERNAME
--          server-password: MAVEN_CENTRAL_PASSWORD
-           gpg-private-key: ${{ secrets.GPG_PRIVATE_KEY }}
-           gpg-passphrase: GPG_PASSPHRASE
--          
-+
-+      - name: Configure Maven for JFrog and Maven Central
-+        run: |
-+          mkdir -p ~/.m2
-+          cat > ~/.m2/settings.xml << EOF
-+          <settings xmlns="http://maven.apache.org/SETTINGS/1.0.0"
-+                    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
-+                    xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0 https://maven.apache.org/xsd/settings-1.0.0.xsd">
-+            <mirrors>
-+              <mirror>
-+                <id>jfrog-maven</id>
-+                <url>https://databricks.jfrog.io/artifactory/db-maven/</url>
-+                <mirrorOf>*</mirrorOf>
-+              </mirror>
-+            </mirrors>
-+            <servers>
-+              <server>
-+                <id>jfrog-maven</id>
-+                <username>${{ steps.jfrog.outputs.oidc-user }}</username>
-+                <password><![CDATA[${{ steps.jfrog.outputs.oidc-token }}]]></password>
-+              </server>
-+              <server>
-+                <id>central</id>
-+                <username>${{ secrets.MAVEN_CENTRAL_USERNAME }}</username>
-+                <password>${{ secrets.MAVEN_CENTRAL_PASSWORD }}</password>
-+              </server>
-+              <server>
-+                <id>gpg.passphrase</id>
-+                <passphrase>\${env.GPG_PASSPHRASE}</passphrase>
-+              </server>
-+            </servers>
-+          </settings>
-+          EOF
-+
-       # This step runs ONLY on branch pushes (dry-run)
-       - name: Run Release Dry-Run (Verify)
-         if: "!startsWith(github.ref, 'refs/tags/')"
-         uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844 # v0.1.15
-         with:
-           files: databricks-sdk-java/target/*.jar
--          body_path: /tmp/release-notes/release-notes.md
-\ No newline at end of file
-+          body_path: /tmp/release-notes/release-notes.md
\ No newline at end of file
Makefile
@@ -1,18 +0,0 @@
-diff --git a/Makefile b/Makefile
---- a/Makefile
-+++ b/Makefile
- test:
- 	mvn test
- 
-+lock:
-+	mvn io.github.chains-project:maven-lockfile:5.5.2:generate
-+
-+check-lock:
-+	mvn io.github.chains-project:maven-lockfile:5.5.2:validate
-+
-+fix-lockfile:
-+	@# Replace JFrog proxy URLs with public Maven Central equivalents in lockfiles.
-+	@# Prevents proxy URLs from being accidentally committed.
-+	find . -type f -name 'lockfile.json' \
-+	  -exec sed -i 's|databricks\.jfrog\.io/artifactory/db-maven|repo.maven.apache.org/maven2|g' {} +
-+
\ No newline at end of file

... (truncated, output exceeded 60000 bytes)

Reproduce locally: git range-diff a8043ce..b8920e0 e281114..2370a96 | Disable: git config gitstack.push-range-diff false

@github-actions
Copy link
Copy Markdown
Contributor

If integration tests don't run automatically, an authorized user can run them manually by following the instructions below:

Trigger:
go/deco-tests-run/sdk-java

Inputs:

  • PR number: 742
  • Commit SHA: 2370a961e60f666f0e28c2358deb5fe4df7635f4

Checks will be approved automatically on success.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant